Google Shutters Deepfake Training
Googleâs Colaboratory computing platform would no longer let customers create deepfakes, the company said quietly. As of 2017, Colaboratory (also known as Colab) is a tool that allows people to develop and run Python code from their web browsers. For the most part, this service is accessible to everyone, and that too sans a penny and gives customers a wide range of capabilities for doing data analysis, machine learning, as well as other more involved tasks in the spirit of academic investigation. There have been times in the past when this wasnât necessarily the case.
Table of contents
What is Deepfake?
If youâre not already familiar with the term, âdeepfakes,â they are hyper-realistic created films depicting events that never took place. Although theyâre occasionally manufactured for well-intentioned chuckles, theyâre frequently constructed to propagate ideological disinformation or have it look as if somebody featured in pornographic material they really had no idea about  (i.e. âinadvertent pornographicâ). Following the 2020 national elections, some made deepfakes of politicians âutteringâ shocking stuff; many have technologies like Colabâs to make âvengeful sexually explicit materialâ featuring past love rivals.
Sophisticated retouching techniques, as well as machine learning processes, are used to create digital illusions like these, which may be trained on Colabâs powerful computers. A face detection technology is often used, allowing the operator to make it look as though the intended individual is exhibiting particular facial features or speaking particular words. When it pertains to false information and pornography, the options are nearly limitless, which may be deadly.
Many Colab customers may have misused their authority because of the enormous authority that accompanies it. Because of ethical considerations, Google decided to prevent an issue from arising. Whether or not itâs the latter, weâre dealing with a major tech business here, I donât know. Due to the high cost of computing resources required to do deepfake learning, itâs possible that Google implemented the prohibition purely for economic grounds. Deepfake training was outlawed by Google midway through May, for whatever reason.
Decryption, bitcoin processing, or torrenting were also outlawed by the corporation, which sensibly doesnât want to be identified with these actions. âYou could be running malware that is forbidden, and this might limit your capacity to utilize Colab hereafter,â says the error message for anybody trying to participate in these activities.
As media outlets note that deepfake fans sometimes recommend newcomers to Colab so that they may show off their skills at producing highly-detailed phony media themselves. Accordingly, this indicates that Googleâs prohibition could have far-reaching consequences for the ânetworkâ of deepfake creators, whether or not particular programmers have malicious intents.

