Friday, Mar 29, 2024 | Last Update : 08:09 AM IST

  Technology   In Other news  28 Jun 2019  Controversial deepfake app DeepNude turned photos of women into nudes

Controversial deepfake app DeepNude turned photos of women into nudes

THE ASIAN AGE.
Published : Jun 28, 2019, 9:43 am IST
Updated : Jun 28, 2019, 9:43 am IST

The app rose quickly and fell even faster.

The DeepNude app will no more be given out on sale and later instalments of the app will not be released.
 The DeepNude app will no more be given out on sale and later instalments of the app will not be released.

A programmer created a new app called DeepNude which used AI to create fake nude images of women. Thankfully the app that uses the controversial deepfake technology has been shut down not even a day after it became popular. The team behind this app stated that they greatly underestimated the vast interest in this passion project and “the probability that people will misuse it is too high.”

The DeepNude app will no more be given out on sale and later instalments of the app will not be released. The creators also issued a warning against sharing the app online by claiming it would go against DeepNude’s terms of service. However, they did acknowledge that some copies would go out.

First reported by Motherboard, this app was available on both Linux and Windows OS.  It makes use of AI to generate nudes of people and it only worked to create naked photos of women. The free version of DeepNude placed a large watermark across the photo that denotes that it is a fake. The paid version had the same message but was smaller and placed in the corner. These watermarks could be easily cropped out or removed. The app is said to have been on sale for the last few months and the team behind it claim “honestly, the app is not that great’ at creating nudes.

No matter what the developers said about the app, it did gain a lot of traction and created widespread outrage regarding its capabilities. Creating fake nude images are not that new but what DeepNude does that’s different is that it creates images instantaneously and it is available to anyone. The outrage has been justified as DeepNude’s photos are used to harass women and have already used to edit women’s likeness in porn videos without consent. After which, there’s pretty much nothing women can do after it has already been spread online.

Alberto, the creator of the app stated that he felt that someone else would soon make the app if he did not do it first. He states, “The technology is ready (within everyone’s reach).” He went on to state that the DeepNude team would quit the app for sure if the app is going to be misused.

A verge reports states, “DeepNude’s team ends their message announcing the shutdown by saying “the world is not yet ready for DeepNude,” as though there will be some time in the future when the software can be used appropriately. But deepfakes will only become easier to make and harder to detect, and ultimately, knowing whether something is real or fake isn’t the problem. These apps allow people’s images to be quickly and easily misused, and for the time being, there are few if any protections in place to prevent that from happening.”

Tags: deepfake, deepnude, app, software, nudes