![deepfake app nude deepfake app nude](https://cdn.vox-cdn.com/thumbor/8iQ_FZVJ99RI_ignonIbalCsI5k=/0x0:1932x1070/1200x800/filters:focal(812x381:1120x689)/cdn.vox-cdn.com/uploads/chorus_image/image/64139543/Screen_Shot_2019_06_27_at_11.05.16_AM.0.png)
“As a deepfake victim said to me-it felt like thousands saw her naked, she felt her body wasn’t her own anymore.”ĭeepNude launched as a website that shows a sample of how the software works and downloadable Windows and Linux application on June 23. “Yes, it isn’t your actual vagina, but… others think that they are seeing you naked,” she said. This is an “invasion of sexual privacy,” Danielle Citron, professor of law at the University of Maryland Carey School of Law, who recently testified to Congress about the deepfake threat, told Motherboard. This tech should not be available to the public." "Now anyone could find themselves a victim of revenge porn, without ever having taken a nude photo. "This is absolutely terrifying," Katelyn Bowden, founder and CEO of revenge porn activism organization Badass, told Motherboard. DeepNude also dispenses with the idea that this technology can be used for anything other than claiming ownership over women’s bodies. DeepNude is an evolution of that technology that is easier to use and faster to create than deepfakes. But the most devastating use of deepfakes has always been in how they're used against women: whether to experiment with the technology using images without women's consent, or maliciously spreading nonconsensual porn on the internet. Have any thoughts on this? Let us know down below in the comments or carry the discussion over to our Twitter or Facebook.Since Motherboard discovered deepfakes in late 2017, the media and politicians focused on the dangers they pose as a disinformation tool. With so many people using social media – a veritable goldmine of source images for users of this kind of software – we need to evaluate exactly how important to us our social media platforms actually are, with more stringent regulation leveled at this sort of content and significantly more done to catch and punish the creators and users of these sites. It is so accessible and easy to use, making this Telegram bot so dangerous. The longer we normalize this kind of objectification of women, the longer we will have this sinister side to everyone’s originally intended space. Innocent women are having their lives ruined by all manner of online abuse, and it can’t be allowed to continue. Women are being exploited and harassed daily via various online platforms.
![deepfake app nude deepfake app nude](https://aniportalimages.s3.amazonaws.com/media/details/app_jun27.jpg)
They could potentially be used to coerce a woman into doing something that they otherwise wouldn’t. Thanks to these images, women could lose employment or be subject to domestic violence from a jealous partner. In countries whose social practices are more conservative, we could potentially see women subject to further harassment due to these deepfake images, particularly if they were to end up in public hands.
![deepfake app nude deepfake app nude](https://www.insidehook.com/wp-content/uploads/2020/10/telegram-app.jpg)
This opens up a whole new consequence of deepfake technology. If this isn’t disaster enough already, then it can only become more disastrous in the future. Now, we see how deepfakes are starting to encroach upon the lives of thousands of innocent women. However, most of these feature a political figure such as Donald Trump or Vladimir Putin as the subject. We have seen a lot of discourse around the subject of deepfakes of late. Of these, some of the victims appear to be underage. According to the report, around 105,000 images of females have been uploaded to the site. This makes the risk of exploitation and harassment greater, simply because it is much easier to achieve. They use just one image of the victim to create deepfake content, rather than needing thousands of images. Around 70,750 of these profiles seem to be based in Russia and Eastern Europe. The company discovered a Telegram network of 101,080 users associated with the bot. Rather than being a space in which people share explicit images (which we also do not condone here at KnowTechie), the bot has actually been embedded in the site. This is the first time that Sensity has come across anything like this bot on Telegram’s platform. Most of the women targeted are not celebrities, as you may think, but private individuals who have absolutely no idea that their images are being used in such a manner. The startling report, Deepfake bots on Telegram, states that victims’ photographs are being uploaded to a bot that creates deepfake nude images.Īccording to BuzzFeed News, just under 700,000 women had been made targets by people using the bot. Sensity, an Amsterdam-based intelligence company, made a discovery this week that the mobile app, Telegram, is being used to exploit women.