Deepfake pornography

Deepfake pornography, or simply fake pornography, is a type of synthetic porn that is created via altering already-existing pornographic material by applying deepfake technology to the faces of the actors. The use of deepfake porn has sparked controversy because it involves the making and sharing of realistic videos featuring non-consenting individuals, typically female celebrities, and is sometimes used for revenge porn. Efforts are being made to combat these ethical concerns through legislation and technology-based solutions.

History

The term "deepfake'' was coined in 2017 on a Reddit forum where users shared altered pornographic videos created using machine learning algorithms. It is a combination of the word “deep learning”, which refers to the program used to create the videos, and “fake” meaning the videos are not real.[1]

Deepfake porn was originally created on a small individual scale using a combination of machine learning algorithms, computer vision techniques, and AI software. The process began by gathering a large amount of source material (including both images and videos) of a person face, and then using a deep learning model to train a Generative Adversarial Network (GAN) to create a fake video that convincingly swaps the face of the source material onto the body of a porn performer. However, the production process has significantly evolved since 2018, with the advent of several public apps that have largely automated the process.[2]

DeepNude

In June 2019, a downloadable Windows and Linux application called DeepNude was released which used GAN to remove clothing from images of women. The app had both a paid and unpaid version, the paid version costing $50.[3] On June 27, the creators removed the application and refunded consumers, although various copies of the app, both free and for charge, continue to exist.[4] On GitHub, the open-source version of this program called "open-deepnude" was deleted.[5] The open-source version had the advantage of allowing to be trained on a larger dataset of nude images to increase the resulting nude image's accuracy level.[6]

Deepfake Telegram Bot

In July 2019 a deepfake bot service was launched on messaging app Telegram that uses AI technology to create nude images of women. The service is free and has a user-friendly interface, enabling users to submit photos and receive manipulated nude images within minutes. The service is connected to seven Telegram channels, including the main channel that hosts the bot, technical support, and image sharing channels. While the total number of users is unknown, the main channel has over 45,000 members. As of July 2020, it is estimated that approximately 24,000 manipulated images have been shared across the image sharing channels.[7]

Notable cases

Deepfake technology has been used to create non-consensual and pornographic images and videos of famous women. One of the earliest examples occurred in 2017 when a deepfake pornographic video of Gal Gadot was created by a Reddit user and quickly spread online. Since then, there have been numerous instances of similar deepfake content targeting other female celebrities, such as Emma Watson, Natalie Portman, and Scarlett Johansson.[8] Johansson spoke publicly on the issue in December 2018, condemning the practice but also refusing legal action because she views the harassment as inevitable.[9]

Rana Ayubb

In 2018, Rana Ayyub, Indian investigative journalist was the target of online hate campaign stemming from her condemnation of the Indian government, specifically her speaking out against the rape of an eight-year-old Kashmiri girl. Ayyub was bombarded with rape and death threats, and had doctored porngraphic video of her circulated online.[10] In a Huffington Post article, Ayyub discusses the long-lasting psychological and social effects this experience has had on her. She explains that she continues to struggle with her mental health and how the images and videos continue to resurface whenever she takes a high-profile case.[11]

Twitch streamer controversy

In 2023, Twitch streamer Atrioc stirred controversy when he accidentally revealed deepfake pornographic material featuring female Twitch streamers while on live. The influencer has since admitted to paying for AI generated porn, and apologized to the women and his fans.[12][13]

Ethical considerations

Deepfake CSAM

Deepfake technology has made the creation of child sexual abuse material (CSAM), also often referenced to as child pornography, faster, safer and easier than it has ever been. Deepfakes can be used to produce new CSAM from already existing material or creating CSAM from children who have not been subjected to sexual abuse. Deepfake CSAM can, however, have real and direct implications on children including defamation, grooming, extortion, and bullying.[14]

Combatting deepfake pornography

Technical approach

Deepfake detection has become an increasingly important area of research in recent years as the spread of fake videos and images has become more prevalent. One promising approach to detecting deepfakes is through the use of Convolutional Neural Networks (CNNs), which have shown high accuracy in distinguishing between real and fake images. One CNN-based algorithm that has been developed specifically for deepfake detection is DeepRhythm, which has demonstrated an impressive accuracy score of 0.98. This algorithm utilizes a pre-trained CNN to extract features from facial regions of interest and then applies a novel attention mechanism to identify discrepancies between the original and manipulated images. While the development of more sophisticated deepfake technology presents ongoing challenges to detection efforts, the high accuracy of algorithms like DeepRhythm offers a promising tool for identifying and mitigating the spread of harmful deepfakes.[15]

Aside from detection models, there are also video authenticating tools available to the public. In 2019, Deepware launched the first publicly available detection tool which allowed users to easily scan and detect deepfake videos. Similarly, in 2020 Microsoft released a free and user-friendly video authenticator. Users upload a suspected video or input a link, and receive a confidence score to assess the level of manipulation in a deepfake.

In 2023, there is a lack of legislation that specifically addresses deepfake pornography. Instead, the harm caused by its creation and distribution is being addressed by the courts through existing criminal and civil laws.

The most common legal recourse for victims of deepfake pornography is pursuing a claim of “revenge porn” because the images are non-consensual and intimate in nature. The legal consequences for revenge porn vary from country to country.[16] For instance, in Canada, the penalty for publishing non-consensual intimate images is up to 5 years in prison.[17] Whereas in Malta, it is a fine of up to €5,000.[18]

The “Deepfake Accountability Act” was introduced to the United States Congress in 2019. It aimed to make the production and distribution of digitally altered visual media that was not disclosed to be such, a criminal offense. The title specifies that making any sexual, non-consensual altered media with the intent of humiliating or otherwise harming the participants, may be fined, imprisoned for up to 5 years or both. However the act has yet to be passed into law.[19]

Controlling the distribution

Several major online platforms have taken steps to ban deepfake pornography. As of 2018, gfycat, reddit, Twitter, Discord, and Pornhub have all prohibited the uploading and sharing of deepfake pornographic content on their platforms.[20][21] In September of that same year, Google also added "involuntary synthetic pornographic imagery" to its ban list, allowing individuals to request the removal of such content from search results.[22] It's worth noting, however, that while Pornhub has taken a stance against non-consensual content, searching for "deepfake '' on their website still yields results and they continue to run ads for deepfake websites and content.[23]

See also

References

  1. Gaur, Loveleen; Arora, Gursimar Kaur (2022-07-27), DeepFakes, New York: CRC Press, pp. 91–98, doi:10.1201/9781003231493-7, ISBN 978-1-003-23149-3, retrieved 2023-04-20
  2. Azmoodeh, Amin, and Ali Dehghantanha. “Deep Fake Detection, Deterrence and Response: Challenges and Opportunities.” arXiv.org, 2022.
  3. Cole, Samantha; Maiberg, Emanuel; Koebler, Jason (26 June 2019). "This Horrifying App Undresses a Photo of Any Woman with a Single Click". Vice. Retrieved 2 July 2019.
  4. Vincent, James (3 July 2019). "DeepNude AI copies easily accessible online". The Verge. Retrieved 11 August 2023.
  5. Cox, Joseph (July 9, 2019). "GitHub Removed Open Source Versions of DeepNude". Vice Media.
  6. Redmon, Jennifer (July 7, 2019). "DeepNude- the AI that 'Undresses' Women- is Back. What Now?". Cisco. Archived from the original on March 1, 2023. Retrieved March 11, 2023.
  7. Hao, Karen (2020-10-20). "A deepfake bot is being used to "undress" underage girls". MIT Technology Review. Retrieved 2023-04-20.
  8. Roettgers, Janko (2018-02-21). "Porn Producers Offer to Help Hollywood Take Down Deepfake Videos". Variety. Retrieved 2023-04-20.
  9. Harwell, Drew (2018-12-31). "Scarlett Johansson on fake AI-generated sex videos: 'Nothing can stop someone from cutting and pasting my image'". The Washington Post. ISSN 0190-8286. Retrieved 2023-04-20.
  10. Maddocks, Sophie (2020-06-04). "'A Deepfake Porn Plot Intended to Silence Me': exploring continuities between pornographic and 'political' deep fakes". Porn Studies. 7 (4): 415–423. doi:10.1080/23268743.2020.1757499. ISSN 2326-8743. S2CID 219910130.
  11. Ayyub, Rana (2018-11-21). "I Was The Victim Of A Deepfake Porn Plot Intended To Silence Me". HuffPost UK. Retrieved 2023-04-20.
  12. Middleton, Amber (2023-02-10). "A Twitch streamer was caught watching deepfake porn of women gamers. Sexual images made without consent can be traumatic and abusive, experts say — and women are the biggest victims". Insider. Retrieved 2023-04-20.
  13. Patterson, Calum (2023-01-30). "Twitch streamer Atrioc gives tearful apology after paying for deepfakes of female streamers". Dexerto. Retrieved 2023-06-14.
  14. Kirchengast, T (2020). "Deepfakes and image manipulation: criminalisation and control". Information & Communications Technology Law. 29 (3): 308–323. doi:10.1080/13600834.2020.1794615. S2CID 221058610.
  15. Gaur, Loveleen; Arora, Gursimar Kaur (2022-07-27), DeepFakes, New York: CRC Press, pp. 91–98, doi:10.1201/9781003231493-7, ISBN 978-1-003-23149-3, retrieved 2023-04-20
  16. Kirchengast, Tyrone (2020-07-16). "Deepfakes and image manipulation: criminalisation and control". Information & Communications Technology Law. 29 (3): 308–323. doi:10.1080/13600834.2020.1794615. ISSN 1360-0834. S2CID 221058610.
  17. Branch, Legislative Services (2023-01-16). "Consolidated federal laws of Canada, Criminal Code". laws-lois.justice.gc.ca. Retrieved 2023-04-20.
  18. Mania, Karolina (2022). "Legal Protection of Revenge and Deepfake Porn Victims in the European Union: Findings From a Comparative Legal Study". Trauma, Violence, & Abuse. doi:10.1177/15248380221143772. PMID 36565267. S2CID 255117036.
  19. Kirchengast, Tyrone (2020-07-16). "Deepfakes and image manipulation: criminalisation and control". Information & Communications Technology Law. 29 (3): 308–323. doi:10.1080/13600834.2020.1794615. ISSN 1360-0834. S2CID 221058610.
  20. Kharpal, Arjun. "Reddit, Pornhub ban videos that use A.I. to superimpose a person's face over an X-rated actor". CNBC. Retrieved 2023-04-20.
  21. Cole, Samantha (2018-01-31). "AI-Generated Fake Porn Makers Have Been Kicked Off Their Favorite Host". Vice. Retrieved 2023-04-20.
  22. Harwell, Drew (2018-12-30). "Fake-porn videos are being weaponized to harass and humiliate women: 'Everybody is a potential target'". The Washington Post. ISSN 0190-8286. Retrieved 2023-04-20.
  23. Cole, Samantha (2018-02-06). "Pornhub Is Banning AI-Generated Fake Porn Videos, Says They're Nonconsensual". Vice. Retrieved 2019-11-09.
This article is issued from Wikipedia. The text is licensed under Creative Commons - Attribution - Sharealike. Additional terms may apply for the media files.