Deepfakes: Nightmares of the Future Occurring Now
I. Introduction
A deepfake is a type of media that portrays a person’s image, or voice, in a manner that is not their own.[1] A Reddit user coined the name and the term expanded to eventually include “synthetic media applications.”[2] The Reddit created a video showing a famous actress performing lewd acts.[3] People were astonished because this was the first time a single person was able to make a “high-quality and convincing fake videos.”[4] The Reddit user claimed to have created the video from free software that is available to people on the internet.[5] To create a deepfake video is not as difficult as it may seem. A person really only needs the software, a video that a person wants to alter and other videos that contain the person wanting to be altered.[6] It is possible for a person to employ the use of an artificial intelligence program to study video clips of the target person and understand their facial movement and structure.[7] From there, the artificial intelligence system can map the target’s face onto another individual if they share common features.[8] The simplistic nature of swapping a persons face or their voice can be problematic in an age of social media and thus requires individuals viewing online media to be weary of deepfakes that may be used.
II. Problems
One example of a deepfake being used in a harmless setting is the video of actor Kit Harrington apologizing for the disappointing ending to Game of Thrones. A video surfaced of Jon Snow, played by Kit Harrington, apologizing for the dissatisfied reception of the Game of Thrones ending.[9] This action was mainly harmless because the creator intended to use the video as a joke. However, instances do arise that pose problems. For example, someone created audio of a CEO authorizing the company to send money to the creator of the fake audio.[10] This is not the only example of fraudsters using deepfake audio to steal money from companies. In 2020, a bank manager in Hong Kong received a call from who he believed was his manager.[11] However, the voice on the line was really looking to swindle the manager into believing that the $35 million were to be transferred to a faulty bank account.[12] The fraudsters were successful in their endeavor, marking the second time in which fraudsters were able to swindle a company out of money by using deepfake technology.[13] This problem of using deepfakes to fool an audience is not centralized in one place, but felt across the globe. In June 2019, members of the United States Congress warned that the deepfake could play into the 2020 elections through the spread of disinformation.[14] Although no there were no repots of deepfake used for the 2020 elections, the 2024 presidential elections could experience those problems.[15] To combat these problems, there has been a push for companies, and even the federal government, to ban the use of deepfakes.
III. The Solution
In 2019, President Trump signed a law that: “(1) requires a comprehensive report on the foreign weaponization of deepfakes; (2) requires the government to notify Congress of foreign deepfake-disinformation activities targeting US elections; and (3) establishes a ““Deepfakes Prize”” competition to encourage the research or commercialization of deepfake-detection technologies.”[16] In 2019, Senator Rob Portman of Ohio introduced a bill, the Deepfake Report Act of 2019, that would require the Department of Homeland Security to report the development of deepfake technology.[17] However, the Deepfake Report Act of 2019 passed the Senate but has not come to a vote in front of the House.[18] In 2021, Representative Yvette Clarke of New York introduced a bill, the DEEP FAKES Accountability Act, that would help prevent the disinformation that could be and has been spread by the use of deepfakes.[19] The bill would require individuals posting a deepfake to disclose the fact that the video, or audio, has been altered in some way.[20] If the individual does not disclose the fact that the video, or audio, has been altered and intends to humiliate or harass, cause violence or physical harm, intends to commit fraud or intends to influence voting, then that person may be fined and imprisoned.[21] However, like the Deepfake Report Act of 2019, the DEEP FAKES Accountability Act has not been passed by the House and is therefore not law.
Where the federal government may have failed to act, states used their legislation powers to help alleviate the problems caused by deepfakes. Texas is the first state to step forward and try to lessen the damage that deepfakes can cause. In 2019, Texas created a law that prohibits individuals from creating deepfake videos with the intent to influence public elections.[22] California enacted a similar law that prohibited an individual to distribute “materially deceptive audio or visual media…of the candidate with the intent to injure the candidate’s reputation or to deceive a voter into voting for or against the candidate.”[23] California law also allows the victim of a deepfake pornography video to file a claim against the creator of the deepfake for “profits, economic and noneconomic damages, or statutory damages up to $150,000 if the act was committed with malice.”[24]
While some states have yet to enact laws that prohibit deepfakes, social media apps are taking a stand to combat the spread of misinformation. Facebook announced that it would remove videos that were considered deepfakes.[25] Facebook employees will check the videos to determine whether the videos were altered or if they were intended to mislead an audience to believe someone said something they really did not say.[26] However, this will not restrict videos that feature a parody or are considered satire.[27] TikTok, a famous video app, announced that it will ban deepfakes in an effort to reduce negative influence on the election cycle.[28] Finally, Twitter announced that it would remove videos that it found were fabricated to mislead people through alterations making it seem as if a person said something they really did not say.[29] Twitter initiated this policy to help prevent any misinformation that could have been spread during the election.[30]
IV. CONCLUSION
Deepfakes, while initially uncommon, are becoming more abundant with the technology that people can possess while not having to leave the comfort of their home. In an age where everybody has a cell phone, and everybody has access to social media, misinformation can spread much like a wildfire. That is why it is up to the federal government, and state governments, to put in place laws that prevent the harm that can come from individuals creating deepfakes. While they can be used for satire, they can easily be used to spread of misinformation, to commit fraud or to create videos of people performing lewd acts that they did not perform. That is why it is important for social media companies to monitor the use of deepfakes and ensure they do not hurt individuals. Before believing something on the internet, you must question whether the video is real or if someone altered the video to make you believe something.
[1][1] Meredith Somers, Deepfakes, Explained, MIT Mgmt. Sloan Sch. (July 21, 2020), https://mitsloan.mit.edu/ideas-made-to-matter/deepfakes-explained.
[2] Id.
[3] Benjamin Goggin, From Porn to ‘Game of Thrones’: How Deepfakes and Realistic-Looking Fake Videos Hit it Big, INSIDER (June 23, 2019, 10:45 AM), https://www.businessinsider.com/deepfakes-explained-the-rise-of-fake-realistic-videos-online-2019-6.
[4] Id.
[5] Id.
[6] Dave Johnson, What is a Deepfake? Everything You Need to Know About the AI-Powered Fake Media, INSIDER (Jan. 22, 2021, 11:46 AM), https://www.businessinsider.com/what-is-deepfake.
[7] Id.
[8] Id.
[9] Alexis Nedd, Of Course There’s a Deepfake of Jon Snow Apologizing for Season 8 of ‘Game of Thrones’, Mashable (June 16, 2019), https://mashable.com/video/game-of-thrones-jon-snow-deepfake-video.
[10] Avani Desai, Taking Fakeness to New Depths with AI Alterations: The Dangers of Deepfake Videos, ISACA, (Oct. 25, 2021), https://www.isaca.org/resources/news-and-trends/isaca-now-blog/2021/taking-fakeness-to-new-depths-with-ai-alterations-the-dangers-of-deepfake-videos.
[11] Thomas Brewster, Fraudsters Cloned Company Director’s Voice in $35 Million Bank Heist, Police Find, Forbes (Oct. 22, 2022, 7:01 AM), https://www.forbes.com/sites/thomasbrewster/2021/10/14/huge-bank-fraud-uses-deep-fake-voice-tech-to-steal-millions/?sh=5c813a477559.
[12] Id.
[13] Id.
[14] Tom Simonite, What Happened to the Deepfake Threat to the Election, WIRED (Nov. 16, 2020, 7:00 AM), https://www.wired.com/story/what-happened-deepfake-threat-election/.
[15] Id.
[16] Jason Chipman, et al., First Federal Legislation on Deepfakes Signed Into Law, JDSUPRA, (Dec. 24,2019), https://www.jdsupra.com/legalnews/first-federal-legislation-on-deepfakes-42346/.
[17] Deepfake Report Act of 2019, S. 2065, 116th Cong., 2019.
[18] Id.
[19] DEEP FAKES Accountability Act, H.R. 2395, 117th Cong. § 1041 (c) 2021.
[20] Id. at § 1041 (f) (1) (2) (a).
[21] Id.
[22] Jason Chipman, supra note 16.
[23] Cal Elec. Code § 20010 (West 2020); K.C. Halm, et al., Two New California Laws Tackle Deepfake Videos in Politics and Porn, Davis Wright Tremaine LLP, (Oct. 11, 2019), https://www.dwt.com/insights/2019/10/california-deepfakes-law.
[24] Cal. Civ. Code § 1708.86 (West 2020); K.C. Halm, et al., Two New California Laws Tackle Deepfake Videos in Politics and Porn, Davis Wright Tremaine LLP, (Oct. 11, 2019), https://www.dwt.com/insights/2019/10/california-deepfakes-law.
[25] Sam Shead, Facebook to Ban ‘Deepfakes’, BBC (Jan. 7, 2020), https://www.bbc.com/news/technology-51018758.
[26] Id.
[27] Id.
[28] Nick Statt, TikTok is Banning Deepfakes to Better Protect Against Misinformation, The Verge (Aug. 5, 2020, 9:00 AM), https://www.theverge.com/2020/8/5/21354829/tiktok-deepfakes-ban-misinformation-us-2020-election-interference.
[29] Shirin Ghaffary, Twitter is Finally Fighting Back Against Deepfakes and Other Deceptive Media, Vox (Feb. 4, 2020, 4:00 PM), https://www.vox.com/recode/2020/2/4/21122653/twitter-policy-deepfakes-nancy-pelosi-biden-trump.
[30] Id.