Artificial intelligence (AI) seems to be everywhere these days. Casual dining restaurants, such as McDonald’s and Wendy’s, have tested the use of AI in their drive-thru lanes (and have since dispensed with them). Businesses, such as Microsoft, have begun using customer services bots at their call centers. While the rise in the use of AI brings with it many advantages, such as increased efficiency and productivity, it also carries some distinct disadvantages like unemployment concerns, hallucinations, and bias. There are also very real societal risks associated with AI. One of these perils is the use of deepfakes or images or videos created or modified using AI, especially those that involve intimate depictions of identifiable people. More and more, many of the victims of such unwanted publications are minors.
Last week, Senator Ted Cruz (R-Texas) and several bipartisan colleagues introduced a bill to protect victims of cyberbullying and “revenge porn,” also known as non-consensual intimate imagery. The bill, Tools to Address Known Exploitation by Immobilizing Technological Deepfakes on Websites and Networks Act, or TAKE IT DOWN Act, will, if passed, require a “covered platform,” such as a website, online service, or mobile app, to establish procedures for removal and remove the imagery and any copies upon a valid request from the identifiable individual within 48 hours of the request. The Act would make it unlawful for anyone “to use an interactive computer service to knowingly publish an intimate visual depiction of an identifiable individual,” and the Act would apply both to adults and minors. This legislation would provide victims with an avenue to halting the spread of the false and nonconsensual images.
The TAKE IT Down Act is just one of many pieces of legislation recognizing the dangers in the disclosure of intimate images. In 2022, Congress created a federal civil cause of action relating to the disclosure of non-consensual intimate images (NCII). Section 1309 of Consolidated Appropriations Act gives victims of NCII the right to bring a civil action against the party who knowingly disclosed the image or disclosed the image with reckless disregard as to whether consent was given. Last year, Texas amended its Penal Code to make it a criminal offense to knowingly produce or distribute electronically a nonconsensual deep fake video that appears to depict a person engaged in sexual conduct or with the person’s intimate parts exposed. See Tex. Penal Code § 21.165. Additionally, in Texas, students can file with the court a sworn application and petition to stop cyberbullying under Texas Civil Practice and Remedies Code Chapter 129A. According to Texas Education Code § 37.0832(a)(2), cyberbullying is defined as “bullying that is done through the use of any electronic communication device, including through the use of a cellular or other type of telephone, a computer, a camera, electronic mail, instant messaging, text messaging, a social media application, an Internet website, or any other Internet-based communication tool.” To qualify as cyberbullying, the activity must: physically harm the student or their property, cause fear, create a detrimental learning environment, substantially disrupt school activities, or infringe on the student’s rights at school. Tex. Educ. Code § 37.0832(c) requires school districts to adopt policies and procedures for dealing with incidents relating to bullying or cyberbullying. Such procedure must also be posted on the school district’s website if practicable.
If you would like to learn more about cyberbullying and the disclosure of NCII, please see the resources listed below.
Additional Reading
Cyberbullying - TexasLawHelp.org
Legislation - David’s Legacy Foundation
Federal Civil Action for Disclosure of Intimate Images: Free Speech Considerations - Congressional Research Service
Increasing Threat of DeepFake Identities - U.S. Department of Homeland Security
Reporting Bullying, Cyberbullying, and Harassment - ACLUTexas
Social Media and Youth Mental Health - U.S. Surgeon General