Congress Urged by Nationwide Attorneys General to Act Against AI-Generated Child Abuse Content

NNicholas September 5, 2023 10:42 PM

Attorneys General from all states have sent a collective appeal to Congress for enhanced protective measures against AI-powered child sexual abuse material. They urge for the inclusion of AI-generated content within existing child abuse laws and the establishment of an expert commission to investigate the potential exploitation of AI in this area.

Proposal for an expert commission studying AI's misuse

The open letter sent by the attorneys general to the leaders of both major political parties in the House and Senate has requested the establishment of an expert commission. This commission would delve into the intricate means and methods through which AI can be potentially used to exploit children. The letter highlights the urgency of this matter, emphasizing the rapid advancement of AI technologies and their potential misuse.

Expanding current laws to include AI-generated content

The attorneys general are pressing for an expansion of the current legislation on child sexual abuse materials to include AI-generated images and videos. AI technology is relatively new, and there are currently no laws that specifically categorize AI-generated images as child sexual abuse materials. This gap in legislation is seen as a significant concern, given the advanced capabilities of AI technology in generating realistic images and videos.

Alan Wilson, the Attorney General of South Carolina, initiated the campaign to write the open letter. He has urged his colleagues to thoroughly examine their state statutes to find out whether the laws are up-to-date with the rapid advancements in AI technology. Wilson raised concerns about 'deepfakes' which could misuse real images of children in abusive scenarios, emphasizing that our laws may not currently address the virtual nature of such exploitation.

Potential misuse of AI to create fictitious children

Another concern raised in the letter is the potential use of AI technology to fabricate fictitious children using a data library and then misuse these images for sexual abuse material. This scenario could create a demand in the industry that exploits children, even if it's argued that no actual child is being harmed. The attorneys general strongly argue against the idea that this wouldn't be hurting anyone, pointing out it would still fuel an industry exploiting children.

While the specific issue of deepfake child sexual abuse is new, the technology sector is already trying to combat deepfake pornographic content. Companies like Meta, OnlyFans, and Pornhub began using an online tool called 'Take It Down' in February, which empowers teenagers to report explicit images and videos of themselves on the Internet. This tool is used to report both regular images and AI-generated content, demonstrating a proactive approach to tackle deepfake content in the industry.

More articles

Also read

Here are some interesting articles on other sites from our network.