Sign Up
Stories
AI Dataset Contains Child Abuse Images
Share
AI Art Generators Exploited for NSFW Ima...
AI Art Generators Trick AI Systems
AI Art Generators: Fooling AI, Generatin...
AI Bias in Image Generation
AI Certification for Copyright Complianc...
AI Copyright Challenges Explored
Overview
API
A large AI dataset, LAION-5B, containing over 5 billion images and captions, includes at least 1,008 instances of child sexual abuse material (CSAM), raising concerns about AI products creating new and realistic child abuse content.
Ask a question
How can AI developers ensure the exclusion of potentially harmful content from AI datasets?
What measures can be taken to prevent the use of AI tools for creating new and realistic child abuse content?
What role does the public play in addressing the issue of child sexual abuse material in AI datasets?
Article Frequency
0.2
0.4
0.6
0.8
1.0
Sep 2023
Oct 2023
Nov 2023
Coverage