Search icon
Download the all-new Republic app:
NIFTY 50 23,587.50 Down stock -364.20 (-1.52%)
NIFTY 100 24,448.85 Down stock -436.10 (-1.75%)
NIFTY 500 22,319.40 Down stock -432.75 (-1.90%)
NIFTY MIDCAP 50 15,881.10 Down stock -441.65 (-2.71%)
INDIA VIX 15.07 up stock 0.56 (3.87%)
NIFTY MIDCAP 150 21,050.60 Down stock -520.70 (-2.41%)
NIFTY SMALLCAP 50 8,925.85 Down stock -252.60 (-2.75%)
NIFTY BANK 50,759.20 Down stock -816.50 (-1.58%)
NIFTY AUTO 22,580.00 Down stock -490.90 (-2.13%)
NIFTY FMCG 55,600.80 Down stock -557.10 (-0.99%)
NIFTY IT 43,771.05 Down stock -1183.10 (-2.63%)
NIFTY MEDIA 1,887.90 Down stock -32.60 (-1.70%)
NIFTY METAL 8,813.25 Down stock -163.95 (-1.83%)
NIFTY PHARMA 22,501.85 Down stock -196.05 (-0.86%)
NIFTY PRIVATE BANK 24,617.60 Down stock -438.70 (-1.75%)
NIFTY REALTY 1,060.10 Down stock -43.15 (-3.91%)
NIFTY OIL & GAS 10,607.60 Down stock -177.55 (-1.65%)
NIFTY COMMODITIES 8,188.45 Down stock -160.75 (-1.93%)
NIFTY ENERGY 34,910.60 Down stock -717.60 (-2.01%)
LEADMINI 188 up stock 0.20 (0.11%)
ZINCMINI 261.25 up stock 0.25 (0.1%)
SILVERMIC 89730 up stock 72.00 (0.08%)
GOLDGUINEA 59188 up stock 50.00 (0.08%)
GOLDM 72900 up stock 28.00 (0.04%)
COTTONCNDY 56540 up stock 20.00 (0.04%)
SILVER 89675 up stock 29.00 (0.03%)
CRUDEOIL 6609 Down stock -237.00 (-3.46%)
NATURALGAS 177.5 Down stock -0.30 (-0.17%)
NATGASMINI 177.5 Down stock -0.30 (-0.17%)
MENTHAOIL 942.2 Down stock -1.20 (-0.13%)
ZINC 260.7 Down stock -0.25 (-0.1%)
ALUMINIUM 220.5 Down stock -0.20 (-0.09%)
LEAD 187 Down stock -0.10 (-0.05%)
NIFTY 50 23,587.50 Down stock -364.20 (-1.52%)
NIFTY 100 24,448.85 Down stock -436.10 (-1.75%)
NIFTY 500 22,319.40 Down stock -432.75 (-1.90%)
NIFTY MIDCAP 50 15,881.10 Down stock -441.65 (-2.71%)
INDIA VIX 15.07 up stock 0.56 (3.87%)
NIFTY MIDCAP 150 21,050.60 Down stock -520.70 (-2.41%)
NIFTY SMALLCAP 50 8,925.85 Down stock -252.60 (-2.75%)
NIFTY BANK 50,759.20 Down stock -816.50 (-1.58%)
NIFTY AUTO 22,580.00 Down stock -490.90 (-2.13%)
NIFTY FMCG 55,600.80 Down stock -557.10 (-0.99%)
NIFTY IT 43,771.05 Down stock -1183.10 (-2.63%)
NIFTY MEDIA 1,887.90 Down stock -32.60 (-1.70%)
NIFTY METAL 8,813.25 Down stock -163.95 (-1.83%)
NIFTY PHARMA 22,501.85 Down stock -196.05 (-0.86%)
NIFTY PRIVATE BANK 24,617.60 Down stock -438.70 (-1.75%)
NIFTY REALTY 1,060.10 Down stock -43.15 (-3.91%)
NIFTY OIL & GAS 10,607.60 Down stock -177.55 (-1.65%)
NIFTY COMMODITIES 8,188.45 Down stock -160.75 (-1.93%)
NIFTY ENERGY 34,910.60 Down stock -717.60 (-2.01%)
LEADMINI 188 up stock 0.20 (0.11%)
ZINCMINI 261.25 up stock 0.25 (0.1%)
SILVERMIC 89730 up stock 72.00 (0.08%)
GOLDGUINEA 59188 up stock 50.00 (0.08%)
GOLDM 72900 up stock 28.00 (0.04%)
COTTONCNDY 56540 up stock 20.00 (0.04%)
SILVER 89675 up stock 29.00 (0.03%)
CRUDEOIL 6609 Down stock -237.00 (-3.46%)
NATURALGAS 177.5 Down stock -0.30 (-0.17%)
NATGASMINI 177.5 Down stock -0.30 (-0.17%)
MENTHAOIL 942.2 Down stock -1.20 (-0.13%)
ZINC 260.7 Down stock -0.25 (-0.1%)
ALUMINIUM 220.5 Down stock -0.20 (-0.09%)
LEAD 187 Down stock -0.10 (-0.05%)

Published 19:36 IST, December 24th 2023

Stanford Internet Observatory unveils disturbing link between AI models and child abuse material

The LAION-5B dataset, temporarily taken down for safety measures, has contributed to the evolution of AI tools, particularly diffusion models.

Reported by: Business Desk
Artificial Intelligence | Image: Freepik

In a shocking revelation, the Stanford Internet Observatory (SIO) has exposed the presence of over 1,000 images of child sexual abuse material (CSAM) in a widely-used open-source dataset, LAION-5B. The dataset serves as a training ground for AI text-to-image generation models, including Stable Diffusion, potentially influencing the creation of hyper-realistic fake images of child exploitation.

The investigation sheds light on the grim reality that generative machine learning's rapid progress enables the creation of realistic imagery facilitating child sexual exploitation. The LAION-5B dataset, containing billions of images scraped from various sources, inadvertently included known CSAM from mainstream social media websites and adult video sites.

Researchers from Stanford's Internet Observatory utilised hashing tools such as PhotoDNA to identify and report the presence of CSAM to the National Center for Missing and Exploited Children (NCMEC) in the US and the Canadian Centre for Child Protection (C3P). Notably, the study focused on matching image fingerprints without directly viewing abusive content.

As AI tools increasingly become tools of choice for pedophiles, this revelation poses serious concerns. The report emphasises the need for safety recommendations in collecting datasets, training models, and hosting models trained on scraped datasets. Suggestions include checking future datasets against known lists of CSAM using detection tools like Microsoft’s PhotoDNA and collaborating with child safety organisations.

The LAION-5B dataset, temporarily taken down for safety measures, has contributed to the evolution of AI tools, particularly diffusion models. These models, fueled by billions of internet images, enable the creation of convincing images with minimal technical expertise. The presence of over a thousand CSAM photos in training data raises alarms about potential misuse.

David Thiel, Chief Technologist at Stanford's Internet Observatory, highlights the advantage these CSAM images give AI models in producing content resembling real-life exploitation. The study signifies a shift in understanding how AI tools generate abusive content, moving from combining textual concepts to using actual images for refinement.

The researchers advocate for regulatory measures, including screening and removal of explicit content from databases, transparent training datasets, and mechanisms to teach AI models to forget how to create explicit imagery. This revelation underscores the urgent need for robust safeguards and ethical considerations in the development and deployment of AI technologies.
 

Updated 19:36 IST, December 24th 2023

LIVE TV

Republic TV is India's no.1 English news channel since its launch.