Dataconomy
  • News
    • Artificial Intelligence
    • Cybersecurity
    • DeFi & Blockchain
    • Finance
    • Gaming
    • Startups
    • Tech
  • Industry
  • Research
  • Resources
    • Articles
    • Guides
    • Case Studies
    • Whitepapers
  • AI toolsNEW
  • Newsletter
  • + More
    • Glossary
    • Conversations
    • Events
    • About
      • About
      • Contact
      • Imprint
      • Legal & Privacy
      • Partner With Us
Subscribe
No Result
View All Result
  • AI
  • Tech
  • Cybersecurity
  • Finance
  • DeFi & Blockchain
  • Startups
  • Gaming
Dataconomy
  • News
    • Artificial Intelligence
    • Cybersecurity
    • DeFi & Blockchain
    • Finance
    • Gaming
    • Startups
    • Tech
  • Industry
  • Research
  • Resources
    • Articles
    • Guides
    • Case Studies
    • Whitepapers
  • AI toolsNEW
  • Newsletter
  • + More
    • Glossary
    • Conversations
    • Events
    • About
      • About
      • Contact
      • Imprint
      • Legal & Privacy
      • Partner With Us
Subscribe
No Result
View All Result
Dataconomy
No Result
View All Result

Did Apple’s CSAM system allow child abuse to go unchecked?

Apple is facing a lawsuit in Northern California and the file cites that Apple “failed to implement those designs or take any measures to detect and limit” CSAM on its devices

byKerem Gülen
December 9, 2024
in News, Cybersecurity
Home News
Share on FacebookShare on TwitterShare on LinkedInShare on WhatsAppShare on e-mail

Apple is facing a lawsuit from victims of child sexual abuse for failing to implement a system to detect Child Sexual Abuse Material (CSAM) on iCloud according to The New York Times. The suit seeks over $1.2 billion in damages for approximately 2,680 individuals, reflecting alleged negligence in protecting users from harm.

Apple sued for $1.2 billion over child sexual abuse material detection failure

In 2021, Apple announced its intentions to introduce a CSAM detection tool intended to scan iCloud for abusive images and alert the National Center for Missing and Exploited Children. This initiative aimed to combat child exploitation effectively. However, following substantial backlash over privacy concerns, Apple abandoned the project, leaving the accused unprotected and ongoing abuse unresolved.

The lawsuit was filed in Northern California and cites that Apple “failed to implement those designs or take any measures to detect and limit” CSAM on its devices. The claim includes experiences from a 27-year-old woman, who stated that she still receives law enforcement notices concerning the online sharing of her images, taken when she was a child. The suit emphasizes the emotional turmoil victims endure as these materials circulate unchecked.

Stay Ahead of the Curve!

Don't miss out on the latest insights, trends, and analysis in the world of data, technology, and startups. Subscribe to our newsletter and get exclusive content delivered straight to your inbox.

“Child sexual abuse material is abhorrent and we are committed to fighting the ways predators put children at risk. We are urgently and actively innovating to combat these crimes without compromising the security and privacy of all our users. Features like Communication Safety, for example, warn children when they receive or attempt to send content that contains nudity to help break the chain of coercion that leads to child sexual abuse. We remain deeply focused on building protections that help prevent the spread of CSAM before it starts,” Apple spokesperson Fred Sainz told Engadget.

In response, Apple affirmed its commitment to combating child sexual abuse while prioritizing user privacy. Spokesperson Fred Sainz noted, “Child sexual abuse material is abhorrent and we are committed to fighting the ways predators put children at risk.” He referenced existing measures such as Communication Safety, which warns children against sharing explicit content.

Despite these efforts, Apple recently faced additional scrutiny when the UK’s National Society for the Prevention of Cruelty to Children accused the company of underreporting CSAM found on its platforms. This adds to the mounting pressure regarding Apple’s strategies to handle this issue effectively.


Featured image credit: Niels Kehl/Unsplash

Tags: AppleFeatured

Related Posts

Dell fixes the XPS: Physical keys return in new 14 and 16 models

Dell fixes the XPS: Physical keys return in new 14 and 16 models

January 13, 2026
Zuckerberg launches Meta Compute to build massive AI energy grid

Zuckerberg launches Meta Compute to build massive AI energy grid

January 13, 2026
Official: Google Gemini will power Apple Intelligence and Siri

Official: Google Gemini will power Apple Intelligence and Siri

January 13, 2026
Amazon: 97% of our devices are ready for Alexa+

Amazon: 97% of our devices are ready for Alexa+

January 13, 2026
Anthropic’s Cowork brings developer-grade AI agents to non-coders

Anthropic’s Cowork brings developer-grade AI agents to non-coders

January 13, 2026
Xiaomi eyes total independence with new chip and OS

Xiaomi eyes total independence with new chip and OS

January 12, 2026

LATEST NEWS

Dell fixes the XPS: Physical keys return in new 14 and 16 models

Zuckerberg launches Meta Compute to build massive AI energy grid

Official: Google Gemini will power Apple Intelligence and Siri

Amazon: 97% of our devices are ready for Alexa+

Anthropic’s Cowork brings developer-grade AI agents to non-coders

Xiaomi eyes total independence with new chip and OS

Dataconomy

COPYRIGHT © DATACONOMY MEDIA GMBH, ALL RIGHTS RESERVED.

  • About
  • Imprint
  • Contact
  • Legal & Privacy

Follow Us

  • News
    • Artificial Intelligence
    • Cybersecurity
    • DeFi & Blockchain
    • Finance
    • Gaming
    • Startups
    • Tech
  • Industry
  • Research
  • Resources
    • Articles
    • Guides
    • Case Studies
    • Whitepapers
  • AI tools
  • Newsletter
  • + More
    • Glossary
    • Conversations
    • Events
    • About
      • About
      • Contact
      • Imprint
      • Legal & Privacy
      • Partner With Us
No Result
View All Result
Subscribe

This website uses cookies. By continuing to use this website you are giving consent to cookies being used. Visit our Privacy Policy.