UK watchdog accused Apple of failing to report sexual images of children

UK watchdog accused Apple

UK watchdog accused Apple of failing to report sexual images of children

UK watchdog NSPCC (National Society for the Prevention of Cruelty to Children) accused Apple of failing to report sexual images of children. NSPCC claims that Apple is not sufficiently monitoring its platforms or searching for photographs and videos of child sexual abuse.

This is also causing concerns about how Apple will manage the increasing number of child sexual abuse content related to artificial information.

NSPCC alleged Apple is significantly hiding the frequent child sexual abuse material (CSAM) in its products. According to police statistics received by NSPCC, child predators used Apple’s iCloud, iMessage, and Facetime in a year to store and exchange CSAM in more incidents in England and Wales alone than the company revealed across all other nations.

AI-Generated Child Sexual Abuse Images Are Uncontrolled on Apple’s services

NSPCC claimed it found 337 reported offenses of child abuse photos associated with Apple between April 2022 and March 2023.

However, Apple reported just 267 cases of suspected child sexual exploitation (CSAM) on its platforms worldwide to NCMEC in 2023. NCMEC refers to the National Centre for Missing & Exploited Children. Hence, UK watchdog NSPCC accused Apple of failing to report child sexual abuse images.

According to the annual report of NCMEC, Google reported more than 1.47 million cases while Meta reported more than 30.6 million cases.

Every US-based digital company is required to notify NCMEC of any CSAM incidents they find on their platforms. This is because NCMEC notices global reports of child abuse, reviews them, and forwards them to appropriate law enforcement organizations.

UK watchdog accused Apple
Apple allegedly reported one hundred fewer cases of child sexual abuse to NCMEC

Click here to read the updates on a UK man who sued Apple after his wife found his cheating messages on iMac

Since iMessage and Meta’s WhatsApp provide protected messaging services, neither company can view the contents of user communications. NCMEC received around 1.4 million reports of possible CSAM in 2023.

Richard Collard is head of safety online strategy at the NSPCC. He said there is a concerning difference between the number of UK child abuse image crimes happening on Apple’s services.

According to Richard, Apple reported an almost negligible number of global reports of abusive content to authorities. Apple is also considerably lagging behind many of its competitors in addressing child sexual exploitation.

Read More:

Share this content:

Post Comment