Apple is not doing enough to protect its most vulnerable users by underreporting the prevalence of child sexual abuse material (CSAM) exchanged and stored on its services like iCloud, iMessage and FaceTime, child safety experts claim
The National Society for the Prevention of Cruelty to Children (NSPCC), a British child protection charity, said data collected through a Freedom of Information request shows that hundreds of CSAM in England and Wales incidents involving Apple, which exceeds the number of incidents officially reported by the company worldwide in a single year
The NSPCC found that “Apple was involved in 337 recorded offenses of child abuse images between April 2022 and March 2023 in England and Wales However, across all global platforms in 2023, Apple reported only 267 CSAMs to the National Center for Missing & Exploited Children (NCMEC), The Guardian reported
This is a steep drop compared to other tech giants like Google and Meta, which reported more than 147 million and 306 million cases, respectively, last year based on NCMEC's annual report; other platforms that reported more potential CSAM cases than Apple in 2023 include, Discord (339,412 cases), Pinterest (52,356 cases), and 4chan (1,657 cases) For reference, NCMEC is asking all US-based tech companies to report possible cases of CSAM detected on their platforms NCMEC will then forward these cases to relevant law enforcement agencies around the world
Apple services such as iMessage, FaceTime, and iCloud all feature end-to-end encryption This means that only the sender and recipient of the message can see its contents; as the NSPCC notes, WhatsApp also uses end-to-end encryption and has reported nearly 14 million suspected CSAM cases to the NCMEC by 2023
Richard Collard, head of child safety online policy at the NSPCC, said, “There is a worrying discrepancy between the number of UK child abuse image crimes occurring on Apple's service and the almost negligible number of global reports of abusive content that Apple reports to authorities ” he told the Guardian
“Apple is clearly lagging behind many of its peers in tackling child sexual abuse when all tech companies should be investing in safety and preparing for the rollout of online safety laws in the UK,” he continued Apple declined The Guardian's request for comment on the NSPCC report
The accusations come after years of controversy over Apple's plans to improve surveillance on its platform to detect child sexual abuse after announcing in August 2021 Apple's child safety toolset that scans iOS devices for images of child abuse The company suspended its efforts just one month later after digital rights groups raised concerns that the surveillance could threaten the privacy and security of iCloud users worldwide In 2022, Apple announced that it would discontinue the project
Asked about the NSPCC report, Apple showed us a letter to the child safety group Heat Initiative that Wired published After Apple announced plans to discontinue its iCloud photo scanning tool, the Heat Initiative organized a campaign to urge Apple to crack down on CSAM and provide more tools for users to report CSAM, leading to Apple's unusual response In its response, Apple stated that it would shift away from the scanning function and focus on developing a series of on-device tools to connect users more directly to local resources and law enforcement
“The subject of child sexual abuse is an abhorrent one, and we are committed to breaking the cycle of coercion and influence that makes children vulnerable to sexual abuse,” said Eric Neuenschwander wrote in the company's response to the Heat Initiative (which can be read in full here) We are proud of the contributions we have made and will continue to work with child safety organizations, technologists, and governments on lasting solutions to protect the most vulnerable members of society”
“Our goal has been and always will be to create technology that empowers and enriches people's lives while helping them stay safe We have developed a number of innovative technologies to help keep children safe and have made meaningful contributions towards this goal We have deepened our commitment to the Communication Safety feature, first made available in December 2021 Communication Safety is designed to intervene and provide helpful resources when children receive or attempt to send messages containing nudity The goal is to disrupt the grooming of children by making it harder for offenders to normalize such behavior
“In our latest release, we have expanded the functionality to more easily and more broadly protect children First, this feature is turned on by default for all child accounts Second, we have expanded it to cover video content in addition to still images We have also extended these protections to more areas throughout the system, including AirDrop, photo pickers, FaceTime messages, and contact posters in the Phone app In addition, there is a new warning feature to ensure that all users do not see unwanted nude images or videos when they receive them in Messages, AirDrop, FaceTime video messages, and Phone app contact posters To extend these protections beyond our built-in features, we have made them available to third parties as well Communication app developers are actively incorporating this advanced technology into their products All of these features use privacy-protecting technology, and all image and video processing takes place on the device, so Apple does not have access to the content We intend to continue to invest in innovative technologies such as these
“We decided not to proceed with the hybrid client-server approach to CSAM detection in iCloud Photos that we proposed several years ago for several good reasons After consulting extensively with child safety advocates, human rights groups, privacy and security technologists, and academics, and considering scanning techniques from virtually every angle, we ultimately concluded that it was practically impossible to implement without compromising user security and privacy
“Scanning personal data in the cloud is regularly used by companies to monetize users' information While some companies justify such practices, we have chosen a very different path It is a path that prioritizes user security and privacy We believe that scanning every user's personally stored iCloud content has significant unintended consequences for users: between 2013 and 2021, the total number of data breaches globally more than tripled, and in 2021 alone, 11 billion personal 11 billion personal records were compromised in 2021 alone As threats become increasingly sophisticated, we are committed to providing our users with the world's best data security, constantly identifying and mitigating new threats to our users' personal data, on their devices and in the cloud Scanning every user's personally stored iCloud data creates new threat vectors for data thieves to find and exploit
“It would also inject the potential for a slippery slope of unintended consequences For example, scanning one type of content would open the door to bulk surveillance, which could create a desire to search other encrypted messaging systems across content types (images, video, text, audio, etc) and content categories How can users be assured that certain surveillance tools have not been reconfigured to monitor other content, such as political activity or religious persecution? Mass surveillance tools have widespread negative effects on free speech and, by extension, on democracy as a whole In addition, designing this technology for one government may require its application in other countries across new types of data [Scanning systems are also not perfect, and other platforms have documented evidence of innocent people being caught up in dystopian investigative networks and victimized for simply sharing baby pictures
“We firmly believe that by working together and collaborating we can do a lot of good As we have done in the past, we would be happy to meet with you to continue the conversation about these important issues and how to balance the various equities we have outlined above We remain interested in working with the child safety community, for example, on streamlining user reporting to law enforcement, expanding adoption of child safety tools, and developing new shared resources among businesses to combat grooming and exploitation”
Comments