Apple has the power to make their devices safe for children and stop the spread of child abuse on their platforms. With your help, Apple can build a safer future for kids.
Find out how you can help build a better Apple.
Steve Jobs
Sensitive content warning: child sexual abuse
Common questions that can help ground and inform challenging conversations
In 2023, the CyberTipline received and astonishing 36.2 million reports of child sexual abuse images and videos, up 12% from 32 million reports in 2022. Nearly 90% were reported and taken down by electronic service providers Facebook, Instagram, WhatsApp, and Google. In contrast, Apple’s reports consisted of just 267 instances of child sexual abuse materials in 2023. That’s not due to a lack of abusive content found on its platforms, such as iCloud, but rather because the company has decided not to monitor for this type of content across all their full suite of services.
With end-user reporting, Apple and Omegle (now shut down) were the only major companies that didn’t provide relatively easy-to-find information and processes for users to report illegal and harmful content.
In terms of the App Store, Apple doesn’t consistently monitor and enforce App Store policies. As a result, age ratings and descriptions can be inaccurate and unsafe apps are accessible or recommended to children.
Child sexual abuse material (CSAM) refers to image, video, and other materials depicting child rape and sexual abuse. It is no longer called “child pornography” because that implies some kind of performance and consent. Federal law prohibits the production, advertisement, transportation, distribution, receipt, sale, access with intent to view, and possession of CSAM. Read more about how companies use verified hashes to stop the spread of illegal CSAM.
Apple can prioritize making iPhones safe for children by:
Privacy is crucial. Our focus is on identifying specific, harmful content – not mass surveillance. Apple stated two years ago that they “have faced demands to build and deploy government-mandated changes that degrade the privacy of users before, and have steadfastly refused those demands.” We believe Apple can both protect kids against online harms and safeguard user privacy.
You can tell us your opinions and/or sign-up for updates on this effort. You can also ask questions and advocate for change internally. We also welcome your ideas on how to help. Get in touch with us at [email protected].
The Heat Initiative conducted a national public opinion poll to understand the public’s interest in more proactive measures to stop the spread of child sexual abuse images and videos on tech platforms. Bellwether Research conducted a representative survey of 2,041 adults (18+), online, May 7-11, 2023. The full sample was balanced to approximate a target sample of adults in the United States based on the Census (CPS 2020).
We found that Americans overwhelmingly support technology companies adopting new policies and features to remove and report child sexual abuse material from their platforms:
On August 29, 2022, the eSafety Commissioner of Australia issued notices to seven online service providers, compelling them to report on their adherence to the Basic Online Safety Expectations concerning child sexual exploitation and abuse. The information gathered from these notices offers valuable insights that were not previously available through voluntary initiatives. The primary goal of this report is to enhance transparency and accountability among providers by shedding light on their actions, or lack thereof, in safeguarding children on their platforms.
The report highlights that Apple falls behind its peers in three fundamental online safety procedures:
The Heat Initiative commissioned a review of 158 publicly available cases where child sexual images and videos were found on Apple products. The majority of cases were related to iCloud and iPhone and included abuse images of children from infants to toddlers to teens. Review the research in this report. TBD