Apple:


Let's build a future without child abuse.

Apple has the power to make their devices safe for children and stop the spread of child abuse on their platforms.  With your help, Apple can build a safer future for kids.

Learn More

Find out how you can help build a better Apple.

We believe people with passion can change the world for the better.

Steve Jobs

The Facts

Sensitive content warning: child sexual abuse

86% of Americans want Apple to take action to better protect children from sexual abuse.
85% of Apple customers say it’s very important that technology companies remove and report child sexual abuse images and videos from their platform.
92% of Apple users agree that “Apple has a responsibility to identify, remove and report child sexual abuse images and videos on all of their platforms.”

Apple reported 234 pieces of child sexual abuse material in 2022, four other tech companies reported 29 million.

Australia eSafety Commissioner’s Basic Online Safety Expectations Report shows Apple falling behind.

Heat Initiative commissioned a review of publicly available cases where child sexual images and videos found on Apple products.

Complete this survey if you'd like to help create safer online experiences for the next generation.

FAQs

Common questions that can help ground and inform challenging conversations

In 2023, the CyberTipline received and astonishing 36.2 million reports of child sexual abuse images and videos, up 12% from 32 million reports in 2022. Nearly 90% were reported and taken down by electronic service providers Facebook, Instagram, WhatsApp, and Google. In contrast, Apple’s reports consisted of just 267 instances of child sexual abuse materials in 2023. That’s not due to a lack of abusive content found on its platforms, such as iCloud, but rather because the company has decided not to monitor for this type of content across all their full suite of services

With end-user reporting, Apple and Omegle (now shut down) were the only major companies that didn’t provide relatively easy-to-find information and processes for users to report illegal and harmful content.  

In terms of the App Store, Apple doesn’t consistently monitor and enforce App Store policies.  As a result, age ratings and descriptions can be inaccurate and unsafe apps are accessible or recommended to children.

Child sexual abuse material (CSAM) refers to image, video, and other materials depicting child rape and sexual abuse. It is no longer called “child pornography” because that implies some kind of performance and consent. Federal law prohibits the production, advertisement, transportation, distribution, receipt, sale, access with intent to view, and possession of CSAM.   Read more about how companies use verified hashes to stop the spread of illegal CSAM.  

Apple can prioritize making iPhones safe for children by:

  • Letting kids report abuse: Create easily accessible reporting in iMessage for kids and parents to report inappropriate images and harmful situations.  Right now children can only report via a generic email address on Apple’s website that is not easily accessible and requires that children leave the environment to report.
  • Requiring safe apps: Monitor the App Store to ensure exploitative and dangerous apps are removed from the app store and that only age-appropriate apps are advertised to children.
  • Stopping the spread of child abuse: Stop the storage and spread of child sexual abuse images and videos in iCloud and other Apple products. 

Privacy is crucial. Our focus is on identifying specific, harmful content – not mass surveillance. Apple stated two years ago that they “have faced demands to build and deploy government-mandated changes that degrade the privacy of users before, and have steadfastly refused those demands.” We believe Apple can both protect kids against online harms and safeguard user privacy.

You can tell us your opinions and/or sign-up for updates on this effort. You can also ask questions and advocate for change internally.  We also welcome your ideas on how to help.  Get in touch with us at [email protected].