Trigger warning: child sexual abuse

We are calling on Apple to:

Create a robust reporting mechanism for users to report child sexual abuse images and videos to Apple.

Detect, report, and remove child sexual abuse images and videos from iCloud.

Join us in holding Apple accountable.

Read the Case Studies

Sensitive content warning: child sexual abuse

Read the coalition letter to Apple.

Child safety organizations, advocates, and those with lived experience support these baseline recommendations for Apple to take action in better safeguarding children.

Apple falls short when detecting child sexual abuse material.

Research Materials

National Polls show that Americans overwhelmingly support technology companies adopting new policies and features to remove and report child sexual abuse material from their platforms.

Research Materials

Australia eSafety Commissioner’s Basic Online Safety Expectations Report shows Apple falling behind.

Research Materials

Apple reported 234 pieces of child sexual abuse material in 2022, four other tech companies reported 29 million.

Research Materials

The Heat Initiative commissioned a review of publicly available cases where child sexual images and videos were found on Apple products.

Join us

Provide your email address to stay informed and help us hold Apple accountable!

All images are AI generated and do not depict real people or real images in iCloud.

Heat Initiative Logo

Provide your email address to stay informed and help us hold Apple accountable!