All images are AI generated and do not depict real people or real images in iCloud.

Child sexual abuse
is stored on iCloud.
Apple allows it.

Apple’s landmark announcement to detect child sexual abuse images and videos in 2021 was silently rolled back, impacting the lives of children worldwide. With every day that passes, there are kids suffering because of this inaction, which is why we’re calling on Apple to deliver on their commitment.

We are calling on Apple to

Detect, report, and remove child sexual abuse images and videos from iCloud.

Create a robust reporting mechanism for users to report child sexual abuse images and videos to Apple.

The Heat Initiative is a collective effort of concerned child safety experts and advocates encouraging leading technology companies to detect and eradicate child sexual abuse images and videos on their platforms. The Heat Initiative sees a future where children’s safety is at the forefront of any existing and future technological developments.

Join us in holding Apple accountable.

Read the Case Studies

Sensitive content warning: child sexual abuse

Read the coalition letter to Apple.

Child safety organizations, advocates, and those with lived experience support these baseline recommendations for Apple to take action in better safeguarding children.

Apple falls short when detecting child sexual abuse material.

Research Materials

Australia eSafety Commissioner’s Basic Online Safety Expectations Report shows Apple falling behind.

Research Materials

Apple reported 234 pieces of child sexual abuse material in 2022, four other tech companies reported 29 million.

Research Materials

National Polls show that Americans overwhelmingly support technology companies adopting new policies and features to remove and report child sexual abuse material from their platforms.

Join us

Provide your email address to stay informed and help us hold Apple accountable!

Heat Initiative Logo

Provide your email address to stay informed and help us hold Apple accountable!