While one of the most common concerns has been on what might happen if other governments try and take advantage of this system for other purposes, the feature will only be available in the United States at launch. The concerns have come from a variety of notable sources, such as Edward Snowden and the Electronic Frontier Foundation.

The EFF wrote:

All it would take to widen the narrow backdoor that Apple is building is an expansion of the machine learning parameters to look for additional types of content, or a tweak of the configuration flags to scan, not just children’s, but anyone’s accounts. That’s not a slippery slope; that’s a fully built system just waiting for external pressure to make the slightest change.

Apple traditionally launches features first in the United States because the US is the company’s largest market and the market in which it is most familiar with local laws and regulations. That is again the case with the new CSAM detection system.

Apple’s implementation of this CSAM detection feature is highly technical, and more details can be learned at the links below. 

  • Detailed technical summary for CSAM detection
  • Technical Assessment of CSAM Detection by Professor Benny Pinkas
  • New Child Safety Landing Page on Apple.com