csam

Created
Tue, 02/07/2024 - 11:21

Automatically tagging or filtering child sexual exploitation materials (CSAM) cannot be effective and preserve privacy at the same time, regardless of what kind of tech one throws at it. Because what is and what is not CSAM is highly dependent on context.

Literally the same photo, bit-by-bit identical, can be an innocent memorabilia when sent between family members, and a case of CSAM if shared on a child porn group.

The information necessary to tell whether or not it is CSAM is not available in the file being shared. It is impossible to tell it apart by any kind of technical means based on the file alone. The current debate about filtering child sexual exploitation materials (CSAM) on end-to-end encrypted messaging services, like all previous such debates (of which there were many), mostly ignores this basic point.