Summary (AI generated)

Archived original version »

Google has refused to reinstate a man’s account after its system wrongly flagged medical images he took of his son’s groin as child sexual abuse material (CSAM). Experts warn about the limitations of automated systems for detecting CSAM, particularly as companies face regulatory and public pressure to help address the issue. Google uses a combination of hash matching technology and artificial intelligence to identify CSAM but relies on non-medical experts for reviews, leading to potential false positives and causing harm to individuals.