Google Classifies Innocent Photo of Naked Child as Child Porn and Closes Account

A man was kicked out of his Google account and received a visit from the police for sending a photo of his child’s genitals to his doctor.

The story sounds like a privacy nightmare. The toddler in question had swelling around his crotch. In preparation for a video consultation, he sent pictures of the condition to the doctor at the request of medical personnel. The facts date from February 2021, when many doctors’ practices in the US were not easily accessible due to corona.

The condition was treated with antibiotics. However, two days later, the man notified Google that his accounts were being blocked for harmful content, violating Google’s policy, and may have committed illegal acts.

As a result, the man no longer had access to emails, contacts, or online photos, and even his own phone number would have been unusable for a while. Of course, the man could appeal against the decision to Google, but that request was rejected.

The case was further investigated in December by the local police in San Francisco, where the man lives. Still, it was later dropped because it was clear that it was a misunderstanding where no criminal offenses were committed.

The father’s story is now being chronicled by the New York Times. But how come a private photo being shared privately triggers such a mechanism?

Google scans photos and other material passing through its servers for possible child abuse. Most tech companies do this by hashing images, simplifying them into a piece of code, and checking those codes against a database of known child abuse.

Since 2018, Google itself has released its Content Safety API AI toolkit. In its own words, this enables it to identify unknown child pornography proactively. In addition to Google, other organizations can also use the technology. But apparently, it does not always work flawlessly so even innocent images with nudes from the private sphere are classified as child pornography.

Leave a Reply