Page Nav

HIDE
Get latest news, breaking news, latest updates, live news, top headlines, breaking business news and top news of the hour.

Grid

GRID_STYLE

Classic Header

{fbt_classic_header}

Breaking News

latest

Google refuses to reinstate man’s account after he took medical images of son’s groin

Experts say case highlights dangers of automated detection of child sexual abuse images Google has refused to reinstate a man’s account afte...

Experts say case highlights dangers of automated detection of child sexual abuse images


Google has refused to reinstate a man’s account after it wrongly flagged medical images he took of his son’s groin as child sexual abuse material (CSAM), the New York Times first reported. Experts say it’s an inevitable pitfall of trying to apply a technological solution to a societal problem.

Experts have long warned about the limitations of automated child sexual abuse image detection systems, particularly as companies face regulatory and public pressure to help address the existence of sexual abuse material.

These companies have access to a tremendously invasive amount of data about people’s lives. And still they don’t have the context of what people’s lives actually are,” said Daniel Kahn Gillmor, a senior staff technologist at the ACLU. “There’s all kinds of things where just the fact of your life is not as legible to these information giants.” He added that the use of these systems by tech companies that “act as proxies” for law enforcement puts people at risk of being “swept up” by “the power of the state.”

The man, only identified as Mark by the New York Times, took pictures of his son’s groin to send to a doctor after realizing it was inflamed. The doctor used that image to diagnose Mark’s son and prescribe antibiotics. When the photos were automatically uploaded to the cloud, Google’s system identified them as CSAM. Two days later, Mark’s Gmail and other Google accounts, including Google Fi, which provides his phone service, were disabled over “harmful content” that was “a severe violation of the company’s policies and might be illegal”, the Times reported, citing a message on his phone. He later found out that Google had flagged another video he had on his phone and that the San Francisco police department opened an investigation into him.

Mark was cleared of any criminal wrongdoing, but Google has said it will stand by its decision.

“We follow US law in defining what constitutes CSAM and use a combination of hash matching technology and artificial intelligence to identify it and remove it from our platforms,” said Christa Muldoon, a Google spokesperson.

Muldoon added that Google staffers who review CSAM were trained by medical experts to look for rashes or other issues. They themselves, however, were not medical experts and medical experts were not consulted when reviewing each case, she said.

That’s just one way these systems can cause harm, according to Gillmor. To address, for instance, any limitations algorithms might have in distinguishing between harmful sexual abuse images and medical images, companies often have a human in the loop. But those humans are themselves inherently limited in their expertise, and getting the proper context for each case requires further access to user data. Gillmor said it was a much more intrusive process that could still be an ineffective method of detecting CSAM.


These systems can cause real problems for people,” he said. “And it’s not just that I don’t think that these systems can catch every case of child abuse, it’s that they have really terrible consequences in terms of false positives for people. People’s lives can be really upended by the machinery and the humans in the loop simply making a bad decision because they don’t have any reason to try to fix it.”

Gillmor argued that technology wasn’t the solution to this problem. In fact, it could introduce many new problems, he said, including creating a robust surveillance system that could disproportionately harm those on the margins.

“There’s a dream of a sort of techno-solutionists thing, [where people say], ‘Oh, well, you know, there’s an app for me finding a cheap lunch, why can’t there be an app for finding a solution to a thorny social problem, like child sexual abuse?’” he said. “Well, you know, they might not be solvable by the same kinds of technology or skill set.”

… as you’re joining us today from the UAE, we have a small favour to ask. Tens of millions have placed their trust in the Guardian’s fearless journalism since we started publishing 200 years ago, turning to us in moments of crisis, uncertainty, solidarity and hope. More than 1.5 million supporters, from 180 countries, now power us financially – keeping us open to all, and fiercely independent.

Unlike many others, the Guardian has no shareholders and no billionaire owner. Just the determination and passion to deliver high-impact global reporting, always free from commercial or political influence. Reporting like this is vital for democracy, for fairness and to demand better from the powerful.

And we provide all this for free, for everyone to read. We do this because we believe in information equality. Greater numbers of people can keep track of the events shaping our world, understand their impact on people and communities, and become inspired to take meaningful action. Millions can benefit from open access to quality, truthful news, regardless of their ability to pay for it.

Every contribution, however big or small, powers our journalism and sustains our future. Support the Guardian from as little as $1 – it only takes a minute. If you can, please consider supporting us with a regular amount each month. Thank you.

No comments

Thanks