More than a dozen prominent cybersecurity experts on Thursday criticized plans by Apple and the European Union to monitor people’s phones for illegal material, calling the efforts a strategy. ineffective and dangerous will promote government surveillance.
In a 46-page study, researchers wrote that Apple’s proposal to detect child sexual abuse images on iPhones, as well as ideas forwarded by members of the European Union to detected similar abusive and terrorist images on encrypted devices within Europe, using “dangerous technology”.
“It should be a national security priority to counter attempts to spy on and influence law-abiding citizens,” the researchers wrote.
The technology, called client-side scanning, will allow Apple – or in Europe, potentially law enforcement officials – to detect child sexual abuse images in someone’s phone using How to scan images uploaded to Apple’s iCloud storage service.
When Apple announced the planned tool in August, it said the so-called image fingerprints would be compared with a database of known child sexual abuse documents to look for clues. suitable results.
But the plan has sparked an uproar among privacy advocates and raised concerns that the technology could erode digital privacy and ultimately be taken over by authoritarian governments. used to track down dissidents and other adversaries.
Apple said it would refuse any such requests from foreign governments, but the outcry prompted it to halt the release of the scanning tool in September. The company declined to comment. on the report published on Thursday.
Cybersecurity researchers said they had begun research ahead of Apple’s announcement. Documents released by the European Union and a meeting with EU officials last year have led them to believe that the bloc’s regulator wants a similar program, not just scanning sexual abuse images. child education, but also signs of organized crime and signs of terrorism. .
The researchers believe that a proposal to allow scanning of photos in the European Union could be introduced as soon as this year.
It said it was publishing its findings now to inform the European Union of the dangers of their scheme, and because “the expansion of state surveillance is really overtaking.” cross the red line,” said Ross Anderson, a professor of security engineering at the University of Cambridge and a member of the team.
Aside from surveillance concerns, the researchers say, their findings indicate that the technology is not effective at identifying images of child sexual abuse. Within days of Apple’s announcement, they said, people had pointed out ways to avoid detection by tweaking the image a bit.
“It allows scanning of individual personal devices without any probable cause for anything illegal to be done,” said another member of the team, Susan Landau, professor of policy. and cybersecurity at Tufts University, added. “It is extremely dangerous. It is dangerous for business, national security, public safety, and privacy.”