Apple Says It Won't Let the Government Turn Its Child Abuse Detection Tools Into a Surveillance Weapon

Technology

Gizmodo 09 August, 2021 - 06:45pm 56 views

Is Apple going to scan photos?

Apple announced last week that it will begin scanning all photos uploaded to iCloud for potential child sexual abuse material (CSAM). It's come under a great deal of scrutiny and generated some outrage, so here's what you need to know about the new technology before it rolls out later this year. MacworldApple's child abuse photo scanning: What it is and why people are worried

Apple wants to check your phone for child abuse images – what could possibly go wrong?

The Guardian 09 August, 2021 - 09:30pm

Apple, which has spent big bucks on ad campaigns boasting about how much it values its users privacy, is about to start poking through all your text messages and photos. Don’t worry, the tech company has assured everyone, the prying is for purely benevolent purposes. On Thursday Apple announced a new set of “protection for children” features that will look through US iPhones for images of child abuse. One of these features is a tool called neuralMatch, which will scan photo libraries to see if they contain anything that matches a database of known child abuse imagery. Another feature, which parents can enable or disable, scans iMessage images sent or received by accounts that belong to a minor. It will then notify the parents when a child receives sexually explicit imagery.

On the surface Apple’s new features sound both sensible and commendable. Technology-facilitated child sexual exploitation is an enormous problem; one that’s spiralling out of control. In 1998 there were more than 3,000 reports of child sex abuse imagery, according to a 2019 paper published in conjunction with the National Center for Missing and Exploited Children. In 2018 there were 18.4m. These reports included more than 45m images and videos that were flagged as child sexual abuse. Technology companies have a duty to curb the terrible abuses their platforms help facilitate. Apple’s new features are an attempt to do just that.

But while Apple’s attempts to protect children may be valiant, they also open a Pandora’s box of privacy and surveillance issues. Of particular concern to security researchers and privacy activists is the fact that this new feature doesn’t just look at images stored on the cloud; it scans users’ devices without their consent. Essentially that means there’s now a sort of “backdoor” into an individual’s iPhone, one which has the potential to grow wider and wider. The Electronic Frontier Foundation (EFF), an online civil liberties advocacy group, warns that “all it would take to widen the narrow backdoor that Apple is building is an expansion of the machine learning parameters to look for additional types of content … That’s not a slippery slope; that’s a fully built system just waiting for external pressure to make the slightest change.” You can imagine, for example, how certain countries might pressure Apple to scan for anti-government messages or LGBTQ content.

Jillian York, the author of a new book about how surveillance capitalism affects free speech, is also concerned that Apple’s new parental controls mean images shared between two minors could be non-consensually shared with one of their parents. “This strikes me as assumptive of two things,” she told me. “One, That adults can be trusted with these images and two, that every other culture has the same ideas about what constitutes nudity and sexuality as the US does.”

Edward Snowden, who knows a thing or two about abuses of surveillance, has also voiced concerns about Apple’s new features. “No matter how well-intentioned, @Apple is rolling out mass surveillance to the entire world with this,” Snowden tweeted. “Make no mistake: if they can scan for kiddie porn today, they can scan for anything tomorrow. They turned a trillion dollars of devices into iNarcs–*without asking.*”

But why would a technology company bother asking the public what it wants? We all know that big tech knows what’s best for us plebs. While mass surveillance may sound scary, I’m sure we can all trust Apple et al. to do the right thing. No need to worry about hackers or Apple contractors accessing and uploading your nudes! No need to worry about Apple employees exploiting the technology to spy on people, in the same way that Uber employees did with their “God View” tool! I’m sure it will all be perfectly fine.

While we’re on the topic of tech companies policing inappropriate content …Facebook’s state-of-the-art technology recently censured a New York gardening group for using the word “hoe”. This wasn’t the first time Facebook has had trouble with the word – as residents of Plymouth Hoe in the UK know all too well.

The (highly unscientific) procedure was known as the “two-finger test” because doctors would probe a potential recruit’s vagina to check if her hymen was intact. If the doctor decided you weren’t a virgin you wouldn’t get into the army. While it may seem mindboggling that Indonesia is only now banning this barbaric practice, virginity testing is a lot more widespread than you might think – it’s still legal in certain parts of the United States.

The state channel CCTV also asked Gong Lijiao, who won gold in the women’s shot put, if she had plans for a “woman’s life.” What does that mean exactly? Losing weight, getting married and having kids, obviously.

The New York attorney general’s office released damning a 165-page report on Tuesday that determined Andrew Cuomo had illegally abused and harassed women subordinates. Despite widespread calls for him to resign, Cuomo is still stubbornly clinging to power. The governor is “reckless, disrespectful, misogynist and allergic to taking responsibility,” Moira Donegan writes. “He has demonstrated not merely an unfitness for power but a personal moral vacuity – an unwillingness to think of other people, of women, as equals, or to imagine his own actions as having consequences.”

“Gritty paternalism was the political brand Mr. Cuomo had been building his whole career, an image shrewdly forged in a blend of aggressive masculinity and performed compassion,” Ginia Bellafante writes in the New York Times. Voters keep electing men like this because we’ve been trained to believe that’s what good leadership looks like. However, as Cuomo proves, posturing a “tough guy” isn’t the same as being a good leader. (Funnily enough, I’ve written a book all about this – and how we desperately need to embrace a more ‘feminine’ model of leadership – which is now available for pre-order!).

A Russian woman has reportedly sued McDonalds after a tempting cheeseburger commercial made her break her fast during Lent. “When I saw an advertising banner – I could not help myself,” the woman complained. She is asking the fast food chain to compensate her for moral damage in the amount of 1,000 rubles ($14).

Apple to check U.S. iPhones for images of child sexual abuse

CBS This Morning 09 August, 2021 - 09:30pm

On Friday, Apple revealed plans to tackle the issue of child abuse on its operating systems within the United States via updates to iOS 15, iPadOS 15, watchOS 8, and macOS Monterey.

The most contentious component of Cupertino's plans is its child sexual abuse material (CSAM) detection system. It will involve Apple devices matching images on the device against a list of known CSAM image hashes provided by the US National Center for Missing and Exploited Children (NCMEC) and other child safety organisations before an image is stored in iCloud.

"Before an image is stored in iCloud Photos, an on-device matching process is performed for that image against the known CSAM hashes. This matching process is powered by a cryptographic technology called private set intersection, which determines if there is a match without revealing the result," Apple said.

"The device creates a cryptographic safety voucher that encodes the match result along with additional encrypted data about the image. This voucher is uploaded to iCloud Photos along with the image."

Once an unstated threshold is reached, Apple will manually look at the vouchers and review the metadata. If the company determines it is CSAM, the account will be disabled and a report sent to NCMEC. Cupertino said users will be able to appeal to have an account re-enabled.

Apple is claiming its threshold will ensure "less than a one in one trillion chance per year of incorrectly flagging a given account".

The other pair of features Apple announced on Friday were having Siri and search provide warnings when a user searches for CSAM-related content, and using machine learning to warn children when they are about to view sexually explicit photos in iMessages.

"When receiving this type of content, the photo will be blurred and the child will be warned, presented with helpful resources, and reassured it is okay if they do not want to view this photo. As an additional precaution, the child can also be told that, to make sure they are safe, their parents will get a message if they do view it," Apple said.

"Similar protections are available if a child attempts to send sexually explicit photos. The child will be warned before the photo is sent, and the parents can receive a message if the child chooses to send it."

Apple's plans drew criticism over the weekend, with Electronic Frontier Foundation labelling the features as a backdoor.

"If you've spent any time following the Crypto Wars, you know what this means: Apple is planning to build a backdoor into its data storage system and its messaging system," the EFF wrote.

"Apple can explain at length how its technical implementation will preserve privacy and security in its proposed backdoor, but at the end of the day, even a thoroughly documented, carefully thought-out, and narrowly-scoped backdoor is still a backdoor."

EFF warned that once the CSAM system was in place, changing the system to search for other sorts of content would be the next step.

"That's not a slippery slope; that's a fully built system just waiting for external pressure to make the slightest change," it said.

"The abuse cases are easy to imagine: governments that outlaw homosexuality might require the classifier to be trained to restrict apparent LGBTQ+ content, or an authoritarian regime might demand the classifier be able to spot popular satirical images or protest flyers."

The EFF added that with iMessage to begin scanning images sent and received, the communications platform was no longer end-to-end encrypted.

"Apple and its proponents may argue that scanning before or after a message is encrypted or decrypted keeps the 'end-to-end' promise intact, but that would be semantic manoeuvring to cover up a tectonic shift in the company's stance toward strong encryption," the foundation said.

Head of WhatsApp Will Cathcart said the Facebook-owned platform would not be adopting Apple's approach and would instead rely on users reporting material.

"This is an Apple built and operated surveillance system that could very easily be used to scan private content for anything they or a government decides it wants to control. Countries where iPhones are sold will have different definitions on what is acceptable," Cathcart said.

The WhatsApp chief asked how the system would work in China, and what would happen once a spyware crew figured out how to exploit the system.

WhatsApp does scan unencrypted imagery -- such as profile and group photos -- for child abuse material.

"We have additional technology to detect new, unknown CEI within this unencrypted information. We also use machine learning classifiers to both scan text surfaces, such as user profiles and group descriptions, and evaluate group information and behavior for suspected CEI sharing," the company said.

Former Facebook CSO Alex Stamos said he was happy to see Apple taking responsibility for the impacts of its platform, but questioned the approach.

"They both moved the ball forward technically while hurting the overall effort to find policy balance," Stamos said.

"One of the basic problems with Apple's approach is that they seem desperate to avoid building a real trust and safety function for their communications products. There is no mechanism to report spam, death threats, hate speech, NCII, or any other kinds of abuse on iMessage."

Instead of its "non-consensual scanning of local photos, and creating client-side ML that won't provide a lot of real harm prevention", Stamos said he would have preferred if Apple had robust reporting in iMessage, staffed a child safety team to investigate reports, and slowly rolled out client-side machine learning. The former Facebook security chief said he feared Apple had poisoned the well on client-side classifiers.

"While the PRC has been invoked a lot, I expect that the UK Online Safety Bill and EU Digital Services Act were much more important to Apple's considerations," he said.

Whistleblower Edward Snowden accused Apple of deploying mass surveillance around the globe.

"Make no mistake: if they can scan for kiddie porn today, they can scan for anything tomorrow," he said.

"They turned a trillion dollars of devices into iNarcs—*without asking.*"

Late on Friday, 9to5Mac reported on an internal memo from Apple that contained a note from NCMEC.

"We know that the days to come will be filled with the screeching voices of the minority," NCMEC reportedly said.

By registering, you agree to the Terms of Use and acknowledge the data practices outlined in the Privacy Policy.

You will also receive a complimentary subscription to the ZDNet's Tech Update Today and ZDNet Announcement newsletters. You may unsubscribe from these newsletters at any time.

You agree to receive updates, alerts, and promotions from the CBS family of companies - including ZDNet’s Tech Update Today and ZDNet Announcement newsletters. You may unsubscribe at any time.

By signing up, you agree to receive the selected newsletter(s) which you may unsubscribe from at any time. You also agree to the Terms of Use and acknowledge the data collection and usage practices outlined in our Privacy Policy.

The AEC handed Fuji Xerox Businessforce AU$27 million back in 2018 to deliver the existing solution for the following two years.

Trio of telcos alleged to have made misleading claims about their 50Mbps and 100Mbps fibre-to-the-node plans.

Report demands that the government cleans up its act with technology or risk huge IT bills and damaging cyberattacks.

Cybersecurity professionals can have their pick of jobs around the world -- here are the ten top cities according to Techshielder

Facing a cyberattack? Pick up the phone and talk to legal help as well as incident response.

The company is adjusting after selling FireEye Products business for $1.2 billion in June.

Apple pushes back against child abuse scanning concerns in new FAQ

The Verge 09 August, 2021 - 09:30pm

‘We will not accede to any government’s request to expand it’

Apple’s new tools, announced last Thursday, include two features designed to protect children. One, called “communication safety,” uses on-device machine learning to identify and blur sexually explicit images received by children in the Messages app, and can notify a parent if a child age 12 and younger decides to view or send such an image. The second is designed to detect known CSAM by scanning users’ images if they choose to upload them to iCloud. Apple is notified if CSAM is detected, and it will alert the authorities when it verifies such material exists.

The plans met with a swift backlash from digital privacy groups and campaigners, who argued that these introduce a backdoor into Apple’s software. These groups note that once such a backdoor exists there is always the potential for it to be expanded to scan for types of content that go beyond child sexual abuse material. Authoritarian governments could use it to scan for politically dissent material, or anti-LGBT regimes could use it to crack down on sexual expression.

“Even a thoroughly documented, carefully thought-out, and narrowly-scoped backdoor is still a backdoor,” the Electronic Frontier Foundation wrote. “We’ve already seen this mission creep in action. One of the technologies originally built to scan and hash child sexual abuse imagery has been repurposed to create a database of ‘terrorist’ content that companies can contribute to and access for the purpose of banning such content.”

However, Apple argues that it has safeguards in place to stop its systems from being used to detect anything other than sexual abuse imagery. It says that its list of banned images is provided by the National Center for Missing and Exploited Children (NCMEC) and other child safety organizations, and that the system “only works with CSAM image hashes provided by NCMEC and other child safety organizations.” Apple says it won’t add to this list of image hashes, and that the list is the same across all iPhones and iPads to prevent individual targeting of users.

The company also says that it will refuse demands from governments to add non-CSAM images to the list. “We have faced demands to build and deploy government-mandated changes that degrade the privacy of users before, and have steadfastly refused those demands. We will continue to refuse them in the future,” it says.

It’s worth noting that despite Apple’s assurances, the company has made concessions to governments in the past in order to continue operating in their countries. It sells iPhones without FaceTime in countries that don’t allow encrypted phone calls, and in China it’s removed thousands of apps from its App Store, as well as moved to store user data on the servers of a state-run telecom.

The FAQ also fails to address some concerns about the feature that scans Messages for sexually explicit material. The feature does not share any information with Apple or law enforcement, the company says, but it doesn’t say how it’s ensuring that the tool’s focus remains solely on sexually explicit images.

“All it would take to widen the narrow backdoor that Apple is building is an expansion of the machine learning parameters to look for additional types of content, or a tweak of the configuration flags to scan, not just children’s, but anyone’s accounts,” wrote the EFF. The EFF also notes that machine-learning technologies frequently classify this content incorrectly, and cites Tumblr’s attempts to crack down on sexual content as a prominent example of where the technology has gone wrong.

Subscribe to get the best Verge-approved tech deals of the week.

Check your inbox for a welcome email.

Epic Games CEO slams Apple 'government spyware' - General Discussion Discussions on AppleInsider Forums

AppleInsider 09 August, 2021 - 09:30pm

I've tried hard to see this from Apple's point of view. But inescapably, this is government spyware installed by Apple based on a presumption of guilt. Though Apple wrote the code, its function is to scan personal data and report it to government.https://t.co/OrkfOSjvS1

I agree one hundred percent with this article. Someone commented that when you work for a company that your data can be scanned. I don't use Facebook or Twitter and never post anything online. When I saw this comment I had to create an account and respond.This article has nothing to do with companies monitoring employee's computers. Apple has no business scanning customer's personal files. Should they also scan your text messages and send your call logs to the feds? Would that make the world even safer? The Apple cloud services are for private storage and to share files with friends and family. Should the government open every storage locker and closet and make sure their is nothing illegal in there?

As an IT Pro with over 20 years experience I've been advising clients about the Pros and Cons of cloud storage for decades. Cloud storage is suppose to make data management and backup easier for the end user. If you store your personal data on your hard drive it is private and you should expect the same when storing in the cloud. I have never stored any of my data in the cloud and still backup my iPhone to my computer using a USB cable. Cloud storage has been pushed on people over the past decade. Companies offer free storage to try and get you interested and then when it fills up they can start charging you for more space. Cloud storage is not meant to encroach on people's privacy. If someone is arrested for something illegal their devices and accounts can be analyzed at that time.

Apple says it will reject any government demands to use new child sexual abuse image detection system for surveillance

CNBC 09 August, 2021 - 02:13pm

Apple defended its new system that will scan iCloud for illegal child sexual abuse materials, or CSAM, on Monday amid a controversy over whether the system reduces Apple user privacy and could be used by governments to surveil citizens.

Last week, Apple announced it has started testing a system that uses sophisticated cryptography to identify when users upload collections of known child pornography to its cloud storage service. It said it can do this without learning about the contents of a user's photos stored on its servers.

Apple reiterated on Monday that its system is more private than those used by companies such as Google and Microsoft because its system uses both its servers and software that will be installed on people's iPhones through an iOS update.

Privacy advocates and technology commentators are worried Apple's new system could be expanded in some countries through new laws to check for other types of images, such as photos with political content.

Apple said in a document posted to its website Sunday that governments cannot force it to add non-CSAM images to a hash list — the file of numbers corresponding to known child sexual abuse images — that Apple will distribute to iPhones to enable the system.

"Apple will refuse any such demands. Apple's CSAM detection capability is built solely to detect known CSAM images stored in iCloud Photos that have been identified by experts at NCMEC and other child safety groups," Apple said in the document. "We have faced demands to build and deploy government-mandated changes that degrade the privacy of users before, and have steadfastly refused those demands. We will continue to refuse them in the future."

It continued: "Let us be clear, this technology is limited to detecting CSAM stored in iCloud and we will not accede to any government's request to expand it."

Some cryptographers are worried about what could happen if a country such as China were to pass a law saying the system also has to include politically sensitive images. Apple CEO Tim Cook has previously said that the company follows laws in every country where it conducts business.

Companies in the U.S. are required to report CSAM to the National Center for Missing & Exploited Children and face fines up to $300,000 when they discover illegal images and don't report them.

Apple's reputation for defending privacy has been cultivated for years through its actions and marketing. In 2016, Apple faced off against the FBI in court to protect the integrity of its on-device encryption systems in the investigation of a mass shooter.

But Apple has also faced significant pressure from law enforcement officials about the possibility of criminals "going dark," or using privacy tools and encryption to prevent messages or other information from being within the reach of law enforcement.

The controversy over Apple's new system, and whether it's surveilling users, threatens Apple's public reputation for building secure and private devices, which the company has used to break into new markets in personal finance and health care.

Critics are concerned the system will partially operate on an iPhone, instead of only scanning photos that have been uploaded to the company's servers. Apple's competitors typically only scan photos stored on their servers.

"It's truly disappointing that Apple got so hung up on its particular vision of privacy that it ended up betraying the fulcrum of user control: being able to trust that your device is truly yours," technology commentator Ben Thompson wrote in a newsletter on Monday.

Apple continues to defend its systems as a genuine improvement that protects children and will reduce the amount of CSAM being created while still protecting iPhone user privacy.

Apple said its system is significantly stronger and more private than previous systems by every privacy metric the company tracks and that it went out of its way to build a better system to detect these illegal images.

Unlike current systems, which run in the cloud and can't be inspected by security researchers, Apple's system can be inspected through its distribution in iOS, an Apple representative said. By moving some processing onto the user's device, the company can derive stronger privacy properties, such as the ability to find CSAM matches without running software on Apple servers that check every single photo.

Apple said on Monday its system doesn't scan private photo libraries that haven't been uploaded to iCloud.

Apple also confirmed it will process photos that have already been uploaded to iCloud. The changes will roll out through an iPhone update later this year, after which users will be alerted that Apple is beginning to check photos stored on iCloud against a list of fingerprints that correspond to known CSAM, Apple said.

Got a confidential news tip? We want to hear from you.

Sign up for free newsletters and get more CNBC delivered to your inbox

Get this delivered to your inbox, and more info about our products and services. 

Data is a real-time snapshot *Data is delayed at least 15 minutes. Global Business and Financial News, Stock Quotes, and Market Data and Analysis.

Technology Stories