Apple explains how iPhones will scan photos for child-sexual-abuse images

Technology

Ars Technica 05 August, 2021 - 06:29pm 41 views

Does Apple Scan your photos?

Apple has also been scanning user files stored in its iCloud service, which is not as securely encrypted as its messages, for such images. ... The company has been under pressure from governments and law enforcement to allow for surveillance of encrypted data. USA TODAYApple wants to look in your iPhone: Company plans to scan U.S. phones for images of child abuse

Apple's Plan to "Think Different" About Encryption Opens a Backdoor to Your Private Life

EFF 05 August, 2021 - 10:20pm

San Francisco – The Electronic Frontier Foundation (EFF) is proud to announce its latest grant from Craig Newmark Philanthropies: $300,000 to help protect journalists and fight consumer spyware.“This donation will help us to develop tools and training for both working journalists and student journalists, preparing them to protect themselves...

Dating is risky. Aside from the typical worries of possible rejection or lack of romantic chemistry, LGBTQIA people often have added safety considerations to keep in mind. Sometimes staying in the proverbial closet is a matter of personal security. Even if someone is open with their community about being LGBTQ+,...

The Supreme Court’s Van Buren decision today overturned a dangerous precedent and clarified the notoriously ambiguous meaning of “exceeding authorized access” in the Computer Fraud and Abuse Act, the federal computer crime law that’s been misused to prosecute beneficial and important online activity. The decision is a victory...

EFF has long fought to reform vague, dangerous computer crime laws like the CFAA. We're gratified that the Supreme Court today acknowledged that overbroad application of the CFAA risks turning nearly any user of the Internet into a criminal based on arbitrary terms of service. We remember the...

An expanding category of software, apps, and devices is normalizing cradle-to-grave surveillance in more and more aspects of everyday life. At EFF we call them “disciplinary technologies.” They typically show up in the areas of life where surveillance is most accepted and where power imbalances are the norm: in our...

Although DNA is individual to you—a “fingerprint” of your genetic code—DNA samples don’t always tell a complete story. The DNA samples used in criminal prosecutions are generally of low quality, making them particularly complicated to analyze. They are not very concentrated, not very complete, or are a mixture of multiple...

With the attack on Colonial Pipeline by a ransomware group causing panic buying and shortages of gasoline on the US East Coast, many are left with more questions than answers to what exactly is going on. We have provided a short FAQ to the most common technical questions that are...

San Francisco, California—Boosting protection of Internet users’ personal data from snooping advertisers and third-party trackers, the Electronic Frontier Foundation (EFF) today announced it has enhanced its groundbreaking HTTPS Everywhere browser extension by incorporating rulesets from DuckDuckGo Smarter Encryption.The partnership represents the next step in the...

Washington, D.C.—The Electronic Frontier Foundation (EFF) today urged the Supreme Court to rule that consumers can take big tech companies like Facebook and Google to court, including in class action lawsuits, to hold them accountable for privacy and other user data-related violations, regardless of whether they can show they suffered...

Government knowledge of what sites activists have visited can put them at risk of serious injury, arrest, or even death. This makes it a vitally important priority to secure DNS. DNS over HTTPS (DoH) is a protocol that encrypts the Domain Name System (DNS) by performing lookups over the secure...

Apple to start checking iPhone and iCloud photos for child abuse imagery

WSB Atlanta 05 August, 2021 - 05:23pm

The new service will turn photos on devices into an unreadable set of hashes -- or complex numbers -- stored on user devices, the company explained at a press conference. Those numbers will be matched against a database of hashes provided by the National Center for Missing and Exploited Children.

In taking this step, Apple (AAPL) is following some other big tech companies such as Google (GOOG) and Facebook (FB). But it's also trying to strike a balance between safety and privacy, the latter of which Apple has stressed as a central selling point for its devices.

Some privacy advocates were quick to raise concerns about the effort.

"Apple is replacing its industry-standard end-to-end encrypted messaging system with an infrastructure for surveillance and censorship, which will be vulnerable to abuse and scope-creep not only in the US, but around the world," says Greg Nojeim, co-director of the Security & Surveillance Project at the Center for Democracy & Technology. "Apple should abandon these changes and restore its users' faith in the security and integrity of their data on Apple devices and services."

In a post on its website outlining the updates, the company said: "Apple's method ... is designed with user privacy in mind." Apple emphasized that the tool does not "scan" user photos and only images from the database will be included as matches. (This should mean a user's harmless picture of their child in the bathtub will not be flagged.)

Apple also said a device will create a doubly-encrypted "safety voucher" -- a packet of information sent to servers -- that's encoded on photos. Once there are a certain number of flagged safety vouchers, Apple's review team will be alerted. It'll then decrypt the voucher, disable the user's account and alert National Center for Missing and Exploited Children, which can inform law enforcement. Those who think their accounts have been mistakenly flagged can file an appeal to have it reinstated.

Apple's goal is to ensure identical and visually similar images result in the same hash, even if it's been slightly cropped, resized or converted from color to black and white.

"The reality is that privacy and child protection can co-exist," John Clark, president and CEO of the National Center for Missing & Exploited Children, said in a statement. "We applaud Apple and look forward to working together to make this world a safer place for children."

The announcement is part of a greater push around child safety from the company. Apple said Thursday a new communication tool will also warn users under age 18 when they're about to send or receive a message with an explicit image. The tool, which has to be turned on in Family Sharing, uses on-device machine learning to analyze image attachments and determine if a photo is sexually explicit. Parents with children under the age of 13 can additionally turn on a notification feature in the event that a child is about to send or receive a nude image. Apple said it will not get access to the messages.

That tool will be available as a future software update, according to the company.

© 2021 Cable News Network. A Warner Media Company. All Rights Reserved.

Terms of Use | Privacy Policy | AdChoices | Do not Sell my Personal Information | Manage cookies+

Technology Stories