Tumgik
#CSAM detection tool for iCloud Photos
orbitbrain · 1 year
Text
Apple Scraps CSAM Detection Tool for iCloud Photos
Apple Scraps CSAM Detection Tool for iCloud Photos
Home › Endpoint Security Apple Scraps CSAM Detection Tool for iCloud Photos By Ryan Naraine on December 08, 2022 Tweet Apple has scrapped plans to ship a controversial child pornography protection tool for iCloud Photos, a concession to privacy rights advocates who warned it could have been used for government surveillance. Instead, the Cupertino, California device maker said it would expand…
View On WordPress
1 note · View note
macnews-org · 2 years
Text
Apple's CSAM detection system may not be perfect, but it is inevitable
Apple’s CSAM detection system may not be perfect, but it is inevitable
Over a year ago, Apple announced plans to scan for child sexual abuse material (CSAM) with the iOS 15.2 release. The technology is inevitable despite imperfections and silence about it. Logos for Messages, Siri, and Photos Apple announced this scanning technology on August 5, 2021 to appear in iCloud Photos, iMessage, and Siri. These tools were designed to improve the safety of children on its…
Tumblr media
View On WordPress
0 notes
just-stop · 3 years
Text
Apple Will Scan iCloud Photos for Child Sexual Abuse Images and Report Matches to Legal Authorities
Apple is adding a series of new child-safety features to its next big operating system updates for iPhone and iPad.
As part of iOS 15 and iPadOS 15 updates later this year, the tech giant will implement a feature to detect photos stored in iCloud Photos that depict sexually explicit activities involving children.
“This will enable Apple to report these instances to the National Center for Missing and Exploited Children (NCMEC),” the company said in a notice on its website. NCMEC acts as a reporting center for child sexual abuse material (CSAM) and works in collaboration with law enforcement agencies across the U.S.
According to Apple, its method of detecting known CSAM is “designed with user privacy in mind.” The company says it is not directly accessing customers’ photos but instead is using a device-local, hash-based matching system to detect child abuse images. Apple says it can’t actually see user photos or the results of such scans unless there’s a hit.
If there’s a match between a user’s photos and the CSAM database, Apple then manually reviews each report to confirm the presence of sexually explicit images of children, then will disable the user’s account and send a report to NCMEC. If a user feels their account has been mistakenly flagged, according to Apple, “they can file an appeal to have their account reinstated.” The system provides a high level of accuracy that ensures less than a one-in-1-trillion chance per year of incorrectly flagging a given account, according to Apple.
Apple posted a 12-page technical summary of its CSAM detection system at this link.
In addition, with Apple’s iOS 15 update, the iPhone’s Messages app will add new tools to warn children and their parents if they are receiving or sending sexually explicit photos.
“When receiving this type of content, the photo will be blurred and the child will be warned, presented with helpful resources, and reassured it is okay if they do not want to view this photo,” Apple said. “As an additional precaution, the child can also be told that, to make sure they are safe, their parents will get a message if they do view it. Similar protections are available if a child attempts to send sexually explicit photos.”
Apple’s iOS 15 also will provide updates to Siri and Search to “provide parents and children expanded information and help if they encounter unsafe situations.” Siri and Search will intervene when users try to search for child sexual abuse material, displaying prompts that will “explain to users that interest in this topic is harmful and problematic, and provide resources from partners to get help with this issue.”
The iOS 15 update is slated to be available in the fall of 2021, available for iPhone 6s and later models.
15 notes · View notes
cpoetter · 3 years
Text
Durch die Woche - 2021-09-04
Der Herbst hat begonnen. Sehr schön! Ich kann Wärme und Hitze nicht wirklich aushalten, deshalb war dieser Sommer eine Wohltat für mich. Hat mich an meine Jugend erinnert, da war so ein Sommer gefühlt Standard.
Aber hey, ihr seid nicht auf einem Wetter- oder Klimablog gelandet. Kommen wir wieder zu den unwichtigen Dingen des Lebens.
Apple sorgt weiterhin für Schlagzeilen.
Das angekündigte Scannen von Bildern auf sexuellen Kindesmissbrauch wird auf Druck von Sicherheitsexperten, Menschenrechtsaktivisten und grundsätzlich eher negativem Feedback neu evaluiert, via Techcrunch:
Last month we announced plans for features intended to help protect children from predators who use communication tools to recruit and exploit them, and limit the spread of Child Sexual Abuse Material. Based on feedback from customers, advocacy groups, researchers and others, we have decided to take additional time over the coming months to collect input and make improvements before releasing these critically important child safety features.
Mehr zum Thema bei Techdirt.
Schlechte Nachrichten kommen für Apple auch aus Asien. Die Japan Fair Trade Commission hat entschieden, dass Apple "Reader" Apps (Spotify, Netflix, Hulu,...) ermöglichen muss, ihre Kund*innen direkt auf ihre Seiten zu leiten, um sich anzumelden. Damit können auch die In-App-Zahlungen vermieden werden. Dieser Regulierung wird Apple weltweit folgen. Allerdings:
The rule change has an extremely limited scope, as Apple claims it only agreed to let developers of so-called reader apps to “share a single link to their website to help users set up and manage their account.”
In Südkorea wurde entschieden, dass Apple und Google In-App Käufe auch außerhalb ihrer App Stores erlauben müssen (s. Techcrunch). Apple hatte bereits letzte Woche einer ähnlichen Regelung für die USA zugestimmt.
Protocol:
The South Korean law is a fierce repudiation of the mobile app store business model, and also offers a roadmap for regulators in the EU and the U.S. to adopt similar approaches to reining in Big Tech.
Mal abwarten.
Nicht unterschlagen möchte ich in dem Zusammenhang den Beitrag von John Gruber, der die positiven Aspekte von Abonnements über App Stores für Nutzer*innen herausstellt. Valide Punkte.
Zu guter Letzt gibt es bei Apple auch noch Beschäftigte. Die Firma hat Berufs- und Privatleben ihrer Beschäftigten teilweise ziemlich verzahnt was die genutzten Geräte (z.B. iPhones) und iCloud Accounts anbelangt:
The blurring of personal and work accounts has resulted in some unusual situations, including Gjøvik allegedly being forced to hand compromising photos of herself to Apple lawyers when her team became involved in an unrelated legal dispute.
Zudem sprechen immer mehr Beschäftige Belästigungen und Diskriminierungen bei Apple öffentlich an.
Es ist für mich wirklich spannend zu sehen, wie Apple immer mehr ins Schussfeld gerät. Viele der Probleme (z.B. App Store Provisionen) sind lange bekannt. Aber dass auch immer mehr Probleme bei Apple von innen nach außen dringen, ist schon bemerkenswert und zeigt, dass Apple nicht mehr die Kontrolle über alle Aspekte seines Geschäfts hat.
Die EU plant neue Vorschriften für Smartphones und Tablets. So soll es ab 2023 Energielabels geben wie wir sie von anderen elektronischen Geräten wie Kühlschränken und Fernehern kennen. Auch soll eine Pflicht zu Sicherheitsupdates für mindestens 5 Jahre eingeführt werden (hallo Androidhersteller!). Für den gleichen Zeitraum soll es Ersatzteile geben und die Geräte müssen dementsprechend repariert werden können (halle Apple!). Die Geräte werden wohl teurer werden.
Nach der OnlyFans Saga wird nun Twitter zum Ziel von Antiporno Gruppen. Vice:
Twitter filed a motion to dismiss the lawsuit; a judge ruled to dismiss all of the claims against the platform where it was protected under Section 230 of the Communications Decency Act, but allowed one claim to move forward, under the Trafficking Victims Protection Act: that Twitter was a “beneficiary” from sex trafficking.
Aber nicht ganz unwichtig bei Twitter:
The confluence of journalists and sex workers using the same platform also helps marginalized communities get the word out when they're under attack.
Wie The Cut berichtet habe viele (Chef-)Redaktuer*innen von Modemagazinen ihre Posten Richtung Silicon Valley verlassen. Der Grund für die Firmen, diese Leute anzuheuern:
By this point, these platforms are less interested in convincing more people to use them than in getting the people who already do to linger and buy something.
That's it, ein sonniges Wochenende!
0 notes
macnews-org · 3 years
Text
Apple privacy head explains privacy protections of CSAM detection system
Apple privacy head explains privacy protections of CSAM detection system
Apple’s privacy chief Erik Neuenschwander has detailed some of the projections built into the company’s CSAM scanning system that prevent it from being used for other purposes – including clarifying that the system performs no hashing if iCloud Photos is off. Credit: WikiMedia Commons The company’s CSAM detection system, which was announced with other new child safety tools, has caused…
Tumblr media
View On WordPress
0 notes
cpoetter · 3 years
Text
Durch die Woche - 2021-08-07
Irgendwie hat sich diese Woche langsam hingezogen, Technews nicht wirklich gezündet, etwas unbefriedigend alles. Mal sehen, was hängengeblieben ist.
Twitter arbeitet nun mit Reuters und Associated Press zusammen.
We’re excited to share that Twitter is collaborating with The Associated Press (AP) and Reuters to expand our efforts to identify and elevate credible information on Twitter.
Es sollen glaubhafte Information sichtbarer sein, gut, ein Punkt. Aber Twitter geht m.E. damit auch einen weiteren Schritt in Richtung eigenständiger Nachrichtenquelle. Das ist Twitter für viele Menschen zwar heute schon. Aber mit der stärkeren Einbindung von Nachrichtenagenturen wird der Unterschied zu klassischen Newsquellen wie Zeitungen geringer. Der Schritt zur Personalisierung von Nachrichten ist mit vergangenen Übernahmen ebenfalls bereits möglich (s. den Wochenrückblick von letzter Woche).
Ebenfalls letzte Woche hatte ich China erwähnt. Ergänzend möchte ich auf einen Beitrag auf dem Blog des Original Platform Funds von Holger Schmidt hinweisen. Dort wird sehr deutlich, wie stark sich die Regulierung von chinesischen Plattformen bereits auf die Kurse ausgewirkt hat.
Seit Februar 2021 bis zum 7. Juli haben die chinesischen Werte bereits ca. 831 Mrd. $ verloren.
Stack Oberflow hat über 80.000 Entwicklerinnen zu fast allen Aspekten ihrer Arbeit befragt. Die Ergebnisse der Befragung sind ganz interessant. Es zeigt sich z.B., dass Microsoft ganz gut im Geschäft ist (Nutzung von Git[Hub], Visual Studio Code, Azure). Der Dino COBOL wird als Schlusslicht der genutzten Sprachen von 0,53% der Entwicklerinnen genutzt (aus rein beruflicher Sicht mal darauf geschaut). Gleichzeitig ist es dann aber relativ überraschend, dass nur 0,22% von denen 45 Jahre und älter sind. Hintergrund der Überraschung: mich fragte einmal eine Entwicklerin, ob wir überhaupt Leute unter 50 Jahren hätten, die in COBOL entwickeln. ;)
Ein großes Echo hat die Ankündigung von Apple erhalten, mit kommenden Versionen von iOS und iPadOS Material von sexuellem Kindesmissbrauch in iCloud Photos zu erkennen.
Apple’s method of detecting known CSAM [Child Sexual Abuse Material] is designed with user privacy in mind. Instead of scanning images in the cloud, the system performs on-device matching using a database of known CSAM image hashes provided by NCMEC [National Center for Missing and Exploited Children ] and other child safety organizations. Apple further transforms this database into an unreadable set of hashes that is securely stored on users’ devices.
Ein weiterer Punkt der Ankündigung ist das Scannen der Nachrichten App nach Fotos mit sexuellem Inhalt.
The Messages app will add new tools to warn children and their parents when receiving or sending sexually explicit photos.
Die Absicht von Apple ist ausdrücklich begrüßenswert. Aber gerade das Scannen der Nachrichten App löst auch Widerspruch aus. Die Electronic Frontier Foundation in einem Statement:
We’ve said it before, and we’ll say it again now: it’s impossible to build a client-side scanning system that can only be used for sexually explicit images sent or received by children. As a consequence, even a well-intentioned effort to build such a system will break key promises of the messenger’s encryption itself and open the door to broader abuses.
Schwierig.
Diese Woche hat Facebook wegen angeblichen Verstößen gegen die AGB die Accounts von Forscher*innen der New York University geschlossen, die an dem Projekt Ad Observatory beteiligt sind. Hintergründe und Zusammenfassung finden sich gut aufbereitet bei Techdirt.
Shira Ovide ist der Frage nachgegangen, ob YouTube ein finanzieller Erfolg ist. Sie kommt da eher zu gemischten Ergebnissen.
0 notes
jamesstegall · 3 years
Text
Apple defends its new anti-child abuse tech against privacy concerns
Apple has boasted a few iconic ads during the company’s 45-year history, from the famous 1984 Super Bowl ad for Macs to the company’s combative 2019 ad campaign promising that “what happens on your iPhone stays on your iPhone.”
On Thursday, Apple announced new technologies to detect Child Sexual Abuse Material (CSAM) right on iPhones—and it seems like suddenly, what’s on your iPhone no longer always simply stays there. The controversial new features strike at the heart of concerns about privacy, surveillance, and tech-enabled crimes. 
Apple says these new features simultaneously preserve privacy and combat child abuse. For critics, though, the biggest question is not about what the technology might do today, it’s about what it could become tomorrow. 
Will Apple’s new scanning technology enable even broader surveillance around the world? Will governments start demanding Apple scan for all sorts of forbidden content on the iPhones in their respective countries? 
Apple says the new detection tool, called NeuralHash, can identify images of child abuse stored on an iPhone without decrypting the image. The company also says it has implemented multiple checks to reduce the chance of errors before images are passed to the National Center for Missing and Exploited Children (NCMEC) and then to law enforcement. (For example, the scanner must detect multiple images rather than just one.) The feature will roll out in the United States this year.
After a long & forceful push from me & @LindseyGrahamSC, Apple’s plans to combat child sexual exploitation are a welcome, innovative, & bold step. This shows that we can both protect children & our fundamental privacy rights. https://t.co/c8Hp7jQY3a
— Richard Blumenthal (@SenBlumenthal) August 5, 2021
Google, Microsoft, Dropbox, and other big cloud services already scan material stored on their servers for child abuse material so the general premise is not new. The difference here is that some of Apple’s scans will occur on the iPhone itself—and Apple argues that this is the defining pro-privacy feature of the new technology. 
In a briefing with journalists on Friday, Apple said that while other cloud services scan nearly everything their users upload, Apple’s on-device scanning is meant to only send an unreadable hash to the company, a code that identifies images that depict child abuse based on a database maintained by NCMEC rather than Apple.
Apple has also pointed out that the feature applies only to people who upload photos to iCloud (automatic uploads are the default setting on iPhones but can be disabled). iCloud accounts are not currently encrypted, so law enforcement can already peer into them.
Why, then, doesn’t Apple just do what other big tech companies do and scan images when they’re uploaded to the cloud instead of when they’re still on someone’s phone? Why build a new and complex set of technologies when it could just take existing tech from off the shelf?
Apple’s next move
There is an enormous amount we don’t know about what Apple is doing now and what comes next. One popular theory is that this week’s announcement will be the first of several moves in the coming months.
Apple’s marketing has boasted about privacy for years. The iPhone was one of the first personal devices to be automatically encrypted. iMessage is one of the most popular messaging apps with end-to-end encryption. But many Apple customers automatically back everything up to their iCloud accounts—and iCloud has never been encrypted. So even in 2019, that famous iPhone privacy ad could have used a tiny asterisk.
Governments have for years exerted enormous pressure on all tech companies, Apple included, to allow law enforcement special access to otherwise encrypted data in order to prevent what they see as the most heinous of crimes. Child abuse is always at the top of that list, followed closely by terrorism. Apple’s public embrace of encryption has made some cops’ jobs more difficult—but it’s made some kinds of surveillance and abuse of power harder too.
The big loophole has always been iCloud. Cops have to work hard to get directly into iPhones, but if most of the data is already in an unencrypted iCloud account, a warrant will do the trick.
Following this week’s announcement, some experts think Apple will soon announce that iCloud will be encrypted. If iCloud is encrypted but the company can still identify child abuse material, pass evidence along to law enforcement, and suspend the offender, that may relieve some of the political pressure on Apple executives. 
It wouldn’t relieve all the pressure: most of the same governments that want Apple to do more on child abuse also want more action on content related to terrorism and other crimes. But child abuse is a real and sizable problem where big tech companies have mostly failed to date.
“Apple’s approach preserves privacy better than any other I am aware of,” says David Forsyth, the chair of computer science at the University of Illinois Urbana-Champaign, who reviewed Apple’s system. “In my judgement this system will likely significantly increase the likelihood that people who own or traffic in [CSAM] are found; this should help protect children. Harmless users should experience minimal to no loss of privacy, because visual derivatives are revealed only if there are enough matches to CSAM pictures, and only for the images that match known CSAM pictures. The accuracy of the matching system, combined with the threshold, makes it very unlikely that pictures that are not known CSAM pictures will be revealed.”
What about WhatsApp?
Every big tech company faces the horrifying reality of child abuse material on its platform. None have approached it like Apple.
Like iMessage, WhatsApp is an end-to-end encrypted messaging platform with billions of users. Like any platform that size, they face a big abuse problem.
“I read the information Apple put out yesterday and I’m concerned,” WhatsApp head Will Cathcart tweeted on Friday. “I think this is the wrong approach and a setback for people’s privacy all over the world. People have asked if we’ll adopt this system for WhatsApp. The answer is no.”
WhatsApp includes reporting capabilities so that any user can report abusive content to WhatsApp. While the capabilities are far from perfect, WhatsApp reported over 400,000 cases to NCMEC last year.
“This is an Apple built and operated surveillance system that could very easily be used to scan private content for anything they or a government decides it wants to control,” Cathcart said in his tweets. “Countries where iPhones are sold will have different definitions on what is acceptable. Will this system be used in China? What content will they consider illegal there and how will we ever know? How will they manage requests from governments all around the world to add other types of content to the list for scanning?”
In its briefing with journalists, Apple emphasized that this new scanning technology was releasing only in the United States so far. But the company went on to argue that it has a track record of fighting for privacy and expects to continue to do so. In that way, much of this comes down to trust in Apple. 
The company argued that the new systems cannot be misappropriated easily by government action—and emphasized repeatedly that opting out was as easy as turning off iCloud backup. 
Despite being one of the most popular messaging platforms on earth, iMessage has long been criticized for lacking the kind of reporting capabilities that are now commonplace across the social internet. As a result, Apple has historically reported a tiny fraction of the cases to NCMEC that companies like Facebook do.
Instead of adopting that solution, Apple has built something entirely different—and the final outcomes are an open and worrying question for privacy hawks. For others, it’s a welcome radical change.
“Apple’s expanded protection for children is a game changer,” John Clark, president of the NCMEC, said in a statement. “The reality is that privacy and child protection can co-exist.” 
High stakes
An optimist would say that enabling full encryption of iCloud accounts while still detecting child abuse material is both an anti-abuse and privacy win—and perhaps even a deft political move that blunts anti-encryption rhetoric from American, European, Indian, and Chinese officials.
A realist would worry about what comes next from the world’s most powerful countries. It is a virtual guarantee that Apple will get—and probably already has received—calls from capital cities as government officials begin to imagine the surveillance possibilities of this scanning technology. Political pressure is one thing, regulation and authoritarian control are another. But that threat is not new nor is it specific to this system. As a company with a track record of quiet but profitable compromise with China, Apple has a lot of work to do to persuade users of its ability to resist draconian governments.
All of the above can be true. What comes next will ultimately define Apple’s new tech. If this feature is weaponized by governments for broadening surveillance, then the company is clearly failing to deliver on its privacy promises.
from MIT Technology Review https://ift.tt/2VnENXH via IFTTT
0 notes
macnews-org · 3 years
Text
New child safety features coming to iOS 15, iPadOS 15, and macOS Monterey
New child safety features coming to iOS 15, iPadOS 15, and macOS Monterey
Apple’s new features aim to protect children and limit the spread of Child Sexual Abuse Materials (CSAM) with new features. What you need to know Apple commits to add extra protection for children across its platforms. New tools will be added in Messages to help protect children from predators. New tools in iOS and iPadOS will help detect CSAM in iCloud Photos. It’s an ugly part of the world we…
Tumblr media
View On WordPress
0 notes