Michael Tsai – Blog – Privacy with Photos.app’s Enhanced Visual Search
Jeff Johnson (mastodon, Hacker News, Reddit, 2, The Verge, Yahoo):
This morning while looking through the settings of a bunch of apps on my iPhone, I discovered a new setting for Photos that is enabled by default: Enhanced Visual Search.
(…)
There appear to be only two relevant documents on Apple’s website, the first of which is a legal notice about Photos and Privacy:
Enhanced Visual Photo Search lets you search for photos using landmarks or points of interest. Your device privately matches the locations of your photos to a global index that Apple maintains on our servers. We apply homomorphic encryption and differential privacy, and use an OHTTP relay that hides the IP address. This prevents Apple from learning about the information in your photos. You can turn off Enhanced Visual Search at any time on your iOS or iPadOS device by going to Settings > Apps > Photos. On a Mac, open Photos and go to Settings > General.
Apple’s second online document is a Machine Learning Research blog post titled Integrating Machine Learning and Homomorphic Encryption in the Apple Ecosystem and published on October 24, 2024. (Note that iOS 18 and macOS 15 were released to the public on September 16.)
As far as I can tell, it has been added to macOS 15.1 and iOS 18.1, none of the initial releasebut it’s hard to know for sure because none of Apple’s release notes mention the feature name.
Individual users must decide on their own consent for the risk of privacy violations. In this specific case, I do not tolerate the risk, because I have no interest in the Enhanced Visual Search feature, even if it happens without error. No benefit outweighs the risk. By enabling the “feature” without asking, Apple doesn’t respect users and their preferences. I really don’t want my iPhone to phone Apple’s house.
Remember this ad? “What happens on your iPhone, stays on your iPhone.”
Apple is thinking about doing this in a (theoretically) privacy-preserving way, but I don’t think the company is living up to its values here. Not only is it not opt-in, but you can’t effectively choose when it starts uploading metadata about your photos. HISTORY you still use the search feature. This is done even if you no longer upload your photos to iCloud. And “private matches” is kind of a euphemism. There remains no plain English text that says it uploads information about your photos and specifically what information that is. You might think it’s just sharing GPS coordinates, but apparently it’s the truth CONTENTS of the photos used in the search.
A piece of data that not location is shared. This is obvious because some of my photos of the London skyline are misidentified as different cities, including San Francisco, Montreal, and Shanghai.
What I’m confused about is what this feature actually is. It seems to compare locally identified landmarks against a database too large to store locally, thus enabling more accurate searches. This too it seems The matching is done entirely on visual data, and it does not rely on photo metadata. But since Apple didn’t announce this feature and doesn’t document it well, we don’t know. A document says trust us to analyze your photos remotely; others say here are all the technical reasons you can trust us. Apple has not said clearly what happened.
(…)
I see this feature implemented with responsibility and privacy in almost every way, but, since it is not well explained and enabled by default, it is difficult to trust. Photo libraries are very sensitive. It is completely fair for users to be suspicious of this feature.
In a way, it is less private than the CSAM scan left by Apple, because it applies to non-iCloud photos and uploads information about all photos, not just those with suspicious neural hashes. On the other hand, your data supposedly—if they don’t have design flaws or bugs—remains encrypted and is not linked to your account or IP address.
jchw:
What I want is very simple: I want software that doesn’t send anything over the Internet without a clear purpose first. All that work to try to make this feature private is cool engineering work, and there’s absolutely nothing wrong with implementing a feature like this, but it should be perfect to opt-in.
Trust in software will continue to erode until software stops treating end users and their data and resources (eg network connections) as the vendor’s own playground. Local on-device data should not exit the radio interfaces unexpectedly, period. There must be a user intent attached to any segment where local data is sent over the network.
Just Apple knocked about how, when Meta interoperability requests are granted, user-installed apps on a device and granted permission to able to “scan all their photos” and that “this is data that Apple itself chooses not to access.” However here we know that in an OS update in October Apple enabled a new feature that sends unspecified information about all your photos to Apple.
I saw a lot of reaction like IT:
I’m tired of so many privacy concerns from all the nonsense… Yes it sends photo data anonymously to make a feature work or improve it. So what? Apple and iOS are the most private companies/software out there.
But I’m tired of the double standard with which Apple and its fans start from the premise of believing in Apple’s marketing. So if you’re opting out quietly, and a document somewhere uses buzzwords like “homomorphic encryption” and “differential privacy” without saying what data it applies to, that’s enough. You can think that your privacy is protected because Apple is a good company with good meaning and not SHIP bugs.
You see, another company CAN “scan” your photos, but Apple “privately matches” them. The truth is that, although they are better, they also have a BACKGROUND on sketchy BEHAVIOR and misleading users about privacy. They define “tracking” so it doesn’t count if the company that runs the App Store does it, then send information to data brokers although they claim not to.
With Apple making privacy a big part of its brand, it’s a little surprising that it’s by default and/or that Apple doesn’t create a custom prompt for “not library of photo, not contact list, not location, etc.” access permission. Some minor changes to WAYS Software works and interacts with the user can go a long way in building and maintaining trust.
I like that Apple is trying to do privacy-related services, but it just showed up at the bottom of my Settings screen on holiday break when I wasn’t paying attention. It sends data about my private photos to Apple.
I like the opportunity to read about the architecture, think about how much this scheme leaks, but I only realized it in time to see that it has been activated on my device. Coincided with a vacation where I only took about 400 photos of recognizable locations.
This isn’t how you launch a privacy-preserving product when your intentions are good, it’s how you slip something under the radar while everyone else is distracted.
The issues discussed in Apple’s blog post are so complex that Apple had to address them in two of their scientific papers, Scalable Private Search with Wally
and Learn with Privacy at Scalewhich is more complicated and vague than the blog post. How many of my critics have read and understood the papers? I think almost zero.(…)
In fact, my critics demanded silence from almost everyone. According to their criticism, an iPhone user has no right to question an iPhone feature. Anything Apple says should be trusted completely. These random internet commentators have become self-appointed experts just by parroting Apple’s words and nodding as if everything is obvious, despite the fact that it’s not obvious to a real expert. , a famous cryptographer.
In the past:
Comments
2025-01-01 22:05:00