On Friday, security researcher Jeffrey Paul published a scathing article regarding Apple’s recent Big Sur-related security snafu. After Apple launched its new OS on Thursday, Mac users began reporting problems launching new applications off local PCs. An initial investigation showed the cause of the problem was a connection initiated by end-user devices when apps were launched.
At launch, apps attempted to connect to ocsp.apple.com to authenticate. While this process is supposed to fail gracefully and allow application launch if servers are not available, the OSCP servers were available — just running slowly. This caused some users’ computers to hang for minutes at a time waiting for app authentication before launching the targeted application.
On Big Sur, trustd is in Apple’s “ContentFilterExclusionList”
….meaning firewalls can’t block it! 😭
— patrick wardle (@patrickwardle) November 12, 2020
Data gathered during the outage and shared online pointed to a few key characteristics: The problems began after Big Sur was released and systems would behave and launch programs normally if their internet access was disabled. Then, on Friday, Paul’s article hit. Titled, “Your Computer Isn’t Yours,” it lays out some damning assertions against Apple — namely that the company logs every single application you run, every single time you run it, and that it sends this data directly to Apple via an unencrypted http (no https) connection.
This means that Apple knows when you’re at home. When you’re at work. What apps you open there, and how often. They know when you open Premiere over at a friend’s house on their Wi-Fi, and they know when you open Tor Browser in a hotel on a trip to another city.
He also points out that the transmissions are sent plaintext and that they run through Akamai, a third-party CDN. Apple, of course, is a partner of the US via programs like PRISM, though frankly, so is everyone else. He then discusses the fact that all of this data transmission is much harder to block under Big Sur than under previous versions of Apple’s macOS, that the company’s upcoming M1 chip won’t run alternate operating systems, and that all of this represents a gigantic land-grab by Apple, both in terms of what it records about your private habits and what it represents as far as the company deciding what you can and cannot run.
These are fairly damning allegations. An Italian security researcher named Jacopo Jannone took a look at Paul’s allegations and came back with a more nuanced portrayal of the situation. According to him, what macOS connects to the internet to transmit isn’t a hash of every single application that you run. It transmits developer certificate information — and multiple applications developed by the same company are signed with the same certificate. Think of this as less of an “Apple knows you’re running Firefox,” and more of an “Apple knows you are running software certified by Mozilla.”
Whether this distinction matters to you is going to depend on how comfortable you are with how much data our devices regularly share with the corporations that write the software that runs on them. Objectively speaking, much of Paul’s critique is correct, even if he’s incorrect about the “Apple gets a hash of every single app you run” angle. It’s true that Apple is locking down its ecosystem with the M1, stepping back from cross-OS compatibility in terms of OS support, and that Big Sur can bypass any firewall restrictions the end-user attempts to create.
Microsoft does something very similar with Windows 10. The company deploys several different defensive strategies to protect users from potentially malicious software, including warning the end user before allowing them to run links from unverified locations. Apple also requires all developers, including those distributing apps online, to have their applications notarized by Apple. Apps that are not authorized will not run by default. Catalina-era discussions of Mac app permissions suggest that non-notarized applications can still be run, they just won’t run by default, and that this is more of an effort to help end-users avoid malicious software than an attempt to control of PCs.
It’s not always easy to separate profit motives from security goals. Apple pitched its T2 chip to users as a superior security solution compared with ordinary PCs. It may be that — but it’s also a tool Apple can use to lock out third party repairs. Certificate verification and app notarization can protect against some (though certainly not all) threat vectors. Does that make it a good idea for OS developers to insert online checks and verifications into the process? (Jacopo claims Apple avoids using https for this periodic hash check in order to avoid loops, for example.) I’m not sure.
A few things do seem clear, as of this writing. First, Apple isn’t literally sending a hash of your applications to its servers. Second, the company needs to fix this soft-fail issue that caused the problem in the first place. A hard timeout after a short period of time would do it. Third, we do continue to see companies using more consumer data, claiming it’s for our own good, and only later do we discover that there have been some whopping unintended side effects. Apple didn’t intend for its software verification system to cause this issue. It still did. Fourth, Apple’s Big Sur takes some further steps towards limiting your own ability to control your PC. Microsoft pioneered some of these with Windows 10 and we can’t say we’re thrilled to see them coming to Apple. Fifth, control of its own ecosystem has been central to Apple’s DNA for the entirety of the company’s existence.
Ultimately, what’s happened here lands somewhere between “serious land grab” and “nothing to care about.” Apple has made changes under the hood to how its operating systems operate and some of those changes make its user-base uneasy. Having gone through them on the Windows 10 side of things, I understand why Paul is unhappy at the idea of having to use an external router to block traffic off his PC. Even if these changes are made for benign reasons they don’t feel benign. Unfortunately, beyond the stereotypical “use Linux,” I don’t have a great solution to propose. Microsoft has some of the same issues. Jeffrey Paul may not be right about the specifics of what Apple is tracking with this information, but he’s not wrong about the ongoing damage to our collective sense of ownership. If you buy a PC from Apple or use Microsoft’s Windows 10 in 2020, you have less control over it than you did in 2000 or 1990.
Update (11/16/2020): Apple has published a support document update in response to Paul’s concerns. It reads:
We do not use data from these checks to learn what individual users are launching or running on their devices.
Notarization checks if the app contains known malware using an encrypted connection that is resilient to server failures.
These security checks have never included the user’s Apple ID or the identity of their device. To further protect privacy, we have stopped logging IP addresses associated with Developer ID certificate checks, and we will ensure that any collected IP addresses are removed from logs.
In addition, over the the next year we will introduce several changes to our security checks:
A new encrypted protocol for Developer ID certificate revocation checks
Strong protections against server failure
A new preference for users to opt out of these security protections
This may or may not change how anyone feels about this kind of policy, and it does seem to be true that Big Sur is locking down the ability to run some applications, but the situation on what this means in terms of running non-notarized apps seems fluid and honestly still a bit unclear. The question of how Big Sur locks down the OS and what data Big Sur sends back to Apple are also two different questions, though both have been raised in this discussion. I think the broad point that Paul raises — that Big Sur represents an assertion of control over Apple’s ecosystem and user experience — is true. The details vary in ways that are going to be meaningful to some people and that others will still see as a bridge too far. There’s an intrinsic tension between security and user freedom here that isn’t easily resolved and Apple has always come down on the “More control” side of the fence.