After years circling the issue, a tech firm has decided to risk its neck in an all-out battle of wills with the most powerful policing organisation in the world, the FBI. At issue is the issue of how much assistance it should give to police trying to access data on the company's devices. Many see what happened this week as a moment of truth that has been coming for years.
The FBI v Apple
At the centre of it all is a single encrypted iPhone that belonged to one of the two individuals, Syed Farook, who carried out the murderous San Bernardino attack on 2 December 2015. The FBI wants access to the data on the device (which belongs to Farook's employer, the San Bernardino County Department of Public Health). Apple doesn't want to go as far as the FBI demands in making that possible and the Feds are now using an 18th Century law called the All Writs Act to force it to comply with what is, in effect, a search warrant by creating special 'one-off' software to bypass device security.
Earlier this week, a judge agreed with the FBI's demand which Apple says it will now appeal.
What is the FBI's technical beef?
A lot has been written about how difficult it is to break something called Secure Enclave, a security coprocessor architecture that arrived with devices based on the A7 processor in 2013. Even Apple says it can't bypass this because it is separate from the OS. Ironically, the iPhone the FBI wants to access, a 5C, uses an older version based on some of the same principles as Secure Enclave that runs on the A7's predecessor, the A6, but with some important weaknesses.
Breaking the 5C's encryption is easier therefore but not easy. As with Secure Enclave, the 5C uses hardware to encrypt data into and out of storage using a special 'ephemeral' key that disappears when the phone is turned off. A copy of this key is stored on flash memory but - and this is the tricky bit - this can't be accessed without the 4-digit (or greater) PIN entered by the user when the iPhone is turned on. The keys work together like a sort of double lock.
Because 4-digit PINs most people use aren't very secure (only 10,000 possibilities) there are extra protections to stop attackers simply going through every possibility, such as a data wipe function after 10 incorrect attempts as well as time delays for every guess. If the PIN uses more than four digits, the time delays turn from seconds to days, months and years. All of this must be done on the iPhone itself.
What does the FBI actually want?
The demands are summarised in a writ it submitted to a US court on 16 February but include disabling the auto-erase function after 10 incorrect PIN attempts and getting round any time delays to that process. Contentiously, the FBI also wants Apple to create a special version of iOS for that phone to facilitate the whole guessing process and access the data encrypted behind the device key.
The focus on overcoming PIN security is very important because it was through this mechanism that the FBI thought it could access data on a device-by-device basis without asking for the dismantling of encryption in general. Encryption is incredibly sensitive to theoretical vulnerabilities (including legal ones) and the FBI presumably didn't want to undermine it conceptually for fear of causing more problems down the line.
What does Apple think?
Legally, Apple plans to appeal against the court order: "We oppose this order, which has implications far beyond the legal case at hand," claimed Apple in a public response to the world on 16 February.
As for the technical issues, Apple said the following: "Specifically, the FBI wants us to make a new version of the iPhone operating system, circumventing several important security features, and install it on an iPhone recovered during the investigation. In the wrong hands, this software which does not exist today would have the potential to unlock any iPhone in someone's physical possession."
The heart of the matter
"The FBI may use different words to describe this tool, but make no mistake: Building a version of iOS that bypasses security in this way would undeniably create a backdoor. And while the government may argue that its use would be limited to this case, there is no way to guarantee such control."
Whether what the FBI is proposing would break not only the security on the 5C but the more advanced Secure Enclave is unclear - assuming that is possible. Apple seems to think it would and that sees the undermining of its security architecture as an existential issue.
In other words
Rather than undermining its encryption technology, the FBI is asking Apple to bypass the PIN security, a less important security layer. But Apple thinks the way it is being compelled to do this undermines the encryption anyway, forcing it to design in a gigantic backdoor. Arguably, this wouldn't be a backdoor (i.e. unknown means of access) but a conceptual frontdoor (a known bypass).
Are Apple's fears well founded?
Some security watchers think it is gone overboard over the FBI's approach with several pointing out that the request related to the specific phone in question, which would be accessed by Apple itself and not the FBI. Describing this as mass surveillance sounds like an exaggeration. Assuming that the backdoor/frontdoor bypass is theoretically possible at all, what difference would it make for Apple to use such a facility on one phone?
All companies must be prepared to bypass the security of their own devices on a case-by-case basis in cases demanded by national security, the upholding of law and (because this is the US) the Constitution. Others have argued that Apple could create a single-use image that could only be used on the San Bernardino shooter's phone and no other device. It's not clear how easy that would be to achieve, however.
There is a strange uncertainty in all of this even for technical people who understand software security on mobile devices.
Why pick on Apple now?
None of the issues raised by this case are new. Encryption and security have been a battleground for years although the growing sophistication of the what is now available on consumer devices has started causing real issues for police services. What changed recently is the underlying politics. The FBI knows it will look bad asking any tech firm to bypass security without a very good reason, one supported by the bulk of the public.
It seems that the San Bernardino attacks offered a test case just too good to pass up. Apple undoubtedly feels that if it gives in over this issue it will only be the beginning of the compromises it will be asked to make. It is probably right on that score. For Apple and privacy advocates the use of an old law to force a change in the balance of power is a critical moment. This one will could all the way to higher courts.
Apple's encryption dilemma explained - conclusion
The argument against Apple's stance is that a mechanism for legal, legitimate bypass must be created or much worse will follow. Intelligence services will start attacking the underlying hardware and firmware used by these chips, assuming they haven't already done that. Apple's stance will simply slow down the NSA and set public opinion against the sector.
A lot rides on this case - doubtless GCHQ and other intelligence services in social democracies will be watching events carefully.