Hacker Read top | best | new | newcomments | leaders | about | bookmarklet login
Under iOS 11, authorities won’t be able to image your device without a passcode (arstechnica.com) similar stories update story
331 points by gopalakrishnans | karma 708 | avg karma 9.44 2017-09-11 14:57:20 | hide | past | favorite | 296 comments



view as:

> These changes are coming in conjunction with another privacy-minded feature that will disable Touch ID by pressing the power button five times.

Wow, that's really nice. I wish Google was so forward thinking about things like this. I see no reason why a fingerprint authentication should be forced upon someone anymore than a password unlock would be. The only reason this is how it works today is because it's much "easier" for the government for force your finger onto the phone, or take blood from you, or hair, and so on - and they can't really do that with passwords. But we can fight back with technology and ingenuity and ensure that a fingerprint auth is "just as good" as as password, at least from this point of view (government forcing you to give it away).


>Unlike most Silicon Valley companies, Apple’s business model is one of "Data Liability." Unlike Google or Facebook which use advertising to extract value from users’ personal information, Apple focuses on selling things that protect a user's data from all unauthorized access — including by Apple. Basically, Apple views user data as a headache, not a monetization opportunity.

https://lawfareblog.com/ios-11-may-complicate-border-searche...


> I see no reason why a fingerprint authentication should be forced upon someone anymore than a password unlock would be.

In Canada, there is a difference. A fingerprint is something you have. A password is something you know. Police can compel you to use your fingerprint to unlock the phone. They can't compel you to disclose the password.


You shouldn't be compelled to give up anything that is "you" without a court order and a warrant. Not indiscriminately photographed, fingerprinted, and in some cases tested for drugs against your will.

Canada forces people to unlock their phones, when crossing the border, quite frequently - and have for years. They will make you unlock it, or disallow entry and/or detain you.

There's a whole series that shows this. It's Canadian Border Guard, or similar. I see it when I cross the border, which I do with some frequency. I just unlock my phone for them.


What do they do with it after you give it to them? Does the device leave your sight?

They just flip through it. I'm a citizen so they don't do much with it. They usually look at texts and emails, to see if people are going to work there illegally. I'm able to work there so it's pretty silly.

It has never left my sight. I can't speak for others and didn't watch the show that carefully. I live so close to the border that my neighbors get Canadian television. So, I've seen it there. I don't actually have TV hooked up, so I don't see it often.


As does the States.

What I meant was that inside of Canada, there are situations where the police don't need a search warrant for your phone. You can unlock it with your fingerprint, so it's legally "open" and searchable.

If you have a passphrase, then they need to know your mind in order to unlock it, and they can't force you to disclose something you know.


Ah, okay. Yes, inside the US it is pretty much the same. Borders are the exception and, I think, you need to actually be crossing in order for them to demand it. Citizens are immune, for the most part.

Curiously, citizens aren't immune in Canada. I have dual citizenship and they still sometimes want to flip through my phone. No, I'm not sure why. They have had me power on and unlock my laptop a couple of times, as well.



It's not "forward thinking" to put that feature in 4 years after Touch ID debuted and people have already been forced to unlock their phones. They should be issuing warnings that fingerprint sensors are for convenience and reduce security unless they're used in multi-factor authentication.

The alternative was passcodes and Apple's research suggested that people disabled passcodes rather than use anything secure. TouchID was determined to be far more secure than no passcode and having both TouchID and a passcode is secure enough for most users. You can't add devices without both (along with an Apple ID password) and, at that point, you have physical access to the device which negates most security features anyways.

Interesting and pragmatic. Do you have a reference for this? I'd be interested in reading more.

There is a reference for this as Apple published a security white-paper about it prior to the release of TouchID. Unfortunately, I'm not at a point where I can search for it. If I find the time today, I will update this post with a link.

Cheers. I'd appreciate it.

Edit to add: I came across this one:

https://www.apple.com/business/docs/iOS_Security_Guide.pdf

It was linked from an article in 2014[1], though this PDF document is dated March 2017.

[1]: http://www.biometricupdate.com/201402/apple-publishes-whitep...


Yeah, that's the updated one. Prior to the release of TouchID, something similar was released.

Here's the whitepaper about iOS security: https://www.apple.com/business/docs/iOS_Security_Guide.pdf

In the February 2017 case of a California artist who was questioned at San Francisco International Airport upon re-entry, after he finally agreed to unlock his iPhone, it was taken out of his sight for several minutes and could have been imaged without his knowledge. Under iOS 11, unless the artist, Aaron Gach, decided to actually give up the passcode (rather than type it in himself), he could at least have been reasonably confident that the phone could not be imaged without his knowledge.

So, doesn't this just mean that border agents will force you to write down your password, key it in themselves to verify that it works, then walk away with the phone to image it?


Rubber Hose Cryptanalysis?

Can't wait for iOS to have the deniability factor of having different passwords unlock different things :)

For the sensitive stuff, have periods of time that require several computers to solve cryptographic challenges in order to unlock the phone. Some of which may be your friends' devices. If they don't hear from you and your intended hosts in a certain amount of time, phone stays locked.

Or one of the devices can be an NFC or Wifi hotspot in a certain area, and one at home. If you don't reach it, phone stays locked.


That's the worst case for rubber hose.

Even if you give up all your passwords they can still keep beating you to get the "real" pass.


They should have an "erase everything" password.

If you are under the threat of violence that won't save you.

But maybe death is preferable to living with your anime collection out there.

I'm calling Poe's law here. I hope you do not serious mean that anime collections are shameful.

s/anime/youknow/ aka Rule34.

better yet, a pass code option to show a bare user account without access to the real user data

If you are realistically in a situation where that would happen, and you would be harmed, they may just decide to kill you as you're no longer of any value.

In this case, a blank phone probably means no entry and not actually harming you. So, there's that, I guess.


Why can't they do that ANYWAY?

So you are afraid of actors that would torture you to get access to the data.

And your "solution" is to share this experience with your friends by tying their devices to yours and setting up a device in such a manner that that actor would not be able to verify if they indeed have access to the real data stored on it.

This is a recipe for a one way ticket to a very dark place for you and your friends.

For the most part the number one rule of actual data safety is that never implement protection that would put your well being or the well being of others at risk, unless you are protecting the nuclear launch codes no data is worth being physically harmed for.


> unless you are protecting the nuclear launch codes no data is worth being physically harmed for.

What if you’re just protecting the location of your daughter/sister from her physically abusive ex? Not worth enduring some violence for?


In that case they would come to break your face to begin with not your 256bit encryption.

So, since the vendor will undoubtedly inject themselves into this complex situation for 'usability', you are basically arguing for devices that are never actually controlled by their user because terrorism. Good 9/11 argument there.

What are you TALKING about?

The user controls their device to the exact same extent now. You have to trust the apps and OS you use. And iOS isn't exactly open source either.

I am saying the user can choose to select beacons to unlock their phone during a trip somewhere. The user CHOOSES to lock their phone in a way that requires things additional to the password, things that represent having made it safely past the security lol.


This is a rare case where multi-factor authentication, and the option to involve biometrics as hardened sub-layer auth, would be nice.

To be adequately effective, though, all back-up activities would require the full authentication credentials for all verification factors, which might only be possible with, say, a fingerprint scanner (touchbar) equipped laptop, or additional external hardware peripherals for other types of systems.


"Multi-factor authentication" doesn't make sense unless you're communicating with a server over the network. Biometrics can be copied and faked. Ultimately it's security theater.

Nothing beats a strong password.


It does. But if your a US citizen or are ok to be barred from entering the US for a long time, then you have the option to not unlock.

If you go the US citizen route, be prepared to never get your phone back and many future border crossings to be 4+hr affairs where they confiscate most of your things.

As a US citizen they have to let you in.


I've done this before. It was a 2 hour ordeal but they simply have no right, and I was let in to go home with all of my belongings.

If you do have money and effort to spare though, perhaps you can file a suit that gets up to the Supreme Court after enough of those 4hr border crossings.

My guess is that those with money and effort to spare will not be required to endure this ritual (they know who they're dealing with).

> As a US citizen they have to let you in

True, and of course they can arrest you the instant they let you in.


Yes, but an arrest requires evidence that you actually committed a crime, whereas border searches require no particular standard and are conducted relatively indiscriminately. Refusing to unlock your device is not a crime in itself. (However, lying to a federal agent is a crime, so it's important for anyone attempting this strategy to be forthright about their refusal, rather than trying to trick the border agent with fake data or whatnot.)

As mentioned elsewhere, Apple can't prevent "rubber hose decryption" (where someone compels/coerces/tortures you to get access to an unlocked phone). I suspect this feature wasn't designed to prevent that threat.

My guess is this somehow foils or mitigates the workaround that the Israeli company sold to the FBI after the San Bernadino phone issue.


They could add a "duress passcode", with an alternate reality of data. I don't expect them to, but it's possible. Bitcoin hardware wallets have this kind of feature to help avoid wrench attacks.

Lying to a federal agent is a crime.

Edit: No seriously. See https://en.wikipedia.org/wiki/Making_false_statements. Refusing to unlock your device is one thing. Claiming that you did unlock it but in fact just used a "duress passcode" is a lie and can land you in jail.


Has the 5th amendment been officially repealed?and the 4th?

The 5th amendment does not protect you from lying. And the 4th has nothing to do with lying either.

And neither necessarily applies at a border crossing

Beating a human is a crime.

This makes me wonder something. I know that while they can't force you to enter your passcode, they can force you to unlock with your fingerprint. I've heard people ask for a feature where you basically have a "duress fingerprint," since fingerprint is already different than password, would using a duress fingerprint be acceptable?

If you're instructed to unlock the phone with your finger, and your "duress fingerprint" unlocks fake data, then you're misrepresenting that fake data as the real data and would presumably be considered to be making a false statement (though this is just speculation on my part).

On the other hand, a "duress fingerprint" that locks out TouchID wouldn't be misrepresenting anything. But it wouldn't surprise me if they could still get you in trouble some other way for knowingly locking out the device after it's been confiscated.


That would be the technical equivalent to the fifth amendment.

The 5th amendment says you can refuse to incriminate yourself. It does not say you can lie.

Refusing to unlock your phone would be equivalent. Unlocking fake data would not.


I believe if you tap the power button 5 times on the iPhone, it disables fingerprint unlock. Perfect for a scenario where you are afraid you might be compelled to fingerprint.

This is apparently an upcoming feature in iOS12 but doesn't work in iOS11 now

What does work is putting the wrong finger on your sensor until it tells you TouchID is disabled and needs your passcode (which is 5 times) or rebooting your device without logging in.


So don't lie about it.

"I just entered my duress pass code, the device is now wiped".

Can they charge you with destruction of evidence when they have no idea if/what evidence was on the phone?


I'm pretty sure they can. And they will probably successfully argue that since you did destroy evidence, you were hiding something.

How do you honestly think that will go down for you or anybody else that tries it, telling a federal officer that you just destroyed the thing they asked you to open?

If you're willing to destroy the data to prevent it being read, why not just wipe the phone before you leave and restore it from an encrypted online backup when you arrive?

Arriving with "empty phone" has also caused refusal of permission to enter in the past.

With the facial recognition tech they might be able to measure duress. Or geofence known customs checkpoints. Would love to reasonably claim that I can't give up my data while I fear for me freedom.

I believe that they simply dumped the NAND to circumvent the passcode entry limit then brute-forced. But it was also an iPhone 5C model which does not include the Secure Enclave. This method could not be reproduced on any model beyond the 5S.

What Apple could do to is to move the bar from "rubber hose decryption" to "rubber hose account access" by offering a dead simple cloud recovery. Imagine a "Flight mode" where everything on your phone is backed up to apple and your phone is presented with an "enter backup id and password" dialog which allows you to load anyone's cloud backup and is essential a factory reset. Border control might still force you to enter a backup id and password under the assumption that you surely have one, but (to my limited knowledge of the US legal framework) it would not be a felony to load a "duress" backup id (because there is no such thing as a duress backup id, just different data sets).

Yeah, this XKCD never been so true as it is now. https://xkcd.com/538/

I have never seen a company both so technically capable and supportive of user privacy as Apple. It stands completely apart from all of the tech giants of today.

This is the only reason why I moved from Android to iOS even though I enjoy the Android phones and Google Fi more.

I wish Google Fi would support iOS devices.


This change alone is tempting me, and I have been diehard anti-apple for about a decade now for walled-garden reasons.

I know, I know, walled garden prevents users from opening their phone to vulnerabilities or guaranteeing a secure experience. I guess I wish I could have my cake and eat it too.


Android has required the user to enter the unlock code to back up the device for years now. This is Apple playing catch-up.

iOS devices have always required the device to "trust" the computer before allowing the backup. Looking at the article, the only real difference now appears to be that you can't just use TouchID, you're forced to use the passcode.

You are correct. Android does not require you to enter a passcode to trust a computer if you have a fingerprint registered. The change is that Apple now requires you to enter a passcode if you have set up a passcode on the device in the past. If you haven't, the attack vector still exists, and if you are traveling with a trusted laptop with fingerprint unlock, the attack vector still exists with one level of indirection.

>If you haven't, the attack vector still exists, and if you are traveling with a trusted laptop with fingerprint unlock, the attack vector still exists with one level of indirection.

You can't setup TouchID without a passcode. The "attack vector" only exists if you 0 security on your iPhone to begin with. (So , yes, technically the attack vector exists if you choose to use no locks at all).


You misunderstand. The vector is: I take random persons phone, compel them to provide me their finger print, and now when I plug the phone in to a computer, I can unlock the device and then tap "trust" in order to back up the phone to the computer. If the iPhone was already powered on and the user had entered their passcode after it turned on, even if it was two weeks ago (there may be a time out I'm not aware of, but you get the point), I'd be able to backup their device to my computer.

That's changed now, from the article:

> Under iOS 11, this sequence has changed to also specifically require the passcode on the device after the "Trust This Computer?" prompt.


> Looking at the article, the only real difference now appears to be that you can't just use TouchID, you're forced to use the passcode.

That's part of it. Another part is that the passcode has to be entered after the device has been connected to the computer, as opposed to starting with an unlocked phone (maybe you took it while it was in use, maybe you forced the user to unlock it) and connecting it to a computer later.


I am not sure to understand.

Once the device is unlocked, what is the obstacle to change the passcode?

Needing the old one?

Or - if you prefer - what happens if you have a TouchID and forget the passcode?

Do you need to reset the phone?


Not always... just since iOS 7.

Android = Google

(No, aosp is unusable, and Google is working to make it more unusable).

Google = no privacy

Therefore, Android = no privacy

Q.E.D


What data does Google collect that Apple doesn't collect? The only difference I can see is that Google is competent at making the data it collects useful to the user and barely usable to advertisers.

Your comment looks like trolling. We were discussing a feature to prevent others from backing up your device.


In the first part, you're setting up a premise that nobody here can or will answer publicly even if they have the knowledge of the exact information collected by either company. That gives you a free run at the second part without anything supporting it.

The information they collect is in their OSes' respective privacy policies. I've looked at them. They are pretty much exactly the same.

You must have looked at the privacy policies but not read them.

You and GGP are the ones making ridiculous claims without evidence (that the data the OSes collect are unknowable instead of outlined in a privacy policy and that Android collects more data than iOS out of the box).

To answer your first question: a lot. Apple makes all your data locally encrypted by default, in Messages, Siri, iCloud, Maps, etc - they don't have the key to access it. See more at https://www.apple.com/lae/privacy/approach-to-privacy/

All of a user's device data is encrypted by default on Android as well. In this respect, they are no different.

The difference is that Google owns the keys that encrypt your files. Services like GDrive, GMail, Google Photos, Assistant, all depend on processing your data in the cloud. Apple does all of that locally (because they can't access any of it otherwise), trading some scalability and sophistication for privacy.

We were discussing data stored on the device. Just like Apple, Google does not have the keys to decrypt the data on the device. And just like Google, Apple has the keys to decrypt the data in their equivalents to the GDrive, Gmail, and Google Photos cloud services for web access. The only difference is that Apple's services are significantly less useful.

> What data does Google collect that Apple doesn't collect?

I don't know what Google collects, but the simple answer to this is "virtually everything". Apple collects very little data from their users and does everything they possibly can on-device (and the stuff that requires the cloud is either encrypted or heavily anonymized). Nearly all data that Google collects, Apple doesn't.


On the other hand, Android services everyone, whereas Apple tells the developing world to fuck off. Apple's privacy is only for the social elite who can afford their expensive toys, and they're not interested in providing service on a global scale.

Let's not get too hippy-feelgood about Apple's intentions regarding its users here.


Well hold on dude, are you arguing that it's not Apple's right to charge money for quality / service?

I am not a diehard android, but i'm replying to you from a LineageOS phone without the Gapps, using microg as an alternative to the google services, and firefox as a browser.

Privacy is achievable on Android. However, I do agree it os very involved.


The services provided by Gapps that send data to Google (push messaging, aGPS, crash reporting, app installs, safe browsing, etc.) also exist on iOS with exactly the same privacy policies. The only difference is that you can remove or disable them on Android devices if you are paranoid, while no such remedy exists on iPhones.

By your standards, privacy is achievable with difficulty on Android and not at all on iOS.


There is another, more fundamental difference. With Apple, you are the customer. With Google, you are the product.

Yet this "fundamental difference" does not change the fact that both companies' devices provide exactly the same amount of privacy out of the box, while only the Google device is truly yours to put whatever software you want on it, down to the OS.

I trust the vendor whose primary revenue stream comes from me, rather than the one whose primary revenue stream is me (or my private data). I trust Google to abide by their T&C's about as much as I trust the NSA to abide by the laws governing the extent of their actions. Which is to say, precisely none.

You can argue until you're blue in the face but Apple have little to no incentive to exploit my private data while Google have every incentive to do so.


I understand your position now. You're a conspiracy theorist. Google and Apple have exactly the same incentive to exploit your private data — money. They also have exactly the same disincentive to disobeying their privacy policies — lawsuits and bad press leading to loss of money.

The Google device is NOT truly yours because you have to look for cracks if you want to root it or remove the Google stuff. The law may also be against you if you crack it. You are just a guest on your device...you are "licensed" to use it just like you get license to play music on spotify, watch movies on netflix etc...nothing is owned anymore and very few people care.

This is not true on the actual Google devices (Nexus and Pixel lines) that have a supported path to flash your own rom, and root. Google so far has been supportive of the custom rom community. Some manufacturers (cough samsung cough) have not.

Many Android devices, including all the ones that Google make and sell themselves, Sony devices, Motorola devices, have an officially supported path to running any code you want on YOUR own device.

Usually that path is: boot the phone into bootloader mode, plug it into a computer, run `fastboot oem unlock`, accept the warning that this will void your warranty, then `fastboot flash <image file to flash>`. An officially documented and supported way to do exactly what you want with the device that you own.

Yes, it voids your warranty, but it's not forbidden or illegal in any way.


Sure. That's why Magisk exists, right? Because OEMs have nothing against you rooting your device. At all. They are perfectly okay with it.

Please. Actions speak louder than words, I am sure you would agree with that.


We were discussing Google devices and Apple devices. All Google phones have an unlockable bootloader.

By google devices, do you mean only the Pixel and Nexus phones? Or does that include all android devices? Because for example Sprint Galaxy s5's can't be unlocked.

I mean devices sold by Google, so Nexuses, Pixels, and a few Google experience phones sold from the Google Store.

Unlockable bootloader, yes. Officially supported ROM without Google Play Services, absolutely not.

I see you defending Google around here. That's fine but a grand-grand-grand...-parent of yours was saying that AOSP is unusable, and that point still stands and is true.


You've confused yourself. The whole point of a device being yours is that you can install not officially supported ROMs.

The officially supported experience with Google Play Services is no more privacy invasive than the officially supported iOS ROM, but in the case of Google devices, the device is yours, and you are not limited to officially supported ROMs.


Safe Browsing and Crash Reporting can be turned off with no repercussions. You can turn off location altogether, though I'm not sure if you can turn off aGPS only.

That leaves app installs (Android allows side loading) and push messaging (I'm not sure about this one) as the ones that you can't disable on iOS.


But please think twice before turning off safe browsing! Contrary to what some seem to believe, it doesn't send your entire browsing history to Google. See https://news.ycombinator.com/item?id=9779990 for an explanation.

Good reminder.

IOS = Closed Source, Therefore only their word there is no backdoor

So is every other phone on the market.

No. There are open source Android phones.

There are no open source ios phones


Yeah? The baseband and chip ROMs are open source? Really?

Or the proprietary kernel patches and binary blob drivers?

If your point is that there are partially closed source Android distros, I agree with you.

more that there are no open source Android phones. Just "partially open source" Android phones.

This ignores essentially all reverse engineering techniques that are required to find actual backdoors & is not something security professionals largely believe.

How hard is it to jailbreak with ADB shell and get root on a random phone? Does that always require a code?

It depends on the device, but at least on Nexus/Pixel devices you can just plug the phone in and `fastboot oem unlock` and it will unlock the bootloader, i.e. allow you to flash anything onto it that you want.

This does not let you access the user's data because fastboot oem unlock wipes the data partition. Before you can unlock the bootloader, you have to enable the option in the developer menu, which requires the phone's owner's account's unlock code.

not sure why there are massive downvotes

AFAIK this is true


> I wish Google Fi would support iOS devices.

I'm curious - how can a carrier support and not support certain networks? Isn't that like saying I wish my ISP supported Linux?


Google Fi only allows Their Nexus and Pixel phones on their network. It is like an ISP seeing that you are running Linux and not letting your traffic through, just a bit more complicated than that. (SIM Card, Modems, Frequencies)

You can pop a tmobile SIM in the phone and they partially work, but Google Fi requires hardware support in the radio to work fully

It's not that they're blocking iOS, but more that iPhones don't support the network.


Understood. Just wish they would both get on board with it.

Not quite. You need those phones to activate service but you can move that sim into other phones later. Once you have service you can also get data-only sims that openly support all devices.

Can you still recieve sms and normal phone calls? Because last time I tried, it didn’t work.

yes

This sounds a bit misguided. IIUC, Google Fi is really a mechanism of accessing two carriers from the same device. It consists of 1. A partnership between Google Fi (the cellular provider), T Mobile, and Sprint, and 2. A special radio hardware on the phone that is capable of working with two different "channels" at the same time. Since my Nexus device broke, I have personally been using "Google Fi" (the provider) on my iPhone which means I do NOT reap the full benefits of Fi without having the special radio hardware.

So, I don't think it is physically possible to do the "two provider at the same time" thingy without a special hardware (you'd need two parallel "radio" circuits on your phone).

And with that, your analogy falls sadly on its face, as you seem to indicate an artificial "software" limitation has been set to prevent us from using Google Fi on our iPhones.


Sorry, I did not mean to say that is how it is but how it seemed. My understanding is that the modems aren’t capable. That being said, activation is done only with a nexus or pixel device (software) and while switching might be a hardware issue, iPhone works on both sprint and Tmobile (on an iPhone on Tmobile after moving from Verizon) and so google fi should still work.

As I mentioned, I AM already using Google Fi on my iPhone. What does not work is the hardware feature of handing off between providers mid-flight. With my [now deceased] Nexus X, a call would seamlessly [to me, anyways] switch between providers during medium/longer commutes. With my iPhone using just T Mobile, calls get dropped during my daily route as there are certain "T Mobile blind spots".

Sorry, I completely misunderstood. Thanks for clarifying!

Is it actually a special radio, or just features which normally get disabled by e.g. Verizon programming the baseband? I'm assuming that the "special" radios are simply NOT using crippleware and setting some other registers which could be enabled on many more devices...if carriers played ball, which they don't want to for obvious reasons.

I have to research deeper to be sure (there is some FAQ at https://fi.google.com/about/faq/#supported-devices-7). But skimming through that page, I suspect this is a new-ish feature not available on most smartphone radios.

Project Fi does some things to switch between several carriers (T-Mobile, US Cellular, others) behind the scenes that doesn't work as well on iOS. Also, you need a Pixel or Nexus device to activate initially anyways.

In this case, Fi relies on software in order to switch between different carriers and wifi.

They are not able to implement it to their satisfaction with iOS, so they stick with their own devices.


Largely the same here, I'm giving up on Windows too, and I've been using MS OS' since DOS. I'll pay a premium to keep my data secure.

All other tech giants today make money from advertisement and /or “big” data, which are inherently anti-privacy. This is why Microsoft has done 180° on privacy and has filled their first part apps with ads.

MS was always in bed with the feds, e.g. Skype and crypto backdoors.

Historically, Microsoft has made a lot of money from government institutions being locked-in to their software, so that makes perfect sense.

That is very true, but I find it very sad that, as much as Apple is pro-privacy, it's very anti-freedom. I'd love to switch to a device as private as an iPhone, but at this point you're just renting the device from them.

It's not anti-freedom. It's just pro-ease-of-use and it's easier to guarantee a quality experience when there are less unknowns and less variables to account for. When everyone is using the same device with the same software, it's much easier to guarantee a great experience.

Allowing side loading of apps would not impact usability in anyway.

Edit: It would massively increase the usability, allowing any apps to be installed, not just Apple approved apps, making the phone more usable to many people. Back in the day I had to buy a dev account just to load an emulator I wanted to use (it was open source) without rooting.


But can ruin your privacy guarantees.

How? Sideloaded apps would still be sandboxed.

Sideloaded apps make attacks on non-sideloaded stuff a lot easier.

Not to sound like a broken record, but, how?

It's more than just sandboxing: Apps going through the App Store have to meet a stringent set of requirements, chiefly being no private API usage, no circumvention of features intended to give the user control/privacy, and no third party web engines (and all the performance and security implications that come with those). Apple regularly rejects apps that violate the first two, and while I'm sure some get through they tend to get buried by scores of more legitimate alternatives. Sideloaded apps get to bypass all of this.

If they ever do offer sideloading/gatekeeper, it should be turned off by default and turning it on should make the risks crystal clear with a scary alert and passcode prompt. Average folks should be heavily discouraged against using it. Gatekeeper works on the Mac, but iOS devices are both far more numerous and far more personal, so the stakes are much higher.


Apple's checks are not very deep. They suffice to keep honest people honest but do little against an actual adversary.

The number of safeguards they have in place make that a moot point. If it's not discovered in testing, it's sandboxed. If it gets past the sandbox, they have a killswitch that's registered to the app's specific ID. There's very little damage that an app can do if it's gone through the App Store proper.

That could be done for sideloaded apps too.

Sideloaded apps, by definition, are not tracked individually by Apple. You can't make a killswitch for an app if you don't know what the app is.

And yet the Mac has a malware killswitch. Clearly it can be done.

Source? The Mac only has the ability to prevent unsigned apps from being installed which is easily disabled. There's no killswitch for unsigned apps.

That is objectively untrue. Every system that has allowed sideloading of apps has been compromised by that very feature.

"Every society that has been allowed to drive cars has had car accidents."

I know the risks, I want to do it anyway. I don't want a nanny over my head.


That may be a better analogy than you intend. The problem with cars is that everyone suffers the consequences of your driving. In fact, you may have the least exposure to the carbon monoxide, pollution, or even accident risk. Pedestrians, cyclists, property owners, and other drivers are all at risk of your driving.

For that reason, societies tightly regulate car ownership and driving. More so than phones: I don’t need a license to use a phone, nor do I have to register my ownership or have it regularly inspected.

But technology these days has this same characteristic: Others bear the costs of your decision. Every device connected to the internet is a DDOS vector.

I don’t want you deciding whether to keep your device up to date with the latest security patches, because if you (and a few million others) don’t, GitHub is down for me.


So, are you doing the responsible thing and not using a laptop, or any device that's vulnerable, like a router?

Bear in mind that the vast vast majority of apple's customers have 0 interest in sideloading apps- if you're apple it makes a great deal of sense to trade off security for the masses against a few power users who can't sideload apps occasionally here and there

A compromise doesn't affect you, it affects Apple and the app developers. iOS has the best apps because it has people buying them instead of everyone and his brother pirating.

Anyway, if you don't want to deal with how an iPhone works, how about buying a different phone.


Technically, they do. You just have to pay more for it (aka buying a developer account).

Actually you don't even need to pay for it these days. Free Apple Developer accounts can side load apps through Xcode

You need to pay for the Mac to use Xcode though...

Correct. You need to own a computer to run computer software.

An Apple computer. To run Mac software. That's what you probably meant.

You don't have to own an Apple computer to run Mac software, so your statement is factually false and his isn't.

Legally.

No, you need to own an Apple computer to install things on your phone. I don't need a computer to install APKs on Android.

No, you don't need an Apple computer to install an IPA anymore than you need a computer to install an APK. Let's stop comparing Apples and Oranges shall we? You need a computer to compile and sign an IPA, the same thing you need to compile and sign an APK.

> You need a computer to compile and sign an IPA, the same thing you need to compile and sign an APK.

You can compile and sign (and, AFAIK, upload to the Google Play Store) an APK on an Android device, using AIDE and perhaps other dev toolkits. No computer needed.


Only for seven days at a time though, no? Either way, that affectively means anybody can sideload if they have a Mac.

You don't even need Xcode. You can do it fine on windows or linux using cydia impactor[1]

1: http://www.cydiaimpactor.com/


I see your point, but: That's not an excuse, that's an explanation. It is anti freedom whether that's their end game or not. I don't really care about Apples motives, I care about the product.

Really, Apple is anti Nothing and pro only one thing: money. Everything else is a corollary. Including ease of use, freedom, and privacy. It's just that those things do matter to me (and GP).


What's an example of a common use case in which Apple is anti-freedom in your opinion?

Presumably they're talking about the App Store walled garden.

Web browser engines. Default apps.

Is it really that big of a deal for the web browser engines?

What's the issue with the default apps? I believe you can now uninstall just about every one of them.


Not being able to unlock your phone or sideload programs. I don't accept the "we know what's best for you" argument. They could have allowed jailbreaking and added huge warnings.

Apple allows you to 'sideload' apps as long as you get a (free) developer account.

Can I just download an IPA from a website and install it on my phone if I have a developer account?

Can you with jailbreaks?

Yes, but Apple doesn't points for those, they actively oppose them.

It is anti-freedom, in the sense that all apps are censored under Apple's own rules. They specifically said that apps do not constitude free speech.

You can't run a mass maket open platform without security or privacy issues as Users will install things that cause problems.

...and IMHO those "security or privacy issues" are an acceptable cost of freedom, the same way that I see crime in general: no one wants to be a victim, but you can see that, as long as crime still exists, so does freedom. If no one can do "bad things", then everyone has already had the essence of life taken out of them. I wrote this comment a few months ago when (almost) everyone was wondering what could be done, in the wake of a massive malware attack:

https://news.ycombinator.com/item?id=14339031

More timely, the WTC attacks 16 years ago marked another notable event whereafter great freedom was lost in the name of safety.

As that old saying goes, "Those who sacrifice freedom for security deserve neither."


As I mentioned in a sibling comment, and it's almost ironic thinking about it, this "freedom for security" tradeoff we're making here is not what it seems.

I see your point in the context of e.g. WannaCry and WTC attacks emboldening authoritarians, however this privacy move we're discussing is actually an anti-authoritarian move. It gives the user more privacy and more freedom.

Advocating more lax privacy/security in this case is siding with authoritarian tendencies.


Apple is only shifting the authority to itself. They still have the signing keys and control what you can or cannot do with the device you "purchased".

In that vein, it would be awesome to see some base open sourced ala Darwin. Even if all it did was drop your iPhone into a command prompt after boot it would be enough.

That argument is more reasonable if users had no platform choice. A cellphone can track your location and listen in on your conversations so you may reasonably want more security than say a PC.

Unfortunately in the device world you have to pick one over the other. The more open a device is the less it can protect your privacy. We haven't yet seen a device that can do both.

I believe you got it backwards. When the device is open, it can protect one's privacy. The problem is, we don't have open devices.

I mean, tha hardware is not open, so developers can't generally implement any proper secure boot schemes (starting from a trusted bootloader), and can't generally control what goes on in the radio modules. Because there are no devices that have those parts open (or I'm unaware of something I want to buy?), there's no security/privacy possible.

Apple can provide privacy there because they're damn huge and they can purchase or design any hardware they want, fully documented to the last every single logic gate. An average free software developer can't.

As for the userland - I believe there are AOSP derivatives + 3rd party apps that result in a reasonably good privacy and security experience.


otoh the pro-privacy approach they're taking also increases freedom, such as in the use-case this article is about. Which imo is probably a more tangible freedom to many than the ability to hack around iOS devices or the idea that the app store isn't free.

You can compile and run your own software on the device. You can jailbreak it to get the same results. You also have access to the entire internet which should allow you to do nearly any kind of development you want to do and use the browser in your phone as a UI the same as any other computer.

The only part of their ecosystem that has major restrictions is the App store. Which kind of makes sense as that's one of the few ways in which someone other than Apple could completely trash their platform and make alot of Apple's customers unhappy.

I'm not saying your "anti-freedom" point is completely invalid, but it's really not a major concern for most tech-savvy people, IMO.


What about Open Whisper Systems?

I like Open Whipser Systems also but do they really qualify as a tech giant?

It's great. They essentially have a monopoly on providing privacy, which is and will continue to be a bigger competitive advantage.

totally agree. i think designing software so you dont have access to customer data (zero knowledge or customer held keys) will eventually be a big selling point for SaaS companies

The first time I saw the concept (and got mind blown by it) was when I discovered the original version of Passpack (online password manager) made by @sullof

Also for governments themselves.

Any posts you'd recommend?

I thought the same thing reading this title and it makes me want to invest. Companies who ignore user privacy are in for a hit I think, in the wake of equifax.

I'd support companies with excellent privacy / data security records being exclusive brokers of my information. Apple is on that very short list.


> Companies who ignore user privacy are in for a hit I think, in the wake of equifax.

Are they? Do people care that much? I think, except for those who actually have their credit stolen, the average person just sees it as another headline of the world today, and will move on, the way people still fly United.


> the average person just sees it as another headline of the world today, and will move on, the way people still fly United.

Until people's lives are actually impacted by the lack of privacy, they'll continue doing what they always do. Once something requires action on the person's part, (either credit being stolen from them, or something along those lines) most people won't take preemptive actions. And this would need to happen on a massive scale for giant corporations to change their ways.


I don't entirely agree. Pre-emptive action could be a 5% higher chance of buying an iPhone, or a bump in existing iPhone user's stickiness to the platform (you value keeping something you have more than not getting something you don't). That's enough to make a significant difference.

Other than financial data I don't particularly value my privacy. I post on many online forums under my own name and even my alias here is a composition of elements of my name. I use Google services quite happily and have no material gripes with the company. But still as an iPhone user it's good to know that Apple has my back on this and I'd really hate to lose that feeling.

Also I don't think this is entirely a commercial decision on Apple's part. They must know very well that taking an absolute stance on customer privacy puts them diametrically at odds with the interests of the Chinese government. China hasn't taken many overt steps against them (though they shut down iTunes movies and iBooks and chipping away at Apple Pay and other services with onerous regulations), but does anyone really doubt that they're working away in the background to try to help Apple's competitors and slow down Apple sales and service expansion in China? There's no way the Chinese government can be happy with Apple's privacy stance and nunlike int he West Apple doesn't have the shield of the law to protect them in China. They stand naked against the Chinese government, but they're still doing it. That takes some real conviction. I really don't think I'm exaggerating in saying that this stance on privacy may very well cost them any chance of a dominant position in the Chinese market. It's certainly running a real material risk of that anyway.


Flying United is a choice. In fact, some companies do suffer lasting harm from bad behavior (Volkswagen comes to mind). Being an Equifax product is not a choice, just a symptom of being a pawn in today's mass-surveillance world where (we are told) privacy and security are illusions.

some companies do suffer lasting harm from bad behavior (Volkswagen comes to mind).

Volkswagen is basically back to it's pre-dieselgate sales levels. Admittedly its stock price is still quite a bit below it's pre-dieselgate high, largely due to the still unknown factors surrounding possible future fines.


I just wish an Apple ID wasn't required for everything. Especially one that requires giving them my full name, phone, and physical address.

I understand that it would be needed for purchase apps, but just to download a free one?


Yeah, I really hope something like Gatekeeper comes to iOS. As time moves on (and tech companies stop being apolitical), I'd like to not have the App Store being the only way to get software only my device.

For example Gab was rejected by the App Store because of the content people were posting. Of course, they won't ban Chrome despite being susceptible to the same content violation.


Is it not still possible to set up an AppleID with gift cards you paid cash for? As far as I know it used to be. I still have an AppleID I set up in another country using a free iTunes download code I got from a bottle of soda iirc, never used a credit card and downloaded a free apps that weren't available in the country of my main AppleID.

I choose to read this as sarcasm, but the fruit fandom here doesn't seem to think so.

I'll trust Google's engineering prowess and rigor before I trust Apple, who couldn't even bother to verify emails before leaking Apple ID attributes towards that email, and who allowed things to "fappen."

But I'll trust none of them to "guard my privates." Please.


It sounds an awful lot like you've had an embarrassing personal experience and are blaming Apple

There's the iTunes hack, the SSL bug.

Apples security record is not better than say, Adobe


You are reminding people of serious security issues, but comparing them to Adobe? That's going too far.

Personally I dropped a lot of Google products with the Snowden leaks revealed to shocked Google engineers that they shouldn't have been sending unencrypted traffic between data centers and assuming it was safe because it was on leased lines. There's bugs, and then there's just negligence.


I agree. But it's merely the consequence of them being a tech giant whose product is not their user's privacy.

There is no “merely” about it when it is a conscious choice to pursue this business model.

Thinking about it, Apple is benefiting right now from the same debt-based economy that benefits the FBI/NSA. I don't think there even is much local US manufacturing, so Apple imports products made in China and sell them at a high profit margin. Apple's cash piles came from debt, and I think national debt is exactly the thing that increases when FBI/NSA spends more money, right? (Of course, it is only one kind of debt, but it is important for example because of the bonds other countries buy with their FX reserves)

I think you've confused several different trends with each other. I can't see a coherent point here.

Apple is a partner of the NSA's PRISM program.

We know that since Snowden, if you remember those revelations.


Yeah but what choice do you have? Apple is the lesser evil here.

You can choose non-US companies. OnePlus, Samsung, etc. I don't think China or S.Korea is less inclined to spy on you, but I suspect they are less competent at it.

But then you end up with Google. It doesn't matter who builds the hardware, if it runs Google's system.

Is Apple good? Probably not. Is it less of a threat to your privacy and freedom than Google? Probably.


This is where I would mention that I'm running Cyanogemod, rooted of course, with Protect My Privacy (PMP) installed.

http://www.android.protectmyprivacy.org/

While I really like non-Google Android + PMP in theory, the reality is that this solution is extremely complicated, limiting, error prone, buggy and probably not worth it.

So I concede that Apple is the better choice, but nevertheless, persist in my futile efforts to make my 1+ somehow work securely.


How dies PMP differ from PrivacyGuard? From the description it sounds the same.

After Ubuntu dropping out (officially, at least), your best bet is probably LineageOS: https://lineageos.org

ah man.. i just needed to clean my monitor of coffee reading this. :)

Samsung, really? :D weeping angel rings a bell?


Yeah, it looks like the choices are between competent+PRISMed and incompetent. The former can likely only be accessed by the Five-Eyes. The latter may be secure, but may also be insecure and accessed by anyone. What would you bet on? For me, a deciding factor is that I don't live in a Five-Eyes country.

Pretty much agreed, though not living in a five eyes country doesn't mean any of those eyes aren't 'looking' at you. :)

Quite the opposite actually, if you're a citizen of it you have some constitutional protection against your own government spying on you, at least in the U.S. AFAIK. If you are outside of it, you're fair game.

In theory i agree if you read up on what is shared about it.

In practice i have very strong doubts this is a fact. Specially since there is '5 eyes', '6 eyes', '9 eyes' and '14 eyes'. And any of the members are monitoring domestic terrorism, so i fail to see how they are not spying on their own citizens.


Isn't Samsung considered to be kind of an arm of the South Korean government? They also make weapons of war and other military equipment, and they haven't proven to be ethically or morally responsible, if not outright reprehensible.

For many people, their only positive quality seems to be "Not Apple."


>but I suspect they are less competent at it.

Maybe they offload it to the US as NSA is so good at it.


Also there's a big difference between 1) a company collecting and selling your data to other companies 2) a company giving authorities access to your data for complying with anti-terror laws.

I'm not implying i agree with how far governments go to intrude into peoples privacy. Just pointing out that i agree with OP that there's a big difference between for example Googles and Apples stance towards privacy.


How is Google selling your data to another company ?

That's literally their business

Google may be using the data for their own benefit but they don't sell it to third parties.

Yeah that's like their entire business model. They use it to sell ads, etc but they won't sell their most valuable asset.

They sell ads that they target based on your data but AFAIK they don't give this data to any third party (except gov agencies of course like all US companies)

They, like Facebook, are big enough that they don't need to sell data in order to monetise it. This doesn't make their swirling vortex of personal data extraction less creepy.

All the other tech giants are in the business of selling your data to advertisers. Apple is in the business of selling their stuff to you.

Mozilla comes to mind.

I would have loved to see FirefoxOS be a real thing.

Unlike any of the tech giants, they have no incentive to pull you in their ecosystem, embrace open source, etc

If only FirefoxOS had arrived earlier and was not tied to a catastrophic 'web on mobile' idea.


So the fact that they've previously been lying about device encryption stands out as supportive of user privacy to you?

Face ID is going to screw this all up. Cops won't even need your cooperation to unlock.

Apple is adding a kill switch to iOS 11 that lets you discreetly disable Touch ID (and presumably Face ID) by hitting the power button 5 times. And of course if you're concerned you can opt out of Touch or Face ID entirely and simply use a pass-code.

https://www.theverge.com/2017/8/17/16161758/ios-11-touch-id-...


I think I would prefer the passcode which can be set to a length much larger than 4 digits when travelling or getting anywhere near fed-gov.

iOS lets you set a complex password instead of a 4 digit pin (turn off the "simple passcode" setting).

Default PIN has been 6 digits since iOS 10, and you can make it a long passphrase instead.

I am aware. I don't know why I phrased it the way I did. I meant I would rather use that feature of the longer passcode instead of facial recognition. Bad grammar for the win.

Thanks for sharing that info, but I think the problem is that Apple is marketing these methods of identification as methods of authentication. Sacrificing security for convenience.

Even if they have a kill switch in place, the users least likely to know about it are the users who are most likely to use insecure methods to "secure" their phone, I would think.


There were two sentences in his post. You only read the first one.

Something that I think people underestimate is just how easy it is to observe you entering your password on a phone, and why that (in my opinion) makes thumbprints much more secure than passwords for casual usage - e.g. every-time you unlock your phone.

All you need is a camera over your shoulder and you don't even need to observe the key-presses as generally the current character is displayed on screen. You could likely observe 100s or 1000s of them a day with an overhead camera at transit stations and the like.

The same thing goes for "Tap And Go" contact less payments not requiring a PIN number under $100. Everyone goes on about how people can run up a few hundred dollars at different stores with your card if they steal it. But consider exposing your pin to surveillance during most common transactions which then also lets you remove cash from an ATM with that card if stolen which is much harder to recover and is also much higher value than the generally $30-$100 limit for transactions without a PIN.

Next minute you'll freak out when I tell you I can clone your house key from a photo of it hanging off your belt...

The general point is that security trade-offs are generally deeper than you might realise on the surface, especially at "public outrage" levels of observation which so frequently haunt the public mind in recent times.

I'm not sold on this Face ID business yet though.. will see how it is presented tomorrow.


How is that different from Touch ID?

Presumably they can just point the phone at your face instead of forcing you to put your finger on the sensor.

TouchID has 3-5 tries before it gives up and requires password. There are 10 fingers + failed presses. It can also be turned off in duress (or will deactivate after a few hours or after reboot).

I always turn off my iPhone for checkpoints or traffic stops. They want it, they can decrypt it the hard way.


Too soon to tell. For all we know, Face ID may give you the option to provide a facial expression to unlock.

That's a great capability I hadn't considered. Your passcode isn't just a 3d composite of your face, but instead could be a 3d composite of your face performing a specific gesture over a short amount of time (wink, kiss, tilt, etc). Would be harder to spoof.

"I'm sorry, Bob. You didn't make a stink face. Would you like to try again?"


Is a facial expression speech? One of the issues with touch ID is that biometrics aren't protected by the 4th, while passwords and the like are. It would be an interesting question whether a court can compel you to make a certain expression.

I'm patenting the Duck(face)Lock(TM)(R)(C)(Registered)(DontStealMe) immediately

Here's hoping for multiple password options that unlock secret partitions. Best way through authority is plausible deniability.

When you need Apple or any technology to fight for your privacy against your own government you know you are in serious trouble.

The government has no right to interfere with your personal effects, this is fundamental to freedom and democracy, and the idea of the private individual.

Yet it seems this too is 'normalized' and citizens are more interested in technology workarounds to deal with this abuse from the state.


With it being such a closed platform, what are the odds this is actually true? There's no way to verify that they don't have a skeleton key.

At least I'm confident the French, Turkish and NK governments don't have the keys. The FBI wouldn't share the secret with simple policemen, unless for extremely rare reasons, in which case iPhone security isn't your main problem.

You think they lie to the US government every time it asks them to decrypt phones and they can't?

I regularly reverse engineer components of iOS and have done so since the first iPhone was released. Never seen anything like a "skeleton key" and there is certainly nothing like that in place now.

Starting at iOS 10, firmware components have not been encrypted (obfuscated), so you or anyone else can also reverse engineer it.


For anyone concerned that authorities force you to give up your password to them (thus allowing them to image your device), you can pair your iPhone to a computer with a MDM (managed device profile), which will prevent any other device from connecting to it. iOS security researcher (now Apple employee) Jonathan Zdziarski has 2 blog posts on this:

Counter-Forensics: Pair-Lock Your Device with Apple’s Configurator:

https://www.zdziarski.com/blog/?p=2589

Protecting Your Data at a Border Crossing:

https://www.zdziarski.com/blog/?p=6918


That's a great idea! Does anyone know if this technique works with iOS 10? The linked blogpost is for iOS 7 and 8

Yes, the underlying technique has not changed and works for iOS 11 (beta).

By the way, the writer of this post now works for Apple on their Security Architecture team.

I would not be surprised if this change came from him.


It still works great, however the linked post uses Configurator and Apple has since replaced it with Configurator 2, so some of the options and workflow are different now.

What happens if your computer breaks? Is the iPhone paper weight then?

Have you ever considered reading the article?

"you have to start fresh, with a brand new install of iOS."


Until forensic companies get hold of new exploits. Which is how high-value targets have been getting dumped for a while now.

With facial unlock couldn't they just hold my phone up to my face to image it?

The article says they will no longer allow Touch ID to trust a computer, I imagine they also won't allow Face ID.

This is why I love Apple.

Anyone could shed a light on how difficult would be the implementation of the following:

- All data encrypted by default

- The "dump" of the phone memory or macbook hard-drive makes it looks like the whole drive is full. It means that the free space is populated with random data that is, itself, encrypted.

- User can switch from his user profile to a fake user-profile and import some data (like contacts/messages/photos)


This would be awesome.

No it is not. It signifies a divide between the tech sector which strives for privacy and government (which is/was supposed to protect the people)

Oh? I might have misunderstood your post.

I thought you were suggesting a way that meant phone backups all appear the same size as the disk 100% of the time. (meaning the true volume of content is hidden)

Then the second point is the user could potentially have two (or more) profiles on the device, and it is possible to unlock it into one or the other. Meaning a user under duress can unlock a device and not reveal the true content while the person trying to get into the device has no way of knowing if that is the true profile or not.

I figured that would be a pretty sweet feature. It would also tie neatly into allowing users to have multiple profiles on their device which is currently impossible on iOS...


It is a good feature, but my comment was on "awesome". It is not an awesome situation, far from it.

ahhh, haha ok I see!

yeah it isn't awesome that we feel we need these security features. However I think even without the fear of government (etc...) this is still a great feature I would love to have.


What if Apple also added a feature for showing an innocent/clean phone if a specific password is pressed? How would the law enforcement know the difference? You only need to show the cellular call log, because they already have it and could use it to prove that you used the "mode".

Here is a slide from the PRISM leaks:

http://www.washingtonpost.com/wp-srv/special/politics/prism-...

#neverforget


The mention that is varies by provider is notable. Apple encrypts End-to-end all iMessage chats, as well as FaceTime (VoIP) calls. None of the other providers on that list do that, so at least there's that.

Also, people here act like Apple jumped willingly onboard the PRISM program. You can bet your ass their arm was twisted by the government or they were taken into the program unknowingly (datacenter ISP taps, etc).


"Apple encrypts End-to-end all iMessage chats, as well as FaceTime (VoIP) calls."

End-to-end encryption does not guarantee that Apple keeps your data encrypted, or that they don't process it for 3rd parties (NSA would fit as a 3rd party, where Apple would be for them a content provider, as the slide shows).

"You can bet your ass their arm was twisted by the government or they were taken into the program unknowingly (datacenter ISP taps, etc)."

Can you back that up ? How can you be so sure ?


> End-to-end encryption does not guarantee that Apple keeps your data encrypted

I'm probably misunderstanding something here but doesn't "end-to-end encryption" mean that A encrypts it with B's key and [whoever is in the middle passing it along] can't decrypt it because they don't have B's key?


You are not misunderstanding, you are correct.

Yup, so Apple knows about your data.

And according to that slide they are a "provider" for NSA.

End-to-end encryption at least maybe guarantees that it is not your ISP that is selling you out to NSA (making it harder for Apple to explain why they are on that list).


How does Apple know about your data unless they have B's key?

Who generates the keys for B ?

Technological solutions to a non-tech problem. The US have demonstrated they'll quite happily just lock people up forever if they can't get to the encrypted data.

Good on Apple, but not a solution.


Do it to one man and its an oddity, do it to a thousand and lots of people will be demanding change. This makes it more likely it will happen to lots of people.

Legal | privacy