> These changes are coming in conjunction with another privacy-minded feature that will disable Touch ID by pressing the power button five times.
Wow, that's really nice. I wish Google was so forward thinking about things like this. I see no reason why a fingerprint authentication should be forced upon someone anymore than a password unlock would be. The only reason this is how it works today is because it's much "easier" for the government for force your finger onto the phone, or take blood from you, or hair, and so on - and they can't really do that with passwords. But we can fight back with technology and ingenuity and ensure that a fingerprint auth is "just as good" as as password, at least from this point of view (government forcing you to give it away).
>Unlike most Silicon Valley companies, Apple’s business model is one of "Data Liability." Unlike Google or Facebook which use advertising to extract value from users’ personal information, Apple focuses on selling things that protect a user's data from all unauthorized access — including by Apple. Basically, Apple views user data as a headache, not a monetization opportunity.
> I see no reason why a fingerprint authentication should be forced upon someone anymore than a password unlock would be.
In Canada, there is a difference. A fingerprint is something you have. A password is something you know. Police can compel you to use your fingerprint to unlock the phone. They can't compel you to disclose the password.
You shouldn't be compelled to give up anything that is "you" without a court order and a warrant. Not indiscriminately photographed, fingerprinted, and in some cases tested for drugs against your will.
Canada forces people to unlock their phones, when crossing the border, quite frequently - and have for years. They will make you unlock it, or disallow entry and/or detain you.
There's a whole series that shows this. It's Canadian Border Guard, or similar. I see it when I cross the border, which I do with some frequency. I just unlock my phone for them.
They just flip through it. I'm a citizen so they don't do much with it. They usually look at texts and emails, to see if people are going to work there illegally. I'm able to work there so it's pretty silly.
It has never left my sight. I can't speak for others and didn't watch the show that carefully. I live so close to the border that my neighbors get Canadian television. So, I've seen it there. I don't actually have TV hooked up, so I don't see it often.
What I meant was that inside of Canada, there are situations where the police don't need a search warrant for your phone. You can unlock it with your fingerprint, so it's legally "open" and searchable.
If you have a passphrase, then they need to know your mind in order to unlock it, and they can't force you to disclose something you know.
Ah, okay. Yes, inside the US it is pretty much the same. Borders are the exception and, I think, you need to actually be crossing in order for them to demand it. Citizens are immune, for the most part.
Curiously, citizens aren't immune in Canada. I have dual citizenship and they still sometimes want to flip through my phone. No, I'm not sure why. They have had me power on and unlock my laptop a couple of times, as well.
It's not "forward thinking" to put that feature in 4 years after Touch ID debuted and people have already been forced to unlock their phones. They should be issuing warnings that fingerprint sensors are for convenience and reduce security unless they're used in multi-factor authentication.
The alternative was passcodes and Apple's research suggested that people disabled passcodes rather than use anything secure. TouchID was determined to be far more secure than no passcode and having both TouchID and a passcode is secure enough for most users. You can't add devices without both (along with an Apple ID password) and, at that point, you have physical access to the device which negates most security features anyways.
There is a reference for this as Apple published a security white-paper about it prior to the release of TouchID. Unfortunately, I'm not at a point where I can search for it. If I find the time today, I will update this post with a link.
In the February 2017 case of a California artist who was questioned at San Francisco International Airport upon re-entry, after he finally agreed to unlock his iPhone, it was taken out of his sight for several minutes and could have been imaged without his knowledge. Under iOS 11, unless the artist, Aaron Gach, decided to actually give up the passcode (rather than type it in himself), he could at least have been reasonably confident that the phone could not be imaged without his knowledge.
So, doesn't this just mean that border agents will force you to write down your password, key it in themselves to verify that it works, then walk away with the phone to image it?
Can't wait for iOS to have the deniability factor of having different passwords unlock different things :)
For the sensitive stuff, have periods of time that require several computers to solve cryptographic challenges in order to unlock the phone. Some of which may be your friends' devices. If they don't hear from you and your intended hosts in a certain amount of time, phone stays locked.
Or one of the devices can be an NFC or Wifi hotspot in a certain area, and one at home. If you don't reach it, phone stays locked.
If you are realistically in a situation where that would happen, and you would be harmed, they may just decide to kill you as you're no longer of any value.
In this case, a blank phone probably means no entry and not actually harming you. So, there's that, I guess.
So you are afraid of actors that would torture you to get access to the data.
And your "solution" is to share this experience with your friends by tying their devices to yours and setting up a device in such a manner that that actor would not be able to verify if they indeed have access to the real data stored on it.
This is a recipe for a one way ticket to a very dark place for you and your friends.
For the most part the number one rule of actual data safety is that never implement protection that would put your well being or the well being of others at risk, unless you are protecting the nuclear launch codes no data is worth being physically harmed for.
So, since the vendor will undoubtedly inject themselves into this complex situation for 'usability', you are basically arguing for devices that are never actually controlled by their user because terrorism. Good 9/11 argument there.
The user controls their device to the exact same extent now. You have to trust the apps and OS you use. And iOS isn't exactly open source either.
I am saying the user can choose to select beacons to unlock their phone during a trip somewhere. The user CHOOSES to lock their phone in a way that requires things additional to the password, things that represent having made it safely past the security lol.
This is a rare case where multi-factor authentication, and the option to involve biometrics as hardened sub-layer auth, would be nice.
To be adequately effective, though, all back-up activities would require the full authentication credentials for all verification factors, which might only be possible with, say, a fingerprint scanner (touchbar) equipped laptop, or additional external hardware peripherals for other types of systems.
"Multi-factor authentication" doesn't make sense unless you're communicating with a server over the network. Biometrics can be copied and faked. Ultimately it's security theater.
It does. But if your a US citizen or are ok to be barred from entering the US for a long time, then you have the option to not unlock.
If you go the US citizen route, be prepared to never get your phone back and many future border crossings to be 4+hr affairs where they confiscate most of your things.
If you do have money and effort to spare though, perhaps you can file a suit that gets up to the Supreme Court after enough of those 4hr border crossings.
Yes, but an arrest requires evidence that you actually committed a crime, whereas border searches require no particular standard and are conducted relatively indiscriminately. Refusing to unlock your device is not a crime in itself. (However, lying to a federal agent is a crime, so it's important for anyone attempting this strategy to be forthright about their refusal, rather than trying to trick the border agent with fake data or whatnot.)
As mentioned elsewhere, Apple can't prevent "rubber hose decryption" (where someone compels/coerces/tortures you to get access to an unlocked phone). I suspect this feature wasn't designed to prevent that threat.
My guess is this somehow foils or mitigates the workaround that the Israeli company sold to the FBI after the San Bernadino phone issue.
They could add a "duress passcode", with an alternate reality of data. I don't expect them to, but it's possible. Bitcoin hardware wallets have this kind of feature to help avoid wrench attacks.
Edit: No seriously. See https://en.wikipedia.org/wiki/Making_false_statements. Refusing to unlock your device is one thing. Claiming that you did unlock it but in fact just used a "duress passcode" is a lie and can land you in jail.
This makes me wonder something. I know that while they can't force you to enter your passcode, they can force you to unlock with your fingerprint. I've heard people ask for a feature where you basically have a "duress fingerprint," since fingerprint is already different than password, would using a duress fingerprint be acceptable?
If you're instructed to unlock the phone with your finger, and your "duress fingerprint" unlocks fake data, then you're misrepresenting that fake data as the real data and would presumably be considered to be making a false statement (though this is just speculation on my part).
On the other hand, a "duress fingerprint" that locks out TouchID wouldn't be misrepresenting anything. But it wouldn't surprise me if they could still get you in trouble some other way for knowingly locking out the device after it's been confiscated.
I believe if you tap the power button 5 times on the iPhone, it disables fingerprint unlock. Perfect for a scenario where you are afraid you might be compelled to fingerprint.
This is apparently an upcoming feature in iOS12 but doesn't work in iOS11 now
What does work is putting the wrong finger on your sensor until it tells you TouchID is disabled and needs your passcode (which is 5 times) or rebooting your device without logging in.
How do you honestly think that will go down for you or anybody else that tries it, telling a federal officer that you just destroyed the thing they asked you to open?
If you're willing to destroy the data to prevent it being read, why not just wipe the phone before you leave and restore it from an encrypted online backup when you arrive?
With the facial recognition tech they might be able to measure duress. Or geofence known customs checkpoints. Would love to reasonably claim that I can't give up my data while I fear for me freedom.
I believe that they simply dumped the NAND to circumvent the passcode entry limit then brute-forced. But it was also an iPhone 5C model which does not include the Secure Enclave. This method could not be reproduced on any model beyond the 5S.
What Apple could do to is to move the bar from "rubber hose decryption" to "rubber hose account access" by offering a dead simple cloud recovery. Imagine a "Flight mode" where everything on your phone is backed up to apple and your phone is presented with an "enter backup id and password" dialog which allows you to load anyone's cloud backup and is essential a factory reset. Border control might still force you to enter a backup id and password under the assumption that you surely have one, but (to my limited knowledge of the US legal framework) it would not be a felony to load a "duress" backup id (because there is no such thing as a duress backup id, just different data sets).
I have never seen a company both so technically capable and supportive of user privacy as Apple. It stands completely apart from all of the tech giants of today.
This change alone is tempting me, and I have been diehard anti-apple for about a decade now for walled-garden reasons.
I know, I know, walled garden prevents users from opening their phone to vulnerabilities or guaranteeing a secure experience. I guess I wish I could have my cake and eat it too.
iOS devices have always required the device to "trust" the computer before allowing the backup. Looking at the article, the only real difference now appears to be that you can't just use TouchID, you're forced to use the passcode.
You are correct. Android does not require you to enter a passcode to trust a computer if you have a fingerprint registered. The change is that Apple now requires you to enter a passcode if you have set up a passcode on the device in the past. If you haven't, the attack vector still exists, and if you are traveling with a trusted laptop with fingerprint unlock, the attack vector still exists with one level of indirection.
>If you haven't, the attack vector still exists, and if you are traveling with a trusted laptop with fingerprint unlock, the attack vector still exists with one level of indirection.
You can't setup TouchID without a passcode. The "attack vector" only exists if you 0 security on your iPhone to begin with. (So , yes, technically the attack vector exists if you choose to use no locks at all).
You misunderstand. The vector is: I take random persons phone, compel them to provide me their finger print, and now when I plug the phone in to a computer, I can unlock the device and then tap "trust" in order to back up the phone to the computer. If the iPhone was already powered on and the user had entered their passcode after it turned on, even if it was two weeks ago (there may be a time out I'm not aware of, but you get the point), I'd be able to backup their device to my computer.
> Looking at the article, the only real difference now appears to be that you can't just use TouchID, you're forced to use the passcode.
That's part of it. Another part is that the passcode has to be entered after the device has been connected to the computer, as opposed to starting with an unlocked phone (maybe you took it while it was in use, maybe you forced the user to unlock it) and connecting it to a computer later.
What data does Google collect that Apple doesn't collect? The only difference I can see is that Google is competent at making the data it collects useful to the user and barely usable to advertisers.
Your comment looks like trolling. We were discussing a feature to prevent others from backing up your device.
In the first part, you're setting up a premise that nobody here can or will answer publicly even if they have the knowledge of the exact information collected by either company. That gives you a free run at the second part without anything supporting it.
You and GGP are the ones making ridiculous claims without evidence (that the data the OSes collect are unknowable instead of outlined in a privacy policy and that Android collects more data than iOS out of the box).
To answer your first question: a lot. Apple makes all your data locally encrypted by default, in Messages, Siri, iCloud, Maps, etc - they don't have the key to access it. See more at https://www.apple.com/lae/privacy/approach-to-privacy/
The difference is that Google owns the keys that encrypt your files. Services like GDrive, GMail, Google Photos, Assistant, all depend on processing your data in the cloud. Apple does all of that locally (because they can't access any of it otherwise), trading some scalability and sophistication for privacy.
We were discussing data stored on the device. Just like Apple, Google does not have the keys to decrypt the data on the device. And just like Google, Apple has the keys to decrypt the data in their equivalents to the GDrive, Gmail, and Google Photos cloud services for web access. The only difference is that Apple's services are significantly less useful.
> What data does Google collect that Apple doesn't collect?
I don't know what Google collects, but the simple answer to this is "virtually everything". Apple collects very little data from their users and does everything they possibly can on-device (and the stuff that requires the cloud is either encrypted or heavily anonymized). Nearly all data that Google collects, Apple doesn't.
On the other hand, Android services everyone, whereas Apple tells the developing world to fuck off. Apple's privacy is only for the social elite who can afford their expensive toys, and they're not interested in providing service on a global scale.
Let's not get too hippy-feelgood about Apple's intentions regarding its users here.
I am not a diehard android, but i'm replying to you from a LineageOS phone without the Gapps, using microg as an alternative to the google services, and firefox as a browser.
Privacy is achievable on Android. However, I do agree it os very involved.
The services provided by Gapps that send data to Google (push messaging, aGPS, crash reporting, app installs, safe browsing, etc.) also exist on iOS with exactly the same privacy policies. The only difference is that you can remove or disable them on Android devices if you are paranoid, while no such remedy exists on iPhones.
By your standards, privacy is achievable with difficulty on Android and not at all on iOS.
Yet this "fundamental difference" does not change the fact that both companies' devices provide exactly the same amount of privacy out of the box, while only the Google device is truly yours to put whatever software you want on it, down to the OS.
I trust the vendor whose primary revenue stream comes from me, rather than the one whose primary revenue stream is me (or my private data). I trust Google to abide by their T&C's about as much as I trust the NSA to abide by the laws governing the extent of their actions. Which is to say, precisely none.
You can argue until you're blue in the face but Apple have little to no incentive to exploit my private data while Google have every incentive to do so.
I understand your position now. You're a conspiracy theorist. Google and Apple have exactly the same incentive to exploit your private data — money. They also have exactly the same disincentive to disobeying their privacy policies — lawsuits and bad press leading to loss of money.
The Google device is NOT truly yours because you have to look for cracks if you want to root it or remove the Google stuff. The law may also be against you if you crack it. You are just a guest on your device...you are "licensed" to use it just like you get license to play music on spotify, watch movies on netflix etc...nothing is owned anymore and very few people care.
This is not true on the actual Google devices (Nexus and Pixel lines) that have a supported path to flash your own rom, and root. Google so far has been supportive of the custom rom community. Some manufacturers (cough samsung cough) have not.
Many Android devices, including all the ones that Google make and sell themselves, Sony devices, Motorola devices, have an officially supported path to running any code you want on YOUR own device.
Usually that path is: boot the phone into bootloader mode, plug it into a computer, run `fastboot oem unlock`, accept the warning that this will void your warranty, then `fastboot flash <image file to flash>`. An officially documented and supported way to do exactly what you want with the device that you own.
Yes, it voids your warranty, but it's not forbidden or illegal in any way.
By google devices, do you mean only the Pixel and Nexus phones? Or does that include all android devices? Because for example Sprint Galaxy s5's can't be unlocked.
Unlockable bootloader, yes. Officially supported ROM without Google Play Services, absolutely not.
I see you defending Google around here. That's fine but a grand-grand-grand...-parent of yours was saying that AOSP is unusable, and that point still stands and is true.
You've confused yourself. The whole point of a device being yours is that you can install not officially supported ROMs.
The officially supported experience with Google Play Services is no more privacy invasive than the officially supported iOS ROM, but in the case of Google devices, the device is yours, and you are not limited to officially supported ROMs.
Safe Browsing and Crash Reporting can be turned off with no repercussions. You can turn off location altogether, though I'm not sure if you can turn off aGPS only.
That leaves app installs (Android allows side loading) and push messaging (I'm not sure about this one) as the ones that you can't disable on iOS.
But please think twice before turning off safe browsing! Contrary to what some seem to believe, it doesn't send your entire browsing history to Google. See https://news.ycombinator.com/item?id=9779990 for an explanation.
This ignores essentially all reverse engineering techniques that are required to find actual backdoors & is not something security professionals largely believe.
It depends on the device, but at least on Nexus/Pixel devices you can just plug the phone in and `fastboot oem unlock` and it will unlock the bootloader, i.e. allow you to flash anything onto it that you want.
This does not let you access the user's data because fastboot oem unlock wipes the data partition. Before you can unlock the bootloader, you have to enable the option in the developer menu, which requires the phone's owner's account's unlock code.
Google Fi only allows Their Nexus and Pixel phones on their network. It is like an ISP seeing that you are running Linux and not letting your traffic through, just a bit more complicated than that. (SIM Card, Modems, Frequencies)
Not quite. You need those phones to activate service but you can move that sim into other phones later. Once you have service you can also get data-only sims that openly support all devices.
This sounds a bit misguided. IIUC, Google Fi is really a mechanism of accessing two carriers from the same device. It consists of 1. A partnership between Google Fi (the cellular provider), T Mobile, and Sprint, and 2. A special radio hardware on the phone that is capable of working with two different "channels" at the same time. Since my Nexus device broke, I have personally been using "Google Fi" (the provider) on my iPhone which means I do NOT reap the full benefits of Fi without having the special radio hardware.
So, I don't think it is physically possible to do the "two provider at the same time" thingy without a special hardware (you'd need two parallel "radio" circuits on your phone).
And with that, your analogy falls sadly on its face, as you seem to indicate an artificial "software" limitation has been set to prevent us from using Google Fi on our iPhones.
Sorry, I did not mean to say that is how it is but how it seemed. My understanding is that the modems aren’t capable. That being said, activation is done only with a nexus or pixel device (software) and while switching might be a hardware issue, iPhone works on both sprint and Tmobile (on an iPhone on Tmobile after moving from Verizon) and so google fi should still work.
As I mentioned, I AM already using Google Fi on my iPhone. What does not work is the hardware feature of handing off between providers mid-flight. With my [now deceased] Nexus X, a call would seamlessly [to me, anyways] switch between providers during medium/longer commutes. With my iPhone using just T Mobile, calls get dropped during my daily route as there are certain "T Mobile blind spots".
Is it actually a special radio, or just features which normally get disabled by e.g. Verizon programming the baseband? I'm assuming that the "special" radios are simply NOT using crippleware and setting some other registers which could be enabled on many more devices...if carriers played ball, which they don't want to for obvious reasons.
I have to research deeper to be sure (there is some FAQ at https://fi.google.com/about/faq/#supported-devices-7). But skimming through that page, I suspect this is a new-ish feature not available on most smartphone radios.
Project Fi does some things to switch between several carriers (T-Mobile, US Cellular, others) behind the scenes that doesn't work as well on iOS. Also, you need a Pixel or Nexus device to activate initially anyways.
All other tech giants today make money from advertisement and /or “big” data, which are inherently anti-privacy. This is why Microsoft has done 180° on privacy and has filled their first part apps with ads.
That is very true, but I find it very sad that, as much as Apple is pro-privacy, it's very anti-freedom. I'd love to switch to a device as private as an iPhone, but at this point you're just renting the device from them.
It's not anti-freedom. It's just pro-ease-of-use and it's easier to guarantee a quality experience when there are less unknowns and less variables to account for. When everyone is using the same device with the same software, it's much easier to guarantee a great experience.
Allowing side loading of apps would not impact usability in anyway.
Edit: It would massively increase the usability, allowing any apps to be installed, not just Apple approved apps, making the phone more usable to many people. Back in the day I had to buy a dev account just to load an emulator I wanted to use (it was open source) without rooting.
It's more than just sandboxing: Apps going through the App Store have to meet a stringent set of requirements, chiefly being no private API usage, no circumvention of features intended to give the user control/privacy, and no third party web engines (and all the performance and security implications that come with those). Apple regularly rejects apps that violate the first two, and while I'm sure some get through they tend to get buried by scores of more legitimate alternatives. Sideloaded apps get to bypass all of this.
If they ever do offer sideloading/gatekeeper, it should be turned off by default and turning it on should make the risks crystal clear with a scary alert and passcode prompt. Average folks should be heavily discouraged against using it. Gatekeeper works on the Mac, but iOS devices are both far more numerous and far more personal, so the stakes are much higher.
The number of safeguards they have in place make that a moot point. If it's not discovered in testing, it's sandboxed. If it gets past the sandbox, they have a killswitch that's registered to the app's specific ID. There's very little damage that an app can do if it's gone through the App Store proper.
That may be a better analogy than you intend. The problem with cars is that everyone suffers the consequences of your driving. In fact, you may have the least exposure to the carbon monoxide, pollution, or even accident risk. Pedestrians, cyclists, property owners, and other drivers are all at risk of your driving.
For that reason, societies tightly regulate car ownership and driving. More so than phones: I don’t need a license to use a phone, nor do I have to register my ownership or have it regularly inspected.
But technology these days has this same characteristic: Others bear the costs of your decision. Every device connected to the internet is a DDOS vector.
I don’t want you deciding whether to keep your device up to date with the latest security patches, because if you (and a few million others) don’t, GitHub is down for me.
Bear in mind that the vast vast majority of apple's customers have 0 interest in sideloading apps- if you're apple it makes a great deal of sense to trade off security for the masses against a few power users who can't sideload apps occasionally here and there
A compromise doesn't affect you, it affects Apple and the app developers. iOS has the best apps because it has people buying them instead of everyone and his brother pirating.
Anyway, if you don't want to deal with how an iPhone works, how about buying a different phone.
No, you don't need an Apple computer to install an IPA anymore than you need a computer to install an APK. Let's stop comparing Apples and Oranges shall we? You need a computer to compile and sign an IPA, the same thing you need to compile and sign an APK.
> You need a computer to compile and sign an IPA, the same thing you need to compile and sign an APK.
You can compile and sign (and, AFAIK, upload to the Google Play Store) an APK on an Android device, using AIDE and perhaps other dev toolkits. No computer needed.
I see your point, but: That's not an excuse, that's an explanation. It is anti freedom whether that's their end game or not. I don't really care about Apples motives, I care about the product.
Really, Apple is anti Nothing and pro only one thing: money. Everything else is a corollary. Including ease of use, freedom, and privacy. It's just that those things do matter to me (and GP).
Not being able to unlock your phone or sideload programs. I don't accept the "we know what's best for you" argument. They could have allowed jailbreaking and added huge warnings.
...and IMHO those "security or privacy issues" are an acceptable cost of freedom, the same way that I see crime in general: no one wants to be a victim, but you can see that, as long as crime still exists, so does freedom. If no one can do "bad things", then everyone has already had the essence of life taken out of them. I wrote this comment a few months ago when (almost) everyone was wondering what could be done, in the wake of a massive malware attack:
As I mentioned in a sibling comment, and it's almost ironic thinking about it, this "freedom for security" tradeoff we're making here is not what it seems.
I see your point in the context of e.g. WannaCry and WTC attacks emboldening authoritarians, however this privacy move we're discussing is actually an anti-authoritarian move. It gives the user more privacy and more freedom.
Advocating more lax privacy/security in this case is siding with authoritarian tendencies.
Apple is only shifting the authority to itself. They still have the signing keys and control what you can or cannot do with the device you "purchased".
In that vein, it would be awesome to see some base open sourced ala Darwin. Even if all it did was drop your iPhone into a command prompt after boot it would be enough.
That argument is more reasonable if users had no platform choice. A cellphone can track your location and listen in on your conversations so you may reasonably want more security than say a PC.
Unfortunately in the device world you have to pick one over the other. The more open a device is the less it can protect your privacy. We haven't yet seen a device that can do both.
I believe you got it backwards. When the device is open, it can protect one's privacy. The problem is, we don't have open devices.
I mean, tha hardware is not open, so developers can't generally implement any proper secure boot schemes (starting from a trusted bootloader), and can't generally control what goes on in the radio modules. Because there are no devices that have those parts open (or I'm unaware of something I want to buy?), there's no security/privacy possible.
Apple can provide privacy there because they're damn huge and they can purchase or design any hardware they want, fully documented to the last every single logic gate. An average free software developer can't.
As for the userland - I believe there are AOSP derivatives + 3rd party apps that result in a reasonably good privacy and security experience.
otoh the pro-privacy approach they're taking also increases freedom, such as in the use-case this article is about. Which imo is probably a more tangible freedom to many than the ability to hack around iOS devices or the idea that the app store isn't free.
You can compile and run your own software on the device. You can jailbreak it to get the same results. You also have access to the entire internet which should allow you to do nearly any kind of development you want to do and use the browser in your phone as a UI the same as any other computer.
The only part of their ecosystem that has major restrictions is the App store. Which kind of makes sense as that's one of the few ways in which someone other than Apple could completely trash their platform and make alot of Apple's customers unhappy.
I'm not saying your "anti-freedom" point is completely invalid, but it's really not a major concern for most tech-savvy people, IMO.
totally agree. i think designing software so you dont have access to customer data (zero knowledge or customer held keys) will eventually be a big selling point for SaaS companies
The first time I saw the concept (and got mind blown by it) was when I discovered the original version of Passpack (online password manager) made by @sullof
I thought the same thing reading this title and it makes me want to invest. Companies who ignore user privacy are in for a hit I think, in the wake of equifax.
I'd support companies with excellent privacy / data security records being exclusive brokers of my information. Apple is on that very short list.
> Companies who ignore user privacy are in for a hit I think, in the wake of equifax.
Are they? Do people care that much? I think, except for those who actually have their credit stolen, the average person just sees it as another headline of the world today, and will move on, the way people still fly United.
> the average person just sees it as another headline of the world today, and will move on, the way people still fly United.
Until people's lives are actually impacted by the lack of privacy, they'll continue doing what they always do. Once something requires action on the person's part, (either credit being stolen from them, or something along those lines) most people won't take preemptive actions. And this would need to happen on a massive scale for giant corporations to change their ways.
I don't entirely agree. Pre-emptive action could be a 5% higher chance of buying an iPhone, or a bump in existing iPhone user's stickiness to the platform (you value keeping something you have more than not getting something you don't). That's enough to make a significant difference.
Other than financial data I don't particularly value my privacy. I post on many online forums under my own name and even my alias here is a composition of elements of my name. I use Google services quite happily and have no material gripes with the company. But still as an iPhone user it's good to know that Apple has my back on this and I'd really hate to lose that feeling.
Also I don't think this is entirely a commercial decision on Apple's part. They must know very well that taking an absolute stance on customer privacy puts them diametrically at odds with the interests of the Chinese government. China hasn't taken many overt steps against them (though they shut down iTunes movies and iBooks and chipping away at Apple Pay and other services with onerous regulations), but does anyone really doubt that they're working away in the background to try to help Apple's competitors and slow down Apple sales and service expansion in China? There's no way the Chinese government can be happy with Apple's privacy stance and nunlike int he West Apple doesn't have the shield of the law to protect them in China. They stand naked against the Chinese government, but they're still doing it. That takes some real conviction. I really don't think I'm exaggerating in saying that this stance on privacy may very well cost them any chance of a dominant position in the Chinese market. It's certainly running a real material risk of that anyway.
Flying United is a choice. In fact, some companies do suffer lasting harm from bad behavior (Volkswagen comes to mind). Being an Equifax product is not a choice, just a symptom of being a pawn in today's mass-surveillance world where (we are told) privacy and security are illusions.
some companies do suffer lasting harm from bad behavior (Volkswagen comes to mind).
Volkswagen is basically back to it's pre-dieselgate sales levels. Admittedly its stock price is still quite a bit below it's pre-dieselgate high, largely due to the still unknown factors surrounding possible future fines.
Yeah, I really hope something like Gatekeeper comes to iOS. As time moves on (and tech companies stop being apolitical), I'd like to not have the App Store being the only way to get software only my device.
For example Gab was rejected by the App Store because of the content people were posting. Of course, they won't ban Chrome despite being susceptible to the same content violation.
Is it not still possible to set up an AppleID with gift cards you paid cash for? As far as I know it used to be. I still have an AppleID I set up in another country using a free iTunes download code I got from a bottle of soda iirc, never used a credit card and downloaded a free apps that weren't available in the country of my main AppleID.
I choose to read this as sarcasm, but the fruit fandom here doesn't seem to think so.
I'll trust Google's engineering prowess and rigor before I trust Apple, who couldn't even bother to verify emails before leaking Apple ID attributes towards that email, and who allowed things to "fappen."
But I'll trust none of them to "guard my privates." Please.
You are reminding people of serious security issues, but comparing them to Adobe? That's going too far.
Personally I dropped a lot of Google products with the Snowden leaks revealed to shocked Google engineers that they shouldn't have been sending unencrypted traffic between data centers and assuming it was safe because it was on leased lines. There's bugs, and then there's just negligence.
Thinking about it, Apple is benefiting right now from the same debt-based economy that benefits the FBI/NSA. I don't think there even is much local US manufacturing, so Apple imports products made in China and sell them at a high profit margin. Apple's cash piles came from debt, and I think national debt is exactly the thing that increases when FBI/NSA spends more money, right? (Of course, it is only one kind of debt, but it is important for example because of the bonds other countries buy with their FX reserves)
You can choose non-US companies. OnePlus, Samsung, etc. I don't think China or S.Korea is less inclined to spy on you, but I suspect they are less competent at it.
While I really like non-Google Android + PMP in theory, the reality is that this solution is extremely complicated, limiting, error prone, buggy and probably not worth it.
So I concede that Apple is the better choice, but nevertheless, persist in my futile efforts to make my 1+ somehow work securely.
Yeah, it looks like the choices are between competent+PRISMed and incompetent. The former can likely only be accessed by the Five-Eyes. The latter may be secure, but may also be insecure and accessed by anyone. What would you bet on? For me, a deciding factor is that I don't live in a Five-Eyes country.
Quite the opposite actually, if you're a citizen of it you have some constitutional protection against your own government spying on you, at least in the U.S. AFAIK. If you are outside of it, you're fair game.
In theory i agree if you read up on what is shared about it.
In practice i have very strong doubts this is a fact. Specially since there is '5 eyes', '6 eyes', '9 eyes' and '14 eyes'. And any of the members are monitoring domestic terrorism, so i fail to see how they are not spying on their own citizens.
Isn't Samsung considered to be kind of an arm of the South Korean government? They also make weapons of war and other military equipment, and they haven't proven to be ethically or morally responsible, if not outright reprehensible.
For many people, their only positive quality seems to be "Not Apple."
Also there's a big difference between 1) a company collecting and selling your data to other companies 2) a company giving authorities access to your data for complying with anti-terror laws.
I'm not implying i agree with how far governments go to intrude into peoples privacy. Just pointing out that i agree with OP that there's a big difference between for example Googles and Apples stance towards privacy.
They sell ads that they target based on your data but AFAIK they don't give this data to any third party (except gov agencies of course like all US companies)
They, like Facebook, are big enough that they don't need to sell data in order to monetise it. This doesn't make their swirling vortex of personal data extraction less creepy.
Apple is adding a kill switch to iOS 11 that lets you discreetly disable Touch ID (and presumably Face ID) by hitting the power button 5 times. And of course if you're concerned you can opt out of Touch or Face ID entirely and simply use a pass-code.
I am aware. I don't know why I phrased it the way I did. I meant I would rather use that feature of the longer passcode instead of facial recognition. Bad grammar for the win.
Thanks for sharing that info, but I think the problem is that Apple is marketing these methods of identification as methods of authentication. Sacrificing security for convenience.
Even if they have a kill switch in place, the users least likely to know about it are the users who are most likely to use insecure methods to "secure" their phone, I would think.
Something that I think people underestimate is just how easy it is to observe you entering your password on a phone, and why that (in my opinion) makes thumbprints much more secure than passwords for casual usage - e.g. every-time you unlock your phone.
All you need is a camera over your shoulder and you don't even need to observe the key-presses as generally the current character is displayed on screen. You could likely observe 100s or 1000s of them a day with an overhead camera at transit stations and the like.
The same thing goes for "Tap And Go" contact less payments not requiring a PIN number under $100. Everyone goes on about how people can run up a few hundred dollars at different stores with your card if they steal it. But consider exposing your pin to surveillance during most common transactions which then also lets you remove cash from an ATM with that card if stolen which is much harder to recover and is also much higher value than the generally $30-$100 limit for transactions without a PIN.
Next minute you'll freak out when I tell you I can clone your house key from a photo of it hanging off your belt...
The general point is that security trade-offs are generally deeper than you might realise on the surface, especially at "public outrage" levels of observation which so frequently haunt the public mind in recent times.
I'm not sold on this Face ID business yet though.. will see how it is presented tomorrow.
TouchID has 3-5 tries before it gives up and requires password. There are 10 fingers + failed presses. It can also be turned off in duress (or will deactivate after a few hours or after reboot).
I always turn off my iPhone for checkpoints or traffic stops. They want it, they can decrypt it the hard way.
That's a great capability I hadn't considered. Your passcode isn't just a 3d composite of your face, but instead could be a 3d composite of your face performing a specific gesture over a short amount of time (wink, kiss, tilt, etc). Would be harder to spoof.
Is a facial expression speech? One of the issues with touch ID is that biometrics aren't protected by the 4th, while passwords and the like are. It would be an interesting question whether a court can compel you to make a certain expression.
When you need Apple or any technology to fight for your privacy against your own government you know you are in serious trouble.
The government has no right to interfere with your personal effects, this is fundamental to freedom and democracy, and the idea of the private individual.
Yet it seems this too is 'normalized' and citizens are more interested in technology workarounds to deal with this abuse from the state.
At least I'm confident the French, Turkish and NK governments don't have the keys. The FBI wouldn't share the secret with simple policemen, unless for extremely rare reasons, in which case iPhone security isn't your main problem.
I regularly reverse engineer components of iOS and have done so since the first iPhone was released. Never seen anything like a "skeleton key" and there is certainly nothing like that in place now.
Starting at iOS 10, firmware components have not been encrypted (obfuscated), so you or anyone else can also reverse engineer it.
For anyone concerned that authorities force you to give up your password to them (thus allowing them to image your device), you can pair your iPhone to a computer with a MDM (managed device profile), which will prevent any other device from connecting to it. iOS security researcher (now Apple employee) Jonathan Zdziarski has 2 blog posts on this:
Counter-Forensics: Pair-Lock Your Device with Apple’s Configurator:
It still works great, however the linked post uses Configurator and Apple has since replaced it with Configurator 2, so some of the options and workflow are different now.
Anyone could shed a light on how difficult would be the implementation of the following:
- All data encrypted by default
- The "dump" of the phone memory or macbook hard-drive makes it looks like the whole drive is full. It means that the free space is populated with random data that is, itself, encrypted.
- User can switch from his user profile to a fake user-profile and import some data (like contacts/messages/photos)
I thought you were suggesting a way that meant phone backups all appear the same size as the disk 100% of the time. (meaning the true volume of content is hidden)
Then the second point is the user could potentially have two (or more) profiles on the device, and it is possible to unlock it into one or the other. Meaning a user under duress can unlock a device and not reveal the true content while the person trying to get into the device has no way of knowing if that is the true profile or not.
I figured that would be a pretty sweet feature. It would also tie neatly into allowing users to have multiple profiles on their device which is currently impossible on iOS...
yeah it isn't awesome that we feel we need these security features. However I think even without the fear of government (etc...) this is still a great feature I would love to have.
What if Apple also added a feature for showing an innocent/clean phone if a specific password is pressed? How would the law enforcement know the difference? You only need to show the cellular call log, because they already have it and could use it to prove that you used the "mode".
The mention that is varies by provider is notable. Apple encrypts End-to-end all iMessage chats, as well as FaceTime (VoIP) calls. None of the other providers on that list do that, so at least there's that.
Also, people here act like Apple jumped willingly onboard the PRISM program. You can bet your ass their arm was twisted by the government or they were taken into the program unknowingly (datacenter ISP taps, etc).
"Apple encrypts End-to-end all iMessage chats, as well as FaceTime (VoIP) calls."
End-to-end encryption does not guarantee that Apple keeps your data encrypted, or that they don't process it for 3rd parties (NSA would fit as a 3rd party, where Apple would be for them a content provider, as the slide shows).
"You can bet your ass their arm was twisted by the government or they were taken into the program unknowingly (datacenter ISP taps, etc)."
> End-to-end encryption does not guarantee that Apple keeps your data encrypted
I'm probably misunderstanding something here but doesn't "end-to-end encryption" mean that A encrypts it with B's key and [whoever is in the middle passing it along] can't decrypt it because they don't have B's key?
And according to that slide they are a "provider" for NSA.
End-to-end encryption at least maybe guarantees that it is not your ISP that is selling you out to NSA (making it harder for Apple to explain why they are on that list).
Technological solutions to a non-tech problem. The US have demonstrated they'll quite happily just lock people up forever if they can't get to the encrypted data.
Do it to one man and its an oddity, do it to a thousand and lots of people will be demanding change. This makes it more likely it will happen to lots of people.
Wow, that's really nice. I wish Google was so forward thinking about things like this. I see no reason why a fingerprint authentication should be forced upon someone anymore than a password unlock would be. The only reason this is how it works today is because it's much "easier" for the government for force your finger onto the phone, or take blood from you, or hair, and so on - and they can't really do that with passwords. But we can fight back with technology and ingenuity and ensure that a fingerprint auth is "just as good" as as password, at least from this point of view (government forcing you to give it away).
reply