Hacker Read top | best | new | newcomments | leaders | about | bookmarklet login

To those who would say "but people would be manipulated into installing malware".

Firstly, Apple could make the unlocking process purposely hard. Many Android phones come with unlockable bootloaders, but the average person has no idea because you need a computer with the SDK to run that specific command to unlock it.

Secondly, people do get scammed on iOS devices anyway. Both through the web, and through poorly reviewed apps that slip to the app store every now and then. Moreover, Apple has a vested interest in not policing these thoroughly because they get to keep the 30% cut from the scammers' earnings.



sort by: page size:

I absolutely disagree with your conclusion. It's like forbidding people from working on their own cars because some people are stupid and kill themselves through their work(and that kind of thing is neither rare nor unusual).

And it's not like people with iOS are resistant to being scammed - there are hundreds of ways criminals can dupe you to sending them money, the invoice scam being the simplest example and it doesn't require any special apps.


So what you're saying is Apple cannot secure iOS to prevent this from happening? Like mandatory anti-malware built directly in the OS ?

Phone makers should not be able to sell their phones, only lease them, since there not yours in the first place.


I understand the impulse, but so many security issues happen to people because they're tricked into running things… I can't help but think that this would cause a dramatic rise in the number of security issues on iOS.

More complexity, more chance of problems.

For example, my dad used to use Androids. Without fail, he would get malware on them, he simply could not resist bypassing the security prompt to click on something he wanted to click on. Or maybe he does not understand English, or the concept of malware enough to properly heed the security warning pop up.

With iPhone, it’s not possible, so there is no worry, and no malware. Same with the hardware changes. People like my dad, or my wife, or even me who have very little interest in technology simply want to trust their device. And this device is literally the key to their life, their financials, their personal data.

All I know is my life has been made much easier by slinging Apple devices at people in my family that they simply do not have a way to mess up.


Are you sure it isn't iOS security that's the real joke? I mean, just do a search on YouTube for all of the lock screen bypasses. And then there's Cellebrite and their ability to unlock any iOS device for the relatively paltry sum of $1500 or you could just wait for a 2 for 1 special and do 2 for $750. The Pixel phones are virtually unhackable and not one has ever fallen at a PwnToOwn event. iOS devices, on the other hand, always fall at these events.

Apple really doesn't help them. the marketing (lying) that iOS is secure is pretty intense.

Anyone who wants to should be able to buy such a device, as it isn't like any of the machine code you are getting elevated access to is even secret (you can download, from Apple, unencrypted copies of the entire operating system). (You can try to make an argument that this is about keeping you from getting access to third-party encrypted assets to prevent some aspect of piracy in the App Store, but this doesn't accomplish that either as you need only have a single supported jailbroken device for that to be easy, and the world already has millions of those and you can't really prevent them as the act of fixing bugs discloses the bug for the older firmware.)

The real problem here is that Apple is so ridiculously controlling with respect to who is allowed to develop software (in Apple's perfect world, all software development would require an Apple license and all software would require Apple review)--in a legal area that isn't really conducive to that (see Sega v. Accolade, which was important enough to later ensure permanent exemptions on reverse engineering and even jailbreaking for software interoperability purposes in the original DMCA anti-tampering laws)--that they are even working right now on suing Corellium, a company which makes an iPhone emulator (which again, has strong legal precedent), in order to prevent anyone but a handful of highly controlled people from being able to debug their platform.

Apple just has such a history of being anti-security researcher--banning people like Charlie Miller from the App Store for showing faults in their review process, pulling the vulnerability detection app from Stefan Esser, slandering Google Project Zero, denying the iPhone 11 location tracking until proven wrong, requiring people in their bug bounty program to be willing to irresponsibly hold bugs indefinitely so Apple can fix things only at their leisure, and using the DMCA to try to squelch research via takedowns--that this ends up feeling like yet another flat gesture: they should have done much more than this device at least a decade ago. I'd say Apple is in store for a pretty big fall if anyone ever manages to get a bankroll large enough to actually fight them in court for any protracted length of time :/.


I dunno - your average iPhone is a powerful PC running a complex OS with a (largely) permanent network connection and the owner almost certainly will cough up $100 to unlock given how much people rely on their phones these days. That sounds like a perfect target for hackers to me.

It's an interesting hypothetical, but not realistic in countries with loan regulation. Not to mention, NSO Group has shown us that you can install a rootkit using built-in iMessage and zero-click exploits. I don't think manually-installed malware would lower the current bar, especially considering how "dangerously" capable the phone and web browser already is. Arguing against anything that can be used against the user would see the phone, iPod and internet communicator removed from your iPhone.

> The answer is 20 years of abominable behaviour by corporate app teams.

I feel like you're not going to like my answer to "Why Apple is facing multinational antitrust scrutiny" then.


Your grandma being scammed is not dependent on being able to install software. The vast majority of phone scams are reliant on browser-based phishing pages, convincing the victim to send a bank transfer, or getting a gift card code from the victim. If you believe it is an issue regardless then safeguards can be implemented such as child safety features or simply allowing you to opt-out (or even not opt-in) when you're setting up your grandma's iPhone for her or whatever.

Yes, you can buy a different phone, but Apple still has a serious hold on the market that affects its competitors, especially when it acts in a way that is anti-competitive. If Apple locking down their store in a way that is extremely user-hostile makes them a billion dollars and they walk away unpunished, how long will its competitors refrain from doing the same for? Apple is large enough that they affect me personally even if I do not use their products.


But unlocking is easy if you're a hacker. It's the non-hackers that apple is screwing the most.

The point of knowing how to hack stuff in this way is to get the most out of your soft/hardware, not to support the most hacker-friendly company with your completely insignificant amount of funds.


That's the hard part for all of this: if you are building an unlock mechanism the first question you need to ask is how you build a UI which clearly communicates to a user that they are likely giving control of all of their data, location/camera/microphone, etc. to whatever they're installing.

The scammers who push malware under the guise of tech support, free porn/games, etc. are effective enough to compromise millions of people and that's before you get to the question of what it'd look like if a government started pushing access for monitoring. How many people might consider installing something which this guy they met in a coffeeshop says will protect their messages from government surveillance? Now consider how many people might have malware installed by an abusive domestic partner, and where control of the device would extend to hiding the existence of spyware.

This is not to say that Apple is acting without self-interest here, only that I think there's really a pretty nasty market failure making it quite difficult to reconcile someone being able to make choices about their device with a fairly high risk of compromise with potentially significant consequences.


You said it didn't need to be easy to run arbitrary code. Why would security not fall under that purview? You can jailbreak your phone and run arbitrary software on it. If there are security problems that's kind of your problem that you introduced by running your arbitrary software.

How would Apple/Google be able to monitor malware here?

How do we know that the secure boot key for example isn't part of the security architecture and by giving it out you basically enable those root level permissions?

Are we really trusting users to not lose or compromise secure boot keys that they manage on their own? If grandpa gets scammed out of his life savings are taxpayers footing that bill?

I'm not necessarily looking for answers to those questions, but it really just seems like there's a lot of open items here that have to be addressed for not a lot of benefit and instead it seems like people just want an effectively jailbroken iPhone that's somehow secured by Apple/Google more easily.


Like if your worried about Governments tampering your device if you leave it out sight. I am sure they could easily get Apple SoCs on the gray market that do not have the boot ROM fuses set. So they could write what ever they want to that chip. Like sure they need to extract the key to decode your data. However, they could make it then appear that OS has a problem and needs to restored via iTunes. Sure it might reduce the sophistication needed, and maybe let smaller governments or companies do things. However, it still involves physical access to the device. If an attacker has physical access pretty much any system can be compromised in a lot ways.

If anything this should be a boon to users. It allows them fully to use their devices they own. Honestly, it is inexcusable that apple makes users have to hack their own devices. You should have the option similar to enabling or disabling secure boot on your PC.

As to the market for stolen iDevices. It still exists, but generally involves parting it out. Like sell the screen, cameras, and battery ect... Honestly, while unfortunate that things get lost or stolen. I don't see how this really changes anything.


The pixel, for example, already has a secure yet user-unlockable bootloader. So do modern x86_64 PC's. Statements like these, claiming that only apple can properly secure a device (and hence that users deserve to be locked out), simply show astounding ignorance.

That's just iPhones having good security and sandboxing. None of that goes away if Apple allows people to use enterprise certs to distribute software.

And people who want a hackable iPhone can jailbreak it or whip out the microscopic soldering iron. I basically don't install 3rd party apps but I want it enforced at the OS level. If even 5% of iPhone users were unable to install apps companies would think twice about ruining their website in order to force their mobile app on users.

Currently the iPhone is a great device for almost everyone on the planet and the trust their users have in the 3rd party apps is a big part of that.

If Apple made it easy to put custom apps on the device it would mean you could more easily be tricked into installing malware and so reduce the trust in the security.

The iPhone is as popular as it is today due in no small part to how they have policed the App Store.


Apple’s approach is pragmatic. If they offered anyone the ability to jailbreak, you would see major app makers manipulating naive users into jailbreaking in order to bypass security, payment restrictions or similar. Even the limited options Apple provides are already abused, such as Facebook convincing users to install VPN profiles with their Ovano product and abusing it to spy on users.

Having the option defeats the fundamental advantages their approach offers.

There are a lot of problems with Apple’s approach from an antitrust perspective and fair competition should be regulated through legislation, but there are good reasons for closed ecosystems to exist and plenty of great alternatives for people who want something more open.

next

Legal | privacy