Hacker Read top | best | new | newcomments | leaders | about | bookmarklet login

We all have had, but in the past. it isn't feasible now.

I mean, you can buy "safe" smartphone, but first you can't prove beyond reasonable doubt that it is actually safe and private, and second, you attract more attention because the same phones are being bough by the criminals.



sort by: page size:

It should be possible to protect oneself. It is extremely difficult only because there are basically only two options if you want to have a smartphone: either iOS or Android. The immense power of nation-state adversaries is focused on just two targets.

And there are are no commercial off-the-shelf security solutions that can “protect” you. You have to do your own security. This is why Snowden had to painstakingly teach the journalists on how to use GPG to receive his cache. No other way would be trustful enough.


There are plenty of security reasons not to buy a smartphone.

Why not get another "unsafe" phone?

Why is there no market for properly secure smart phones?

I would have thought the market would be huge.


I can't disagree with your point of view; I merely point out that a world where everyone who wants a secure smartphone needs to own their own smartphone factory isn't a very practical world.

And even if you find open-source, audited, secure hardware, your carrier might compromise you. The more I think about it, the more "secure smartphone" sounds like an oxymoron.

So I need to buy a smartphone to enjoy basic security features.

It is even better idea if it is not a smartphone at all but a dedicated simple security device! Smartphones are a constant security risk alone!!

I had multitudes of non-smartphone based security devices in my life, cheap and dedicated small secure things, but those are endangered species now regrettably.


Do actually secure smartphones exist? Ever heard of Pegasus?

Safe from who? I think smartphones are some of the least safe things when trying to avoid law enforcement/3 letter organizations. I don't feel they're particularly safe at all really, but I'm sure someone can elaborate more.

Reading about this, I wonder if there is a market for legitimate secure phones, i.e. not marketed towards criminals. Businesses are afraid of industrial espionage. Political activists are afraid of persecution (and yes, it can also happen in the west, even if you are doing "good" things and are non-violent).

The defenses in both cases can be quite different. For some people, a cheap Chromebook that you can wipe quickly plus 2FA gives great security, or using a modern (encrypted) phone with Signal. But what if you are, say, protesting against the construction of a new Google campus on top of your neighborhood? Then you'd don't want all your secret stuff on their infrastructure. Normally you assume everybody is playing fair, but it would be so easy for them to push out a malicious update to your phone to gather dirt on you.

If one has experience with AOSP and security, there seems to be a market for an open, secure phone. But I wonder how you'd keep the organized crime out, or at least keep plausible deniability to not get into trouble...


Here is an idea: if security matters to you, get a better phone. They sell them in shops.

Great - how do I get a non-compromised smartphone? :/

I'm kidding but I'm also serious.


You're probably right about what's involved in building a truly secure smartphone from scratch that we can trust.

It's an interesting thought experiment, but I wonder if we can satisfy many use cases without having to build a truly secure smartphone.

For example, if I just want to have voice calls to a handful of people with the content of the calls encrypted, then perhaps I can just plug in a "scrambler box" between my untrusted off-the-shelf phone and my audio headset?

So rather than designing a secure phone where we trust the wifi stack, the baseband stack, the bluetooth stack, the graphics stack, the USB stack, the flash storage stack because we've designed them from scratch, all we have to design is a little scrambler box that just has audio in, audio out, some mechanism for key generation and exchange, and only needs a laughably modest CPU to do the encryption.

Don't really need an OS at all - single process and static memory allocation should suffice.

The audio encoding/decoding and encryption/decryption don't sound too hard to implement from scratch. It's the interoperability with the rest of the world and the UI that makes implementing a whole smartphone so hard.

[I do wonder though how well our scrambled audio will make it through the phone network which is applying lots of clever compression designed for speech.]

If we assume we can mostly trust hardware designs that are at least 30 years old then we can probably avoid designing all the hardware from scratch - e.g. there's probably some sort of Z80 clone CPU we can copy.

The mechanism for key generation and management sounds a bit tricky though. The user would need some way to add his contacts' keys to his scrambler box.

A keyboard and LCD display to type keys in by hand would be secure but impractical for long keys.

The level of tech needed to read a key file from a FAT filing system on a USB stick might be too high to be easily implemented securely. Any ideas?

I'm aware of the famous "trusting trust" paper, but I'm not sure we need to worry too much about the compiler used to build the software running on our scrambler box. All we need to do is choose a compiler released before we started out project and never upgrade it. It is hard to imagine a compiler backdoor that would automatically recognize that the intent of our code is to encrypt data and undetectably comprise it (though it would be wise I guess to avoid any existing implementations of cryptographic primitives).

Sounds like a hardware kickstarter project :)


Anyone with physical access and lots of money to pay for a bespoke exploit (assuming you're using a phone where that's possible), sure they have can have access if they're willing to put in the time and considerable money it would take.

If you're ok with the risk posture of not locking your phone that's your decision, but you're quite wrong about the level of access granted merely by possession of a modern smartphone.


If criminals were really extremely careful, they'd get burner phones with custom firmware that performs FDE without key escrow. I'm sure there are plenty of sellers who are willing to supply such phones at a high margin given the opportunity.

If it becomes a crime to encrypt your phone, only criminals will encrypt their phones.


That will always remain an issue: Even if someone manages to create a secure app, the software and hardware platform will never be secure.

Hijacking a smartphone by only knowing its number on the other hand does not seem realistic too me. So a source for this claim would be great …


Any truly secure phone will attract criminals which in turn will attract law enforcement scrutiny.

It's like building a private house with barbed wire, armed guards, ... people will wonder what's going on inside.

Today, the only way to make it as a criminal in the West is to fly under the radar.

Stuff like "the Amazon of drugs", or "the WhatsApp for criminals" will always be taken down.


It seems there is no safe cell phone. They all run closed source software, written in unsafe languages (C and C++) and can be abused by cyber criminals and governments to spy on and track people at will.

Why do we carry these things in our pockets?

And I'm not convinced Signal or any other privacy protecting app is really useful. If we assume all cell phones are owned (or can be at any time) then the criminals own all the private keys on the phones as well.

It's impossible to have private communications with a cell phone.

next

Legal | privacy