Freedom and police power has evolved over history. It is highly tendentious to claim that proliferating strong encryption would overturn any particularly strong legal tradition. It would simply make some forms of electronic surveillance less useful. Encryption had been used long before mass telecommunications with no negative impact on law enforcement. Law enforcement will adapt to the lack of useful wiretapping if it needs to, and the means to adapt are in use today: placing bugs, physical surveillance, etc.
People keep talking about an elegant solution, but from what I understand, there can't be. Either I'm the only one with a key to my home, or some third party also has a key. And we've see recently that there is no third party that can be completely trusted. Well, maybe with the exception of my Mom. Everyone's mother gets their encryption keys? :)
"I would love to hear from folks who really understand security if such a scheme would work or not (and/or point me to people who have thought about this)."
So again, very little added to the conversation. This really is one of those cases where a lot of technical expertise is required to usefully comment, and I appreciate Obama conceding as much at SXSW. But there seems to be many others just throwing out magical solutions without understanding whether or not they would actually work in practice.
I disagree with him, but even if tech companies were to compromise on this issue they should get something in return. Like tell the government to stop writing idiotic laws about the internet for example.
> I am saddened by the tech sector’s absolutist approach to this issue
Is this guy living in the same universe as the rest of us? Keeling over on this would just be a final act of surrender regarding any semblance of privacy defenses.
Keeling over on this would just be a final act of surrender regarding any semblance of privacy defenses.
If warranted government access represents a final surrender and not the first surrender, we've already lost privacy defenses ages ago.
In a world where consumer privacy is invaded by hardware vendors, software vendors and service providers with data to sell. All without a second thought whenever there is a whiff of profitability, what are we even holding on to? When I don't have and can't get root on my own devices, do we even have privacy defenses in computing to begin with?
Never installing software or ever visiting a website isn't a defense. It's a cop-out to pretend like the existence of that option somehow means we're not all already fucked.
And what happens when that capability is not given to those you trust but instead to your enemies or state actors with vast human rights abuses?
Not mentioned in this article is the cost of those we do not trust having the same keys in the locks, and the power that it would give them, simply saying that this new position is "absolutist" when in fact it is exactly the same as before.
The TSA key and the elevator key were recently leaked and no one can put that genie back in the bottle without replacing all the locks in the city/luggage.
If you design your locks with a fail-safe built in then it will only take someone with specific knowledge of that fail-safe to use it; saying we should just back down and that this is just Silicon Valley misinterpreting existing thought is a complete mischaracterization of the facts and vast new authority the government wants granted.
The point is that society has already figured out and agreed on these issues. This is not about a dragnet but specific situations where there is a warrant.
I think we've all been so scared by all the dragnet activity the NSA/FBI have been conducting that we want to throw the baby out with the bathwater.
Too many tech-nerds incorrectly extrapolates the future from false premises. If the tech-nerds had any social skills, they would know that society has built-in safeguards to prevent government abuse.
Tech-nerds: the world doesn't operate under "theory". There are social rules that you are ignoring that completely invalidates your fears of government. Did the Snowden revelations lead to thousands of US citizen murdered by government? No? Not even one, in the realm of statistical noise? No? Why do you think that is?
The US was a country that had could have ruled over the world between 1945-1949, and yet, we chose not to do that.
Tech-nerds are going to have to reformulate their horrible anti-government rhetoric to account for these social effects.
Learn how people actually live, not how people theoretically might live.
Why do you think that the lack of murders by the government after the Snowden revelations indicate that the fears are unfounded? I don't think you actually understand what you're trying to debunk.
Parrellal reconstruction. IRS harassment of politically undesirable groups. More laws than anyone could read or understand capriciously enforced often based on political tribalism.
Then and FBI lawsuit to force apple to unlock a phone the FBI could trivially unlock.
I mean why wouldn't "tech-nerds" being falling over themselves to submit and comply?
You seem to be ignoring the tens of millions of people who actually died and countless more who were actually oppressed at the hands of governments in the last century.
Government actors -people in positions of power- are known to ignore rules -social and legal- to further their own agenda. From NSA analysts listening to other people's phone sex for personal titillation, to a sitting President who orders the harassment of disfavored groups, then orders some guys to break, enter, and destroy records to cover it up, there are people in positions of power who will do Bad Things that are very much violations of applicable rules, social or otherwise.
You need to seriously read some history. Read about how Hitler went from nobody to killing millions of jews. It was a process. Same with Stalin. We've been down this road before - it doesn't end well.
Different societies have figured out the issues differently.
In some societies, you get killed for being gay, for example. Or for having the wrong kind of sex. Or for not following a certain religion. Or for saying the wrong things. So... you want to hand the power to that society to peer into the minds of anyone, and do whatever the majority thinks should be done?
You are advocating for tools to enable a tyranny of the majority. Instead of letting society figure it out for you, how about allowing each person to have their own freedom?
You are advocating for tools to enable a tyranny of the majority. Instead of letting society figure it out for you, how about allowing each person to have their own freedom?
Unfortunately, that's not reality. Each citizen complies with the laws of his/her own country. So do companies.
There is no global law concerning these issues, so companies with a presence in country X are bound to country X's laws. It doesn't mean country Y's law are great and/or should be the common denominator.
> Each citizen complies with the laws of his/her own country
Actually no. Each subject complies with the laws of "their" country up to the limit of enforcement. When enough people disobey, social change occurs.
What the establishment is advocating is to move these limits of enforcement down to the level of individuals' thoughts. This isn't surprising, as their power deludes them into forgetting that laws follow society, and not vice-versa.
Imagine the setting of a century ago. A homosexual travels across a border, at which point their actions from the past several months are magically analyzed. They're then thrown in jail for buggery, while everyone else smiles and nods! Justice was served, with due process, to suppress repugnant anti-social behavior.
This debate has been framed such that believing that privacy should exist implies one is an "absolutist". Well then, I guess I absolutely believe that every individual action need not be subject to public scrutiny.
The problem is that the existing societal solutions have a pretty hard bottleneck: It requires a physical person to show up at your door with a warrant to "rifle through your underwear drawer" as the president put it. It's relatively hard for someone else to take advantage of this (though, put on a fake police uniform and bring a fake warrant and you can probably get into most people's houses while risking some really nasty charges).
With encryption, any solution I've seen suggested has the risk of mass use by the government in addition to allowing other people besides the government to use it.
It is arguably baked into the nature of policing that such institutions will over-reach their authority and attempt to consolidate it. Usually for all the right motivations, but they have systematic blinders to the unintended consequences and risks. For this reason, there must be effective counterbalances, and demonstrably court and board oversight is insufficient.
Hence it is important that for some actions that police are legally enabled to do, they remain too expensive (in time, manpower, money, whatever) to do habitually or without thought and prioritization.
Pretty much any technological advantage that facilitates parallel construction should probably be resisted for this reason.
My take is twofold:
1) Law enforcement wants their job to be easy. Sorry, it's not. They see criminals in everyone because they're immersed in it day in and day out. People aren't bad; they don't need constant surveillance.
2) Politicians have to have some way of making themselves look "strong" in the eyes of U.S. Citizens. It's just steeped in our blood at this point. "The War on Terrorism" (tm) is a really easy way for them to look strong even though there's NOT a terrorism issue in the U.S.
Part of the issue is that the dragnet activity the NSA/FBI have been conducting -- and their new moves to share this data with local law enforcement in violation of previous promises -- has destroyed anyone's trust in the social contract. People rightfully do not trust that any powers granted to government via any form of key escrow or backdoor system would be used only in conformance with the law.
Obama himself has acted in such a way as to destroy anyone's trust in him. He promised to be on the side of 'transparency,' but as near as I can see he's no better than Bush II in this regard. He has no credibility on this issue at all.
... and it's also important to note that we in Western democracies are developing and deploying tech that is going to be used in far less liberal countries with far less rule of law. Any backdoor/escrow system that we create is also going to be made available to dictators who want to persecute and kill. If it's bad for us, it's going to be deadly to the unlucky souls who live under much less civilized regimes.
The problem isn't with ways for law enforcement to access all your communication with a warrant, for probable cause, on the suspicion of wrongdoing, by upstanding law enforcement officers, and fair trials.
The problem is the potential for abuse. When your secrets get leaked, when they're used unlawfully, when they get in the wrong hands, when the humans who have access to it do the wrong things, if the wrong people get elected, if the wrong laws get enacted that enable more abuse, if another McCarthy comes along and sweeps people up in hysteria.
And that is why we must support strong encryption without backdoors, for everyone, upstanding citizens and criminals alike. The positives simply don't outweigh the negatives.
You're comparing apples and oranges. The correct comparison is getting a warrant to search someone's house and finding a piece of paper with a strong cryptogram on it. They plead the fifth, and the story ends.
The secret that unlocks the cryptogram belongs to the owner, not to Apple or anybody else. It lives in their mind. You can get a warrant to search someone's house, but you cannot compel someone to share a secret. That has been true from day one, but you're trying to weasel your way around this fact because you're afraid of terrorists and you trust the police.
Are you a lawyer, or have you studied 5A jurisprudence? My impression is that it's not as simple as you're making it out to be. The 5A doesn't prevent a bunch of other compelled incriminating actions, and compelled production of passwords may eventually survive 5A scrutiny at SCOTUS as well.
Prohibitions on compelled self-incrimination predate the bill of rights, and were intended as a measure to limit torture, not as a general privilege of the accused to conceal evidence.
No, I'm not a lawyer, and I'm deliberately simplifying my example to make an argument.
However, I have looked into 5a jurisprudence, and fully agree that there are some scenarios that are not so simple, e.g. if they really did have certain other evidence against you that specifically suggests what is in the cryptogram, and/or that it is pertinent, then you could very well end up in jail indefinitely for contempt of court.
In this instance, I strongly suspect that my simplification is justified because in the San Bernardino case and the myriad others that are on the FBI's short list, they are fishing in order to gain a general precedent that they do not currently have. They want to be able to compel you to release a password even when they don't have any evidence suggesting what is in the cryptogram, arguments for why it is pertinent, etc.
I personally think people should prepare themselves for a future in which compelled production of passwords (or, at least, compelled unlocking of devices and documents) is the norm. The 5A argument about crypto seems really flimsy, and allowing compelled unlocking seems constitutionally more straightforward than a ban on strong cryptography.
There is less organic pressure for deniable encryption than there is for end-to-end encryption, because the latter is helpful for general-purpose security, and the former is not.
Even if a court could order you to produce a key for a cryptogram on paper, there are practical limits, like providing evidence the key continues to exist.
Warrants are powerless against deniability, whether that's about electronic data, cryptograms, or any plausibly unknowable information. Deniability regularly stymies legal cases, but it is a fact of existence, and nobody is begging "the experts" to come up with a fix.
That's true, but I think it misreads the law enforcement concern about crypto. The issue isn't that it's possible to deniably and unbreakably encrypt things. The issue is that unbreakable crypto is becoming the default, and that we can clearly see a future in which no phone call, document, hard drive or device will be available to an adversarial court proceeding.
It is much less likely that deniable encryption will be the default any time soon.
I won't comment on the legal aspect, but there is the practical aspect as well - how would it ever be possible to distinguish someone refusing to produce the password vs someone forgetting the password? Everyone forgets passwords to things all the time. We can't make forgetting your password a crime.
To a first approximation nobody is walking around with a phone that has received a call or sent a text message within the last N weeks that they themselves are incapable of unlocking.
I think the only way the house search warrant vs. phone search warrant comparison would work is if there was a way to know that someone used a warrant to sift through your data. You can know the police has searched your home because doing so requires physical access to the house. If there is a way to use a private key to access your data, anyone could use that key to access your data and you will never know about it. And that is really scary!
But still, nothing prevents you from having cameras hidden throughout your house. There still needs to be physical access to the house. Where encrypted data can be distributed anywhere and decrypted without trace, there is only one copy of a house and it is yours.
We don't have to abandon the framework at all. The framework never guaranteed law enforcement would have a resource to access. It just says we have to obey the warrant. So, if only I can unlock the device, I can be ordered by a court to unlock it for a search. If I don't, I can be put in jail. So, Congress would just need to pass laws strengthening that principle to achieve what FBI claims to want.
Far as what Fred asks for, the best minds in cryptography (and breaking it) looked at the past of industry and present of CompSci to see if that was feasible. The consensus is that it was impossible to design a backdoor for FBI that wouldn't be used by foreign intelligence services or breached by hackers. The write-up of that is here:
My own background is high-assurance security: the kind of rigorous engineering designed to stop nation-states. I have an idea of what works and doesn't. I disagree with the authors that we can't make a backdoor for selective access. We do it all the time under banner of remote administration, VPN's, etc. Crypto in general, damnit! My concerns were hackers accessing all data, FBI reading more than allowed, and FBI/hackers having write access that allows forgery. My attempt at a read-only, high-assurance mechanism for law enforcement is here:
Note: That post reads like I'm in favor of surveillance or something. The context is a situation where courts or Congress forces L.E. into systems. My L.E. scheme is insurance policy against broader ones.
The design lets Feds and courts get what they need while restricting access, esp write access. Such designs can be made arbitrarily difficult for attackers without access. Consumer devices and anything cost-sensitive will have limits. The hardware levels can be hit, esp with emanation attacks. U.S., U.K., Israel, and Russia have that tech. So, there's residual risk of attack that has to be mitigated.
However, the bigger problem is key management. The Fed's original proposal was escrowing the keys to them. The FBI, DEA, and NSA have a long history of poor security leading to leaks, infiltration by foreign intelligence, corruption, and occasionally targeting Americans. I question giving them any full L.E. power to begin with but they certainly shouldn't hold the keys if one is mandated. That means companies will hold the keys themselves protecting them with recommended means (esp HSM's). HSM's themselves have same security issues, esp w.r.t. intelligence agencies, but we'll ignore that for now. ;)
So, the problem then changes from protecting ephemeral communication (like spoken words) to protecting keys to records that will be kept for years. If not initially, government will pass laws to mandate it like they've done here to a degree and in other countries. You have to realize, though, that a product as popular with elites and government as an Apple or Android phone will lead to tons of people's secrets consolidated into hands of a few companies. That's incredible power for them to wield. Let's say we trust them, though.
This leads to the biggest counterpoint I heard in my exploration of lawful intercept. Dirk Praet, well-versed in IT and military INFOSEC, put it something like this: "Nick, let's say you deploy high-assurance L.E. in devices and every security method you can to protect escrowed keys. Once secrets pile up in one place, you must realize that every intelligence agency, hacker, and organized crime group on Earth will be coming for it. You might be smart enough to stop many of them. Will you bet you're smarter than all of them combined?"
I'm confident in myself but not dumb enough to place that bet. My L.E. design might be implemented for remote administration, etc for voluntary use in companies. Yet, after thorough exploration, I strongly recommend against any escrow or backdoor mechanism because it puts us at risk against people who otherwise had limited potential to harm us. It also piles up the risk, too, once laws show up mandating retention of communications. Instead, we should focus on penalizing refusal to allow a search and implement methods to ensure the search is honest.
> I disagree with the authors that we can't make a backdoor for selective access. ... However, the bigger problem is key management.
That right there is why we can't make a backdoor for _selective_ access. The Bad Guys will get the credentials to access the door, (or they'll be inside the Good Guys' organization) and they'll walk right through the door.
Past experience has taught us that law enforcement and government both want backdoors to be quick and rather hassle-free to use. This -obviously- is at odds with a paranoid credential management scheme.
> ...I strongly recommend against any escrow or backdoor mechanism... Instead, we should focus on penalizing refusal to allow a search and implement methods to ensure the search is honest.
1) Heh.
2) If a manufacturer ships a crypto system that _cannot_ produce plaintext on demand (and cannot be modified and field-upgraded to do so) and requires physical access and sophisticated hardware tampering to defeat, will this be considered a "refusal to allow a search"?
Incidentally, I devised such a scheme with foreign, tamper-resustant hardware for exactly that reason. I'll post it here when Im at main PC around 10:30p tonight.
Similar song to the first one: Oh so much of that is completely unrelated to the topic at hand (which is -to wit- design of a high-volume (think hundreds of signatures per day), very low-friction, audited code signing system that creates images signed with what is -for all intents and purposes- the same key that's used to sign the official, aboveboard builds of the software).
"The second half of this is largely hand-wavy technobabble. The first half or so of it doesn't apply to the system we're describing."
"Similar song to the first one: Oh so much of that is completely unrelated to the topic at hand "
You got a talent for dismissing without reading. If you did, you'd have guessed that I implemented the second scheme in something. That actually included signed messaging and code operating at 10Mbps wirespeed. Besides, if an 8-bit CPU can do ECC/RSA ops in 0.x-3sec, then why would you think GHz CPU's w/ HW acceleration couldn't do a few hundred signatures a day? You running your infrastructure on 4-bit CPU's handwired with TTL chips or something?
"high-volume (think hundreds of signatures per day), very low-friction, audited code signing system that creates images signed with what is -for all intents and purposes- the same key that's used to sign the official, aboveboard builds of the software)."
You said low-friction, high-speed, no "techno-babble," and implied it shows resistance to NSA/FBI scheming. That's impossible to answer given your constraints.
"I also notice that you didn't answer my question."
A non-technical, low-friction answer to that is possible. Let's revisit it.
"If a manufacturer ships a crypto system that cannot_ produce plaintext on demand (and cannot be modified and field-upgraded to do so) and requires physical access and sophisticated hardware tampering to defeat, will this be considered a "refusal to allow a search"?"
No law says it's illegal for private party to buy intercept proof equipment unless it's radio. There's foreign countries that don't restrict crypto. So, buying one from a foreign country is legal if it passes import approval. Using it is legal. When hit with court, one must show reason other than dodging the law that the information can't be produced to dodge obvious prosecutor argument. My prior scheme on that is resistance to attacks on personnel by organized crime or foreign spies via extortion, kidnapping, torture, etc. Easier to resist them compromising everything in permanent, stealth fashion if that's impossible at manufacturing time. So, legal to buy, a justifiable reason for it, and finally great lawyers for presentation. There you go.
No guarantees, though. Our laws are subject to court interpretation. That can be good or bad for the accused.
This article signals that big VCs have officially become a part of the establishment (lobbying, hosting Obama dinners, etc), and Fred Wilson is volunteering to plead compliance to the powers above.
That was my reading too. This isn't a post for public consumption; this is a coded communication to the U.S. national security apparatus that Mr. Wilson is willing to play ball.
Of course, he could have just emailed anybody with the same message, and they would have gotten it...
I can understand where Fred is coming from, and I think Obama's statements on the matter were actually pretty thoughtful. But writing this article without mentioning the fact that the "warrants" that are being granted to law enforcement have grown more dubious by the year is incredibly insincere.
Commenter @charlieok on that site had a great point:
"...If we were to invent a technology to read peoples thoughts should the state demand the absolute right to access our thoughts because after all the bad minds can not be allowed to go dark? This decision is that decision. ..."
No, the proper analogy to locking down a computer is locking secrets away in your mind and as of this writing, that's still mostly protected by both the 4th and 5th Amendments of the Constitution. I cannot be compelled to testify against myself and neither should my electronic devices which are no more than an extension of my mind. Of course idiots like the author of this article want to literally create something that's impossible, the have your cake and eat it too approach to security that has never worked and cannot possibly work. Yes, it's these kind of idiots we should be looking for advice from, people who don't even understand the basics of computing, mathematics, or even their own reality, people who defer to our technologically handicapped president as an authority on technology.
So, when they get everything they want ("no absolutism" implemented by US companies), will we finally be able to explain to them it was all for nothing (except making the public less secure) when the next big attack uses open source non-backdoored encryption, maybe even routed through companies not under the US's jurisdiction?
The problem is that surrendering encryption would not restore the old pre-encryption status quo. It would give government and state-linked actors absolutely unprecedented ability to observe, surveil, and monitor our activities now (and in real time) and in the past. These are not old abilities. They are abilities that have never existed.
When the government gets a search warrant to go and rifle through your sock drawer, they do not gain access to:
- A complete history of every physical movement you've made in the past several years to an accuracy of 1-4 meters.
- Every private correspondence you've made in the past several years.
- The history of every home automation device in your home and how it was used.
- Personal health and body monitoring data for years (if you use personal health monitors, etc.).
... and so on. As this technology advances the amount of information things can passively collect is going to keep growing exponentially.
Modern Internet-connected device-embedded computing devices log, collect, and can reveal more information than has ever been available in all of human history. Without encryption we're living in glass houses -- a true panopticon.
Encryption is critical if we are to have any hope of implementing a "smart" automated future without surrendering absolutely all privacy forever.
In the further-future (maybe not as far as we think) I can imagine things with direct neural interface to the human brain. That's where we might be going. Surrendering encryption and strong security as we head into that kind of world is a recipe for insane dystopian scenarios.
The old world is gone. The old status quo is dead. There is no "balance" that can be legislated that will restore it. The new world is a qualitatively different place where completely different rules and balances apply. In the new world we face a binary choice between absolute panopticon and total surveillance or an unparalleled level of individual privacy. There is no middle ground -- because math says so. Information is either encrypted or it is not. In security systems are either secure or they are broken. It's binary.
I actually think most of the people in this debate -- on both sides -- fail to realize just how dramatic this shift is. This is literally the end of an age of human history, and the relationship between individuals and governments (and "society") is about to be radically redefined in ways that many people (liberal, conservative, even libertarian) are not going to like. I know there are aspects of it that I don't like, like the strangely intimate relationship that is emerging between software and "cloud" vendors and human beings, but I don't see these things changing and I think the best reaction is try to work through the implications and figure out how to deal with them than to dig in and resist time.
Edit:
But "it can't happen here!"
Donald Trump is a credible presidential candidate, and has said things like "maybe he should have been roughed up" about protestors. Any power we give to Obama will be passed along to the next administration, and the next, and the next, and the fact that someone like Trump can even be considered should be a reminder that we cannot bet on future leadership being any better than present leadership. If you're right-of-center, replace Donald Trump with Hillary Clinton or Bernie Sanders. They are also credible candidates, and they too would inherit the panopticon we're building. Every power you grant to the current government will be possessed by all future governments -- including many you don't like -- and once granted a power is very hard to revoke.
No society can bet on its future leadership being perfect. That's an impossible pipe dream. Any society that lasts for any length of time is going to weather periodic episodes of fanaticism or corruption.
The way you stop it from "happening here" is by limiting government so as to prevent moral hazard. That way you can weather an awful leadership without descending into tyranny or collapse. The reason we don't live in a corruption-ridden hellhole or a totalitarian dictatorship is because we have laws, procedures, and customs that limit government. Those limits allow criminals to get away with crimes, but that's one of the costs of preserving a nominally free and open society. The alternative is to load up on moral hazard and then wait for the first corrupt or fanatical government to trip over it.
What this misses is let's take say me as an 8yr in the eighties and an eight year old today...there was no way to subpoena conversations with ones parents, the comics you read, music you listened to, etc. none of that was recorded...
"Privacy absolutism" is the direct consequence of the Snowden revelations. If the government had not created such a massive apparatus spying on everyone, there'd be room for compromise (if technically feasible).
By creating secret courts, secret warrants or warrantless spying, the gov itself destroyed all faith in the possibility of a compromise that allows access where necessary while safeguarding meaningful privacy.
The NSA now has a profile of my porn preferences. Congratulation! I hope it was worth it – I've since enabled SSL on every domain I control, taught classes on the use of PGP and am one of thousands of non-Americans willing & able to create strong encryption messaging services it Silicon Valley bends to the Gov's demands. The NSA helpfully create a market and at this point any moral qualms I might have once had evaporated.
Reflecting back on the last decade, I feel like a case study in 'unwanted side effects'. I never was (and never will be) an Ayn Rand-style libertarian or one of those conspiracy-peddling bitcoin enthusiast. I've defended the primacy of the law as the only legitimate outlet of power in the face of hands-of-the-internet techno-anarchism. As I'm European, it should also be mentioned that I despise the Anti-american undercurrent so prevalent here, being able to recite most of The West Wing from memory. Hey, my major claim-to-fame is actually a widely-copied photo of Obama at his Berlin rally in summer '08.
But I've apparently made much of same journey as Apple has, which probably never foresaw that it would end it such an open confrontation with the US government (it must be bad for business to take sides in the culture wars).
In fact it seems possible that Apple or I actually never changed. Lesson: if you shift the Overton-window until the moderates become extremists, don't be surprised if they start behaving as such.
People say this a lot, but I'm not sure what it's supposed to mean. Ok, Snowden demonstrated that giving the USG centralized access to communications is dangerous. People don't trust the government. Now what? What's the effective policy response to this?
Crypto advocates (I am one of those) eventually need to stop pretending that there isn't a meaningful tradeoff here. In 10-20 years, everything is going to be encrypted: every phone call, every hard drive, every email, every photograph. That is going to pose real problems for the kinds of law enforcement cases that everyone cares about.
That doesn't mean key escrow is a good idea, or that we should prohibit encryption. But saying "that's not going to be a problem" is just not an argument. You have to do better.
> That is going to pose real problems for the kinds of law enforcement cases that everyone cares about.
Those problems have existing solutions: Place a room bug where the evildoers meet. Use real-world physical surveillance. Track their movements.
The only thing you can't do with existing solutions is to scale them up to pushbutton mass-surveillance numbers. And that's only a problem if you want to use surveillance as a tool to suppress political/social movements, and our beneficent government would never do that.
I agree with you. That's something that frustrates me about the "Going Dark" debate. Crypto can pose real problems for justice and safety, and the answer still probably isn't to suppress crypto. Maybe we should just do a better job of funding investigatory work?
> Maybe we should just do a better job of funding investigatory work?
Many of these supposed problems are a product of "searching under the street lamp." Once you shut off the mass-surveillance street lamp, we may find that the quality of law enforcement rises if only because somebody might get off the internet and look at the backlog of burglary and strong-arm robbery cases.
That's almost certainly true. Electronic surveillance has made it easier for investigators to "cheat" their way through a case. But we fund investigations based on how long it takes an investigator to clear a case, so if we make it much harder to do that, something else will have to give.
> Crypto can pose real problems for justice and safety
Exactly like the worries of "shouting fire in a crowded theater".
If communication leads to an action that has real world consequences, then criminalize and investigate the actual action. Don't take the lazy way and attempt to police communication.
"Safety and justice" are abstract concepts. Referencing them without discussing specific situations can only lead one to the conclusion that more is better.
I brought things closer to specifics, while needing to stay general, by asserting that any real crime must have real-world physical involvement, which can always be investigated and policed.
It's always been possible to execute the "perfect murder", although few manage to accomplish it. I don't see how that could possibly change with the addition of bicycles for the mind, unless they lead to psionic powers.
What we're actually "debating" here is law enforcement getting a temporary boost due to criminals' misunderstanding of technology, and now fighting tooth and nail to hold onto the transient. Nobody was breaking out of jail and then posting about it on facialbook in 1950, so in 2050 when someone breaks out of jail and posts on e2e socialnet, police will simply have to use the same classic investigative techniques they've always had.
I'd never say that strong ubiquitous encryption is not going to be a problem. I'd just say that (a) It's the fault of those who proved they couldn't be trusted with that kind of power (b) Crime is at an all-time low so it weighs lower on my set of priorities, (c) If the "law-enforcement cases that everyone cares about" include drugs I'm starting to be one of the people that don't actually care about those and (c) as I'm not American I think I'm allowed to disregard all law-enforcement arguments until the FBI publishes the framework that allows my local LE to access such data but not the chinese though-police.
And that is what is happening, which is why this story is hot. Because, encrypting all the things is a way for the people of a democracy to take back power from their abusive governments.
In the same way that an "encrypt everything" future decreases the current power of a warrant, the current power of a warrant is far above its historical level.
The transition from physical to digital massively increased the amount of data being stored. Your "papers" never recorded your historic location, biometrics, metadata and contents of all (or most) of your communication, etc. It used to be that the government could never access most of that information, warrant or not. Now they can, because our smartphones track all that stuff. The more I think about it, the more I see our digital assistants as extensions of the mind rather than an ever-growing stack of "papers and effects;" it's a difference in magnitude that produces a difference in kind.
I'd argue that the encrypt everything future is more similar to the pre-digital past than not. Before the telephone, law enforcement could never access the contents of a private conversation without physically putting a person in the room or questioning someone involved. I wouldn't say that it's exactly the same; there's a big difference in effort between using Signal and enciphering all of your physical letters, which will admit a difference in adoption rates.
Like the other commenter said, it brings back the importance of HUMINT. But like you said, adding a law enforcement loophole to encryption is not the answer; that would only serve to increase government surveillance power, even if it was gated with a warrant. It's the equivalent of putting a surveillance device in everyone's pocket. Historically unprecedented, even if you need a warrant to access the data on it. Problem is that law enforcement desperately wants to keep that power now that they have it.
IMHO we are in fact facing a choice: make it easier for criminals to communicate, or live in a glass-house panopticon. I don't think there's a middle ground. There's no way to weaken or break crypto... without weakening or breaking crypto.
Freedom with (perhaps) a bit more crime, or totalitarian surveillance state. Take your pick.
Neither of these choices is perfect, but nature seldom provides perfect choices or 100% downside-free options.
nit: direct consequence of the out of control government exposed by the Snowden revelations.
But while Snowden was great publicity, the underlying issue was always there and will always be - you're either in charge of your computing or someone else is. And other people have much different incentives than yourself, especially after they aggregate a trove of everyone's personal data.
So, let's say the U.S. tech sector compromises. Who outside of the U.S. is ever going to use software, hardware or services developed by the U.S. again?
Second, I completely agree with others that, well, the government has brought this on itself.
This article links to another blog post where someone yet again proposes not just one master key, but a unique master key for every device manufactured (which would then presumably be protected by a master master key ... it is, as they say, turtles all the way down). But once again, someone will get a hold of that super-master key and it's game over.
Personally, I'm more afraid of the government/law enforcement (and by that I mean not just the US, but every country's government and law enforcement) having access to this. So maybe the US isn't looking at my data - after all they said they'd only look if they have a warrant[1] - but what's to stop the rest of the world? I'm not a citizen anywhere else, so there isn't any legal framework protecting my privacy there.
And besides, the LEA argument that having encryption they can break being no different than having houses, safes, safety deposit boxes, etc. accessible with a warrant is complete bollocks. If the technology existed which let a burglar break into every house in the world with a particular sort of lock, simultaneously and undetectablely and get away with the _entire_ contents of said house, these same agencies would be demanding better locks for everyone (well, actually they'd probably be demanding regulations on who can have access to the "be everywhere at once" tech and insisting that they needed it to fight terrorists and pedophiles). But make no mistake, that's what having an "encryption master key" means. It doesn't mean "great I have this super key, now when I burglarize houses I don't need to smash the door in", it means "great, now I can break into every house and no one will even notice until all their stuff is on eBay".
I don't believe that government/law enforcement should have this capability and I certainly don't believe that investigating a couple of _dead_ terrorists (or live ones for that matter) is worth giving up our privacy, but if this sort of tech is going to be required, at the very minimum using it must require physical access to and irreversible modification of the target device. For example, this mythical magic decoder ring device should have to work like so:
If I understand correctly how the security on iPhones works, your passcode is used to unlock a private key which is then used to decrypt your data. If a copy of that private key is also encrypted with an asymmetric cypher and different, per-device keypair and the private half of that keypair is stored in a special chip (which I describe below), then this should be pretty safe.
1. Get a warrant to take physical possession of the target device
2. Dismantle said device and remove the chip with the encryption keys
3. Insert the chip into the decoder machine
4. The decoder machine burns out one-time fuses (via pins which are left physically disconnected in the target device) in the encryption chip which enables access to aforementioned second private key (which is unique _per-device_) and prevents the device from working should the encryption chip (or a replacement) be reinstalled[2]
5. Data is decrypted using this copy of the private key
6. Now it is very obvious that this has been done, it isn't feasible to do it on a large scale and _no one_ is doing it remotely.
If you're still concerned that there may exist a remote exploit (and those fears are justified, even though as far as I can tell it should be impossible because the hardware to blow the fuses doesn't even exist in the target device), add an additional constraint that blowing those fuses also removes power from the radio baseband chipset, physically and permanently disconnecting the device from all networks. The obvious issue here is the potential for extreme mayhem if someone figures out this exploit, instantly bricking 100's of millions of mobile devices.
I'm not saying that this is a good idea or that we should back down from a pro-encryption stance, but if it comes down to some stupid law requiring such a thing, acting upon it must be _at least_ as difficult, time consuming, and apparent as my proposal above.
[1] I don't believe them, but that's not actually relevant
[2] This is probably the most hand-wavy part, but technically feasible I think
Unbreakable encryption exists. Trying to legislate it away is like trying to stuff the genie back in the bottle.
There is no way to prevent "bad guys" from writing encryption software and installing it on programmable devices; you can find the code for AES in under one minute, written for any language you desire. It takes an undergrad level of programming ability to create a functioning app that does basic AES encryption. So ISIS just needs one programmer with this ability -- and somehow I doubt that person is going to give a copy of the master private key to law enforcement.
To the extent that governments prevent unfettered use of encryption in mainstream devices, "bad guys" will just use rasbperry pi's, custom ASICs they get fabricated, etc. Programmable devices capable of running strong ciphers and communicating over the internet are absurdly cheap now.
Limiting the use of cryptography would therefore only harm stupid and non-resourceful criminals, and regular people with something to hide. Which is everyone, kind of.
I think this is the operative point. Even if we agreed on some key escrow system and mandated its use on common devices, that would just make strong crypto somewhat harder to get. You can bet a thriving black market would spring up in no time to serve the need.
The choice is not between a world where bad actors can communicate securely and one where they can't. The choice is between a world where everyone can communicate securely and one where only criminals can do so. "If strong crypto is outlawed, only outlaws will have strong crypto", one might say.
I guess it's sort of brave for him to come out and say this, since Wilson's livelihood depends in part on his relationship with the tech industry, which is on this issue as he describes it.
But it's just not enough to point out that the people are entitled to every person's evidence. For that principle to have any sound control over engineering, we need a means for evidence to be produced in court cases but not coughed up to extralegal procedures by the DoD or other countries. Nobody knows how to do that right now.
I'm unlike virtually all of HN in that I think both Obama and Wilson are raising a valid point, as are the "Going Dark" campaigners. But just because the argument is valid doesn't mean that we should build weaker security. We may have to live with real, meaningful challenges to law enforcement for years or decades while we work out a response to the problem.
The problem with their point is that it it's a fair-weather argument. It sounds perfectly reasonable in a world of good-faith actors, and a checked and regulated law enforcement, filled with upstanding moral officers.
And even if you trust the current government, how can you trust a future government to continue acting in the best interest? Maybe abuse won't happen today, but what are your guarantees that it won't happen in the future?
Also, we live in a world where there are still plenty of totalitarian regimes. If, for example, Apple develops this capability and give it to US law enforcement, every single totalitarian regime out there is going to demand access to the same thing for their law enforcement, and then we know they are going to abuse it.
If American technology is generally back-doored for access by US government agencies, nobody elsewhere on the planet will want to touch it.
Heck, the contract cancellation-zilla that already started with the Snowden affair will just start snowballing.
With technology pretty much permeating everything, it means that it may no longer be possible to export almost anything to other countries. Get used to widespread rampant poverty now already, when international trade will be mostly gone.
The FBI say that all they want is to protect the United States, and indeed that will become much easier to do when there will be pretty much nothing left to protect! ;-)
First, I love it when the language in a debate evolves. Everyone for secure communication is now am absolutist. It's like what was attempted with pro-life. Oh, so you're anti-life?
What a horrible argument though. We should ban a readily available technology so the average consumer's communications are no longer secure? Anyone who has half a brain is going to just download non backdoored technology (or write it themselves) when they need to communicate securely. Most criminals committing major crimes are smart enough to be this careful.
Also, saying the tech sector is libertarian? I'd give up the majority of my income to taxes if it meant no one went hungry. I still believe in a right to privacy and secure communication. My mom would never send personal information (credit card mainly) over text when I was younger. I thought she was nuts. Now I see how right she was when learning about stingrays and other cell site simulators.
You can call me a privacy extremist if you want, and paint me in there next to all the other extremists. If you ever start looking into how broken our communication systems currently are, and how easy it is for criminals to eavesdrop, you'll likely become an extremist too. Political movements can also change rapidly. Backdoored communication means to can be targeted politically, for your race, religion, sexual orientation, and opinions. I'm not willing to gamble on the government's good nature now nor in the future.
> if there is probable cause to think that you have abducted a child or you are engaging in a terrorist plot or you are guilty of some serious crime, law enforcement can appear at your doorstep and say we have a warrant to search your home,
How exactly is this supposed to work? Are we pretending that real criminals are suddenly going to stop using safe encryption just because it's illegal? That just doesn't make sense to me.
I mean the software is out there, how are you going to stop people from using it? Or are we going to decrypt ALL the traffic, all the time just to make sure everybody still uses our weak crypto? That seems just as "absolutist" as perfect encryption.
Sometimes when I read the news these days I shudder with the fear that we are
rapidly approaching the dystopian nightmare many of us who grew up in the
Cold War era came to fear. We were taught that a favorite saying by the Soviet
"subjects" was "The walls have ears."
I read a story in elementary school about a child (my age at the time) preparing
for a State exam. His father was answering questions (quotes are approximate):
"Dad, why are the leaves green?"
"Nobody knows."
"How far away is the sun?"
"1000 miles."
The father was concerned that his kid might get too many questions correct on
the exam. Kids who were too smart disappeared,
Hopefully Mr. Wilson would agree that if this goes through then we need a
serious fucking discussion about these "warrants" that are "required" to get
private information.
One of problems we have now is that the warrants are too easy to get. And they
can be kept secret. There is no harm in trying, so the feds might as well keep
trying. The FISA court contains judges that are hand-picked by the Chief Justice of
the Supreme Court [1], which can leave the FISA court dangerously single-minded.
Perhaps new laws compelling law enforcement to be open and transparent about what
it is trying to acquire would be enough, but I am suspicious.
A second problem centers on the legal justification for the validity of the
National Security Letter- the letter that the FBI can write to get information
from things like libraries. The (legal) legitimacy of the process for aquiring
an NSL (which does not require judicial oversight) rests on the idea that the
target has no expectation of privacy to the information being obtained. If
they put in a back-door to our private communications, then it could be argued
that we now have no expectation of privacy for our phones (etc.). So I'm afraid
that lowering the technological barrier may coincidentally lower the legal barrier.
Wow- this think tanked quickly off of the front page. With 35 points submitted 4 hours ago, it is on page 7- below a story submitted one day ago that only has 6 points. Is this an intervention of the mods?
reply