He is keenly aware (cf the privacy bugs fiascos of mid 2010) that trust in the system is hard to build up and easy to lose. It takes enormous sincere and public effort to pull out of loss-of-trust spirals.
It's a feature of the system, precisely because it replaces trust in shaky institutions.
I trust strong cryptography. I don't trust governments not to abuse their power. I don't trust corporations not to maximize profits at our expense.
It's really simple. I don't believe in any system that requires trusting either governments or corporations. A good system will necessarily work despite them.
> we should strive for a future where humans are not motivated to hurt other humans. Not treating it as some sort of default is a good start.
Trust is inherently selective and small scale. You trust specific individuals that you personally know, not all of humanity. Even that trust is easily broken.
Compare that to modern computer and internet technology where servers are expected to receive input from any other computer on earth. The technological equivalent of children being encouraged to talk to strangers.
I think he succinctly summarizes the issue. The public will need to keep pressure on both sides, not just to fight corruption but to get anything done at all. The stalemates and government shutdowns undermine trust as well. And on top of that, there's the increasing distrust in media.
The size of the country and ease of communication I think makes rebuilding trust an increasingly uphill battle. The number of people involved is so large, magnitudes beyond the number of people we can know and trust personally. In PGP, key signing is a way of extending that circle of trust. How can we rebuild the social/political/institutional equivalent once it's lost?
Increased ease of communication makes it so much easier to sow distrust. You hear something -- is it disinformation? just a rumor? an honest mistake? Something you can actually trust? And who can you trust when you look for information to confirm or verify that first piece? What level of openness is necessary to make the populace trust the level of oversight?
We can't each research and verify every piece of information that we come across. We need some kind of chain of trust if we're to get anything done besides manage our paranoia. I'd like to stay on side of believing that people are generally good -- the general population as well as people working in government -- but we have to stay vigilant to corruption.
The underlying problems aren't new, but I do think we're seeing scaling issues that compound the problem. Anyone know what work or research is being done on these issues of rebuilding trust?
> "We need governments to take action if we’re going to make progress on challenges like avoiding a climate disaster or preventing the next pandemic," Gates wrote. "But declining trust makes it harder for them to be effective. If your people don’t trust you, they’re not going to support major new initiatives. And when a major crisis emerges, they’re less likely to follow guidance necessary to weather the storm."
This is where Bill's msg breaks down for me. Simply put: Trust is earned. The fact that governments have broken that bond uncountable times is not the fault of those who have had enough. Bill is attacking a symptom, not the root - power controlled - problem.
Gates is an adult. He should know this. Yet he promotes a narrative that's otherwise. Sadly, he doesn't get the irony (i.e., his false spin undermines trust).
> The longer I am alive the more I learn about how much of human civilization depends on good faith and trust.
The way I see it, this is a feature, not a bug. The energy costs of trying to build a civilization like ours on trustless systems would be prohibitive (and no one would be happy about the amount of bureaucracy it would involve).
Trust is not only vital to the continuous existence of our civilization, it's also a very powerful tool for advancing it. That's why the growing lack of trust of the general population worries me. The rule of law works only as long as most people trust in its implementation. Money works only as long as most people trust in its implementation. That's why, for instance, I'm very critical of mainsteam journalism and media platforms - all they seem to be doing these days is to sow discord, burning people's trust in institutions to get their eyeballs. That's why I've been so critical of Uber since the very day they entered the taxi market, as by breaking laws and getting away with it they were eroding trust in the rule of law (half of the blame goes to municipal governments that failed to immediately ban them).
In my view, trust is so important that working to destroy it is antisocial behaviour, and one that should be punished swiftly and severely. Much more than it is these days.
--
> If the last few years (decades it seems) have taught me anything, it's that we have to reevaluate many of the assumptions we have been holding about reality and our peers. From the Target/Home Depot/Equifax breaches to Facebook's Cambridge Analytica fiasco, we have just been way too lucky and way too trusting.
IMO the examples you give aren't very powerful. Nothing significant happened because of Target/Home Depot/Equifax breaches, nor the Cambridge Analytica fiasco led to anything of relevance - when compared with the previous scenario of power grid failures. I'd use the 2008 financial crisis as a better example.
> GPS is just one of many such cases, although a surprising one. I never thought the world would put so much trust in a system created by and for the US armed forces.
> Many, many institutions in the USA are built on it being a high-trust society. Now that it's falling into a low-trust state, we can expect those institutions to fail, and perhaps the state to as well.
Not enough people understand this, but I'm encouraged whenever I hear from those who do.
> We are at all-time peak of distrust among individuals, corporations and the gov't. The only way forward is going to be trust-less systems where everyone can inspect why and how things are working.
I agree there's a lot of distrust in society but maybe not all-time peak because to interact with so many moving parts in our complex environment today, I think we may be trusting of more things. But perhaps that's for another conversation.
The main thing I'm curious about is your statement that the only way forward is thru trust-less systems. I personally believe we could also try to increase trust in other people and institutions.
Why do you think the only way forward is thru trust-less systems?
What he did is analogous to someone breaching a trust system by doing something untrustworthy. If the argument is we shouldn't have trust systems, fucking bravo. Isn't the world a great place?
>People nowdays see the failure of trust and strive to design systems that don't depend on trusting its actors at all. I'm not sure it's a good direction. As we increasingly don't expect trustworthiness from other people, we're digging ourselves deeper into the hole.
It's an extremely bad move: at the sociological level, trust shows up as coordination. To reduce the level of trust means reducing the level of coordination, increasing the chances that someone does something stupid and harmful for what turns out to be no good reason.
>So yes, anti-vaxxers, anti-GMOs, anti-nuclear, et al. are wrong. Stupidly wrong. But it's hard for me to blame them. They've been lied to so many times that it's hard for them to trust authorities anymore.
Moreover, it's not like anyone bothered to teach them the actual truth-finding and truth-tracking methods of science, so that they could understand the difference between science and authority.
> society has got quite a lot out of being generally high-trust
I agree. The vast majority of times you pass someone on the street and don't expect that person to turn around and stab you in the back and take your wallet, is not because of the existence of laws, but because of the existence of trust. Laws (particularly criminal code) must exist for the minority. This is one of the reasons why I hate "trust-less" systems -- societies cannot function that way. It's as if a program was not trusted to access even the memory that was allocated to it, and the OS had to check permissions and run validations on every single CPU instruction issued by the program. Trust saves resources.
reply