Hacker Read top | best | new | newcomments | leaders | about | bookmarklet login

I think we're in for a slow painful transition until people (in aggregate) intuitively "get" exactly how invasive and unfriendly data-correlation can be when you expose yourself -- and your friends -- to when you share seemingly-innocuous facts with our welcomed-digital-overlords.


sort by: page size:

I agree that it is exhausting. But it seems that it's a challenge of information parsing and application - not necessarily the volume of information itself.

I think there can be no doubt that the volume and accessibility of information exploding is a good thing. It will take time for us to find our rhythm with the new reality - we need tools on an individual and social level to cope.


totally - info overload is already a huge problem, and only going to get worse.

Have people become less smart? I think more likely you’ve just been exposed to the reality than most people are not as smart as yourself.

I certainly think more people are aware of things they wouldn’t have otherwise been aware of. It’s become more difficult for governments to hide facts from their citizens, and that’s been good.

It’s been a mixed bag and I think we should be careful of judging the Internet by the behaviour of companies like Facebook.

We’re also learning. Ten years ago I can’t imagine anyone quitting social media due to concerns about privacy, manipulation, the undermining of democracy or the amoral behaviour of Facebook’s employees, but that happens a lot now. As PG said, it takes time for us to “develop antibodies” to these things.


Or they're following a newsfeed algorithm that's determining their information-seeking proclivities for them. I honestly find that much more subtly disturbing and manipulatable than willful ignorance – people have ceded control over their information seeking behavior entirely.

Well said.

We live in a society where more data than ever is stored systematically and widely available, but we haven't yet developed the tools to cope with the resulting information overload. I suspect this is why we see popular trends like using Facebook and Twitter: even if people did have the time to take part in genuinely deep, creative, insightful interactions, there's so much to talk about that soundbite discussions rule.

It doesn't help that the media is increasingly full of superficial "don't make me think" material. Sometimes, it's the endless echo chamber that is most of the blogosphere and online forums: today, I reckon about 90% of the posts I've seen are just links to other posts that are top 10 lists of other posts that might, if you're lucky, include one or two original bits of writing. Off-line media is just as bad, with banal reality TV shows and sensationalist newspaper headlines boosting audience numbers because so many people just don't want to make any effort any more.

I think in the next couple of years, though, we will start to see a backlash against these trends. People will get bored of one-liner "discussions" with "friends" they haven't talked to more than twice in real life, and start to wonder what happened to, y'know, interesting conversations where they learned something or gained a new insight. This doesn't have to be highbrow or reserved to well educated middle classes: just "I liked this movie I saw the other day, because..." rather than "That movie rocked, because." would be a start.

As this trend takes over, I think we'll see improved ways of separating the wheat from the chaff. Just as search engines try to identify the best material on topics covered on bazillions of web pages but mostly not very well, so I think we'll see tools develop that cut out the middle men and go straight to the source of original, interesting material; I suspect the first social networks that enable this effectively will be the next popular trend.

Which brings us back to where we came in, which is to say that we need to have some original, interesting material to talk about for all of the above to work. That, one way or another, is going to involve either thinking more or sharing data that isn't trivially available to others, either because it's not accessible, or because it is but you still have to work out which parts are important enough to highlight. And how are we going to know that one bit of writing is presenting such content where many others aren't? Whatever the answer to that question, if a picture is worth a thousand words then we can be sure that good visualisations that pick out interesting data to discuss are going to play a big part in it.

Of course, sometimes you can be too verbose. I didn't have time to write a short post here, so I wrote a long one instead, as Mark Twain might have said. Maybe I could have just written "Current trend = banal & uninteresting => doomed; new trend = original & creative => interesting => successful; effective visualisations help identify original & creative content". Then again, that's more than 140 characters.


While there is definitely an uptick in polarization, I think the more important and insidious trend is the increasing ubiquity of bullshit. We are flooded with information online, most of which is irrelevant or counterproductive to our interests. Needing to constantly wade through that in triage mode (with a judgmental attitude) has the general effect of wearing down people’s psyche and making them feel exhausted, anxious and on the edge. In that mindset, of course small things are going to set them off — but that is just the symptom, not the root cause. This is also a much harder problem to wrangle with because it is less specific. The causes of this problem are deeply embedded inside the incentives we have set up on the web over the last two decades — to challenge those will require answering some hard questions. At some level, people realize this (hence small efforts like the slow tech movement, digital detox, etc), but they are yet to find the right balance of convenience and sanity (for lack of a better word).

Extreme version of counterpoint: By putting all of this knowledge online, we're building an organism bigger than ourselves.

Weak version: We're learning new types of information, and in many cases creating it rather than simply seeking to retain it.

My version is probably somewhere in the middle. But overall this is needlessly alarmist hand-wringing. Hooray for trend pieces!


The wast majority of people out there have no mental capacity to imagine how data they post online and provide to various orgs could be and most likely eventually will be used against them. This is evidenced by the continuing proliferation of dumb comments along the lines of "I am boring", "I am doing nothing wrong", etc.

There is a tremendous cognitive bias in play here. The idea that because most people around you don't weaponize certain types of information means no one anywhere will ever weaponize that information, even if it is globally and indefinitely available.


I think this could be driven by personalization of internet content. It started with the innocuous idea of presenting only the information you need. And, it turned into a system of echo chambers where your world view would be limited to certain ideas that you are comfortable with. And, it leads to lots of cognitive distortions. Also, the abundance of media on the internet force the content of the media to be louder for it to be visible. Nuance has to give way to crass generalization to get deeper media engagement.

I seriously worry about the trend and I am not very hopeful about the direction it takes.


A disturbing trend of the internet age.

Humans are terrible at quantifying these 'movements'. Even 50 people complaining about something can SEEM like a lot.

Our brains just aren't wired to do proper statistics first analysis on incoming data. Everyone is susceptible to this default mode of thinking, including journalists. I believe it's the cause of some of the crazy, anti-science movements that we're seeing in the last 10 years.


When I was a kid ignorant people kept their thoughts in their heads safe and tucked away or maybe told a few people near them that knew better. Now they let them out on social media where they can find other ill-informed people and reinforce one another. I don’t know a way around this filter step. Maybe it naturally will resolve. As Stewart Brand said, information wants to be free. I will be optimistic enough to believe that means correct information. The science and empathy just isn’t well distributed yet. “The future is already here — it's just not very evenly distributed.” -William Gibson

Just wait until they learn about the "behavioral reading" data collected by, oh I don't know, virtually every media site on the Internet.

"people on HN react like some grand conspiracy theory was unveiled while it takes 15 seconds on google" is garden-variety internet snark

Such a factual situation is often the subject of such snark. In this age of search bubbles and personalized (read algorithm distorted) views of information the observation is often true. Such reactions are exhibited by one group even to factual information. This results in frustration on both sides of the statement, creating the state of "War," which, "is not a place from which people find it easy to climb down." This is particularly bad because such information is particularly valuable. It's precisely such mental model breaking information which is the most valuable.

It's very easy to see how a group like, the Flat Earthers, or Anti-vaxxers have such model-breaking information blindness. Throughout history, we've seen that it's particularly hard for privileged knowledge workers -- who have been validated by society through high pay and social status -- to see the model-breaking information. (Witness the struggle to get germ theory recognized by gentleman doctors.) In short, we tech folks need to acknowledge that there are factors acting to make us the most blind.

In 2019, we all have it. We've always all had it. The question is how do we communicate this without being in the mental state of "war."


I've got to agree with this statement. In my opinion, it's an absolute blessing (and an unavoidable evolutionary outcome) that there is still varying degrees of friction inherent to the process of commenting on many websites. Imagine if every person could throw their uninformed opinions about every topic into the common space with no hassle whatsoever, the signal to noise ratio would be positively devastated. In fact, the coupling of frictionless 1->N communication capacity with continual reinforcement of poisonous ego-centrism in advertising appears to be the precise recipe for the construction of FaceBook-style intellectual wastelands. The blunt fact of the matter is that the vast majority of people appear to have very little of value to contribute to the vast majority of discussions or processes, except perhaps in a passive role. This is not a judgement of value, merely a fact that should be considered from a societal vantage when the design of large information systems is undertaken so that ethical and socially-beneficial decisions can be made.

The concept of preventing a dilution of discourse and output quality by intentionally maintaining a degree of friction in the participation process is well-embodied by, e.g., the effort required to start contributing to certain critical open-source projects like the Linux kernel (think email-based communications, packaging processes, etc.). The notion of friction extends directly to aesthetics as well - consider the difference in discourse level that exists between HN, Reddit, FaceBook, and so on. There are many factors involved in the gradient, but it is worth noting that the aesthetic model employed by a system or product is perhaps the clearest external signal of that system's intended purpose. To that end, the dramatic "gameification" of relatively frictionless communication protocols appears to be incompatible with the idea of truth on a fundamental level. This also applies to the modern incarnation of popular US "news" organizations - think Fox's dramatic music and hyper-augmented hollywood-style visuals, or my personal favorite, CNN's "situation room".


It definitely isn’t! The problem is when individuals with dubious motivations and skills are given access to a platform that influences tens or hundreds of millions.

The blessing and curse of having the worlds information at our fingertips is that we can usually find the data, groups and echo chambers to mirror any narrative.

If you were hesitant about certain medical procedures, you now have access to groups and data that can and will reaffirm that you should not undergo that procedure because of X, Y & Z.

Wether X, Y & Z are directly relevant, or even statistically probable - it doesn’t matter anymore. Group think, peer pressure, societal pressures and desires to be “free” and “independent” etc start to reign.

The backlash against Spotify is warranted - it’s just a large corporation that people can condem and refuse to do business with. Same with governments.

Perhaps not exactly the same analogy, but muslims won’t generally use the services offered by a Church, or some people might go see their Shaman over a GP, or vegans will avoid the meat-heavy BBQ place. We have our beliefs, some are misguided, some are dangerous. Covid is pretty dangerous, and deadly to certain demographics.


Absolutely. This trend has put me on track for a sort of internet retreat. I can’t know if anything I’m reading or hearing is real suddenly. Previously I could be critical of it and consider sources and try to be rational, but at this point it seems reckless to be immersed in this environment.

Even in terms of work, I’m suddenly put off by purely digital things. I want to interface with the real world, hardware, build systems that humans need. I’m very interested in AI, but it’s not a problem I’m solving and I suppose I don’t really want to compete with it or interface with it more than on an opt-in basis.


it may be true that there is no other path, but we really can't help but notice that the bulk of human knowledge is being replaced by literal noise. that is going to be a real problem.

Honestly, I don't think the problem can get worse. Or at least, it won't get worse because of more information. The capacity of the human mind for rationalization is already essentially infinite, people are manipulated because they want to be manipulated.

It's pretty much to be expected, the world is really quite complicated and without a drive for consensus independent thought would lead you astray a hundred times before leading you to water.


I think it’s because people are uncomfortable with the idea that information can be harmful to the masses who might not be able to process it ‘correctly.’
next

Legal | privacy