You hire the same kinds of people that these companies hire to design their products to be addictive and then you draft legislation banning the things that they say are addictive.
This is a fair position, but we shouldn't forget that companies have poured billions of dollars into making these "tools" as addictive as possible, because that's ultimately how they make their money. See [1] and [2] for more.
It's bad if you accept that people deserve agency: the ability to freely choose how they act.
The primary purpose of making an addictive product is to remove peoples' agency by hijacking known deficiencies in our minds/bodies. It's a form of coercion, because your goal is to prevent people from being able to choose whether they use your product or not.
I think when companies consult with behavioral, and other, psychologists in order to craft better ways to wring more engagement out of their users, those companies are intentionally making their products addictive. I don't see it as being much different than exploiting the addictive potential of new nicotine salts, or exploiting additives to cigarettes to do the same.
It can be spun however anyone wants to spin it, but at the end of the day, that's exploiting biology to subvert the will of others. And yes, I know this can describe advertising, as well.
This is certainly a difficult problem, and I think that as we learn more on this subject people are going to be increasingly uncomfortable with the answers available to us and what it means for the products we create.
What I feel is that, regardless of regulation on a societal level, it is up to us as individuals to decide what we feel comfortable with in regards to the products we create and the risk of harming others that we are willing to take on. We should be able to acknowledge the fact that - regardless of whether or not addiction is a personal failing or an inescapable biological flaw - intentionally leveraging and profiting off of it is an unethical action. It is exploitation regardless of the nature of the flaw being exploited.
It's complicated, because I think there are many types of products that can't avoid the potential for addictive use. In those situations, we have to ask ourselves whether we're targeting that addictive use, or whether it's an undesired side effect that a user may come across - and in that latter situation, to find ways to mitigate those harmful outcomes. If we want to provide outcomes that help our customers and provide them value, we need to be willing to accept that we may have to create awkward-feeling and profit-limiting mitigations.
Flawed as it is, education and the promotion of people developing self-awareness about their addictions is a strong tool we have for mitigation as long as it isn't something buried where users won't easily find it. Setting limits or ceilings on spending (spending of both time and money) is another mitigation. Avoiding monetization models such as gambling mechanics that inherently exploit addiction is another. Promoting and sticking to direct sales of products for discrete costs is another. But there is still so much we need to learn, and many potential pitfalls.
The advertising business model is why those companies built addictive products. To make money from ads, you have to keep them coming back as often as possible, even if it isn't adding value to their lives.
While I don't disagree with you, it's important to remember that these companies spend billions of dollars to literally make their product addictive.
In school we teach kids how to be responsible with alcohol and drugs. But some still become addicted. The same applies to social media.
The difference is that instead of the pusher being some shady character at the back of the school bus, it's a massive company with marketing, public relations, and lobbying teams.
It's not a fair fight. Social media companies employ psychologists to assist their UX designers in designing their products to be as addicting as possible.
> To the extent that you are designing your application to consume more and more of the user's attention (note that the customer is the advertiser, the addicted user is the product), to make it more and more addictive, you may be ruining some people's lives.
Our economy is intertwined enough that "your actions somewhere down the line may be making it easier for someone to make choices that ruin their lives" is probably true of literally every single person. If you work on a product making low-skill hiring more efficient, you're also lowering McDonald's costs and making it even harder for people with unhealthy relationships with food to resist, and thus "you may be ruining some people's lives".
It's so broad a claim that it's utterly meaningless.
And some people are legitimately addicted and unable to stop despite their sincere desire. In many cases, companies engineer products and services to be as addictive as possible. The techniques used by casinos, for instance, are well known.
This is not an argument for regulation, prohibition or anything like that. Merely a rebuttal to the falsehood that nobody is being "exploited" by such companies and that "value" is being created. In many cases value is being destroyed.
How many businesses can I remember around addictions? What s the most famous anti-TV addiction company? Anti-cigarettes? Anti-sex addiction? If it's none, then the opportunity is not underappreciated.
Making an anti-addiction product is by its nature the opposite of a sellable product. And the people who have the frontal lobe to willingly go against their addiction, have already solved their problem.
Serious question: how do you feel about having been involved in that feature, knowing that you were creating an addictive product and trying to get people more hooked on it?
You might try to say that people have free will and can do what they want, but you know that the reason it’s done is exactly because it’s addictive.
I'd think it's more about economic theories. If all economic incentives point to making a product more addictive, of course products will be designed that way.
reply