The mass-surveillance-based, creepy-stalker-but-at-scale business model of these companies ought to be illegal.
IMO we should try to outlaw that and see what happens, before legislating on the downstream effects. Might still need stuff like social media account age minimums, but... might not.
Banning dragnet spying and/or making retaining data about people very risky, has some chance of killing algorithm-feed social media (by drying up ad dollars) and would be a good thing to do anyway.
Making companies liable not for content they host or deliver/show to an explicit recipient or set of recipients, but for content they both host and actively elect to promote, would kill the algorithm feed, suggested videos, et c, but not ordinary Internet hosting services or messaging platforms (or even traditional forums!).
Neither is impossible, and either would probably be good to do anyway.
If the core problem is companies shoving shit at you based on what drives "engagement" and with little regard for anything else, those are policies to look at.
The dragnet spying that forms the core of social networks' business model ought simply be illegal. It's disgusting and tremendously dangerous. That'd help a lot—by destroying some of the companies, and removing many of their worst incentives.
Also, they shouldn't be able to launder responsibility for editorial choices by saying "an algorithm did it!" and hiding behind laws meant to protect email and IM providers and web hosting companies and ISPs and such.
I think this (anti-trust enforcement) is much overdue. Social networks need to be regulated in regards to free speech, children and youth usage and advertising, etc., etc.
It should be illegal for companies like Twitter to forbid this. Users should be free to access their data, and free to use any tool to do so. The revered network effect is anti consumer and it must be broken.
This is the one regulation that could save a nation from FB, Twitter, Instagram and their ilk. Any politician to run on this could shoot somebody in the middle of a crowded street without losing my support.
but then the US would have to ban twitter and FB/Insta.
IT would be better to have an enforceable set of licensing. But that wont happen because 1st amendment noises (rightly or wrongly. I'm not sure that corporations should really be treated as humans in this instance, but US case law thinks it should since the 70s.)
I have two big points here: governments should ban social media, and to save its reputation, the tech industry must pull the plug on those services before that happens.
Social media is the new smoking, and Meta, ByteDance, and X are the new Philip Morris. Their products are services that feed you addictive content in an addictive way, such that you spend hours and hours using the service, during which they show you as many ads as possible. That's the money firehose. There's no question it's very bad for kids, and we should ban the shit out of it. We shouldn't do it immediately, we shouldn't do it in a draconian way, but we should slowly chisel away at these services with regulations until we've transformed them from addictive mind manipulation apps to chill forums and blogs.
What does regulation like that look like? "Hey it looks like you've got > 10m MAUs, please make your default algorithm recency and not engagement, please restrict users to 1 hour of total use of your feed a day, please only show users 1 ad every 5 minutes, welcome to a consent decree where we monitor all your content moderation choices, also here's the framework you'll be using."
Does that sound nuts? It's actually what we want: we don't want our kids using social media more than an hour a day, we don't want them served tons of mindbending ads, we don't want some asleep-at-the-switch content moderation crew letting ISIS recruit on these services or weirdos spam our kids with pro-ana content.
True to nerd form, we've really lost the plot on this one. We're arguing like, "kids can use VPNs" or "there's no difference between Instagram and Wikipedia" (lol) and blah blah. Laws and regulations aren't airtight. Kids can still pay people to buy booze for them! We still don't let them buy it themselves because that's way worse! Don't let the perfect be the enemy of the good here.
---
Importantly, the debate is basically over. In this poll [0]:
- Americans overwhelmingly are concerned about kids' social media use
- Only 51% think it's primarily up to parents to prevent harms
This poll was conducted over a year ago, so I'm sure even more Americans are concerned and even fewer think it's solely up to parents to prevent harm. Regulation is coming
Tech is at a crossroads right now. It's been on a downward slide for over a decade. No one thinks tech is on their side anymore. It's very close to being just the latest soulless, corrupt business empire in a long line of soulless, corrupt business empires (tobacco, mining, trains, energy, pharmaceuticals, etc). If social media platforms go down kicking and screaming, it will cement this reputation.
Maybe this is fine! Energy companies are doing pretty well! But I don't imagine the people at these companies think they're working at big oil or big tobacco. If you don't relish the prospect of defending your career at every social gathering you go to for the rest of your life, tech needs to pull itself back from the brink before it's forced to. It's the only way to save its reputation and actually be the positive force for good it thinks it is.
The reasonable basis is that Facebook have consistently shown themselves to be a bad actor. I'm not arguing that age checks are a good thing, just that I don't see a reason why the law can't explicitly target companies that are detrimental to society. You might then argue Reddit should also be included for similar reasons.
I would suggest this is generally better than an attempt to codify in law some kind of definition of what companies should be covered.
This question to me is akin to something like how do we make smoking less addictive. Companies aren't made for social welfare unfortunately and most of the time its government restriction that helps curb the problems. So what I see likely happening is that we pass some type of age requirement on social media. Now how companies enforce this without violating privacy issues is another can of worms.
Social media of US owned companies was already used as a weapon and still is. Perhaps instead of this legislation we should clearly lay out the regulations that all social media companies need to adhere too.
I've been learning in my law course about how laws are built from precedent and often need to be based on real things that have happened, rather than in anticipation of something bad maybe happening in the future.
So when it comes to facebook and other big tech companies, I feel like they should be regulated similar to how toy companies can't advertise violence to children, and tobacco companies can't advertise to children at all. Those laws are incredibly strict and written with the understanding that kids are vulnerable and naive, and can't really think for themselves. But that vulnerability still has to be demonstrated, and tangible harm has to be shown before a robust law can be written (meaning, one that won't be struck down as unconstitutional or unreasonable or whatever), and such laws are still in the process of being written for the tech companies.
So you could argue that they should back off of marketing to kids as a moral thing, but equally you could argue that they need to stake out their territory before laws are written that block off entire lines of business. The tech companies are behaving rationally, but the results can be awful. Society needs to fight back and protect ourselves with strong laws.
As many point out every time this comes up, regulating access to social media products by requiring ID checks and age gating will almost certainly not work effectively, will harm the web overall and that's if its not implemented in a half-arsed way. It's a bad idea.
You want people to stop buying cigarettes? Don't let the cigarette companies advertise. Require warning labels, plain packaging, increase tariffs, etc.
The society doesn't exist to support the businesses poisoning it.
That's great in theory, but these companies heavily benefit from networking effects and become near-monopolies of their social network niche. You can't just switch to a different social network if you don't like it. And even if you could, it doesn't make it ok to screw people, it just means people can opt out of the screwing.
I'm not saying they are legally wrong, or violating anyone's legal rights, or that the government should do something. I'm saying that it's still wrong and unacceptable, and they deserve criticism.
At some point, social media companies need to realize that they ARE utilities. They should not attempt to control their content, rather, law enforcement should prosecute illegal usage.
Restricting spying and indefinite data retention, and making companies responsible for content they don’t just host but promote, would likely kill engagement-first social media and “the feed”. And are things we should do anyway.
Maybe there should be some sort of condition, like minimum MAU or something. Only regulate those that grow to be huge enough for many people to depend on them. This would exclude startups but would still apply to Facebook and Twitter.
I think this discussion applies to TikTok, Snapchat and Reddit as well. Facebook (along with Instagram) is probably the worst offender at the moment, but any laws that apply to Facebook would apply to others.
> This idea that you can ban Facebook and Instagram and suddenly the internet is safe for kids is just ridiculous.
Who said that? I don't think they should be banned, but I do think they should be regulated or otherwise held accountable. A big part of it might be being more transparent with their algorithm, and giving tools to users to control it more.
I think where we are is akin to back in the days where people realized that, no, we aren't going to ban buildings, but we are going to have building codes. We aren't going to ban food, but we are going to have an FDA. Etc. The free market doesn't address all problems.
IMO we should try to outlaw that and see what happens, before legislating on the downstream effects. Might still need stuff like social media account age minimums, but... might not.
reply