If true then surely that's better, not worse, as it means that Facebook simply needs to be suitably (financially) incentivised to change its behaviour - perhaps achievable via tighter regulation, penalties & rewards, etc etc.
I think the point is that we need to punish companies like Facebook in ways that might actually incentivize them to behave differently. If they get slapped with a fine that they can easily pay and then see their stock price go up, what's the point?
I don't like Facebook and I don't like their privacy practices. However, $5B is approximately 20% of their cash on hand so I think that we aren't talking about "parking ticket" level fines.
For real changes in behavior though, it'd probably take something radical like fining their board directly instead of punishing the shareholders.
While this certainly makes it much less likely that costly massive human moderation would actually be implemented, I don't think that really qualifies as an argument for why it shouldn't be done.
If you think rossdavidh is right about those issues Facebook is having (and I think he's right), then that's the reality in which the company has to operate. If a business model leads to infeasible costs due to fundamental issues, maybe that business model shouldn't exist as such.
Obviously Facebook (or any other company) doesn't want to do anything that significantly increases their costs without providing revenue if they can avoid it, but that doesn't mean it shouldn't be expected of them.
Given the number of times facebook has brazenly lied to customers, journalists, regulators, and governments, it seems likely that the situation is probably worse than the picture painted by the article.
Thank you for posting this here. It is exactly the kind of public service announcement other organizations need to see, because it will encourage them to question and reevaluate the benefits and costs of their relationship with Facebook.
If enough organizations change their behavior, I'm hopeful the folks at Facebook will be forced to change their behavior too. I mean, shouldn't be possible for Facebook to be profitable without also doing all those toxic/evil things?
My point is that Facebook can improve its behavior only by putting its business at risk.
If Zuckerberg, Sandberg, et al could improve Facebook's behavior without putting the business at risk, they would do it in a heartbeat. But it appears they can't.
Their efforts are thus sincere but highly constrained: They will never voluntarily do anything that would put the business -- their life's work -- at risk.
If I may use an imperfect analogy: Facebook is a "polluter of society" that can't afford to stop polluting until all its competitors are forced to stop polluting society too.
It is entirely possible that the incentives are aligned in such a configuration - where to maintain that position, facebook needs to be better for all parties involved.
Think of this with respect to google. Their business grows linearly with search usage, so their incentives are to create faster and more open access to the internet. Don't believe that "don't be evil" bullshit. They act the way they do because it makes them a great deal of money. AND, it benefits pretty much everyone expect dinosaur incumbents.
Seems like the point is that Facebook already "controls" how people think, and that their incentives are miss-aligned in a way that this is leading to public harm.
EDIT: Or at least miss-aligned so that they are perpetuating the harm-causing aspects of the system and not addressing them(if they can even be addressed at all).
Ok, so Facebook has yet another appalling practice that's been known for quite sometime (see the Telegraph article linked in this one). The percentage of users boycotting the company and its products don't seem to materially matter. Facebook is making more money now than before. Facebook will not behave well on its own. Regulation and really hefty fines running into a low double digit percentage of revenue (not profits) are the answers!
Possibly even more so than at malice. If an organization acts out of malice, you can change its behaviour by educating it about its best interests or moving to disincent its malicious behaviour.
But incompetence, especially systemic incompetence, is extraordinarily difficult to change. Facebook appears to have ADD around user privacy and its interface. Designs come and go, defaults change capriciously, and security seems to be an afterthought.
We've seen this movie before, with MSFT in the role of the bumbling centi-billionaire. Hopefully Facebook will steer a different course.
"force them to optimize" doesn't mean putting a gun to somebody's head. But it's clear that the existing socioeconomic arrangements overall incentivize companies, including Facebook, to do a lot of harm while chasing profits. Forcing them to not do so can also mean changing the arrangements to remove the incentives.
That is a good point. We're getting closer to fines that actually hurt the companies enough not to do it again.
Not there yet, but closer.
The frustrating part is that they kept doing it even after they said they wouldn't. And FB amassed a sizeable revenue breaking a law while snuffing out competition, growing bigger, and reaping the benefits whilst still paying a fine that was essentially "worth it".
I heard in a podcast the idea how the next "Facebook" can't do what FB did to grow -- because all that stuff is now regulated, closely watched, and fined.
>As a corporation, Facebook truly seems to be trying to improve its behavior for the benefit of society at large
Based on what? Lip service? Empty gestures? Those are worth as much as Google's "Don't be evil" motto and Apple's and Nike's social justice campaigns...
Yes, I can definitely agree that there is far more that Facebook can do in investing enough manual effort into shutting down outliers in egregious abuse.
reply