My point is that Facebook can improve its behavior only by putting its business at risk.
If Zuckerberg, Sandberg, et al could improve Facebook's behavior without putting the business at risk, they would do it in a heartbeat. But it appears they can't.
Their efforts are thus sincere but highly constrained: They will never voluntarily do anything that would put the business -- their life's work -- at risk.
If I may use an imperfect analogy: Facebook is a "polluter of society" that can't afford to stop polluting until all its competitors are forced to stop polluting society too.
I don't think it's an emergent property, I think it's a by-product of the constraints. It's all well and good that they want to make Facebook safe and healthy, and I honestly believe plenty of people working there are trying to do just that. However, they are operating under the constraint that they cannot move backwards on profits, and therefore engagement.
Imagine if you were trying to fix climate change, but under the condition that you weren't allowed to burn fewer fossil fuels. You may try very hard, and very sincerely, but it's fool's errand.
"Their business model is based on behavior modification"
Not to sound like a broken record, but I always feel I should chime in when these kind of claims are stated as fact.
Facebook is incapable of intentionally modifying behavior in a way that benefits their business model. That's not how the company thinks about product development, it's not something that happens, and it's just not a serious way to analyze the business. I worked on "the algorithms" and can explain in detail how most of them work. Human civilization simply does not have the technology to do this.
When was Facebook about improving society? Oh sure they'll do that as long as it doesn't jeopardize profits, but after all we've learned about their social engineering, social experimentation, pattern recognition and various mechanisms to maintain engagement, is it really right to pretend they are any different than any other large corporation? The dollar is the endgame, and if improving society becomes a profitable endeavor, then and only then is it safe to say they will pursue it.
The problem is that "bad behavior" is a concept in flux. Facebook never hid what their business model was: sell your personal data to third parties. Only a few privacy activists were concerned. Others reactions went from "meh" to "it's actually smart!" (Remember when Obama's campaign was praised for its innovative approach profiling voters?).
It took Cambridge Analytica for people to realize that they did not want this.
I have been paranoid about Facebook since day one, but there is something I won't do: blame them for coming up with a business model that is legal and did not seem to concern users ethically either.
The hearings of Zuckberg have been shameful. As much as I love seeing him on the grill, I have more contempt for the lawmakers in front of him, who actually enabled Facebook to become such a monster by either facilitating or simply not understanding what it was doing.
Facebook is a problem, but the ones responsible for this situation are not to be found within the company.
You are comparing very different things. While I agree a lot of bad people are doing bad things out there, not many if any have the reach and control Zuckerberg does. I wonder if there is a resolution where Facebook can be fixed without government regulations that would likely have adverse effects on the internet at large. I don't even use Facebook and yet I am vulnerable as they doubtless have a complete profile on me based on tracking pixels and people with whom I socialize having accounts with them.
Zuckerberg is not responsable for the existence of the Friendship Paradox or our natural instinct to envy, etc. Facebook mission in itself, is good. We have no reason to doubt Zuckerberg honesty to do good. Do you have any example where Facebook did something that they knew was bad but still did it?
It's true that social network bring bad side effects, everything does. They can only try to correct them.
Facebook was beyond fixing the day it was founded. Zuckerberg is one of the least ethical operators of a large company that I'm aware of and the company reflects that in every way possible. Fixing Facebook would mean removing Zuckerberg and that's not going to happen, so I applaud the whistleblower stepping forward but I think they're hopelessly naive if they believe that this will really lead to meaningful change.
This article seems to be premised on the idea that Facebook's problems are caused by its designers, not its investors and the regulatory environment that it exists in.
The problems with FB are not a failure of design or engineering but a failure of the business and political culture.
Can that failed business and political culture solve its problems by creating onerous new laws on other people? I would think not, although they may try.
As a corporation, Facebook truly seems to be trying to improve its behavior for the benefit of society at large, but the company is finding it very difficult to do The Right Thing, because it would reduce future revenue growth, shrink long-term profitability, and hurt the company's competitive standing against the many other companies that are trying to eat Facebook's lunch every day.
In the extreme, Facebook's choices appear to be: (a) act in the best interest of society and get f#cked by competitors; or (b) remain a dominant force in the market, but as a side effect, f#ck everybody. All options for Facebook appear to be a mix of those two horrible choices.
I think you're missing my point. I'm not saying that Facebook should give up. Everyone there should make every effort to improve. My point in bringing up complexity is that it's going to take a lot more than Mark waving his hand and magically making everything better. This isn't some video game where one player hits a button and thousands of minions instantly rearrange themselves and all of their actions are resolved in milliseconds. Anyone who knows anything about complex systems - especially those involving people - knows that's not how they are.
In reality, no matter what leadership says or does, actual change will still require continuous effort from literally thousands of engineers, data scientists, and others. It will take time, as all such things do. Petulantly demanding that things happen faster than they can happen isn't going to make it so.
>As a corporation, Facebook truly seems to be trying to improve its behavior for the benefit of society at large
Based on what? Lip service? Empty gestures? Those are worth as much as Google's "Don't be evil" motto and Apple's and Nike's social justice campaigns...
Sure, but it helps that they are being watched, which aligns good behavior with their self-interest. Seems to be the only factor promoting good behavior at Facebook. Other companies seem to have leaders with more genuinely good intentions (e.g. Google - not Eric Schmidt, but Larry Page and Sergey Brin).
If you're actually arguing that FB doesn't have a positive impact on people, you are so far off the mark that you can't be reasoned with.
You might be able to argue about Zuckerberg's methods, but the end result is that he created something that made hundreds of millions of people's lives better.
Unless you've done the same, you don't really have the moral high ground here.
Your analogy is just wrong. You were trying to argue that a change Facebook actually made could have hurt people, but your scenario only could have happened if Facebook made the change without making it clear what was going on. If the only aspects of your argument that were hypothetical were Bob's actions, that would be something worth discussing. However, you're making it seem like what Facebook actually did could have hurt Bob, when you have no idea if that's actually the case.
I also disagree. The "curation" algorithms and other systems clearly have a serious influence. Facebook is not responsible for human nature, but they are responsible for intentionally guiding it for profit and ignoring the consequences. It's not as if Facebook is merely tabula rasa; they meddle.
There have been many examples of executives and engineers who have spoken to this effect:
I think the point is that Facebook's contribution to society (with its actual product, not its tech work) has been negative; its goal is to turn every knob it can find to get people to spend more time on Facebook, not actually solve any problems. All businesses are selfish, but some manage to meaningfully improve some aspect of people's lives along the way.
If true then surely that's better, not worse, as it means that Facebook simply needs to be suitably (financially) incentivised to change its behaviour - perhaps achievable via tighter regulation, penalties & rewards, etc etc.
I don't care if they do better. It's not pointless quitting, I now no longer have Facebook in my life. That was the point. "Challenging them to do better" has a strange assumption that somehow we all stand to gain if we can just make sure they get it right next time, or maybe the time after. They're not our friends, they're not an Oxfam, they're a business with a shitty product. I don't challenge Johnson & Johnson to get them to make a better band-aid, because frankly, I don't give a shit. Same with Facebook.
>"I think that we owe it to the users to challenge Facebook to live up to a higher standard, regardless of what we as individuals may gain or lose from their choices."
People need to learn to think for themselves and live with the consequences of their choices. It's not rocket science.
It’s not false that there is a societal problem that is not unique to Facebook.
But that sidesteps the question of what responsibility they have as a company whose profits are, at minimum, powered by that problem, if not exacerbating the problem.
“Privatize the profits, socialize the costs” is not sustainable.
If Zuckerberg, Sandberg, et al could improve Facebook's behavior without putting the business at risk, they would do it in a heartbeat. But it appears they can't.
Their efforts are thus sincere but highly constrained: They will never voluntarily do anything that would put the business -- their life's work -- at risk.
If I may use an imperfect analogy: Facebook is a "polluter of society" that can't afford to stop polluting until all its competitors are forced to stop polluting society too.
reply