"Their business model is based on behavior modification"
Not to sound like a broken record, but I always feel I should chime in when these kind of claims are stated as fact.
Facebook is incapable of intentionally modifying behavior in a way that benefits their business model. That's not how the company thinks about product development, it's not something that happens, and it's just not a serious way to analyze the business. I worked on "the algorithms" and can explain in detail how most of them work. Human civilization simply does not have the technology to do this.
The point is that fighting Facebook on Facebook is a losing game and "innovate a better business model" has been tried, is being tried, and is not working because it is hard. A plan that does not work is indeed "flawed", no matter how noble and natural the intentions.
This reads like what has become the standard Facebook PR response: they claim to be committed to improvement, but will ultimately do nothing actually effective. On the one hand, because it is excruciatingly difficult in what is relevant, and on the other hand, because they won't ever come close to harm their business in what is actually feasible.
My point is that Facebook can improve its behavior only by putting its business at risk.
If Zuckerberg, Sandberg, et al could improve Facebook's behavior without putting the business at risk, they would do it in a heartbeat. But it appears they can't.
Their efforts are thus sincere but highly constrained: They will never voluntarily do anything that would put the business -- their life's work -- at risk.
If I may use an imperfect analogy: Facebook is a "polluter of society" that can't afford to stop polluting until all its competitors are forced to stop polluting society too.
If Facebook can't maintain a business model without selling their well-trained and dopamine-addicted userbase like a commodity, then their business model does not deserve to be maintained.
Facebook's entire business model is creating people addicted to chaos. It has radicalized people, led to suicides, and caused a huge rift in many countries' political ecosystems. That is their business model, it's the only way they make money.
Amazon, Google, Microsoft etc have flaws but their business models don't depend on creating conflict.
FB's core business model is the root problem. "Working from within" is vain, naive, and futile; only Zuckerberg has the power to change the business model, and we all know thats not happening.
What's even more hypocritical is how the psychological foundations of their business model are never mentioned at launch, but out they come when FB has to justify something or are being criticized (as in Parker's instance). It's always after the fact, even though they are undoubtedly employing experts in the field to develop their features.
Or perhaps the underlying business/product model is inherently flawed in a way that's bad for society, all patches have proven woefully insufficient in mitigating that, and Facebook have been intentionally concealing this.
The reason Om cites as why Facebook won't change is similar to the reasons companies don't change in response to disruptive innovation, as described in Christensen's "The Innovators Dilemma." Big companies are too invested in a successful business model for them to easily switch to a new one overnight. Even if Zuckerberg grasps the threat to FB's brand posed by fake news and smartphone addiction, his company is streamlined to operate the way it always has been. The best he can achieve is short term fixes and lip service. Solving these problems would require Facebook to value something other than people's attention. But doing that would cannibalize their advertising business.
This is my take as well. It's the business model itself that's toxic, or at least predatory, as the necessary outcome of it is to lean as hard as possible on known frailties of human behavior and psychology in order to cultivate advertising placement and revenue.
There's no tinkering around the edges of that control loop that's going to fix the problem. What we observe as the deleterious effects on society are in fact systematic (as in, "design failures") faults in the thesis of Facebook's business model.
“ We make the Facebook, as people using it.”
This is false. Facebook has hundreds, if not thousands, of product people working to alter, maximize, and monetize user behavior.
Their technical achievements are not my domain. But I know behavior patterns, since Facebook's platform was built to harm peoples ability to choose for themselves, I consider it a blight on humanity. I doubt they have any interest in serving the common good. But I hear that you don't share my skepticism, time will tell.
Maybe we should just recognize that Facebook / software business models don’t work for operating, boots on the ground services. Network effects don’t work with negative unit economics.
The key thrust of this is that at Facebook specifically, the company is agnostic and amoral about the uses or effects of facebook, and the only goal is maximizing attention to be sold by any means necessary.
I don't know anyone who works at Facebook at high enough of a level to know if this checks out, but I'm curious if it does.
I’d go further - there is one simple idea that is evil, that drives Facebook:
Social networks won’t work unless they are given away for free, and the only way to make a sustainable one you give away for free is to monetize via behavior modification products.
I was talking about Facebook the tool, i.e. how it is actually being used by people. Not Facebook the business model, which I agree with you is exactly as you describe, but is something completely different and separate from how it is perceived and used by ordinary people.
Does it? Not from my experience on either side of the transaction.... in my eyes all of the “innovation” facebook has produced (along with every other large tech company) has been straightforward, by the book scaling of extant business processes with technology.
Anyway if there’s a process it can hardly be called imagination now can it? It’s just human-level algorithms fed by cash.
If you can’t imagine how to improve this you’re certainly lacking for either observational skill or bound by some dogmatic worldview.
Some companies just have business models that only work if the quality of life for the world as a whole declines. They focus on extracting value for a few at the expense of the many (many of which don't realize what's happening).
Facebook is one of them. Its entire business revolves collecting and selling information about its users. One day, the information released in this way will be recognized as the obviously harmful stuff that it was all along. We're not there yet, though.
Not to sound like a broken record, but I always feel I should chime in when these kind of claims are stated as fact.
Facebook is incapable of intentionally modifying behavior in a way that benefits their business model. That's not how the company thinks about product development, it's not something that happens, and it's just not a serious way to analyze the business. I worked on "the algorithms" and can explain in detail how most of them work. Human civilization simply does not have the technology to do this.
reply