If I friend/follow a lot of people, and I have a simple timeline, I see what they're thinking about.
But if there's an algorithm selecting which posts I see from my social circle, and it happens to shade one way versus another, there are no ads, no synthetic content, but I have still been manipulated.
FB experimented with this quite famously a while back, they used sentiment analysis to see if they could manipulate people into being more or less depressed.
Without informing users, obtaining their consent, or following any of the other rigorous requirements for doing psychological research that we demand from medical proactitioners.
I think these socially sponsored ads are going to be money-making and also insidious if done right. It's a nuanced, strange mix of social content and advertising. They're trying to make all ads social or all their open graph actions into ads. If you play this out to its logical conclusion, that's kinda scary.
Imagine a world where a large chunk of your online interactions are actually being used as content/ads targeted at your friends.
Facebook was actually fun until monetization and creepiness destroyed the vibe.
Now you get reminded every two friend posts (of those that still do) of a creepily targeted ad, and then of course another 1-2 posts mixed in from "friends" trying to monetize their daily reiki personal training life coach bullshit, but you keep them around because you've known them for a long time.
Manipulative ads that are coarsely targeted I think we got used to from television. But manipulative ads with foreknowledge of your medical google searches is fucking creepy.
The engineers were following the orders of the businessmen. Does it excuse it? No, but I'm hard pressed to see any salaried employee not actively working against the greater interests of mankind and the world.
Hell is truly us. We are our greatest and only true enemy.
Are you sure you are not missing the point? Whether curated or not, the expectation of users certainly is not that it's curated according to facebook's interest of the day, but according to their own interests (that is, to reduce noise, not to influence them), and whether perfect or not, people certainly do expect and counteract lying in ads.
Also, just because ads are generally perceived with some scepticism, doesn't mean that certain kinds of ads aren't unethical as well. In particular, ads that exploit methods of bypassing rational thought might indeed be unethical, and certainly are not the same as just general self-interested advertisement. And despite what you claim, manipulation is actually not a defining property of ads - if you have a product to sell that is actually a rational thing to buy, advertisement can use perfectly rational thinking in order to pursuade you to buy it. Just because much of end-consumer directed advertisement nowadays is trying to sell bullshit, doesn't mean that advertisement can only be used to sell bullshit.
People know ads are trying to manipulate them, they don't know that Facebook is actively trying to make them feel happier or sadder. Just because two things are similar does not make them morally equivalent.
lol yep! what's sad about this is that I totally believe that fb is doing pretty manipulative things - they have the biggest incentive in the world to get people to buy things - and that it is probably adjacent to the level of manipulation this person is talking about. but then instead of realizing that for-profit companies are bad, some alienated people start thinking "maybe race mixing is the problem" instead, project that onto ads, and that's how you get white nationalism.
Here's a perfect example of how promoting a Facebook post can be really creepy.
A coworker opened up his Facebook account today and this was what we saw. The first post on his feed was something Storylane had sponsored - a post on my behalf. I didn't know that I had allowed Storylane to post on my behalf (my bad, apparently) but more importantly I had no idea that they could pay to promote that post on my friends' feeds.
So, what just happened here? I log into a website, then that company - through a series of now acceptable events - pays to convert me into a spokesperson... even though I only logged in to check out their design and don't actually care much about their platform at all. Is it just me, or is that going a bit too far?
Yeah, that. It's utterly shocking how poorly-targeted Facebook's ads are. Also how poor-quality. This is one of the wealthiest webapps in the world, has a nightmarish amount of personal information about us... and yet the ads seem to be bad scams or generic hook-up stuff.
OP's point was that using the term "horrifically evil" is horrifically hyperbolic.
What do you mean by "tricking people" to hand over a treasure trove of data? Ignoring the many benefits that FB provides (reconnecting with old friends, easy group communication, meeting people, convenient logins for newer applications, discovering good content, etc....), what they are doing with advertising isn't "evil" in any sense.
Facebook is a /free/ service that delivers targeted ads. Would you rather ads be untargeted? Millions of people benefit from having advertisements displayed to them showing stuff that they actually want to buy. Also, millions of businesses benefit from being able to target niche markets due to FB's data collection.
People WANT to find products that are relevant to them, and so many businesses have become successful by being able to reach those customers through FB's platform.
Facebook doesn't actually sell the data to advertisers. They just allow customers to blindly target demographic groups through their platform. It doesn't violate people's privacy in the slightest.
Have they done some unethical psychological experiments? Yes. But come on, I wouldn't call that evil... especially when they do so much good. By that measure, any university that has been around for 50+ years must be evil too!
> are at the forefront of manipulative advertising (using fabricated social proof of friends, for example)
I can't say I'm surprised; stuff like this ALWAYS happens when you have a large platform and you sell things based on what a set of algorithms observe. The problem is most places that I've worked at will at least review the algorithm generated items to make sure they're appropriate. It sounds like Facebook completely lacks this which makes me wonder how long have categories like this existing on the platform?
Facebook is one of the most effective, wide reaching places for advertising so it begs the question: with enough money, how much can you manipulate a group of people?
"...An acquaintance in the biz once bribed a Facebook employee whose job it was to approve or deny ads on the platform. His inside man set his account to auto approve any ad he wanted. ..."
Just wait till some NSA employee starts selling gossip to TMZ or HR departments.
You have friends that actually spend their free time modifying their profile to make advertising easier? The fuck? Is this a psyop post from FB lobbying group?
Absolutely I'd rather be "manipulated" better, if you count advertising that way. I'd much rather be shown ads for things I might actually want than stuff I have no interest in at all, and I say that as someone who has actually purchased things off of Facebook ads. If the ads going to be there regardless, might as well work it for the benefit of both me and the company presenting it.
Curated social media that is funded by manipulating people (ads) is manipulating people.
reply