I don't agree no one would care. It seems like you're basically saying no one cares if you don't get caught. That's true for anything from murder to theft to NSA spying.
As for advertising trying to manipulate you you know it's an ad and there seem to be regulations for making that clear. For example things that look like articles in magazines but are actually ads are marked as "Special Advertising Section". Similarly ads for political candidates have to be marked "Paid for by XYZ". etc.
We don't tolerate deception in those areas why would we tolerate it from FB?
Fake news is one thing. Targeted ads are another thing. I don't think that targeted ads can change that much. Fake news I agree, can be brainwashing, but fb does all it can to combat fake news.
If I friend/follow a lot of people, and I have a simple timeline, I see what they're thinking about.
But if there's an algorithm selecting which posts I see from my social circle, and it happens to shade one way versus another, there are no ads, no synthetic content, but I have still been manipulated.
FB experimented with this quite famously a while back, they used sentiment analysis to see if they could manipulate people into being more or less depressed.
Without informing users, obtaining their consent, or following any of the other rigorous requirements for doing psychological research that we demand from medical proactitioners.
It's far not the first time when Facebook is using cheap manipulative tricks. I remember when I only registered on Facebook for the first time, its contact suggestions were mostly females (I'm a guy) in a proportion way more than I actually have females on my contact list or in my social circles. Manipulation is deeply in Facebook's DNA.
Can you see how they do it? How they do the manipulation? My acquaintances, who got tricked into accepting the friend request, didn't see the propaganda, not even after I mentioned to them that it's pro-Trump propaganda. One of them did realize, a little bit later.
It's really insidious and manipulating in a clever way.
I don't like this kind of manipulation. (Not pro-Trump, not pro-anyone-else either.) Any tips about what to do about it? Reporting the profile is one thing, however, there're likely 10 000 other similar profiles that I never found out about.
Cannot Facebook automatically find these things and ban the ip addresses or something?
That seems as a bit of a slippery slope. Also, I dislike FB as much as the next guy but I can't really imagine that someone would sign up for FB without realizing that something like this feed manipulation is a real possibility.
I don’t believe people are as easily manipulated as you do.
I don’t buy things I don’t have a pre existing need for based on advertising, or really in general. If anything ads make me less likely to buy something based on the associated irritation.
My political beliefs have not been swayed by my Facebook feed. I think if anything Facebook has made people more hardline to their existing beliefs by feeding them that there is mass agreement with them.
You rolled off into unsupported tinfoil hat land for a bit. Soon to be divorced men isn’t a big enough market to go after.
And if your kid wants gummy bears, that’s on you for buying them.
Absolutely I'd rather be "manipulated" better, if you count advertising that way. I'd much rather be shown ads for things I might actually want than stuff I have no interest in at all, and I say that as someone who has actually purchased things off of Facebook ads. If the ads going to be there regardless, might as well work it for the benefit of both me and the company presenting it.
It took time, and many thousands of dollars, before I realized that the vast majority of “likes” my pages received as a result of paid campaigns on FB were from accounts which were clearly not real people.
A simple look enough of their profiles revealed that, like would he expected from any fly by night CPA network, FB was using bots, or at least straw man accounts run by low-cost staff, to like and view content which FB was paid to advertise.
Worse, I found that the clickthrough metrics reported by them to off-FB destinations I advertised NEVER was anywhere close to what was reported on the destination, including when tracked by Google Analytics.
In short: like-fraud, click-fraud, and more.
I cannot be the only person to notice these things. I assume it persists because most people, self included, simply complain and move on once we notice the “game” but don’t sue.
I always thought that FB is somewhat complicit (not out of evilness necessarily).
I see a lot of "viral" posts - some like those mentioned in the article, but also a ton of odd woodworking, cooking, and "resin art" videos. The videos are quite repetitive and not really interesting so I wonder if they are maybe hidden ads, but they are not marked as such, and it is not clear what they are selling. (Well maybe they are trying to sell resin, which is really expensive.)
Anyway, it seems like they are different kinds of posts on FB. Some stay close to their point of origin, and only rarely get shown to other people who have not liked a page or are friends themselves. And other posts which, if somebody commented on or interacted in any way with them, get shown to their friends and friends-of-friends.
After running a charitable cause / political FB page for a while, I'm convinced that internally there are actually different categories of posts - ones that are shown to followers, and ones that are allowed to float or go viral. I really wonder what the mechanism is to get into the floating category. It doesn't seem to be based on quality, nor on money spent. Maybe it is some interaction metric that somebody learned to game?
You have every reason to believe FB uses message content to build and enhance your internal FB Profile, which is where ad targeting originates. To believe otherwise is just being gullible. If there is a profit enhancing mechanism lying on the floor, it will be picked up.
I wonder how Facebook came up with the 7% fake accounts number.
Certainly social media profits short term from these schemes... as long as they don't get out of control and threaten the entire business model. So the cynic in me suggests Facebook and the like aren't actually interested in eliminating the scams entirely but rather keeping them managed within certain parameters while appearing to be trying to do everything they can to shut them down.
Are you sure you are not missing the point? Whether curated or not, the expectation of users certainly is not that it's curated according to facebook's interest of the day, but according to their own interests (that is, to reduce noise, not to influence them), and whether perfect or not, people certainly do expect and counteract lying in ads.
Also, just because ads are generally perceived with some scepticism, doesn't mean that certain kinds of ads aren't unethical as well. In particular, ads that exploit methods of bypassing rational thought might indeed be unethical, and certainly are not the same as just general self-interested advertisement. And despite what you claim, manipulation is actually not a defining property of ads - if you have a product to sell that is actually a rational thing to buy, advertisement can use perfectly rational thinking in order to pursuade you to buy it. Just because much of end-consumer directed advertisement nowadays is trying to sell bullshit, doesn't mean that advertisement can only be used to sell bullshit.
It just doesn't know how to exploit users influence without allowing them to express their individual views, extreme or not.
More to the point, Facebook is designed to maximize how much marketing influence it has over you, which it does both by maximising time and exploiting the existing trust you have in Facebook friends, to convert that into implied endorsements and peer pressure or popularity based groupthink. Or exploit your mental state of engagement to equate content from an ad with content from a friend.
Which I point out, because Facebook is basically working as intended, they're just upset that they've built a machine which (by it's design and very nature) they can't monopolize control of the content of influence, since they rely on amplifing users' influence to express theirs (or their advertiser's).
Like most tools, it inherently just does a job, which is to get you to to influence your friends. Just like real life, friends can be a good influence or a bad one.
It could just as easily be said that it's in Facebook's interest to appear to want to find and delete fake accounts. As long as their customers (advertisers) believe click fraud is being addressed, Facebook wins.
It depends on your social network, groups, likes, and shares. Manipulators are always going to try to target people who want to believe their lies.
reply