> THIS GUY LOVES BUYING VACUUMS HERE ARE SOME MORE VACUUMS
Everyone thinks this problem is really annoying, but to me it's a good sign - it indicates the current technical capabilities on using and abusing personal information is still limited, the predicted machine learning apocalypse is not here yet. In other words: despite the large quantity of personal information collected by advertisers, they still cannot exploit it efficiently enough, so far all they can do mostly is showing you the same things over and over - advertisers are still unable to develop a perfectly manipulative algorithm that makes you buy things (edit: or they have it, but it's still not found in wider deployment).
Effectiveness of targeted ads directly translates to effectiveness of mass surveillance. Perhaps we should enjoy the useless shopping recommendations from Amazon for now. When every single shopping recommendations made by the algorithm is the thing you always wanted (coming soon!), you'll know the cyberpunk dystopia is officially here.
This point of view also reconciles the two different opinions on machine learning in the tech community. The first person tells you that it's nothing but overhyped marketing, while it has some specialized uses, but everything is exaggerated as "AI" to get money from VCs, and in many areas, it's effectiveness is just too limited. The second person warns you that collecting personal information at an industrial scale is extremely dangerous, with the recent progress on "AI", they can use your data against you in ways never possible before. These two are not a contradiction: The first view is where we were previously, and the next view is where we are going to.
> "This guy has a vacuum cleaner. Let's recommend him more vacuum cleaners to buy!"
Even though every single one of the platforms claim to be using cutting edge tech for targeted ads but all of them suck. I don't see them improving even if you give them a guided tour of your home including closets
> The advertiser has a tracker that it places on multiple sites and tracks me around. So it doesn't know what I bought, but it does know what I looked at, probably over a long period of time, across many sites. Using this information, its painstakingly trained AI makes conclusions about which other things I might want to look at, based on...
> ...well, based on what? …Probably what it does is infer my gender, age, income level, and marital status. After that, it sells me cars and gadgets if I'm a guy, and fashion if I'm a woman. Not because all guys like cars and gadgets, but because some very uncreative human got into the loop and said "please sell my car mostly to men" and "please sell my fashion items mostly to women."… You know this is how it works, right? It has to be. You can infer it from how bad the ads are. Anyone can, in a few seconds, think of some stuff they really want to buy which The Algorithm has failed to offer them, all while Outbrain makes zillions of dollars sending links about car insurance to non-car-owning Manhattanites. It might as well be a 1990s late-night TV infomercial, where all they knew for sure about my demographic profile is that I was still awake.
> You tracked me everywhere I go, logging it forever, begging for someone to steal your database, desperately fearing that some new EU privacy regulation might destroy your business... for this? [1]
>I have a sense that this is the dirty little secret of the spyware advertising industry, personalization just isn't that great.
Personalized adverts and recommendations can be incredibly, horrendously dumb.
Here's what I see when I hit amazon's homepage at the moment :
A "buy once again" column that features blackout curtains I bought 3 months ago (no, curtains don't need to be replaced every months, amazon.), USB cables I bought multiples of in the same time frame, a wireless charger (I already bought two before).
An entire line dedicated to showing me backpacks (I bought one less than a year ago)
An entire line dedicated to headphones (I recently bought wireless IEMs)
An entire line dedicated to watches (same)
I don't get it. Supposedly the best and brightest work at firms like amazon and google to brainwash us to buy stuff, but classic, random, non-targeted advertisement is more likely to make me discover products I'd buy than targeted advertisement because the latter only shows me things after I don't need to buy them anymore!
Here's what I would expect actually intelligent targeted advertising to do :
After buying a smartphone, recommend accessories (cases, screen protectors, USB-C dongles, chargers, whatever)
Here's what targeted advertisement actually does :
show me smartphones ads everywhere I go after I already selected and BOUGHT a smartphone. No, I don't need to buy another smartphone weeks after a recent replacement, amazon!
The same sort of phenomenon can happen after google locks on searches I did to buy something. I can't wait to see the internet advertisement industry crash and burn, it's overvalued nonsense.
> the predictions being made with it aren’t terribly impressive
So what? They still have the data and can refine their methods tomorrow. Today their predictions might be low quality, but they can retry as many times as they want. The problem is not the predictions they are making today; it's the many predictions (or inferences) are able to keep making in the future.
> political groups
Remember that some types of advertising is not targeted. Some political advertising or branding advertising is intended to reach "all voters"j or maybe a very broad category like "Everyone Californian of voting age". Branding campaigns don't care if you're interested in e.g. vacuum cleaners. They just want you to think of their name first every time you happen to think of or hear about vacuum cleaners.
edit: (Multiple contradicting groups could be pushing ads at your (very general) demographic.)
> science....or dogs
Many scientists like dogs?
> Amazon thinks that my interests
No, they think that showing you textbooks and vacuum cleaners has a greater chance of increasing their revenue, according to various statistical models. Targeted advertising isn't about targeting what your are interested in. It's about letting other people target your with what they think they can sell you.
edit2: Of course, it could also be a terrible model trying to use data in stupid ways. I'm just suggesting that there are many plausible explanations.
Advertising is likely far more successful at manipulating you than you realize.
When it comes to purchases it will show you only a narrow subset of what's avilable to you. Amazon does this already when it shows you the same products over and over again then starts showing unrelated products or stops showing new results entirely.
Product advertising is bad enough, but it's not just about trying to get you to buy something. AI will also shape your opinions and your political views. It'll hide or downplay information to protect paid partners and advertisers. It'll lie to you in order to convince you that there are things you urgently need to do to protect what you already care about, when in reality doing those things will be ineffective or even counterproductive. It will show you things to keep you angry, or afraid, or feeling helpless. It will distract you from doing things that could bring meaningful change. It will influence how you vote.
Even if you are somehow more immune than most people to cognitive biases and the psychological manipulations employed by advertisers, marketers, and PR firms most people around you will not be.
> The issue is not advertising, the issue is the construction, classification and use of human profiles through the collection of behavioral data, by-and-large without the subjects being aware or cognizant of the possible short and long-term implications.
I don't disagree that collection of data isn't bad, but two things:
1. The collection of data isn't the goal, the advertising is. The data is only practically valuable to do advertising better (and of course for surveillance, but I suspect the market for data for surveillance is orders of magnitude smaller). If you kill the advertising, you kill the need for data. If you kill the data, the advertisers will just find other ways to make their advertising convert better.
2. We should all be outraged at the idea of corporations using psychological tricks to influence our behavior. The human mind was not pentested before shipping to production. Some ads just remind you that a product exists, but many try to create positive associations to make the product more appealing or to make you think of it later.
We just let this happen and don't care. Mostly because we don't really notice it working. You don't think "oh yeah, I'll buy the name brand paper towels because I saw that commercial", you just buy them because they're the ones that your brain has an association with cleaning up the whole dribble of blue liquid. Of course you'll buy the one that works. Our brains are unpatched Linux kernels and advertisers are botnets.
Like, be outraged at the data mining and selling. But be more outraged at why it's happening.
> It's worth remembering that "data" as they say stales very quickly. Knowing that you ordered a package two months ago is far less interesting than knowing you're expecting one next week.
Comparatively less, yes, but knowing about all your packages is still very interesting and hugely privacy sensitive.
> You'll still get ads, but they won't be tailored to you.
TBH, all tailoring I've seen is comically bad. Yeah, they can know I bought shoes so they would show me shoe ads for the next three months. As if that's how humans behave - once we're in "shoe heat", we do nothing but buy shoes for three months and the task of the good ad system is to capture when the "shoe heat" starts and capitalize on it. I think if one wrote a comedy show about robots trying to understand human society and getting it hilariously wrong every time, the behavior of ad networks could supply great material for years.
> But the concept of "having your data" in a historical context is meaningless.
No it's not. There's a lot of private things in one's past than can be dug up and (ab)used. Ask any politician who went through an electoral campaign.
>If systems like Amazon's recommendation system got good enough, and if perhaps AI buddies began watching you and giving you product advice at just the right time, advertisements could be a thing of the past.
That just sounds like targeted advertisements?
Or do you mean, specifically only when you are looking for the thing in question, finding the thing which is most likely to be what you want?
>I believe we have all been served with ads for products and services that we haven’t even searched for.
Sure, including ones I've never discussed; simple (not even requiring ML) statistical prediction of likely interest based on interests of people with similarities in search history or other things that Google overtly has in its tracking of profiles could well explain that; with ML applied well, that gets even better.
Covertly recording conversations for ads seems to be an unnecessary assumption to explain any effect I've seen or heard decribed, so while it's not impossible, I don't see any reason besides paranoia to believe it is true.
Computer vision can show greater-than-estimated-human-performance, while still failing hilariously unhuman once in a while. People remember the 1 in a 1000 wonky recommendation that made them do a double-take. Recommendation engines work best for the mean and stereotypical person. That way, you can use information of similar profitable people to effectively recommend.
Facebook, for instance, got mined for "suckers". If you are scummy, you want a list of gullible people who click the most stupidest, poorly designed, and shady ads. Ad tech knows where they are and delivers them on a silver platter. Going back 2 decades to serving ads without ML would kill a business. You don't think they thoroughly test a new recommendation engine and see relevant stats go up before they deploy it? You don't think they can serve you more relevant ads when they know you are a 17 year old male vs. a 42 year old woman? Both the data gathering and the algorithms have improved year over year. To say ad tech personalization is terrible, is akin to complaining we don't have AGI.
> every advertiser I’ve interacted with is either doing individual-level targeting or striving towards it.
Only if they're clueless.
For example: Nike really wants a dataset of "people who buy expensive sneakers for fashion purposes".
This dataset is probably hundreds of millions of anonymous people, and not personal data. If there was a way to get this dataset directly, Nike would do that in a heartbeat.
Unfortunately, as of 2019 the only way to get something like this today is by, e.g., crossreferencing credit card purchase info with Twitter browsing logs, which leaks a shitload of sensitive private data.
For ad purposes personal data collection is a bug, not a feature.
"I'm not sure why most people are okay with companies gathering tons of data about them and trying to use it to manipulate them into buying products they don't actually want or need (among other uses), but I'm not one of them."
It takes a ton of work to prevent it and it's more or less futile anyways. It's not so much that people are okay with this, rather it's a part of modern life and it's exhausting trying to mitigate it. And impulsively buying products due to ads is a personal failing.
> to put it another way, all that "show a person who just bought a car an ad for another car" targeting was not really working.
Boy that sure is true for me and has been for a long time. I'm often bewildered not by how accurate my ads are, but rather by how braindead they seem to be. I'm not convinced algorithm-based ads work at all.
In fact I would hypothesize that ads are often more effective when they are unexpected, for something the viewer has never considered buying before. A targeted system would hide such ads.
> Effectiveness of targeted ads directly translates to effectiveness of mass surveillance.
Assuming you're correct about the technical points of efficacy, we can still turn this around as a social problem and hypothesize that the willingness to use unproven, annoying targeted ads directly correlates to the willingness to use intrusive mass surveillance.
You bought a vacuum here are more vacuums might directly translate to you attended a protest so you're a permanent malcontent. Worse yet, the "metrics" of social monitoring aren't as black-and-white as future purchases, so badly-used surveillance might falsely think itself useful simply through confirmation biases.
Honestly, focusing on AI seems silly since the internet, at this point, is completely neck-deep in deceptive scams being pushed by reputable, mainstream, important orgs.
Google, Facebook, Amazon, etc. take in literal billions of revenue to show people scam ads. The scammers are directly paying them money - presumably these are not anonymous orgs, but people with credit card numbers and ID. And the big internet giants just gladly act as for-profit middle-men for ripping off confused elderly people, because reviewing the ads posted to their platform isn't practical.
>further advertisements for that product aren't necessary to reach you //
I'm one if those "advertising doesn't affect me" people. Except, then I realised that advertising acts subconsciously and uses human psychology against us, and I'm not as immune as I thought.
I buy a Dyson, every ad I see for vacuums is a Dyson, it confirms I made the right choice, everyone is buying them, they're everywhere, etc.. Everytime I see a shiny new vacuum it's a chance for my brain to compare it with the old rubbish one; why are all those people "enjoying" vacuuming when I have to suffer it.
With polo shirts it's like "these new ones look smart/fashionable/etc." vs the old one.
Yes there is likely a lag in "this person was looking for ..." signals; but I still don't buy (heh!) that as _entirely_ silly.
>Once I buy a car, or a television, I'm probably not going to be in the market for another one for years. //
Car, probably, depends on the person's wealth; TV, I've v heard people say "we liked it so much we got another one for our bedroom" or whatever.
> I still don't get how in the hell could personalized ads be more profitable.
It's because it creates a new profitable business model for BigTech.
Personalized ads require that the network collect as much data about you as possible. The idea is to use this data by trying to create some behaviour model to determine how and when a user purchases stuff. (Currently this doesn't really work as advertised, as you guessed).
But the US government, with PRISM, created a new revenue stream for BigTech - buying the raw data of the users directly. As with any government contract, this is actually way more profitable than serving ads to cheapskate internet users. Seeing how easily and successfully BigTech is able to get this data, the US has effectively decided to privatise spying (partially) to the BigTech. To sabotage the creation of regulation and privacy laws that may prevent such data harvesting, and to create an even more profitable market for the BigTech, they have also started to rope in other countries too. (See Five Eyes (FVEY) - https://en.wikipedia.org/wiki/Five_Eyes ).
> "They have a tendency to show you ads for exactly the thing you don't need anymore because you already found it. I don't think AI is in any danger of taking over the world just yet."
There's an eschatological trait to targeted advertising, as it seems to be all about past sins. So I'm not too sure about your evaluation and AI's own claims…
Everyone thinks this problem is really annoying, but to me it's a good sign - it indicates the current technical capabilities on using and abusing personal information is still limited, the predicted machine learning apocalypse is not here yet. In other words: despite the large quantity of personal information collected by advertisers, they still cannot exploit it efficiently enough, so far all they can do mostly is showing you the same things over and over - advertisers are still unable to develop a perfectly manipulative algorithm that makes you buy things (edit: or they have it, but it's still not found in wider deployment).
Effectiveness of targeted ads directly translates to effectiveness of mass surveillance. Perhaps we should enjoy the useless shopping recommendations from Amazon for now. When every single shopping recommendations made by the algorithm is the thing you always wanted (coming soon!), you'll know the cyberpunk dystopia is officially here.
This point of view also reconciles the two different opinions on machine learning in the tech community. The first person tells you that it's nothing but overhyped marketing, while it has some specialized uses, but everything is exaggerated as "AI" to get money from VCs, and in many areas, it's effectiveness is just too limited. The second person warns you that collecting personal information at an industrial scale is extremely dangerous, with the recent progress on "AI", they can use your data against you in ways never possible before. These two are not a contradiction: The first view is where we were previously, and the next view is where we are going to.
reply