Hacker Read top | best | new | newcomments | leaders | about | bookmarklet login

That sounds more like what will be offered when Facebook decides to jump in the hiring game.

Facebook ads allow incredibly narrow targeting, including race, age, gender, and interests across hundreds of dimensions.

I've never looked at a Google Ads console, though. Maybe they allow similarly narrow criteria.



sort by: page size:

I don't know how it works with Google, but the article says Facebook is building a new ad console specifically for the kinds of advertising where discrimination is illegal. Presumably they'll make sure ads run through it aren't producing any disparate impact. (But note that this wasn't the problem with the cases in the article; in those cases, advertisers were explicitly pressing a "don't send this to women" button.)

We’ve seen Facebook employment ads already used to specifically target younger people.

It’s more about excluding people from advertising as opposed to including. If you think of it from that angle it won’t take too much imagination to see the issues that may arise.


This post is about targeting ads at certain demographics. Those ads happen to be for jobs.

The simple question is: Does everyone have an equal opportunity to this position or not? If you can't even see the ads then the answer is no, and so it's a violation.

If there were different ads for different groups but everyone still had the opportunity to see, apply and acquire the position then it's fair game.

Perhaps Facebook could improve job ads to allow for more specific targeting but always have a fallback ad in that campaign for any non-targeted users. This would help employers without excluding anyone.


Did I miss something, I thought Facebook sold advertising based on cohorts? That's surely just as discriminating? I seem to remember age ranges being part of that as well, but even without that specificity a cohort will naturally select for cultural groupings, which will include delineations like age, race, social standing etc.

I'm not sure if there is a perfect answer there. It might be best to scope it down to a certain medium and channel and say that all users on that channel have the ability to either see the ad or browse a directory of all listings.

If Facebook also had a jobs board that anyone can search and find ads where employers aren't paying for impressions then I think that would offset a targeted campaign to select groups.


So the argument, as I see it here, is that companies should be forced to spend additional resources advertising to demographics that they have identified as not strong leads in their campaigns, so that overly virtuous people can feel happy about them spending money on something that likely won't concern them in the slightest.

Righto.

If the article's title was "Facebook is letting job advertisers target only women" or "Facebook is letting job advertisers target only people in their ideal demographic", I wonder if the discussion would be any different here.


> So, you should be able to click a "?" link in the corner of any ad and see which demographics or groups the ad is targeting.

You can do this on Facebook, for maybe four years now. It's not particularly informative, but it exists.


I’ve never tried to place a FB ad, so I don’t know what the interface looks like. But wouldn’t it be fairly straightforward on their end to disable gender (or other protected classes) as options for targeting when it comes to job or housing ads? Or are advertisers not required to choose a “category” for their ad, in the way that on Craigslist, you have to select a category to post in?

Even if there is no such categorization, it’s not a difficult NLP challenge to detect that the content of an ad is for job hiring, or to know that the advertiser is based in the U.S. FB could implement a warning/reminder dialog similar to how GMail tells you that you haven’t actually included an attachment yet.


If you are creating FB ads, you target by the segment which gives you the most leads. I imagine age is just one parameter of those segments.

Personally, I would never think to find a job through FB. IF someone wanted to target me, being outside that age group, it would need to be a different platform.


* housing, employment, and credit Ads

I think the easiest solution would be to disallow ads of those categories on their platform. I'd think the risk of "facebook/instagram is racist" damaging their brand and the cost of federal discrimination lawsuits would outweigh whatever revenue they project.

As an aside, I know it's faux pas to bring up any observed (and/or presumed) differences between the protected classes - but maybe (just maybe) Facebook's targeting is smart enough to correlate "most likely to care" about things that tend to have skewed demographics without looking at the demographic data itself. Like the example in of truck driver ads targeting men, what is Facebook using to determine who they target? And do those data points line up with demographics?

I don't know, but these kinds of systems are tough to introspect from the outside.


I thought the issue here was with demographically targetable ads on e.g. Google or Facebook. I've been a hiring manager in the past, and we never used those. Most of our entry level employees were from career fairs at the nearby university. Most of our senior employees were either from the network or from (ultra-expensive) recruiters. Frankly it never even occurred to anyone to purchase targeted ads because they're so uncommon.

What if the companies that were placing job ads on Facebook were also placing job ads on other platforms that weren't targeted to men only? Playing devils advocate here...

If you can target software engineer on Facebook, then clearly you'd just do that.

If you're targeting more broadly for some stupid reason, then there's an economic incentive to target unfairly by demographic, just to narrow down the sheer number of irrelevant ad views by any means possible.


So they can pay to show the ads to job seekers 40+ but not hire them. Facebook makes more money, and they get to say they did some social good.

Seems like they should be advocating (at least in this use case) to stop the discrimination to increase profits.


Even if they remove the explicit gender category in their advertising interface, it seems relatively trivial to find proxies for gender (or race, or age, for that matter) to target a specific subgroup of people with your ads.

It's possible to algorithmically bias the ad-targeting algorithms in such a way to achieve parity in whatever sensitive attribute you are concerned about, but that will come at the expense of utility of the targeting (since you will have to show it to more people who you actually don't want to target, just to achieve gender/race/age equality) - of course, that might not be so bad if your hypothesis is that the advertiser has a bad mental model of his target group to begin with and you think that Facebook should force him to make it more egalitarian.

I don't think that will happen though, Facebook might remove explicit gender targeting, but I doubt it will do much to help with biased ads since people will just use proxies for gender etc. instead.


Facebook already forbade using those categories for targeting ads for those things:

> For over a year, we have required advertisers we identify offering housing, employment or credit ads to certify compliance with our non-discrimination policy.

It looks to me like they're removing this as an option for all types of ads:

> We’re committed to protecting people from discriminatory advertising on our platforms. That’s why we’re removing over 5,000 targeting options to help prevent misuse. While these options have been used in legitimate ways to reach people interested in a certain product or service, we think minimizing the risk of abuse is more important. This includes limiting the ability for advertisers to exclude audiences that relate to attributes such as ethnicity or religion.


I disagree. If they know they are hosting employment ads, they need to be required to perform to legal standards. That means that, minimum, an employment ad should be seen differently than a regular ad by the algorithm and by the backend - no demographic weighting can be allowed within the protected classes; unless the employer applies for a BFOQ, makes it clear what that BFOQ is, and has it authorized by either an EEOC rep or an employment law professional within the ad platform (in this case Facebook). Otherwise they're clearly in violation of the law and knowingly so.

While I see both sides of the argument, as long as the advertiser pays per view, it seems wrong to require the listing to target demographics that are almost certainly uninterested.

Your example of listing a job in Men's Health but not Cosmopolitan seems good. Technically not a perfect analogue as a woman always could buy Men's Health, but in practice very similar.

Spinning off into a different angle, what if an advertiser gets the most qualified female applicants advertising through LinkedIn, and the most qualified male applicants through Facebook? (Again assuming pay per view) Is it wrong for them to target women & men on separate platforms with separate ads? Utterly hypothetical of course.

Or yet a different angle, what if they have ad "A" and ad "B", each subtly designed to excite one of the sexes. Perhaps an ad for the Marines, one emphasizing toughness & grit, the other emphasizing teamwork & "stronger together". Target one at men, one at women. Legal?

To me it seems to come down to the overall behavior of the employer, not Facebook or any one ad.

next

Legal | privacy