Google basically admitted that GPA and where you went to school don't matter and do not correlate with job performance. And yet here we are still optimizing on the Ivy League. It'd be a safe wager that algorithm whiteboard interviews don't either except that they screen out folks who don't pass the interview. So because everyone has that baseline, that baseline is by definition meaningless. The other side of the coin is that performance management is also just as bad.
All of these claims from Google that say competition performance hurts or that GPA doesn't matter are missing one huge thing: selection bias.
Google only sees the performance of the employees that it hires, not the performance of the employees that it doesn't hire. Because of this, the data they analyze is statistically biased: all data is conditioned on being employed by Google. So when Google says things like "GPA is not correlated with job performance" what you should hear is "Given that you were hired by Google, GPA is not correlated with job performance."
In general, when you have some thresholding selection, it will cause artificial negative correlations to show up. Here's a very simple example that I hope illustrates the point: Imagine a world where high school students take only two classes, English and Math, and they receive one of two grades, A or B. Now imagine a college that admits students with at least one A (AB, BA, or AA) and that rejects everyone without an A (BB). Now imagine that there is absolutely zero correlation between Math and English - performance on one is totally independent of the other. However, when the college looks at their data, they will nonetheless see a stark anticorrelation between Math and English grades (because everyone who has a B in one subject always has an A in the other subject, simply because all the BBs are missing from their dataset).
The bottom line is that whenever you have some score that is some positive combination of input variables and then you threshold your observed data on a minimum total score (as is the case in hiring at Google or in college admissions), then you will see much stronger negative correlations between your inputs than exists in real life.
And really, whenever you run some selection algorithm, you should hope that (on the margin) there are no correlations between your selection decision and your inputs. If there still is a correlation post-selection, that means your algorithm has left money on the table. So when Google says that programming competitions are negatively correlated with performance and GPA is uncorrelated with performance, what that likely means is that Google's hiring overvalues programming competitions and fairly values GPA.
In fact, if we did a randomized controlled study (the gold standard of proving causation), I think we'd see the expected results. Just imagine - if you grabbed two random programmers from the entire population, one who had won a competition and one who had not, do you really think the competition winner would be the inferior programmer?
Edit: Many other posts here are coming up with plausible sounding narratives to fit the data. "Competition winners are one-trick ponies or loners or write awful code." I encourage everyone to think critically about the data analysis itself.
I agree that exams don't always do a great job of measuring actual competency, but I'm not sure they are worthless either.
> This correlates well with the observations made by Google and other large companies that analyse how to identify valuable hires. Turns out that there's no correlation between grades and workplace performance.
These observations probably suffer from a confounding effect known as range restriction. Basically, if Google generally only hires people with a GPA of 3.0 or above, they may have some idea of how GPA variation above that level correlates with job performance, but they don't know about it below that level. So they won't know if people with GPAs of 1.5 will do just as well at Google as people they normally consider.
Suppose Google started hiring people with much lower GPAs than they do today. Anyone want to make a bet what the outcome of that experiment would be?
Well, Google knows, and that's why they don't do the experiment.
This makes no sense. Surely Google believes GPA is a good predictor of performance, but isn't on the job performance the ultimate measurement? How long did you work there?
> In the engineering/academic world, grades do matter and are highly predictive of ability.
Can you cite some research studies that conclusively show this to be the case?
I'd like to counter by saying that companies like Google have largely ignored GPA as a measure of aptitude. From the article (link below):
"Google doesn't even ask for GPA or test scores from candidates anymore, unless someone's a year or two out of school, because they don't correlate at all with success at the company. Even for new grads, the correlation is slight, the company has found."
Google won't you apply to work there without giving your college GPA, despite the fact that all the academic research, and also their own internal research, shows that GPA doesn't correlate at all with job performance.
I don't know if it was ever made public, I think I heard this story via backchannels. But Google did a research project at some point to discover why GPA/interview performance in general did not seem to correlate with post-hiring job performance. The project was never allowed to reach full completion because the direction their data and investigation was going looked like this:
"Our hiring process does not correlate with post-hiring job performance because we have a large and measurable bias in interviews in favour of Ivy League candidates and women."
In other words, GPA didn't correlate with job performance for Google because it was a confounding variable: they were selecting the sample for top tier universities (which select on GPA) at hiring time, but not when it came to promo reviews (which had a different process and anyway, less rhetorical ideology involved than hiring, at least at that time).
Didn’t Google find no useful correlation between either GPA or school attended and job performance? I’m not sure if they categorized people by “easy” or “hard” degrees.
Out of Google, Amazon, Microsoft and Facebook, Google was the only company that straight up declined to interview me because I did not meet GPA requirements (3.0 cumulative - I had a 3.4 at the time in CS but I hadn't done well in freshman chem or calc 3)
I like that they decided to stop asking brain teasers due to a lack of correlation between performance in them and performance once hired. Do they really think that cumulative GPA has a strong correlation on new-hire performance?
I certainly don't.
(I expect to get some push-back from you guys and I'm interested in the discussion to follow :))
Meta data analysis from thousands of studies shows that there is essentially 0 correlation between where someone went to school (or their GPA) and how well they perform on the job once hired
The data may say that, but it likely omits a lot.
For example, if I have a candidate with straight A's from MIT, but I hire someone else who had a 1.2 from Podunk State instead, I supsect that the candidate from Podunk State had some other qualification that offset their school and GPA.
This is the same effect that Google sees when their best employees don't ace their interview loop. They actually have one person give them a very low score. This tends to correlate to high performers, because it meant that someone else fought for this candidate, which is a signal of something else compelling about the candidate.
To put it another way, I suspect that if you randomly selected straight A students from MIT versus D students from local community college, the MIT students will outperform for a wide variety of professional jobs.
Interestingly, Google themselves decided GPA doesn't matter much.
"Google doesn't even ask for GPA or test scores from candidates anymore, unless someone's a year or two out of school, because they don't correlate at all with success at the company. Even for new grads, the correlation is slight, the company has found."
Actually Google found that GPA is a weak indication of how a person performs. Apparently, according to Google, the GPA is only weakly correlated with job success in the first year or two.
I think he is right. Past and future performance don't correlate. That is why employers do usually not look a resumes or references. Or why, say, Google and Microsoft don't ask tricky questions in interviews. Or why admission to graduate programs is never determined by your past grades and achievements. Wait, what?
You've skipped over the original NYT article, which the qz.com article being linked to here quotes from. He does say they're worthless. Here's the complete quote:
One of the things we’ve seen from all our data crunching is that G.P.A.’s are worthless as a criteria for hiring, and test scores are worthless — no correlation at all except for brand-new college grads, where there’s a slight correlation
This chimes with my understanding and experience that Google only really use test scores and GPA's right now as a filter to manage the vast number of internship/entry-level applicants they get.
This has been circulated around HN and Reddit several times, and it's disappointing that someone of Norvig's stature would present the data in such a misleading way.
Here's a good explanation posted by "tedsanders" the last time this came up on HN:
"""
All of these claims from Google that say competition
performance hurts or that GPA doesn't matter are
missing one huge thing: selection bias.
Google only sees the performance of the employees that
it hires, not the performance of the employees that it doesn't hire. Because of this, the data they analyze is statistically biased: all data is conditioned on being employed by Google. So when Google says things like "GPA is not correlated with job performance" what you should hear is "Given that you were hired by Google, GPA is not correlated with job performance."
In general, when you have some thresholding selection, it will cause artificial negative correlations to show up. Here's a very simple example that I hope illustrates the point: Imagine a world where high school students take only two classes, English and Math, and they receive one of two grades, A or B. Now imagine a college that admits students with at least one A (AB, BA, or AA) and that rejects everyone without an A (BB). Now imagine that there is absolutely zero correlation between Math and English - performance on one is totally independent of the other. However, when the college looks at their data, they will nonetheless see a stark anticorrelation between Math and English grades (because everyone who has a B in one subject always has an A in the other subject, simply because all the BBs are missing from their dataset).
When Google says that programming competitions are negatively correlated with performance and GPA is uncorrelated with performance, what that likely means is that Google's hiring overvalues programming competitions and fairly values GPA.
"""
I've also heard people involved in Google's Code Jam competition say that Norvig's study was done a long time ago, and no longer really applies.
There's a big (and common) error in statistical reasoning Google is making with the decision to down-weight GPA based on their data: That GPAs do not predict performance among those it hired does not imply that Google should not use GPA when hiring any more. Rather, it means that Google used GPA to exactly the right extent among those it hired under its old policy - there was no information left in GPA they didn't use, and therefore they should leave whatever policy they have in place as is.
Explanation: suppose there are only two things Google observes, GPA and coding ability, and that Google uses some correct decision rule to only hire those people where the sum GPA + coding ability > some threshold. Those who have lower GPAs will thus tend to have higher coding ability, otherwise they wouldn't have met the threshold to make it into the pool of hires they're analyzing - and, therefore, comparing "those with low GPAs that Google hired" and "those with high GPAs that Google hired" is not an apples-to-apples comparison.
In order to assess whether GPA should be used at all, they would need to look at how the people they didn't hire because of their existing policy would have performed.
Before I graduated, when I was interviewing for jobs, lots of companies wouldn't interview anyone with lower than a certain GPA. Google was the most notable one, although they've since gotten rid of that requirement since they found it has almost no correlation with your performance as an employee.
Other larger companies that cared were Lutron, Epic, and a few more I don't really remember. I think Epic even asked me what my ACT score was.
Beyond that first job though, my GPA hasn't mattered at all, which is good because mine was pretty mediocre.
I believe this was done after taking a look at GPA and post-hiring performance at Google. This was not mentioned in the OP, but I guess they realized that there wasn't any or, maybe in certain positions (non-research/entry devs/etc), a negative correlation between the GPA and performance.
reply