> we use quantifiable work-sample tests as the most important weighted component in selecting candidates
Can you speak to that a little? I'm reading it as either programming challenges in a real environment, or analysis of previous code snippets submitted. In either case, I'm happy to see more companies doing this.
I'm always looking for ways to improve my interview skills, but I want to do it honestly. Studying to the test isn't fun for me, I would rather hack on something.
> Why would a practical test be more effective than a discussion about prior work experience? Short coding tests under pressure are extremely unrepresentative of real work and I'm not sure homework-style interviews are much better. I personally don't want to spend an entire weekend on your test and I've dropped opportunities for that reason in the past.
These are good points, and the tests I have given are specifically designed to avoid these issues.
Personally I do this:
* Come up with a simple but realistic 1 or 2 hour task, and write up the request in plain English (do not write a tech spec). It should be task you'd expect anyone to be able to easily do on day one without any assistance.
* Include a short narrative about the end user that clearly implies a specific requirement or two, but don't spell out it out as an explicit requirement.
* Omit a small but important detail that any reasonable person would ask.
* Instruct them to provide a solution to the users problem, and ask questions if something isn't clear.
These are pretty simple to evaluate too:
* Does the code run? (i.e. can the candidate write basic code?)
* Does it meet the user's explicit needs? (i.e. can the candidate follow basic instructions?)
* Does it meet the user's implicit needs? (i.e. does the candidate write code to spec, or do they think about the bigger picture?)
* Did the candidate ask a question about that obvious missing detail? (i.e. does the candidate speak up, or will they make assumptions?)
You would be surprised with how many people don't have the ability to apply their ability to write code to solve simple problems.
> You’re trying to measure how they’ll perform on your work before you hire them? That sounds like you’re trying to get free work from them.
I wish the hyperbole about trying to get free work during an interview would just die already. Nobody is getting free work by interviewing applicants, interviews are costly to most companies, but necessary to grow the team.
Just because interviews take time and are difficult and scary doesn’t mean someone is taking advantage of you. Try to put yourself in the company’s shoes — they have to evaluate you one way or the other. How would you do it?
A time limited stress test is extremely effective at identifying certain kinds of performers. Obviously being able to solve problems fast is an indicator of a certain kind of smart. This doesn’t identify all smart people, and nobody claimed it did, that’s why a coding test is almost always one part of many in the interview process, and people routinely get hired when passing some parts without passing all the parts.
> They are testing for the kind of person that is willing to jump through hoops.
That's true, but it is meant as a different filter when applied to a large number of candidates.
I went through a bunch of leetcode style interviews last year with some very odd interviewers - the oddest was a Sr Principal engineer asking me to manipulate linked lists with recursion.
And the comment he made after I wrote the code was that "We don't care if you write code at work, but we want to know you like writing code enough to not see it as beneath you to do. To make sure you're not some whiteboard architect, because everyone you are supposed to lead will be writing code."
I had been looking at the interview as a sort of idiot catcher before that point, but suddenly it seemed like a good filter for respecting the effort at the lowest level of the profession.
> Take an afternoon and skim Cracking the Coding Interview before applying.
First, I know this isn't a black or white issue but the problem I have with this is that I'm wondering if they are hiring people with the skills they want or are they hiring people with the skill to succeed in their test.
In theory, they should be the same. In practice... I don't know.
> Competency tests have helped me get in the door.
How so? The door was already open if the company decided to give you a test.
My impression is that coding tests either offer neutral or negative feedback during an interview. It's not like a candidate can manipulate linked lists in a way that shows particular brilliance.
In my company (totally remote) we decided to automate hiring (also remote) as most as possible and we are happy with the results. We also value the candidate's time very much so we made the hiring process very straightforward. It works like this:
- First the candidates must do some online tests, which are mostly multiple choice questions. The subjects are logic, english and a specific test for the coding language. All of these tests were hand made to us and customized to our needs. The candidates spend about 2 hours in total doing theses tests, and they know immediately their score in the parts of the tests that can be accessed automatically (which are the most part)
- The candidates approved in the theoretical tests - about 10% of the candidates that applied for the job - do a practical coding challenge that was also made from scratch by us. We have looked at the available solutions in the market and thought they are too focused on textbook algorithms that are rarely used in the job market. So we made our own tests that contemplate practical problems a programmer has to solve daily in our company. The questions are NOT too complex. The objective of this step is to weed out the candidates that can't code in the language, not to get the best. We design this challenge to take about 2 hours. The implementation was a lot of fun: we use the Docker API to run the candidate's code against the automated tests we made and the score is calculated as the number of tests that passed. Using Docker means we can work with any language that runs on Linux. So far we have made tests in Javascript, Typescript, Ruby, Postgres and Elm.
- The candidates that are approved in the previous step - about 2-4% of the candidates that applied for the job - are called for an interview. Now is the time when we take a look at the candidate's resume and look at the human side of things. Given that we have very few candidates to look at and we know they all meet some minimum skills we are very confident when conducting the interviews. By now we usually have a very clear signal to tell who the best candidate is even before we conduct the interviews. So in practice the interviews are NOT used to select the best candidate. They are there just to see if a red flag doesn't show up.
We are very proud of our coding challenges app. If someone wants to collaborate on it just drop me a message. It's a Rails app on GitHub, but it probably not good enough for an open source project yet - one needs to know a lot about Docker and Rails to make it work.
I’ve hired dozens of Sr. engineers and given coding tests to almost all of them (after a 2nd round interview, never earlier).
However, my tests are time-boxes (1hr) and are about greenfielding an app (like a ToDo app) to see how they approach a problem rather than programming knowledge.
> Why?
About 33-50% of devs that pass the two interviews fail the coding test. Of the ones that fail, they’re great at interviewing but probably oversold their skills. Some are really poor at time management. Some are really poor at simplifying problems. And some are really poor at following directions.
It’s possible that they were on teams that carried their weight or possible they were exaggerating their capabilities or prior responsibilities.
The opposite has happened too where I’ve been blown away by the code quality and approach.
But overall it’s been a very helpful datapoint in hiring.
>Don’t do “dog and pony show” interviews. If I have to write code on a whiteboard, or even in an editor live on a call, or look at an extensive take-home test (anything more than a 15-30 minute task), I’m highly likely to pass for anything except an a top shelf package at a company I’m highly motivated to work for.
I'm always interested in this point of view, because I understand why candidates don't like it but as a hiring manager myself I also know that there's a fairly large proportion of candidates who look good on paper but can't actually do the most fundamental thing that a developer needs to do: solve an abstract problem with code.
What's the alternative to filter those people out?
> Also, hiring managers... Do you find these kind of tests useful? Why? How?
Former hiring manager.
It's unfortunate because it is a bit insulting. However, there are some people who lie. They lie about being able to code. I don't want to hire a senior dev who I then have to fire shortly thereafter.
A simple, short coding test that any senior dev should be able to do quickly in their language of choice or a longer project that they get paid for are my two favorite options to filter out folks who say they are senior devs but can't code.
Then read some (maybe this, I didn't look at it in detail, but at first glance looks reasonable) and practice.
> I just want to see how well the candidate can code :-(
All this tells you is if the candidate can code. If you want that, give them a test, look at their Github history (if they have it). Ask for code samples from any project they can share that they're proud of.
I personally find it much more valuable when interviewing someone to talk about what they've done. How they've solved problems. What their interests are both in and outside software.
Setting up a non-stressful experience and using behavioral interview type of questions and form is my personal favorite. Something I think is really important, that many people forget, is that the interviewee is also interviewing you. If you come across disorganized, unprepared (didn't even read their resume), etc., then you may fail the interview and even if you want to hire them, they'll say no.
> As imperfect as ALL technical recruiting styles may be, at least a short coding assignment (i.e. < 4 hrs) is halfway-related to the real world.
Yes, but what it is not is economical. There's a huge dropoff in the interview funnel at this step because it's largely not worth the effort for the interviewee. The choice is between spending 5-10 hours per take home test per company or spending 5-10 hours a week studying silly whiteboard exercises that apply to dozens of companies.
It's also just as uneconomical for the interviewer. I've completed about a dozen of these tests over the last year and only one company bothered to look at the test after I sent it in. Which makes sense; it takes time to look at a lot of code, figure out which of it is boilerplate and which of it was written fresh for the test. And even then, there's no guarantee somebody else didn't write it. And while you may not be interviewing with hundreds of companies, most companies easily get hundreds of applicants per position these days.
For whatever reason, it doesn't seem to be worth anybody's time to do these tests. This bums me out because I'm terrible at live coding tests. But, this is the reality of the situation, so I have to adapt.
> Maybe skills tests in an interview process is not such a bad thing.
I'm fine with this in theory, but lots and lots of places bungle it so badly that they're no longer testing programming skills, they're testing rote regurgitation of memorized code under pressure. If your job legitimately requires that, for everyone's sake get out of the industry.
> Most of these white board tests seem to be "Have you revised this specific algorithm for your interview, and found the specific implementation we're looking for?" If you have, great. If not, rejected.
As someone who has given hundreds (maybe thousands) and received dozens of these types of interviews, it's really not.
Generally speaking I'm looking for at least 2-3 of these abilities, depending on which question I'm asking:
1) Can you reason about data structures and algorithms when looking at a problem you haven't seen before?
2) Can you communicate your ideas effectively?
3) Can you integrate new information from a colleague while problem solving?
4) Can you write code that makes sense? Do you understand basic programming concepts?
5) Can you read your own code and reason about it?
There is at least two hidden attributes that I'm not testing for but do affect performance, so I try to account for them:
1) Are you comfortable with me, a relative stranger?
2) Can you do these things while dealing with a high pressure situation?
You will encounter high pressure situations at work but often interviews feel more high pressure (to some people) than most daily conversations about engineering.
That's it. If I can tell a candidate already knows "the right answer" to the problem, I'm usually disappointed because I'm more interested in watching them think.
> I interview programmers and I've encountered this a few times, and yet, I still don't do code tests. I talk to the candidate and you can easily ask the right questions to tell who can code and who can't.
How do you know? Is this repeatable -- e.g. can you write down the questions and the criteria such that someone who is not you can follow them and come to the conclusions you would?
Do you ask all candidates the same questions, or do you go with your gut and hope there's no systemic bias?
>I don't really understand why asking someone to do a small amount of programming for a programming job interview, when they claim to be an experienced programmer, is any sort of "extreme" practice.
It isn't and that's exactly what you should expect.
Nonetheless it shouldn't be too much to ask that the test and interview ask relevant questions and that the company puts skin in the game that is commensurate with the sacrifice being asked of the candidate.
In other words, 5 minute screener tests are cool but if you make me do a weekend project you put me up in a 5 star hotel.
> We see and also personally observe how coding tests do not effectively measure a persons analytical skills
I'm not convinced of this. It's not a perfect system but a well-orchestrated coding test is a great indicator of performance. In fact I think that's a huge pain point when hiring for a soft-skill oriented role -- it's very difficult to judge candidates until they've spent ~6 months on the job.
So what do established companies do? They rely very heavily on pedigree and experience. Coding interviews suck but I think they're necessary for meritocratic hiring. Plus, most of the complaints I see here are the result of poorly conducting programming interviews, not necessarily programming interviews as a concept.
I'm not a fan of whiteboard coding challenges, even though they got me my first "real" Bay Area tech job.
On the other hand, take home challenges eat up a lot of time and credentialist filters like going to a name brand school (or even having the right major) filter me out.
My go-to for when I'm in a hiring position again would be recommendations from my personal network, but that has its own flaws and doesn't scale very well.
I think at the core of the problem is that most companies are extremely risk adverse in their hiring and would gladly miss a Jeff Dean type of super-performer in order to reduce their number of bad hires.
> Simply by doing a number of practice and real interviews you will be much better at interviewing. You as a programmer are probably not that fundamentally better, but you the interviewer are.
That is true, but I see that as a serious weakness in programming interviews rather than a feature. If you can, or even need, to train yourself in things that have no bearing on the job you are being hired for, the interview is flawed. I understand that, to some degree, this is unavoidable. It seems like there is a drive to structure interviews like programming competitions or Jeopardy. However, the latter two are games; an interview should be a well-constructed test, not a game. I do think a large number of non-US people participate in these competitions because of their relationship to current hiring practices.
> a software engineer who considers writing any tests beneath them, a job for someone else
I'd also balk at that for an interview take-home test.
I was an Indian QE guy looking for a dev job once. I was that guy whose job was to write tests for someone else, which wasn't fun at all.
If I had interviewed somewhere and I felt the conversation was going towards "how good are you at writing tests, write some", I'd have noped out of it immediately.
Specifically because testing was being offshored (think of the early 2000s) and development work was being held at HQ at my then employer.
I would expect that this work would be my day-to-day based on that interview.
That sort of "stale smell" around the job of writing tests for someone else's code stinks to say the least.
Right now I'm on the other side of the hill, I'd consider it a red-flag if someone interviews me for a day without asking me "review this code" or "analyze this design", but merely whiteboard a bunch of algorithmic C++ code without any tests.
Can you speak to that a little? I'm reading it as either programming challenges in a real environment, or analysis of previous code snippets submitted. In either case, I'm happy to see more companies doing this.
I'm always looking for ways to improve my interview skills, but I want to do it honestly. Studying to the test isn't fun for me, I would rather hack on something.
reply