Hacker Read top | best | new | newcomments | leaders | about | bookmarklet login

Before we criticize the current interview format and propose alternatives, we need to understand how we got here first. This is my understanding of what happened (I wasn't there for most of this!).

Leetcode-style interviews became popular in the mid 00s, primarily because they were used by hot tech companies of the time. The thing to understand is that at that time, the idea of asking people to write code during an interview what sort of revolutionary. Prior to that interviews had little structure. It wasn't unusual for the hiring manager to make the decision, some times based on credentials, recommendations, or trivial-like questions.

This type of interview became wildly popular because it allowed budding unicorns to hire programmers of high quality at scale. The process was less biased than the alternative, reproducible and scalable. Here you have two blog posts [1][2] that show the line of thought at the time.

The reality is that big tech has elevated leetcode type interview to an art. They have reached a near local optimal through years of experiments and refinements. It is working well for them so they don't have the need to take big risks such as completely revamping their hiring process.

I love the topic of hiring and interviewing and I'd love to truly get at the bottom of which method works best. I like this article because it explicitly calls out shortcomings with typical alternatives that are not usually mentioned. I hope in the future a new crop of unicorns can take these practices to the next level and do a fair comparison.

[1] https://www.joelonsoftware.com/2006/10/25/the-guerrilla-guid... [2] https://sites.google.com/site/steveyegge2/five-essential-pho...



sort by: page size:

I once had an interview that I really liked. It was for a bank.

They give the candidate some messy, done-in-a-hurry code (but not purposefully obfuscated) and ask them to refactor it to the best of their ability. The interviewer sits next to the candidate and talks to them throughout the whole process. It's pretty much pair programming, but the candidate has the initiative.

I found it to be a breath of fresh air after all the leetcode interviews. Of course, they have the luxury of doing this, because they know what exact technologies they are using and they don't need to measure candidate's ability to learn new tech, etc. In their situation they are more interested in topics like whether the candidate can produce maintainable code, name things properly or communicate with co-workers.


I recently got the opportunity to interview some new candidates for a few different development positions, with the freedom to shape the technical part of the interview as I saw fit. I figured that I'd share my experience of what worked, as well as my own thoughts on some of the other approaches, since I'm still a developer that has to experience the other part of the hiring process as well.

Here's a summary, in case you don't feel like reading the whole thing...

LeetCode, code tests and whiteboarding, the flawed standard: I find that being asked to write code that works without an IDE or a compiler and solve problems without being able to look them up is quite unlike my day to day job. Larger companies may still use this approach because they don't necessarily experience a shortage of applicants even if a certain percentage are turned away by this, but that might not be viable for smaller companies. Furthermore, you might end up hiring people who are good at competitive programming, which may or may not be what you should optimize for.

Take home exercises, high risk but high reward: I'd say that one of the better approaches is giving the candidate a problem to solve and letting them do that with their own tooling and approaches to development, which can later be a great basis for discussion or reviewing the code together. The problem here is that not many have time for that sort of thing and it's not always trivial to figure out whether it's actually their project or a friend helped them out bunches with it (albeit the discussion should help). That said, allowing the candidate to offer up one of their public past codebases (if such exist) might be worth looking into, as an idea.

Just having a conversation instead: I've found that just conversing with the candidate is also good enough in many respects. Asking them a few technical questions, but also letting them express themselves, and learning more about them in the process. You might suggest that there are some people who are good at explaining things without quite having a grasp at them, but typically if you ask them to elaborate that becomes pretty clear. Not only that, but this is helpful in figuring out which are their stronger areas and which might need a bit more help from mentors, as well as what they're interested in themselves, you can even recommend some technologies for them to look into!

Doing some design work together: if you'd have even more time on your hands, something like a system design interview might also be nice. Actually, any kind of design work, be it creating an ER diagram of a DB schema for some domain, or a few UI wireframes and talking about the UX, or just drawing up an example architecture for some made up system. Regardless, seeing how people reason in detail is good, especially if the process is collaborative in nature, even with the occasional disagreement to explore conflict resolution.

Writing and/or reviewing some code together: while writing code without tools might be troublesome, sitting down to write or review some code with actual IDEs and other tooling is a far better experience. Actually, if there was a take-home task, in those circumstances refactoring some code or adding a new feature might be quite good, because the candidate will already be familiar with the codebase in question!

I'd say that you don't need to do all of these steps, or ask all of these questions, but when the volume of applicants is relatively low and manageable, having a humane approach to each individual feels like an okay way to go. It's actually good to be at a point where I don't have to scale this up to a 25-100 interviews a week and throw much of this out the window. Then again, other approaches are also good, whatever works for each set of circumstances!


I think the more important overarching point is that most candidates aren't prolific github people like you, and likewise most companies are not going to cater their interviewing practices to woo your preferred flavor of interview style.

Most candidates don't have code samples laying around (e.g. they only write proprietary code for work), so asking for some is a non-starter in a lot of cases.

Another point worth of reflecting, your accomplishments in OSS say nothing of interviewing expertise, they are completely different skillsets. Companies may not hire in the order of 50k people a year, but interviewer calibration in a large org is absolutely a problem that isn't easily solved by just following the opinion of some random 6-digit salary SWE.

As someone who's interviewed in the order of a few hundred senior SWEs, I think there are good and bad ways to go about any interview format, including leetcode-style.

Complaining about how companies don't just throw away their interview practices in favor of [insert pet format here] doesn't really help, since it isn't really actionable.

We talk about the perils of full software rewrites, I think similar concerns exist for hiring as well.


Been an interviewer and interviewee recently, so being on both sides of the track has given me some perspective.

This is the current process that I think is fair and holistic:

1. meeting with the candidate, our manager, and some devs talking about their past exp., our company, our team, and their wants

2. Take home coding task based on our day to day work: This is linear with direct instructions for inputs and outputs; there is an optional part at the end for testing more tricky concepts. They are instructed to write clean and clear, no stress if they don’t finish, take their time with a week to do it (it’s a few hours work).

3. Interview with them walking through their code on their machine and describing their thought process, field questions from them if any are left.

Then we decide by a team discussion afterwards.

Gives them their space to think, reduced pressure for candidates who are socially pressured.

Thoughts?

I personally detest leet code as a recruiting tool.


Anytime there's a blog post or a thread complaining about the 'broken coding interview process' it ends up being upvoted by everyone and their brother on HN but very rarely have I seen people offer a reasonable alternative instead.

I work for a big tech company and for our org, typically we have a couple of Algorithms/Coding rounds, a system design round, a technical communication round and a loosely structured interview/chat with a hiring manager. I think this works because people, at least on my teams, have to work on a wide array of features that could involve writing backend queues, or Hadoop jobs or some business logic on the API layer that would need to leverage caches/DBs/other appropriate key-value stores. It definitely does help to have people who have broader knowledge of Computer Science to be working on those features.

I feel our interviews have low recall and precision but a good accuracy (we end up hiring smart people we can count on to pick new things up). And FWIW I don't think we ask unreasonable questions during the coding rounds. I also feel it's a reasonably scalable way to organize a generic 'objective' and 'fair to all' interviewing process at a big company that can hold a reasonable hiring bar.

If I was running a smaller company then I probably would have kept in place a process that was closer to the bare practical requirements of the job.

Also, if the existing interview process was really that bad in practice with an abysmal correlation with successful hiring, wouldn't the companies have dropped it already?

I know this kind of sucks for applicants who think they have all the skills that are needed to practically do the things they'd have to do at the job they're interviewing for (and in some cases rightly so), but arguments bashing the current interview process would seem more valuable with the proposal for a better alternative that can comes off as reasonable after the same amount of critique and scrutiny that the current process gets.

Also, I often get confused - do people not agree that there's any correlation with hiring good engineers and the current process, or do they just think companies should have a more developer-friendly interview process? If it's the latter, then do the companies have any incentive to do so? It can't be that 'their potential hiring pool becomes wider' because if that really was such a big problem they would have changed already.

(All opinions mentioned here are mine and none are my employer's, obviously)

(Edited comment twice to attempt to make thoughts more coherent)


Here's that alternative to the coding interview that everyone has always asked for.

My first 3 or 4 tech jobs started with interviews like this, which seemed to work fine. Then everyone fell in love with coding questions. It's funny to think some hiring managers might be going back to interviewing like every other job.

I've done my first programmer hiring for my startup several years back. Thankfully I've came across Guerilla Guide to Interviewing[1] article by Joel Spolsky, and that helped greatly.

I've interviewed 6 candidates within a week, and hired one. Never regretted of the outcome.

I've set up a test code so: an input variable, comparison variable, and an empty function. Function takes input variable as an input, and output of which is compared with comparison variable.

Interviewee would be asked to fill in the body of the function, and play with it until function gives the expected output. I've set up a computer with two screens, put IDE on one, and Google on the other. Two chairs, and some coffee.

First I've chatted with the interviewee for about 10 minutes, tryin to make them comfy as possible. Then before the test, I've stressed very much that I am also a programmer, and I know how awkward it is to be coding while someone watching, and that'd alone would make me do silly mistakes, so I was expecting same from them and that was completely OK.

Afterwards, I've encouraged them to use Google in plenty, and feel free to stay silent or explain as they go.

So I got to assess their fluid intelligence, their ability to break the task down and progress efficiently, their English proficiency[2] (if they use Google in English), their usage of keywords, their choice among in stack overflow responses, etc.

At the end I've rejected a guy with 8 years of experience on his CV, and hired a junior. Looking back, that turned out to be an amazing decision, best I could have made, for work went good with the junior and I've had the (mis)fortune of working side by side with the 8yr guy several years later.

PS 1: Task was to write a recursive function to travelse a multidimensional array, and find out whether the first letter of every "value" was a capital letter. PS 2: The work was to maintain and develop mid-scale SaaS project along with me.

1: https://www.joelonsoftware.com/2006/10/25/the-guerrilla-guid... 2: Country's mother language was not English, and English proficiency was not good on average.


Actually, asking people to write some code in (or for) an interview is the best way to go with technical interviews. I haven't found any other approach that's better given the time constraints for an interview (and I've been interviewing ppl for 20+ years).

The hard part is coming up with what programming problem to ask a candidate that's fair to the candidate yet provides insight into their capability. For us, I created a simple data structure problem that hits a smattering of things that are taught in a typical undergrad CS curriculum (trees, recursion, etc). The candidate has about 30min to write the code for this in their preferred language using a plain text editor. I care less about perfect syntax when working with libraries, etc. and more about whether the candidate can take a problem statement and turn it into working code.

For an on-site interview, I send the candidate a description (along with screenshots) of a small web app project that hits some UI, some DB, some business layer. I encourage them to innovate around my description and surprise me. They get a couple of days to implement, then show up with their laptop and demo it. I want to see working software, then a deep dive into their code.

Typically by the end of the on-site interview, we have a really good idea whether this person knows their trade or not. It's helped me sniff out the pretenders pretty well over the years. The ones I hire are the ones who obviously take pride in their craft, know why they picked various implementation approaches, and can explain the technology they used.


Not that I disagree with all of your points, but I can't say I've seen any evidence that there is such a thing as Standard Developer Interview™. Over the past month, I've been interviewing with 8 different companies, 3 of which are in the game development industry. They have all had substantially different approaches to their recruitment process and interviewing.

I've identified some common elements, but each of the companies mixed and matched them in different ways and degrees. The techniques used were:

- timed online test (e.g. HackerRank or Codility), reviewed later by the team

- online, real-time coding during the phone screen (e.g. HackerRank or CollabEdit)

- take-home project with a deadline (usually up to 2 weeks), reviewed later by the team

- phone, video-conference or on-site discussion about employment history and experience

- phone, video-conference or on-site discussion about technical problems

- on-site whiteboard coding

- on-site coding on company-provided box

- on-site coding on candidate's own box

- coffee-shop conversation about behavioral topics

The topics for technical discussions were:

- algorithms, data structures and complexity (big-O notation, trees, hash tables, lists, circular buffers, breadth-first and depth-first traversal, binary search, recursion)

- scalability (caching, load-balancing, sharding)

- concurrency (locks, conditional variables, contention, memory fences)

- mathematics (vectors and matrices, linear algebra, computational geometry, combinatorics and probability)

- low-level concepts (manual memory management, cache locality, specifics of garbage collection, VMTs in C++)

- object-oriented design (inheritance, polymorphism, fragile base class, composition over inheritance)

- API design (specifying constraints, changing API based on requirement changes, paging results, increasing or decreasing operation granularity)

Now, I know that this is just 8 companies. I know that this is a drop in the ocean. I know that the plural of "anecdote" is not "data". But over and over again, I've seen the same claim on HN: our industry focuses too much on computer science "trivia" questions. For an audience that usually requires quality, sourced data to support arguments, we seem to be awfully quick to draw generalized conclusions about this specific topic.


I wish all interviews were done this way. It solves so many problems with the traditional coding interview process.

A couple of years back, I was hiring some vendor developers for my team, and since I had some flexibility in the interviews that wouldn't be allowed for full-time employees, I tried an experiment: For one of the vendor candidates, I told him a day ahead of time that I'd be asking him to implement System.Collections.Hashtable in C#, with behavior equivalent to the one in .Net. The day of the interview came, and he whiteboarded it flawlessly, to a degree that I'd never seen any other candidate accomplish, and he was able to have a deep conversation about the implementation and all of its nuances. He then proceeded to bomb the rest of his interviews and didn't get the position. What this illustrates to me is that the typical programmer interview, where we come up with some weird problem and toss it to the candidate while they have no references to look at, no IDE to type into, and a 45 minute deadline looming over their head, is totally divorced from the actual work we actually really want them to do. If I need someone to implement a widget, I'd rather they took some time to research it and do it right, versus trying to crap out a solution in five minutes on a whiteboard.

I've interviewed lots of developers and recommended 'hire' to about 15 people in the last several years and not once did we get it wrong.

All it takes is a short problem to solve at home, which can be easily googled and one hour architecture interview, where we discuss the technical architecture of a hypothetical 'real world' service.

It takes about 20 minutes to determine if the candidate has the experience with the technologies listed in the resume. Little experience is not necessarily a show stopper.

True that the language we hire for (Clojure) filters out a lot of people ahead of time, but knowing the language is not exactly what I look for..

What I look for is 'passion' - does the candidate love programming and can the candidate articulate technical issues with ease.

The other question I try to answer - will the candidate enjoy working in our team and will we enjoy working with him/her.

Smart people will shine in a certain way, even if they bomb specific questions - they have an opinion, they try, they ask the right questions.

I'd be sad if we missed on some of the people in our team because of automated (inhuman) puzzle interviews.

We're not looking for a problem solving machine, we're looking for a partner to create something great together, someone who shares our passion for hacking and someone who we'd love to work with.

I share TFA's opinion that leetcode-style interviews are not the way to go and I hope the industry comes back around and focuses on the human side more.


Here's what they are doing, from the linked blog post:

> Our dev teams had taken to working with candidates to solve a bug or feature as part of the interview process. It was a collaborative effort with the candidate and the team working together to solve a real problem.

Sounds like a great idea.

Except it's so variable. What if you get lucky and you get a easy bug, whereas someone else gets something much harder. Let's standardize that by giving everyone the same task to work on.

And what if we want to hire people who aren't fluent in your project's programming language? That would be a big handicap for them. Let's deal with that by having minimally sized "projects" in all the languages, and allowing the candidate to choose their preferred language.

And we don't want the interviews to be biased against people who don't already have domain knowledge in a specific field. So let's make the task generic enough to be approachable by any good programmer.

Congrats, if you did all of the above, you've reinvented leetcode style interviews.

There's some cool stuff that MS has put into practice. I like the focus on reading/understanding existing code, instead of just writing code. I also like the relaxed pacing and "open book" approach.

But for the most part, I don't think this is really as revolutionary as people think it is.


Recently, I tried something similar with my interviews.

In the past, I've had lackluster success with the brain teaser, CS theory, and 'FizzBuzz' interviews. Plus, most motivated candidates had already mastered and memorized the 'Google interview secrets'. So, I wasn't getting the right people.

My goal is to recruit productive, engaged software engineers who care about the craft of software and who can learn things relatively well and are resourceful. I want collaborative team players, with passion for software, not necessarily geniuses.

So, I concocted an interview process that required new candidates to develop a small program using all of our current tools. I also gave them an existing, relatively poor code base to work with, and also asked them to recommend refactorings. I also gave them access to some of my current team members to ask questions.

It was a great process. But, here was the major problem: 75% of the candidates dropped out of the interview process instantly. I'm guessing that they had better opportunities with quicker yield.


Hi HN, I’m Gary from Litebulb (https://litebulb.io). We automate technical onsite interviews for remote teams. When I say “automate”, I should add “as much as possible”. Our software doesn’t decide who you should hire! But we set up dev environments for interviews, ask questions on real codebases, track candidates, run tests to verify correctness, and analyze the code submitted. On the roadmap are things like scheduling, tracking timing, and customizing questions.

I've been a software engineer at 11 companies and have gone through well over a hundred interviewing funnels. Tech interviews suck. Engineers grind LeetCode for months just so they can write the optimal quicksort solution in 15 minutes, but on the job you just import it from some library like you're supposed to. My friends and I memorized half of HackerRank just to stack up job offers, but none of these recruiting teams actually knew whether or not we were good fits for the roles. In some cases we weren't.

After I went to the other side of the interviewing table, it got worse. It takes days to create a good interview, and engineers hate running repetitive, multi-hour interviews for people they likely won't ever see again. They get pulled away from dev work to do interviews, then have to sync up with the rest of the team to decide what everyone thinks and come to an often arbitrary decision. At some point, HR comes back to eng and asks them to fix or upgrade a 2 year old interview question, and nobody wants to or has the time. Having talked with hundreds of hiring managers, VPs of eng, heads of HR, and CTOs, I know how common this problem is. Common enough to warrant starting a startup, hence Litebulb.

We don’t do LeetCode—our interviews are like regular dev work. Candidates get access to an existing codebase on Github complete with a DB, server, and client. Environments are Dockerized, and every interview's setup is boiled down to a single "make" command (DB init, migration, seed, server, client, tunnelling, etc), so a candidate can get started on coding within minutes of accepting the invite. Candidates code on Codespaces (browser-based VSCode IDE), but can choose to set up locally, though we don't guarantee there won't be package versioning conflicts or environment problems. Candidates are given a set of specs and Figma mockups (if it's a frontend/fullstack interview) and asked to build out a real feature on top of this existing codebase. When candidates submit their solution, it's in the form of a Github pull request. The experience is meant to feel the same as building a feature on the job. Right now, we support a few popular stacks: Node + Express, React, GraphQL, Golang, Ruby on Rails, Python/Django and Flask, and Bootstrap, and we’re growing support by popular demand.

We then take that PR, run a bunch of automated analysis on it, and produce a report for the employer. Of course there’s a limit to what an automated analysis can reveal, but standardized metrics are useful. Metrics we collect include linter output, integration testing, visual regression testing, performance (using load testing), cyclomatic/halstead complexity, identifier naming convention testing, event logs, edge case handling, code coverage. And of course all our interview projects come with automated tests that run automatically to verify the correctness of the candidate’s code (as much as unit and integration tests can do, at least—we’re not into formal verification at this stage!)

Right now, Litebulb compiles the report, but we're building a way for employers to do it themselves using the data collected. Litebulb is still early, so we're still manually verifying all results (24 hour turnaround policy).

There are a lot of interview service providers and automated screening platforms, but they tend to either not be automated (i.e. you still need engineers to do the interviews) or are early-funnel, meaning they test for basic programming or brainteasers, but not regular dev work. Litebulb is different because we're late-funnel and automated. We can get the depth of a service like Karat but at the scale and price point of a tool like HackerRank. Longer term, we're hoping to become something like Webflow for interviews.

Here's a Loom demo: https://www.loom.com/share/bdca5f77379140ecb69f7c1917663ae5, it's a bit informal but gets the idea across. There’s a trial mode too, for which you can sign up here: https://litebulb.typeform.com/to/J7mQ5KZI. Be warned that it’s still unpolished—we're probably going to still be in beta for another 3 months at least. That said, the product is usable and people have been paying and getting substantial value out of it, which is why we thought an HN launch might be a good idea.

We’d love to hear your feedback, your interview experiences or ideas for building better tech interviews. If you have thoughts, want to try out Litebulb, or just want to chat, you can always reach me directly at gary@litebulb.io. Thanks everyone!


Interviews are largely broken today. People don’t write code in front of other people in real time in their jobs. They write it for an interpreter or compiler, and huge amounts of time and resources are spent building/maintaining development environments and pipelines. Trial and error is engrained into the normal workflow. It baffles me that a few minutes in a share notepad is commonly considered an accurate gauge for skill.

I think a much more accurate way is to discuss the role and candidate background and experience, and if theres a fit invite then to participate in code reviews and/or a small project.

Companies are cheap though, so they would rather do quick and crappy trivia interviews than something more time consuming and thorough.


> Likewise, I've had some novel interview processes that more closely approximate real working conditions. One company's interview was conducted over git-hub. It was asynchronous, with tasks spread out over a week. After building the first solution, the interview came back with further feature requests and comments on the first iteration. This tested the candidate's ability to refactor existing solutions to meet changing tasks, and the ability to integrate feedback. These are things that are rarely captured by whiteboard interviews, but are arguably some of the more important skills in software development. But on the flip side, it was much more time-consuming in aggregate than 4 hour-long interviews.

These you can afford to do if you are Google. If you aren't, candidate sort the places they want to work at in descending order. By the time they get to that take-home, they might already be further along the interview stages at better companies.


This sounds like a reasonable approach and that is why software interviews remain broken. I'm sure it works for your organization, not saying you are bad at hiring or anything but it still smacks of the kind of hoop-jumping that turned me off so much from the process last time I was interviewing. This included on-the-spot coding exercises, massive take-home projects that required many hours of undifferentiated grunt work, totally useless whiteboard sketching and pseudo-code sessions, obscure Google-like quiz questions.

The interview process for the job I have now was a massive breath of fresh air.

The application asked for code samples and a cv. It was a small company and the CEO, CTO and direct co-workers all drove the interview process. The process was entirely conversational. First an intro phone call with the CTO and then a questionnaire via email in which I answered about 30 questions on various topics that were all very practical daily software development type stuff. It was painless to respond to each with about a paragraph in detail. Then a call with a direct co-worker about the questionnaire and this was my opportunity to ask questions of him about the company. Before getting the interview they actually read my code samples and reviewed my Github account. In the interview there was a ton of discussion about the company, its culture and all the of the above discussions. Following this was compensation negotiation with the CEO.

Everything about the hiring process said to me yes this is the place, they get it!

next

Legal | privacy