Disclaimer/Foreword: The aim of writing this series of posts is to allow our modern interview processes to be viewed through the lens of history (chronicling the process) and science (identifying problems and inferring improvements from process changes). This series will contain 3 parts: the first one (this post) describes how interviews got to be the way they are today, the second will describe how you can maximize your chance of success at these interviews, and the third will talk about suggestions to improve the process and fix some of the issues plaguing tech interviews.
I’ve been interviewing in the “tech industry” for around ten years now. In that time, the interview process has undergone some interesting changes. Presently, I’d assume almost every engineer in the industry is all too familiar with the onerous, and oft-frustrating process that is tech interviews. Yet, except for a few minor differences, the process has been standardized* throughout the industry. How did it get this way?
The beginnings (prehistory to 2006)
We start our story in the early 2000s, when companies (Google) would basically ask riddles for interview questions (riddle process). Questions like, “How would you escape if you were stuck in a blender?” or ones about pirates dividing gold coins on ships. While “smart” people would (theoretically) clear these interviews, I presume this process led to complaints that these riddles had nothing to do with on-the-job skills. This was then “fixed” in the mid-2000s by having questions that were still puzzles but involved algorithms, data structures and coding as well (algorithmic puzzle process). For what it’s worth, I did some basic googling to see if I could verify my theory and it looks to hold true, at least for a few cases.
In the initial years since the change, I’m assuming this change worked really well! Most candidates were only able to clear the interview if they could show how they grappled with a logical problem, using their toolbox of algorithms and data structures to stitch a solution together. Even if they didn’t arrive at a working solution, as they worked through the question, their thought process could be “judged” to see if they were “smart” based on how they attacked the problem. Hence, you’d frequently see interviewers mention how you didn’t need to solve the problem to get through - it’s about how you think about the problem, how you work your way through it. I presume current stalwarts of the industry (like Jeff Dean and Peter Norvig) would have had little-to-no concerns with the process, as they seem to like these kinds of puzzles and would have fared well in this kind of interview process. However, these are still puzzles and do not correlate very highly with the job. Whether “intelligence” as judged by puzzle solving transfers to on-the-job skills (or more generally, to anything) is an entire blog post by itself, but I’d venture a guess that it doesn’t.
The interview process at this stage (2007-2015ish) fulfilled a couple of criteria. Firstly, it filtered out many candidates, optimizing for minimizing false positives i.e. candidates who got through were almost certain to be “smart” (although false negatives could be fairly high). Secondly, the process allowed interviewers to analyze how candidates approached problems, and get an insight into their thought process.
However, the ‘algorithmic puzzle’ interview process fails on multiple counts. Firstly, I’m not sure how well this translates to on-the-job problems. On-the-job problems are usually about strategic planning, interpersonal interactions or - if technical - are much harder, but allow for multiple attempts, collaboration (with others & the internet) and do not have a 40 minute time limit. Secondly, a candidate that is a strong ’no’ at the first interview will still be asked to sit through the remaining three interviewers, wasting both the candidate’s and the interviewers’ time. A phone screen is an attempt at remedying this, but that can fail for a very “omnipresent” reason - bias. Bias is the third main problem with the interview process, which stems from the fact that a single interviewer is the be-all and end-all of your interview performance. It doesn’t matter too much how you did, if the interviewer thought you didn’t do well. Some measures were taken to reduce this bias, like having the questions and answers provided noted down, and in some cases having a hiring committee that would make the decision to move forward once the interviewers had agreed the candidate was a potential hire (interviewer’s negative bias can still have an impact, but a positive bias would not be affected).
One major pitfall, which would soon be exposed, was that this only tested anything meaningful (see caveat) if the candidate had not seen the question before. If the candidate had, this was effectively a test of how well the candidate could memorize, remember and pretend like they came up with it on the spot.
The moment of reckoning
Around 2016, there started to be a distinct change in the interview process. I’d like to think it was a direct result of Max Howell airing out his frustrations publicly. Max is the inventor of homebrew - a piece of software that allows Mac users to install other pieces of software. This created a moment of reckoning - it looked like the number of false negatives may be far too high with this process, even if the number of false positives was low. Companies had a number of different ways they could go at this stage. Would they take a few months to think about and redesign the process? Unfortunately, the path chosen was a different one. Most companies simply doubled down on the existing process by deciding that the reason Mr. Howell was not able to solve the interview question was not because the skills being tested were irrelevant, but rather just because he was not aware that these were the kinds of questions that were going to be asked! So, the company started providing candidates with preparation materials and links to websites like TopCoder, HackerRank, etc. so that candidates could spend hours and days and weeks poring over “leetcode problems”. This was a very different point-of-view from before, as evidenced by some advice given in “Cracking the Coding Interview” - the book notably advises that you are to let the interviewer know if you have seen the question before. However, if the candidate was supposed to prepare for it, and they prepared well, then it should be expected that the candidate would have already seen most questions! And, given that the chance that the interviewer can reliably detect if you have done this question before is pretty low (especially if you throw in a few “missteps”), it would make sense for you not to voluntarily give up the information that you have seen the question before, as doing so only puts you at a disadvantage to other candidates who don’t divulge this information. In addition, if the interviewer proceeds with the question anyway, having revealed that you knew of this question can make it seem like you didn’t have to think and lead to a reject.
The hardening (2016+)
The lack of a significant change in the interview process meant that soon many candidates had prepared extensively for the questions.
Reading Steve Yegge’s blogpost from the mid-00s shows how far we have strained from the original vision of the process:
Long-term warming up means: study and practice for a week or two before the interview. You want your mind to be in the general “mode” of problem solving on whiteboards.
A week is no longer nearly enough time. I’d expect it would take at least a couple of months if you wanted to clear multiple interview loops, and ideally a year if you haven’t interviewed in some time.
Feel free to ask for help or hints if you’re stuck. Some interviewers take points off for that…
This again, does not hold true. From experience, interviewers are now loathe to give hints, and I would assume this is due to most candidates breezing through “hard” questions in under 15 minutes. It would be unbelievable for someone to work out some of these that quickly (unless they were a genius - which I presume should be rare), and yet, this is now the expectation.
For an example of how much some candidates prepare, and how meticulously, look no further than Mauricio Poppe’s page. It’s gotten so bad that there are businesses around preparing for interviews (leetcode, cracking the coding interview, etc.). Not learning skills for the job, just learning how to get the job. To truly understand the absurdity, imagine a dentist (with 10+ years of experience) needing to take a course to find a job as a dentist.
Over time, instead of the interview process actually getting better it has instead gotten incredibly hard with an increase both in the number of questions you are expected to solve, as well as the difficulty of the questions. All it would take is for one company to look at the interview process and revamp it successfully (à la a certain search company in the early ’00s). A successful revamp means better hires, and better hires means a more productive company, and a better run business. The fact that this has not been done yet may mean that hiring is overrated. Interviews are overrated (since questions don’t reflect on-the-job skills). Any one even slightly capable, is able to do the job (usually by learning on the job).
Or it could mean that fixing hiring is a really hard problem.