Wacky interview questions: An exploration into the nature of evidence on the internet

Gayle Laackmann reports (link from Felix Salmon) that Microsoft, Google, etc. don’t actually ask brain-teasers in their job interviews. The actually ask a lot of questions about programming. (I looked here and was relieved to see that the questions aren’t very hard. I could probably get a job as an entry-level programmer if I needed to.)

Laackmann writes:

Let’s look at the very widely circulated “15 Google Interview Questions that will make you feel stupid” list [here’s the original list, I think, from Lewis Lin] . . . these questions are fake. Fake fake fake. How can you tell that they’re fake? Because one of them is “Why are manhole covers round?” This is an infamous Microsoft interview question that has since been so very, very banned at both companies . I find it very hard to believe that a Google interviewer asked such a question.

We’ll get back to the manhole question in a bit.

Lacakmann reports that she never saw any IQ tests in three years of interviewing at Google and that “brain teasers” are banned. But . . . if brain teasers are banned, somebody must be using them, right? Otherwise, why bother to ban them? For example, one of her commenters writes:

I [the commenter] have been phone screened by Google and so have several colleagues. I can say that the questions are different depending on who is asking them. I went in expecting a lot of technical questions, and instead they asked me one question:

“If I were to give you $1000 to count all the manholes in San Francisco, how would you do it?”

I don’t think you can count on one type of phone screen or interview from Google. Each hiring team probably has their own style of screening.

And commenter Bjorn Borud writes:

Though your effort to demystify the interview process is laudable you should know better than to present assumptions as facts. At least a couple of the questions you listed as “fake” were used in interviews when I worked for google. No, I can’t remember ever using any of them (not my style), but I interviewed several candidates who had other interviewers ask some of these. Specifically I know a lot of people were given the two eggs problem. Which is not an entirely unreasonable problem to observe problem solving skills.

And commenter Tim writes:

I was asked the manhole cover question verbatim during a Google interview for a Datacenter Ops position.

What we seem to have here is a debunking of a debunking of an expose.

Who do we believe?

You’ll be unsurprised to hear that I think there’s an interesting statistical question underlying all this mess. The question is: Who should we believe, and what evidence are we using or should be using to make this judgment?

What do we have so far?

– Felix Salmon implicitly endorses the analysis of Laakmann (who he labels as “Technology Woman”). I like Salmon; he seems reasonable and I’m inclined to trust him (even if I still don’t know who this Nouriel Roubini person is who Salmon keeps mocking for buying a 5 million dollar house).

– Salmon associated the “fake” interview questions with “Business Insider,” an unprofessional-looking website of the sort that clogs the web with recycled content and crappy ads.

– Laackman’s website looks professional (unlike that of Business Insider) and reports her direct experiences at Google. After reading her story, I was convinced.

– There was one thing that bugged me about Laackmann’s article, though. It was the very last sentence:

Want to see real Google interview questions, Microsoft interview questions, and more? Check CareerCup.

I followed the link, and CareerCup is Laackmann’s commercial website. That’s fine–we all have to earn a living. But what bothered me was that the sentence above contained three links (on “Google interview questions,” “Microsoft interview questions,” and “CareerCup”)–and they all linked to exact same site. That’s the kind of thing that spammers do.

Add +1 to the Laackmann’s spam score.

– I didn’t think much of this at first, but then there are the commenters, who report direct experiences of their own that contradict the blog’s claims. And I couldn’t see why someone would bother to write in with fake stories. It’s not like they have something to sell.

– Laackmann has a persuasive writing style, but not in the mellow style of Salmon (or myself) but more in the in-your-face style Seth Godin, Clay Shirky, Philip Greenspun, Jeff Jarvis, and other internet business gurus. This ends up being neutral for me: the persuasiveness persuades me, then I resist the pushiness, and the net is to be neither more or less convincing than if the article were written in a flatter style.

What do I think? I’m guessing that Laackmann is sincere but is overconfident: she’s taking the part of the world she knows and is generalizing with too much certainty. On the other hand, she may be capturing much of the truth: even if these wacky interview questions are used occasionally, maybe they’re not asked most of the time.

My own story

As part of my application to MIT many years ago, I was interviewed by an alumnus in the area. We talked for awhile–I don’t remember what about–and then he said he had to go off and do something in the other room, and while I was waiting I could play with these four colored cubes he had, that you were supposed to line up so that the colors on the outside lined up. It was a puzzle called Instant Insanity, I think. Anyway, he left the room to do whatever, and I started playing with the cubes. After a couple minutes I realized he’d given me an impossible problem: there was no possible way to line up the cubes to get the configuration he’d described. When he returned, I told him the puzzle was impossible, and he gave some sort of reply like, Yeah, I can’t figure out what happened–maybe we had two sets and lost a couple of cubes? I still have no idea if he was giving this to me as some kind of test or whether he was just giving me something to amuse myself while he got some work done. He was an MIT grad, after all.

13 thoughts on “Wacky interview questions: An exploration into the nature of evidence on the internet

  1. I find it pretty amusing that reasonable programming questions and silly brain teasers aren't always distinguishable. She uses "Explain the significance of 'dead beef'" as an example of a wacky question Google would never really ask a programmer. But 0xDEADBEEF is a hexidecimal number lots of compilers use to fill up newly freed memory. If you see a bunch of 0's in a memory dump you don't know if that's uninitialized memory or if it's supposed to have 0's. But 0xDEADBEEF is pretty much a giveaway that something is wrong here.

    On your original point, though, "X doesn't happen" requires a lot more evidence than "X happens". If she says she never heard of any of these questions being asked, and commentors say they have, I'd be inclined to believe both of them, thus "X happens". This is one of those cases where you can derive the more likely truth value from first principles rather than how trustworthy the website looks.

  2. Let me clarify a few things here:

    (1) The *list* is fake, although some of the questions might happen to be asked. To clarify the difference, supposed I made up and published a list of a list of things you said to your coworker yesterday, including (a) "you're a horrible person", (b) "you're stupid" and (c) "I'm not sure about that. If I created that list out of nowhere and you never said (a) or (b), it would be correct for you to say that the list is fake even if you did just so happen to ask (c). Your post (and some of the commenters on mine) generally miss this distinction.

    (2) Business Insider's question selected 15 questions from a blogger / career coach's list of 140 Google Interview questions. Business Insider reported it as "questions from the coach's clients" although the coach never actually said that. In fact, you can find the same questions *word for word* from other sources, posted before this coach ever posted his. They weren't from his clients – he pulled them from elsewhere on the web.

    Now, on to your specific points:

    >>> "But . . . if brain teasers are banned, somebody must be using them, right? Otherwise, why bother to ban them?"

    Because they used to be quite common at many companies, perhaps most notably Microsoft. They are no longer allowed, at least for software engineers. However, everyone's definition of "brain teaser" varies. For example, which of these are brain teasers:
    (a) A man pushed his car to a hotel and lost his fortune. What happened?
    (b) You have access to a 100-story building and you only have two eggs. Find the maximum height you can drop an egg from without it breaking.
    (c) Describe how you would select a random node from a binary search tree.
    Most people would say that (a) is a brain teaser, many would call (b) a brain teaser, and some would extend that to (c).

    >>> "If I were to give you $1000 to count all the manholes in San Francisco, how would you do it?"

    That's not a brain teaser (at least in my opinion). That's in the same style as "How many pizzas are ordered in Chicago?" It would be very unlikely to be asked for a software engineering interview, but for other positions, yes. Problems that are quite common for consulting positions at the big three (Bain, BCG, McKinsey).

    >>> "At least a couple of the questions you listed as 'fake' were used in interviews when I worked for google… Specifically I know a lot of people were given the two eggs problem"

    Indeed, the egg drop question was common. Again, fake list, some fake questions, and some real questions. It sounds like the commenter was only saying that some of the Business Insider questions are actually real (which I agree with).

    >>> "I was asked the manhole cover question verbatim during a Google interview for a Datacenter Ops position."

    This is actually the first comment that actually disagrees with me. Perhaps this person had an awful interviewer, or perhaps they're lying. Regardless, it's far from normal, and is banned under almost anyone's definition of brain teaser. I'm not sure it's really fair to say "Google asks this question" any more than you can say that Google asks about your marital status. Has someone asked that? Probably. But it's very far from representative.

    >>> "She's taking the part of the world she knows and is generalizing with too much certainty."

    That's fair. I admit that I obviously don't know what everyone has ever asked at Google. Perhaps I should be clearer in saying "in the thousands of Google interviews I've seen, I have never seen someone ask this type of question. However, some interviewers may have asked this type of question before." But then, I also think that that's obvious.

    Analogy: "Doctors will ask for your consent before performing an operation, as this is required under law."
    * Interpretation A: "Doctors will almost always…"
    * Interpretation B: "There has never once been a single doctor who did not…"

    My guess is that most people will interpret it as A, even if the statement literally means B.

    Really, I think it's less that I'm generalizing from my own experiences and more than I'm taking "this is very unlikely" and shortening it to "this doesn't happen."

  3. A very early Egyptian document addresses this

    How one should contrast and appropriately combine discrepant evidence from different witnesses – see The science of conjecture: evidence and probability before Pascal. Franklin, J. and Levitt, N.

    Would be interested to conjecture how much progress has been made ;-)

    For my own story, when training a replacement, I wanted to give them some safe experience in consulting – so I would purposely show up late for the appointment or have fake telephone call placed about an emergency I had to leave for. Seemed to work and I don’t think I got caught.

    And of course some of my training in MBA school was preparing and practicing responses to the well circulated interview questions at the time. People get better at answering them until the finally get hired.

    K?

  4. Gayle:

    Thanks for the additional details. I linked to Lewis Lin's site because it appeared to be the original list (even if not an original set of questions). The Business Insider site was crap. It had only 15 questions (rather than the full list), was full of irrelevant interviews, required clicking through and waiting for each page to load, and didn't include the answers (at least anywhere I could see). I didn't see the point in wasting my readers' time by linking to a site like that.

    I don't see why it's a problem that Lin repeated questions from other sites (although I agree that he should credit the sources where he found them). If you're putting together a set of questions, it makes sense to grab them where you can.

    Regarding generalizations: I had the impression from your original post that you said these sort of questions were never asked at Google. You have since qualified that you don't think they ask them of software engineers, but they might ask them elsewhere at Google.

    Regarding the list of questions, you wrote:

    These questions are fake. Fake fake fake. How can you tell that they're fake? Because one of them is "Why are manhole covers round?" This is an infamous Microsoft interview question that has since been so very, very banned at both companies. I find it very hard to believe that a Google interviewer asked such a question.

    But I don't think anyone is claiming that this is a list of questions that are always asked, only that they are sometimes asked. Based on your paragraph above, I'd guess that the round manhole question is the "fakest" question on the list. Now it's true that the Tim who commented on your website might be lying or confused. But, if he is remembering correctly, then the fakest item on the list is actually real. And it sounds like you have no problem with the programming questions on the list, and you agree that questions like the manholes in San Francisco and the egg drop are real. So how can you be so sure that "these questions are fake"?

    I see your point, that anyone can make up question and put them on the web, and in some sense the burden of proof is on the Lewis Lins of the world to convince people that the questions on their list are real. I'm just surprised that you're so sure the list is fake fake fake, given that all the questions in dispute seem to have actually been asked in Google interviews (at least according to your commenters).

    In any case, I imagine that Google interviewers, having seen such lists floating around on the web, would start changing their questions pretty quickly. In practice I'd guess that this whole conversation is already out of date.

  5. I pretty much disagree with everything Gayle said, and I don't know anything about Google interviews so I'll just make some comments about brainteaser-type interview questions in general. I have been subjected to such interview questions, and vehemently disapprove of them.

    1) Anyone who has interviewed with any one of the consulting/i-bank outfits can smell a "brainteaser" from a hundred miles out. There is no controversy about what a brainteaser is.

    2) Brainteasers do not reveal a candidate's analytical reasoning. It reveals whether the candidate has heard of the particular brainteaser before.

    3) Brainteasers reveal little to nothing of a candidate's ability to perform the job he/she is interviewing for. (For my first job, I was asked one that requires the use of the pigeonhole principle; I am not aware of a circumstance under which a consultant or banker would need to call upon the pigeonhole principle.)

    4) Brainteasers are used to amuse interviewers and prove to them that they are smarter than the job seekers.

    5) Contrary to the spiel, brainteasers have answer keys. See point 4.

    6) Some brainteasers require unconventional solution techniques. For example, you may need to use an inelegant brute-force technique because the data is arranged in such a way to stump any attempt at a general solution, the kind of problem-solving skills that you want colleges to teach.

    7) Contrary to belief, most business jobs do not require Mensa-level IQ. This applies to Google jobs, consulting jobs and banking jobs.

    8) Brainteasers can be used to filter out candidates who are not good "fits", which is to say do not look like other employees of the company. (say this "fake" question: how many golf balls can one fit into a Ferrari 360?)

  6. Why are these questions banned?

    Anything involving hiring by deep-pocketed firms has public policy implications due to the legal doctrines of disparate impact and business necessity. For example, when I worked for Dun & Bradstreet and needed to hire a programmer, I asked HR for their written programmer's test. They said they had no such thing and that I was forbidden to put any questions in writing to job applicants. (Written tests leave a paper trail.) However, I was informed, I was free to ask anybody anything orally about programming.

  7. I have to disagree with you Kaiser. As an MIT student studying Course 6 (eecs) who has taken a couple algorithms classes, your ability to solve "brain teasers" determines who creative you can be to solve complex problems.

    Just about every interview I've done has involved some kind of brain teaser. And the brain teaser usually involves implementing some kind of data structure such that it minimizes the number of operations in order to solve the problem (or minimizes the space required).

    I agree that the manhole teaser is an extreme example, but most brain teasers do test the creative intelligence of a candidate.

    I also agree that some brain teasers are over-used. Most interviewers tell me that if I've seen the problem before, then I should let them know. And yes, I am truthful and tell them, but there are those who beat the system and solve brain teasers that they've previously solved.

  8. I see (at least) three different types of question in this list:

    Gotcha! questions. You either know the answer or you don't, and in the latter case there is no chance of working out the right answer in an interview. This is fine, as long as the question has something to do with the job. You might expect a computer scientist to know what "dead beef" means, for example, and know the complexity of a sorting algorithm. If the job requires some understanding of probability and statistics, you might expect the candidate to know how to work with a Poisson process and understand the optional stopping theorem. These questions only seem baffling hard to a lay person because they rely on years of training. On the other hand, Gotcha! questions about manhole covers and monopoly surely don't discriminate between who can do the job and those that can't.
    Guesstimation questions. These present very open ended problems with insufficient information to come up with a precise answer, like filling a bus with golf balls or counting the number of piano tuners in a town. Such questions are not bad interview questions. It's not about getting the "right" answer but about exposing the way you think about new problems. You need to identify the required inputs and explain clearly how you would put them together to find the solution.
    Logic puzzles where you apparently have insufficient information to solve the problem, or where the obvious strategy is really sub-optimal. Unfortunately are relatively few original logic problems and you tend to keep coming back to the same problem dressed up in different ways. The lists given by Business Insider are full of examples. They give two questions about clocks (one and two) that rely on the same observation (the hour hand moves in continuous time, not in discrete one-hour jumps). The question about weighing balls is a variation of the "nine coins" problem that I first read about in a Martin Gardner book when I was a boy. The question about philandering husbands and their vengeful wives is essentially the same as a problem I read in a book that my dad had when he was a boy although it was presented in a completely different way (3 men have to guess the colour of a sticker on their own forehead). If you have ever heard of the old chestnut about rowing a fox, a chicken and a bag of grain over a river, you can probably get the key to answering the question about getting fast and slow campers over a rickety bridge. So these questions only discriminate between people who have spent a lifetime studying brain teasers and people who haven't. If your aim is to fill your organization with "people like us" maybe you can do that, but it does not make a healthy organization.

    By the way, the "Instant Insanity" problem did indeed drive me crazy when I was a boy. Then at university I discovered how it could be presented as a problem in graph theory, rendering it tractable.

  9. @Andrew "I don't see why it's a problem that Lin repeated questions from other sites… If you're putting together a set of questions, it makes sense to grab them where you can."

    That's fine that he did. I've never asserted that Lewis is lying. However, the more degrees of separation away from the original source, the higher likelihood that some lied or misinterpreted the question.

    Given that the manhole question:
    (1) does not match up with typical Google questions
    (2) should be banned at Google
    (3) is very well known for being a Microsoft question
    (4) was copy and pasted across many sites, with no clear attributing to a specific candidate
    I think it's far more likely that it's fake.

    @Kaiser "Anyone who has interviewed with any one of the consulting/i-bank outfits can smell a "brainteaser" from a hundred miles out. There is no controversy about what a brainteaser is."

    In that case, I wish you were there then when the Google Hiring Committee I was on used to debate whether or not certain questions were brainteasers. We could never agree.

  10. Ok, I did some digging and actually have some pretty solid evidence that the list was faked (at least as far as attributing all the questions to Google).

    Read on below, but the gist is this:
    (1) I found Lewis' original source for several questions (it was a CNN article).
    (2) On the original source, the questions are not attributed to Google.

    See below.

    Lewis lists these Google questions on his post (http://blog.seattleinterviewcoach.com/2009/02/140-google-interview-questions.html):
    (1) "How much should you charge to wash all the windows in Seattle?"
    (2) "Why are manhole covers round?"
    (3) "You have five pirates, ranked from 5 to 1 in descending order. The top pirate has the right to propose how 100 gold coins should be divided among them. But the others get to vote on his plan, and if fewer than half agree with him, he gets killed. How should he allocate the gold in order to maximize his share but live to enjoy it? (Hint: One pirate ends up with 98 percent of the gold.)"

    One of many links he cites at the bottom of his post is this cnn article: http://money.cnn.com/2007/08/29/technology/brain_

    Go there, and you'll see all three of those questions listed – the lengthy pirate one is even listed word for word. I think we can assume that this was his original source, what with the word-for-word copying and his citing the link.

    Now, let's read that cnn article.
    (1) The manhole question is attributed to Microsoft.
    >> ""We want to gauge people's creativity," says Warren Ashton, recruiting manager at Microsoft. The manhole cover problem is Ashton's personal favorite. "

    (2) The window washing question is attributed to Amazon.
    >> "That's why Amazon.com interviewers, for example, have been known to ask job candidates to … ballpark that bill for washing all of Seattle's windows."

    (3) The pirate coin question is attributed to eBay.
    >> "eBay often hits candidates with a word problem that goes like this: You have five pirates, ranked from 5 to 1 in descending order."

  11. Gayle:

    Interesting; thanks for the link. The only thing that puzzles me is that now you're saying that Lin's list is evidently fake fake fake, it includes questions that you know Google would never ask, etc., but there's a comment from you on Lin's site from 1 Dec 2009 writing "Great list!"

  12. "but there's a comment from you on Lin's site from 1 Dec 2009 writing "Great list!"

    And then I looked through the list more thoroughly and noticed that a number of the questions didn't seem real.

    Anyway, now that you do (even if begrudingly) agree that the Lewis' list is fake, can I expect a clarification from you that this is not a "a debunking of a debunking of an expose"? I believe I have shown pretty conclusively that, in fact, Lewis' list is fake (in that it labels non-Google questions as Google questions).

  13. Gayle:

    Thanks for corresponding. I am not intending to be "begrudging" here. This is not an area I know anything about; I'm just reporting what I see. It seems perfectly reasonable to me that (a) Lin's list is fake (as evidenced by your links) but that (b) it's not as obviously fake as you originally stated (as evidenced by your earlier comment on Lin's site). Add to this the diversity of people's interviews in different segments of a large company, and confusion reigns.

    If nothing else, perhaps the dissemination of these lists will make interviewers at Google and elsewhere less likely to rely on trick questions.

Comments are closed.