>The LC interviews are like testing people how fast they can run 100m after practice
Ah, but, the road to becoming good at Leetcode/100m sprint is:
>a slow arduous never ending jog with multiple detours and stops along the way
Hence Leetcode is a reasonably good test for the job. If it didn't actually work, it would've been discarded by companies long ago.
Barring a few core library teams, companies don't really care if you're any good at algorithms. They care if you can learn something well enough to become world-class competitive. If you can show that you can become excellent at one thing, there's a good chance you can become excellent at another thing.
That's basically also the reason that many Law and Med programs don't care what your major in undergrad was, just that you had a very high GPA in whatever you studied. A decent number of Music majors become MDs, for example.
But why stop there? Why not test candidates with problems they have never seen before? Or problems similar to the problems of the organization hiring? Leetcode mostly relies on memorizing patterns with a shallow understanding but shows the candidates have a gaming ability. Does that imply quality in any way? Some people argue that willing to study for leetcode shows some virtue. I very much disagree with that.
> If it didn't actually work, it would've been discarded by companies long ago.
This that I've singled out above is a very confident statement, considering that inertia in large companies is a byword at this point. Further, "work" could conceivably mean many things in this context, from "per se narrows our massive applicant pool" to "selects for factor X," X being clear only to certain management in certain sectors. Regardless, I agree with those who find it obvious that LC does not ensure a job fit for almost any real-world job.
> If it didn't actually work, it would've been discarded by companies long ago
You're assuming that something else works better. Imagine if we were in a world where all interviewing techniques had a ton of false positives and negatives without a clear best choice. Do you expect that companies would just give up, and not hire at all, or would they pick based on other factors (e.g. minimizing the amount of effort needed on the company side to do the interviews)? Assuming you accept the premise that companies would still be trying to hire in that situation, how can you tell the difference between the world we're in now and that (maybe not-so) hypothetical one?
Does it work though?
When I look at the messy Android code, Fuchsia's commercial failure, Dart being almost killed by politics, Go's marvellous design, WinUI/UWP catastrophical failure, how C++/CX got replaced with C++/WinRT, ongoing issues with macOS Tahoe,....
I am glad that apparently I am not good enough for such projects.
It's also a filter for people who are ok with working hard on something completely pointless for many months in order to get a job.
> Hence Leetcode is a reasonably good test for the job. If it didn't actually work, it would've been discarded by companies long ago.
I see it differently. I wouldn't say it's reasonably good, I'd say it's a terrible metric that's very tenuously correlated with on the job success, but most of the other metrics for evaluating fresh grads are even worse. In the land of the blind the one eyed man is king.
> If you can show that you can become excellent at one thing, there's a good chance you can become excellent at another thing.
Eh. As someone who did tech and then medicine, a lot great doctors would make terrible software engineers and vice versa. Some things, like work ethic and organization, are going to increase your odds of success at nearly any task, but there's plenty other skills that are not nearly as transferable. For example, being good at memorizing long lists of obscure facts is a great skill for a doctor, not so much for a software engineer. Strong spatial reasoning is helpful for a software developer specializing in algorithms, but largely useless for, say, an oncologist.
> Hence Leetcode is a reasonably good test for the job. If it didn't actually work, it would've been discarded by companies long ago.
This is an appeal to tradition and a form of survivorship bias. Many successful companies have ditched LeetCode and have found other ways to effectively hire.
> If you can show that you can become excellent at one thing, there's a good chance you can become excellent at another thing.
My company uses LeetCode. All I want is sane interfaces and good documentation. It is far more likely to get something clever, broken and poorly documented than something "excellent", so something is missing for this correlation.
> If it didn't actually work, it would've been discarded by companies long ago
That makes the assumption that company hiring practices are evidence based.
How many companies continue to use pseudo-science Myers Briggs style tests?
LC interviews were made popular by companies that were started by CS students because they like feeling that this stuff is important. They're also useful when you have massive numbers of applicants to sift through because they can be automated and are an objective-seeming way to discard loads of applicants.
Startups that wanted to emulate FAANGs then cargo-culted them, particularly if they were also founded by CS students or ex-FAANG (which describes a lot of them). Very, very few of these actually try any other way of hiring and compare them.
Being able to study hard and learn something well is certainly a great skill to have, but leetcode is a really poor one to choose. It's not a skill that you can acquire on the job, so it rules out anyone who doesn't have time to spend months studying something in their own time that's inherently not very useful. If they chose to test skills that are hard and take effort to learn, but are also relevant to the job, then they can also find people who are good at learning on the job, which is what they are actually looking for.