The title is not merely a clickbait, but my current opinion, after attending a programming competition for the first time. This post expresses my opinions on the hiring processes of [some of] the new age companies through programming competitions and algorithms-focused interviews.
I believe that the assessment for a senior/architect level programmer, should be done by finding how co-operative
[s]he is with others to create interesting products and their history than by assessing how competitive
[s]he is in a contest.
On my lone programming competition experience (on hackerrank
), the focus of the challenges were on Algorithms (discrete math, combinatorics etc.).
Usage of standard, simple algorithms, instead of fancy, non-standard algorithms is a better idea in real life, where the products have to last for a long time, oblivious to changing programmers. Fancy algorithms are usually untested, harder to understand for a maintenance programmer.
Often, it is efficient to use the APIs provided by the standard library or ubiquitously popular libraries (say jquery). Unless you are working on specific areas (say compilers, memory management etc.) an in-depth of knowledge of a wide-range of algorithms may not be very beneficial (imo) in day-to-day work, elaborated in the next section.
There are various factors that decide the runtime performance, such as: Disk accesses, Caches, Scalable designs, Pluggable architectures, Points of Failures
Algorithms optimize mostly one aspect, CPU cycles. There are other aspects (say choice of Data structures, databases, frameworks, memory maps, indexes, How much to cache etc.) which have a bigger impact on the overall performance. CPU cycles are comparatively cheap and we can afford to waste them, instead of doing bad I/O or a non-scalable design.
Most of the times, if you choose proper datastructures and get your API design correct, we can plug the most efficient algorithm, without affecting the other parts of the system, iff
your algorithm proves to be really a bottleneck. A good example is the Evolution
of filesystems, schedulers in the Linux Kernel. Remember that Intelligent Design
school of software development is a myth.
In my decade of experience, I have seen more performance problems due to poor choice of datastructures or unnecessary I/O, than due to poor selection of algorithms. Remember, Ken Thompson
said: When in doubt, Use Brute Force.
It is not important to get the right algorithm on the first try. Getting the skeleton right is more important. The individual algorithms can be changed, after profiling
At the same time, this should not be misconstrued as an argument to use bubblesort
The 10,000 hour rule
Doing well in online programming competitions is mostly the 10,000 hour rule
in action. You spend time in enough competitions and solve enough problems, you will quickly know which algorithm or programming technique (say dynamic programming, greedy) to employ if you see a problem.
Being an expert at online programming competitions does not guarantee that [s]he could be trusted with building or maintaining a large scale system …