Only a Sith deals in absolutes.
Obi-Wan Kenobi, Star Wars Episode III: Revenge of the Sith
It might probably be due to my age (recently turned thirty-six), but I’m recently observing a certain aversion in me against tech holy wars. It’s something between “C’mon, you can’t be serious” and “Not again”. I know that they have a decades old tradition in computer science, but still, I find them somewhat irritating.
Discussion about programming languages are full of these. For example, you have the dynamic vs. static typing discussion, functional vs. object-oriented, what is the right abstraction for concurrency: locks, actors, STM, or something entirely different, immutable vs. mutable data structures, etc. Of course, most of this discussion is led in a reasonably rational tone, but every now and then, you meet people who categorically reject anything which isn’t X.
In database systems, there is the NoSQL vs. SQL databases, which break down to consistency vs. being eventually consistent, “scaling up” vs. “scaling out”, and so on.
In Machine Learning we have the Frequentism vs. Bayesians divide. I actually often forget that people are taking this serious, but then again I end up with people rejecting ideas because they are from the other group.
Note that most of these questions are pretty big ones: Programming languages, databases, notions of probability and inference under uncertainty. There are little holy wars on smaller things.
I had an interesting exchange on Twitter today with David MacIver on this, and we agreed that in many cases, it is clear that there are arguments in favor of each side depending on the context. Different programming languages are fit for different things, and there is no language or programming paradigm which is universally superior. At the same time, the costs for mastering both alternatives are often quite large. You tend to grow attached to the programming paradigm you do most of your work in, and also to the tools, editors, libraries, the community, etc., and might over time become relucant to switch.
David added that in many cases, however, it’s actually not that difficult to do both but people suffer from sunk cost fallacies. What this means is that people are somewhat averse to writing off investments they have already made. Basically, if you spent a lot of time learning a certain technology, you would feel like you’ve wasted all that time if you switched to a different technology.
Of course, and this is the reason why it’s called a fallacy, this has usually nothing to do with which technology is actually better for the problem at hand.
As I said, holy wars are often about complex, high-level stuff. And unfortunately, things are never that easy. It’s usually quite complicated. Both sides have areas where they shine, and others where they fail. At the same time, problems usually also have rather complex requirements which seldom align with the available solutions.
I think that for many, holy wars are also rooted in a wish to simplify life, to get easy answers. “Which programming language should I learn?” For which problem? Number crunching? Distributed programming? Building a web site? “Which database technology is the best one?” Which will scale with your demands for the next ten years? There are no simple answers. But that’s life, basically.
I think what irritates me most about this is when smart people, who otherwise seem to be able to take in a huge amount of detail, give in to holy wars. In particular when they’re scientists, because our professionality demands that we’re open to new things, and always suspicious of what we think we already know.
To close this rant, and to counter the argument that I’m basically having a holy war against holy wars, I admit that holy wars (at least in tech) have their merit. Often, it forces both sides to focus on what’s special, and possibly also grow in the process. Given the huge number of technologies to learn today, it’s also a good thing to just focus on one thing for some time, and nothing helps like believing that you’ve found the greatest thing there is. At the end of the day, however, you have to put it all into perspective and admit that perfect solutions only exist in fairy tales.
Posted by Mikio L. Braun at 2011-06-26 20:30:00 +0200blog comments powered by Disqus