Cultists inbound

This is a brief post on a phenomenon that is easy to detect but hard to combat and even harder to eradicate. The following happens more often than not, sadly. So you are in a discussion with a colleague on what kind of tech to use and it actually got started by the other colleague saying we should use this technology. We shall call the technology Starlit Snowflakes. I hope you learned by now the first and only relevant question to ask that needs a satisfying answer is of course:

Why?

The answer you get is because tech company Megacorp is using it, therefore we need to use it. This is not a valid reason of course. This thinking and reasoning is composed of the following effects and biases:

Bandwagon effect

There is this effect that people generally accept it as fact that a certain action or belief is true because a lot of other people do or believe it. This tendency is very strong and is part of our group mentality. We as human beings like to fit in more than we want to stand out.

Of course the more people use Starlit Snowflakes the more it will proliferate because more people are using and therefore it must be good. You can see the potential for viral marketing and the value of a well known name and image. Also Kubernetes anyone ? You did not think I had forgotten about it did you?

Seriously though I think the bandwagon effect and congruence bias is used in most cases where a piece of tech solved a specific issue to make it better than it is. For example Golang was created to run on multiple platforms, like Java, but have an easier way to do concurrency. Fair enough, that is a specific use case and solution. How many of us out there have this need ? Maybe you only need to deploy the application to one server (or a bunch of servers) running one type of Operating System and the concurrency only needs to be written once over a relative normal period. So there is no need for fast development with easier concurrency and also no need to run it on multiple platforms at the same time. You can write it in Elixir, Python or Java just as easy from that point of view.

There are more technologies that fit this bill like among others: – React (by Facebook) – Bazel (by Google) – Kraken (by Uber)

There are also examples of tech that can be used by anyone because it is a better solution (or the best solution) to a well known problem. An example I can give is H3 by Uber. They made a better solution for doing geospatial indexing based on hexagons because they fit better together and used in different resolutions they can better describe regions on the Mercator projection of Earth that will not result in overlapping areas and not defined to complex areas created by mankind nor influenced by new postal codes or redefined existing areas.

Confirmation bias

This is linked to conservatism explained a little later. People tend to look for things that confirm their opinion and value that higher and stronger than those things that oppose their view or opinion and either deny those or value them weaker.

So in this case the engineer will only look for stories confirming this is a good piece of tech to use.

Congruence bias

This has to do with testing the hypothesis only directly and not testing the alternative options. In our case the engineer will just make a case for the one tech it wants to use and state it is the best instead of actually looking around for other solutions and making sure there is no alternative.

Starlit Snowflakes is the best to solve Megacorp problems and there is only one who suffers from them, Megacorp. Take this into account when reasoning about a piece of tech.

Conservatism

This is the most important one so I think this is the one that is the basis for a lot of the other ones. This is also linked to the continued influence effect. This conservatism is also called belief revision or more accurately the belief revision resistance. It is the fact that people do not correct their belief as much as they should when presented with new information that should correct their behaviour.

I liken this to a self correcting mechanism each one of us has. There are people in the spectrum that have this self correcting mechanism so powerful that they are always right and nothing can make them see they are wrong and more importantly these people will convince others that they are the single source of truth.

Then the other end of the spectrum is the people that have not a self correcting mechanism that says they are right but consistently telling them they might be wrong. They will never say they are right because they might not be. It is an engine of doubt that makes them explore other avenues of thought.

The rest is in the middle spectrum that have a mild version of self correcting. On core beliefs it will correct and say they are always right but on peripheral items they do not really care and will adopt whatever seems reasonable.

The problem is this mechanism makes it damn near impossible to make people see the error of their ways, the mechanism is simply too strong. I think it is wise to adopt a mechanism near the spectrum of the doubt engine mixed with certain base fundamental truths you know to be true.

Continued influence effect

This effect is interesting because it happens when you learned something in the past that later turned out to be false. This will fuel the self correcting mechanism with ammunition to state you should not change just because of the new information. The information you already had in your mind is worth more than the new contradiction information. Also the mechanism devalues the new information so it sustains each other nicely.

This might mean the engineer learned something in the past that absolutely no longer holds or never was true in the first place but it is hard to get rid of that information.

Default effect

This is when there are a bunch of options to choose from persons tend to go for whatever is the default more often than not. This is met on the same edge with best practices and community preferences. It might be that for a specific case the best practice or community preference does not fit the bill completely and there is a need for a slight deviation and that will not be done because it is not the default.

It might be that the default option does not fit the case at all and something far off the beaten path is better suited, it will not be chosen.

It might also be that blindly the default will be chosen and maybe the bandwagon and congruent testing support the blind choice.

Solutions

The moment I have one or many I will share them. For now I think it is enough to identify the biases and effects and see if you can address them one at a time. For all the biases listed and more go here.

#devlife #thoughts