Manage cognitive biases in software development
Cognitive biases help us think faster, but they also make us less rational than we think. Being able to recognize and overcome biases can prevent problems and increase the performance of software teams.
João Proença, Senior Quality Engineer, and Michael Kutz, Quality Engineer, discussed the impact that cognitive biases in software development have on Agile test days 2021.
Group thinking is a tendency to agree with the rest of the group regardless of the consequences, Kutz said. In this way, the group avoids conflicts and preserves harmony, but arrives at sub-optimal, sometimes even catastrophic decisions. Individual concerns are not raised and the group as a whole becomes deaf to criticism from outside the group:
We once tried to create a new testing strategy for a large number of teams. Obviously, only we were thinking about it at this level, so it was easy to ignore outside opinions as being uninformed or incompetent. We felt like we were only thinking about the greater good and, as a result, our morale was unquestionable. At first we had a lot of heated and fruitless discussions, but at a point that stopped. I thought it was because our ideas were getting more mature. In the end, no one opposed our ideas but the strategy failed anyway.
Looking back, the strategy was really not good, as Kutz explained:
We tried to mix up all the ideas in our group, reaching a compromise that just didn’t work.
Proença mentioned that there are many misconceptions in the software industry regarding cognitive biases. He gave an example of affinity and diversity bias:
In tech, too often I’ve seen leaders make statements like, “I want to promote diversity because it’s the right thing to do, but at the end of the day I have a business to run. “. In my opinion, this is a very bad way to approach it. Diversity is not only the right thing to do, but it is also good for business, as diverse teams are more likely to become very successful.
InfoQ: What are cognitive biases?
Michael kutz: Cognitive biases are systematic trends in human thinking. They mainly occur when we think quickly, and less when we make conscious and well-considered decisions. Most likely, the evolutionary process came with these shortcuts to give us an edge while forming successful hunter / gatherer groups. Today, these shortcuts can still be useful, but they often result in very sub-optimal decisions and cause huge social problems.
João Proenca: Other reasons why we probably evolved as humans to have “two systems of thought” (as Daniel Kahneman calls them) are probably speed (fast vs. slow thinking) and the energy expended in cognitive processes .
InfoQ: Can you give some examples of biases and the effect they can have on our professional life?
Proenca: Affinity bias is a tendency to gravitate towards people like us in appearance, beliefs and background. It can even make us (subconsciously) avoid or hate people other than us. This is usually the reason why you will find teams that are not at all diverse within organizations. This has several implications in terms of team performance and also inequality.
The “statistical biases” are also interesting: the conjunction error, the anchoring effect and the availability bias. All of them tell me that we are really bad at estimating probability, size, or time, when we don’t root those estimates in real, objective data.
Kutz: Another good example is the current bias. This makes us prefer small, short-term rewards to larger long-term rewards. You may be familiar with the term procrastination. This behavior is a direct result of the current bias.
For example, I found that spending two hours cleaning up the test environment became oddly appealing when the alternative was to read a 500-page book on cognitive biases.
InfoQ: What impact does bias have on the software industry and how does it deal with bias?
Kutz: Well, effects like those described above naturally have a big effect on software development. Group thinking particularly influences product planning and other pre-development processes; when the planning group begins to ignore objections from outside the group, real bad assumptions flourish and become the basis of the product.
During development, we suffer from the IKEA effect, forcing us to stick with bad frames that we put together ourselves.
Ultimately, we can fall prey to confirmation bias, ignoring negative market feedback.
The industry as a whole is not very aware of the biases. There are a few good practices that can mitigate the effects. For example, planning poker – when done well – can minimize the anchoring effect during estimation. the 1-2-4-any liberating structure discusses the dangers of group thinking; by getting individuals to think first, then having them share their ideas with another member of the group, then these two share their merged ideas with another pair, and only then with the group as a whole, then the two share their merged ideas with another pair, and only then with the group as a whole, then individual concerns cannot be hidden in a silent agreement.
InfoQ: What are your tips for recognizing and overcoming prejudices?
Proenca: I believe that a lot of misconceptions should be clarified in the industry around cognitive biases. A misconception around affinity bias in organizations is that the common way to solve it is to put in place quotas (gender, racial, etc.), which are very controversial and which you don’t necessarily need. There are many other effective things you can do to hire people into teams or leadership roles, such as targeting job vacancies to people from under-represented groups or setting clear and objective criteria. to assess candidates.
Kutz: Getting to know the biases and the effects certainly helps a lot. They are there anyway. Everyone is influenced by these prejudices. It helps to name things and know precisely what steps can help mitigate that specific influence.
There is no bias to overcome. Mitigation is the best we can do unless we leave all of our decisions to an AI (but who should choose the training data for that – oh my god).
Understanding a set of biases has personally helped me recognize problematic decision-making processes and understand why some techniques are useful, while others complicate the processes.
For example, I have known several liberating structures. I tried them on and they felt great. Yet facilitating them can be an effort, so I haven’t suggested it often. Now that I know about group thinking, the bandwagon and other biases, I could see the need for such a measure sooner and apply them in a more targeted way, without inflating the process.
Proenca: One of the things we tried to do was not just ‘talk’ to people about cognitive biases, but rather get them to experience the biases firsthand in the workshop we held at Agile Tower Vilnius 2021. It helps them to understand each bias a little more, and at the same time, shows more clearly that most of us “suffer” from it and that it is not about being ashamed, but rather about dealing personally. . Like Michael says, I feel like I’ve become a lot more ‘experienced’ to noticing a bias happening in front of me, and that’s usually the hardest part to get over!