Get started Bring yourself up to speed with our introductory content.

Common biases that can taint analytics analysis

Analytics can help businesses make data-driven decisions, but common cognitive biases can skew how the information is interpreted.

Businesses are increasingly looking to make data driven decisions, and analytics can be a key tool for reaching...

that goal. But it can be a wasted effort, if the analytics analysis is skewed by human biases.

Analytics can assist business strategy and decision making by providing a fresh perspective on company data or by identifying correlations that humans miss. It's a highly technical process, but the human element is still very much a factor when it comes to developing query hypotheses and converting data reports into an actionable business plan. Letting the numbers speak for themselves is an alluring aspect of analytics, but the reality is that human bias can creep into the process, if practitioners aren't vigilant.

Business Insider recently called attention to 20 cognitive biases that affect decision making, and it's worthwhile to think about how some of them could specifically taint analytics analysis.

Here are some of the big ones, with emphasis on how they affect analytics-based conclusions.

Clustering illusion. Wishful thinking is the most common of human tendencies. In all areas of life and thought, we have the potential to see what we want to see, and miss what's really there. In an analytical result, that's a particularly strong risk, as analytics are all about finding patterns in noisy data. It's the most natural thing in the world to see patterns that tell us what we want to hear. Avoid this mistake by scrupulous scoring of results and maintaining a commitment to accepting the numbers honestly, for the ultimate good of the enterprise.

Selective perception. The sibling of clustering illusion, selective perception is the interpretation of analytical outcomes in ways that confirm what we expected them to say. Expectations in general are precarious in the human experience; the world seldom turns out the way we want it to. But buying into expectations obfuscates the entire purpose of analytics: if we uncritically allow ourselves to simply see what we expect to see, what's the point?

Confirmation bias. This old favorite is familiar to almost everyone who has ever argued politics and/or religion. We tend to accept only new information that supports our old ideas. This is just as likely -- and even more dangerous -- in the realm of analytics, where outcomes can influence decisions at the highest levels. The entire point of analytics as a strategic tool is to push beyond old ideas into more effective choices and policy: why poison the waters?

Survivership bias. We like good news, we hate bad news. It's a natural tendency to focus on positive outcomes projected in our analytical results, ignoring the negatives. But this truncates the power of analytics. We have to give both positive and negative outcomes equal weight, if we're to be fully informed by the data.

Ostrich effect. And when the news is bad, we cover our ears. Often we try to ignore or argue against conclusions that aren't what we want to hear. This is especially problematic in analytics, where the results are generally culled from several different sources and objectively scored; the ideal is to embrace even bad news delivered by analytics, because we can be reasonably certain that addressing that bad news will be effective.

Bandwagon effect. When we find ourselves riding a wave, we tend to go with the wave. If an analytical result seems to put us into a positive industry trend, we may be inclined to buy in all the way, even if the result is weak. It's important in that circumstance to stay off the bandwagon and give the result the weight it deserves, no more, no less.

Outcome bias. A "little brother" of bandwagon effect is the outcome bias: over-trusting a process because it worked well once. The danger here should not be underestimated; analytical processes must be constantly fine-tuned to remain effective. To lock in on an analytical process because it delivered a positive result, forgoing continuous fine-tuning and critical scrutiny of the results, is asking for trouble.

Pro-innovation bias. Cognitive studies have shown that human beings tend to over-emphasize both similarities and differences between things that fall into different categories. We do this all the more when evaluating a new idea. We tend to overvalue a new idea's (unproven) usefulness, and undervalue its (probable) shortcomings. This is a natural consequence of human enthusiasm, and we see it all around us (especially if you live in Silicon Valley). But if analytics are being applied to evaluate some new product or process or service, realism is always the best choice. Money and effort dumped into a mediocre new idea benefits no one.

Information bias. This one is somewhat obscure, but it happens in the best of us. We can become focused on information or trends or details that really have no effect on the outcome we are pursuing. This bias is hard to root out because it's often a good thing to toss unknowns into the input of a data mining operation, and we often can't know what was really important until the results are in and the outcomes are apparent. The second time around, though, it's easier to prune the information that does and doesn't matter -- and emotional attachment to details that don't have a substantive effect is an unnecessary distraction.

Blind-spot bias. Finally there's blind-spot bias: the failure to recognize and allow for our own biases. Of all our tendencies in bias, this one is both the most obvious and the hardest to overcome. For this reason, it's a wise practice to employ analytics scientifically; that is, subject them to peer review. The fix for blind-spot bias is to put more than one set of eyes on every analytical result to ensure that no one person's biases distort an important outcome.

Conclusion

Committing to a business analytics platform and practices is a major investment of time, energy and even culture change. It isn't as easy as installing new software or handling new reports: it can change the enterprise from the inside out, bringing about real change in both policy and process.

With so much invested and so much at stake, it makes sense to be mindful of how the results of analytics are gathered, perceived and interpreted. Being aware of how human bias can cloud analytics analysis is an important first step toward preventing it from happening.

Next Steps

Predictive models should focus on business value

Target specific business goals with predictive models

CRM analysis can boost knowledge about customers

This was last published in December 2015

Dig Deeper on Collaborative analytics platforms and strategy

PRO+

Content

Find more PRO+ content and other member only offers, here.

Join the conversation

1 comment

Send me notifications when other members comment.

By submitting you agree to receive email from TechTarget and its partners. If you reside outside of the United States, you consent to having your personal data transferred to and processed in the United States. Privacy

Please create a username to comment.

What techniques have you used to prevent bias from infiltrating analytics analysis?
Cancel

-ADS BY GOOGLE

SearchBusinessAnalytics

SearchDataManagement

SearchERP

SearchOracle

SearchSAP

SearchSQLServer

Close