Tanusha - Fotolia

Manage Learn to apply best practices and optimize your operations.

Analytics initiative success can turn on quality of data

Data analytics is the key to business decision making these days, but faulty assumptions and poor data quality can hamper those efforts.

This article can also be found in the Premium Editorial Download: Business Information: Business analytics in the cloud slowly gains elevation:

It may go without saying that data analytics has become a buzz phrase in IT. From human resources to supply chain management, from marketing to finance, analytics has become a key tool in these departments' business decision making. Companies today know that, without this data insight, they will fall behind.

Data analytics enables companies to arm themselves with data to reduce costs, boost sales and make operations more efficient. They may also use analytics to predict future events. While previously executives may have based decision making on gut, instinct or tradition, today they ask, "What do the numbers tell us?"

From this question, data can tell companies quite a bit about their customers and operations. Analytics may indicate how many calls a salesperson has to make before getting an interested prospect to review services or whether a certain product might take off in a few months. But analytics is useless if the quality of data is poor.

The challenges of data-driven companies

Companies are benefitting from data-driven decision making, but there's a steep learning curve. Contending with large volumes of data, coming from multiple data silos and in different formats, is challenging. The ability to handle massive amounts of information, integrate it from different areas of the business and combine it to derive actionable data in real time is be easier said than done.

As with any data-dependent process, the decision making hinges on quality of the information.

One of the principal challenges is the quality of data: Without high-quality data as the foundation, decision making will likely falter. As with any data-dependent process, the decision making hinges on quality of the information. As the saying goes, "garbage in, garbage out." Bad or incomplete information will lead to incorrect predictions and misleading descriptions.

Where do data quality issues originate? One problem concerns the initial assumptions on which an analytical model is built. In marketing, a predictive model might be applied to next year's marketing budget. It might strive to make marketing expenditures more efficient by parsing customers into new groups: those who will buy regardless of advertising; those who will buy only after seeing compelling advertising; and those who won't buy. The idea is to spend resources only on that middle group because the other two are a waste of money.

But what happens if the customer profiles in these other groups isn't correct? What if the demographic definition of the "will buy it anyway" category of customers is based on faulty information, such as brand loyalty that doesn't account for competing technology? A mistake like this can wreck a marketing campaign, no matter the quality of the predictive analysis.

What's the fix? Test every assumption before incorporating it into the model. Be certain, going in, because even business truths which are taken for granted might be off base.

Did we score?

The previous example involves an error at the front end while building the model. But another significant mistake often happens at the end of the process: failure to do a postmortem on the results of a predictive model. In the example I just laid out, a more effective marketing campaign might emerge from the raw numbers: sales improved, while marketing spent less.

But it isn't that simple. A measurable improvement is all well and good, but success in analytics is measured by how much a process improved compared to potential success. One number going up while another goes down really says only one thing: that a process is moving in the right direction. If left at that, the enterprise still can't rate the effectiveness of the analytics process. 

What's missing? First, specific objectives need to define the modeling process: optimum sales targets that can be compared with sales, for example. Having those numbers makes it possible to score an analytics process success, not only against past performance, but against future potential.

So, for example, if a food distributor wants to boost sales by 8% over the coming year, it first needs to look at its current sales number and compare that number with growth over the past, say, five years to see whether this predictive goal has merit.

With this kind of thinking in place, analytics can improve a process today, with potential to improve continuously, fine-tuning not only the outcomes, but the quality of inputs to the process.

Next Steps

Calibrating big data and skillsets for analytics projects

Predictive analytic efforts may need reality check

Business model matters when predictive modeling

Using data profiling to address data quality issues

This was last published in November 2015

Dig Deeper on Information governance management

PRO+

Content

Find more PRO+ content and other member only offers, here.

Join the conversation

2 comments

Send me notifications when other members comment.

By submitting you agree to receive email from TechTarget and its partners. If you reside outside of the United States, you consent to having your personal data transferred to and processed in the United States. Privacy

Please create a username to comment.

What challenges has the quality of data created for your analytics initiatives?
Cancel
Of course. Every single piece of business success depends on the quality of data. This extends from the very simple to the utterly complex. False assumptions tank the best of intentions.
Cancel

-ADS BY GOOGLE

SearchBusinessAnalytics

SearchDataManagement

SearchManufacturingERP

SearchOracle

SearchSAP

SearchSQLServer

Close