Quantcast
Channel: Infogix » Blog FSI
Viewing all articles
Browse latest Browse all 10

Big Data is Changing Rapidly, But Study Shows the Focus on ‘Bad Data’ is Not

$
0
0

Infogix Blog 8.27.15

Despite Big Data being a growing concern and increased focus for organizations, an industry study says the amount of “Bad Data” remains a constant since 2007.

The term big data has turned into this decade’s biggest buzz word. It’s touted as the magical key that will supposedly unlock previously unforeseen business advantages, or the magical potion that can take companies from “good” to “great.”

The truth is, those dreams of grandeur are correct, but companies are still failing to achieve the desired results from their terabytes and petabytes of data. In the rush to jump on the big data bandwagon, major organizations are still failing to prioritize an enterprise initiative to fix the bad data that is leading to flawed analytical results.

According to research conducted by Experian 1, a quarter of the information within U.S. organizations is believed to be inaccurate. What’s more surprising, however, is despite the recent hyper-focus on the benefits of big data, the amount of money wasted on an annual basis due to bad data hasn’t changed in nearly a decade.

It’s reported that 77 percent of companies surveyed believe their bottom line is affected by inaccurate and incomplete data, with respondents saying an estimated 12 percent of revenue is being wasted. Yet, despite these companies increasing their awareness about the benefits of data quality and data-driven decision making, the average percentage of wasted revenue hasn’t changed since 2007.

The pervasive problem comes down to data integrity. All data should be considered “good data,” otherwise, what is the point of collecting it in the first place?

According to the Experian report, just 38 percent of organizations surveyed use software to check data accuracy at the point of capture, and only 34 percent make any attempt at making corrections after it has been collected. The study even details that nearly a quarter of enterprises are still solely relying on manual checks rather than an automated controls solution, adding a layer of human error to the data quality equation.

Understanding which data is inaccurate, incomplete, duplicated or incorrect is as important as having big data at all. Executives in the C-suite should take more pride in the validity of their databases rather than the size. Without clean data, it won’t matter how many data scientists you hire – poor data quality leads to poor analytics and results in imperfect business decisions.

All of this, of course, boils down to the bottom line. When marketing executives have incorrect contact information, they waste millions on outreach to the wrong people. When business development professionals have inaccurate sales and demand data, they put less emphasis on markets that will help the company grow. And when financial directors can’t easily reconcile duplicate transactions, they can overlook payments and even money laundering issues.

If big data is supposed to be the magical element that drives an organization’s advanced analytics (and ultimately their business decisions), data integrity is the foundation. Now ask yourself, how much of your big data is bad data?

1 Thomas Shultz, ”The State of Data Quality,” Experian Data Quality (September 2013)


Viewing all articles
Browse latest Browse all 10

Trending Articles