The intersection of statistically based insight and the realization that information can be an asset has had and will continue to have serious reverberations in the business world. Being smarter has always meant being successful; as far back as the 19th century, analytics was already generating competitive advantage.
Success requires more than just knowledge of statistics or ways of dealing with “big data.” Execution is essential, but without a plan and commitment, little happens. Success also requires an understanding of how analytics translates to competitive advantage.
Possibly the best-guarded secret in business analytics is that, in practice, its success comes down not only to organizational culture but also to the ability of managers to successfully sell the value of analytics. As researchers such as Thomas Davenport and Jeanne Harris have rightly pointed out, overall success can often be linked to a variety of factors including organizational structure, management commitment and successful strategic planning. However, it’s often “where the rubber hits the road” that the greatest impact can occur.
Analytics is a multi-disciplinary activity: the value from insight comes not from the activity but from the execution. Often, this crosses a variety of departments within an organization – few analytics groups have responsibility for both the insight creation and the execution of that insight. Because of this, selling the value of analytics isn’t just a goal for managers; it’s a necessary criterion for success.
For many managers, this can be challenging. Despite broad interest in business analytics as a discipline, most organizations have a relatively immature understanding of what “business analytics” is, let alone how it creates value. When projects fail, it’s all too easy to point to the organization as the reason why success wasn’t achieved. Unfortunately, this is only half the picture – the harsh reality is that we, as managers and analysts, all too often carry a large portion of the blame. In an ongoing straw poll held by the author across more than 1,500 people, only a handful ever feel that they could quantify the value they were creating through applying business analytics. When even the experts can’t explain why analytics is important to the organization, what hope does a layperson have?
The value of analytics lies in its ability to deliver better outcomes. And, when it comes to selling the value of business analytics, four mistakes stand out above all others as the biggest blockers of success.
Business analytics solutions are intended to compile data across the organization and make it available at all levels. In this way business analytics serves as a mirror, effectively shining a light on any data quality issues that exist. Business analytics solutions, if properly implemented, can resolve some quality issues but it is not feasible to retroactively resolve all data quality issues for use in dashboards and reports.
Resolving data quality issues can be difficult. Responsibility for data quality resides across the entire organization, and therefore will require an action plan that includes the entire organization. Here are some tips you can use to begin to get a handle on your organizations data quality.
There are too many potential causes to list them all, but here are a few common problems to look for.
- Conflicting Source Systems – Is your data sourced from a single line of business system? Or are there multiple systems with potentially conflicting information? Many of the clients I’ve worked with use multiple ERP systems for different business divisions. This can lead to a number of difficulties when combining data within your business analytics solution.
- Manual/Duplicate Entry – Does your data have a single point of origin? If not, you can bet there will be inconsistencies in how the data is entered. In a recent webinar the moderator typed ‘Hello’ and received no less than 6 variations in return (‘Hello’, ‘hello’, ‘Hello!’, ‘hi’, ‘Hi’, ‘Good Morning!’)
- Inconsistently Applied Business Rules – Remember all those spreadsheets your team used to pull together every month? All the business logic applied to them while ‘massaging’ the data were being recreated every month. Where in the organization are those rules captured, other than in someone’s head or in a collection of outdated spreadsheets? Are you sure they were applied consistently to every report each month? And whose version of the spreadsheet contains the ‘correct’ rules, anyway?
Data quality refers to the level of quality of Data. There are many definitions of data quality but data is generally considered high quality if, “they are fit for their intended uses in operations, decision making and planning.”. Alternatively, data is deemed of high quality if it correctly represents the real-world construct to which it refers. Furthermore, apart from these definitions, as data volume increases, the question of internal consistency within data becomes significant, regardless of fitness for use for any particular external purpose. People’s views on data quality can often be in disagreement, even when discussing the same set of data used for the same purpose.
- Degree of excellence exhibited by the data in relation to the portrayal of the actual scenario.
- The state of completeness, validity, consistency, timeliness and accuracy that makes data appropriate for a specific use.
- The totality of features and characteristics of data that bears on its ability to satisfy a given purpose; the sum of the degrees of excellence for factors related to data.
- The processes and technologies involved in ensuring the conformance of data values to business requirements and acceptance criteria.
- Complete, standards based, consistent, accurate and time stamped.
Steps to address data quality – To ensure data quality, the business intelligence project team has to address it from the very beginning. Here are several significant steps to consider:
- Require the business to define data quality in a broad sense, establish metrics to monitor and measure it, and determine what should be done if the data fails to meet these metrics.
- Undertake a comprehensive data profiling effort when performing a source systems analysis. Data anomalies across source systems and time (historical data does not always age well!) is needed so that the team can address them with the business early on.
- Incorporate data quality into all data integration and business intelligence processes from data sourcing to information consumption by the business user. Data quality issues need to be detected as early in the processes as possible and dealt with as defined in the business requirements.
Enterprises must present data that meets very stringent data quality levels, especially in light of recent compliance regulations and demands. The level of data transparency needed can only result from establishing a strong commitment to data quality and building the processes to ensure it.