Your shopping cart is empty!
Master data management (MDM) is the practice of acquiring, improving, and sharing master data. MDM involves creating consistent definitions of business entities via integration techniques across multiple internal IT systems and often to partners or customers. MDM is enabled by integration tools and techniques for ETL, EAI, EII, and replication. MDM is related to data governance, which aims to improve data’s quality, share it broadly, leverage it for competitive advantage, manage change, and comply with regulations and standards.
Master Data Management (MDM) comprises a set of processes, governance, policies, standards and tools that consistently defines and manages the master data (i.e. non-transactional data entities) of an organization (which may include reference data).
Both Metadata Management and Master-data management are data management initiatives. They have a close link with BI, but their sole purpose is not BI alone. They serve BI and also gain from BI. However, transactional systems have an equal stake on MDM and Metadata.
An MDM tool can be used to support Master Data Management by removing duplicates, standardizing data (Mass Maintaining), incorporating rules to eliminate incorrect data from entering the system in order to create an authoritative source of master data. Master data are the products, accounts and parties for which the business transactions are completed. The root cause problem stems from business unit and product line segmentation, in which the same customer will be serviced by different product lines, with redundant data being entered about the customer (aka party in the role of customer) and account in order to process the transaction. The redundancy of party and account data is compounded in the front to back office life cycle, where the authoritative single source for the party, account and product data is needed but is often once again redundantly entered or augmented.
MDM has the objective of providing processes for collecting, aggregating, matching, consolidating, quality-assuring, persisting and distributing such data throughout an organization to ensure consistency and control in the ongoing maintenance and application use of this information.
Master data management is one kind of Data Management Initiative. MDM is a set of processes, infrastructure and tools to create and maintain a unified (though not necessarily a physically single) reference for all ‘non-transaction entities’, to ensure that there is a ‘consistent and standard’ ‘structure and data’ related to these entities.
MDM includes a host of data integration techniques and also the establishment of standards, which are enforced manually OR in automated way. For example, Customer MDM will also make sure that customer data is either existing at a single place OR is synchronized to make sure that all copies of Customer Master Data are congruent/aligned.
The benefits of master data management (MDM) can only be achieved if the strategy is broad-reaching and comprehensive. But, many organizations make the mistake of viewing MDM as a technical issue – a misbelief that can lead to project failure.
The premise is quite simple. In order for MDM to solve business problems, it must be tightly aligned with business activities.
The truth is, MDM encompasses both business processes and technologies. The technical solution will ultimately support the strategy – but the strategy itself must take into account the various people, policies, and procedures that play a role in data governance, access, sharing, and administration. In order for any MDM plan to work, it must take into call for the needed process adjustments and re-alignments.
The formalization and enforcement of data collection and management standards across the enterprise is what truly will enable the effective execution of MDM. For example, if the accounting department archives information in one way, yet sales and marketing handle the storage of historical data in a completely different manner, the results of an MDM program will be seriously hindered. In other words, consistent workflows must be implemented to promote the accuracy, timeliness, and integrity of corporate information.
The following steps are needed to reach a stage of MDM readiness are defined as critical:
• Document business processes and how they map to application functionality
• Define and use common information concepts
• Assess the organization’s capabilities as they related to data quality and governance
The purpose of this exercise? Determine where procedures are lacking, and re-structure in a way that will most effectively support the new MDM strategy.
And, perhaps most importantly, which processes should be reviewed and assessed? Those that relate to information creation, updating, deletion, or archiving are the activities that have the greatest impact on data quality. Therefore, it is these procedures that must be effectively controlled in order for MDM to deliver desired returns.
Data Validation and MDM
While the creation of master data is important, and the seamless dissemination of it to end users is even more important, it is the accuracy and quality of that data that is crucial to the success of your master data management (MDM) strategy.
Yet, few companies carefully consider data quality as they are developing their MDM plan, and fail to put the proper validation mechanisms in place upon execution of that plan. This cannot only seriously hinder MDM success, it can have a severe impact on core business operations.
Why are validation and quality control so vital? Because information is generated from many sources. There is application data, which is maintained in various back-end business systems, as well as the metadata that describes its attributes. There is transaction data, which is created in the course of “live” events or automated messages, and the reference data that provides detail about it. Then finally, there is master data, which links these together to facilitate the creation and centralization of a single, consistent set of values across all sources.
Take, for example, a client’s location. While a customer relationship management (CRM) system may display one address, an accounting package may show another. Yet a third address may be included in an electronic document, such as a purchase order, transferred during the course of a business-to-business transaction. These types of inconsistencies, if not detected and corrected in a timely manner, can cause major setbacks in MDM projects. In other words, bad data will ultimately lead to bad master data.
And, when master data is poor, businesses won’t achieve the levels of flexibility and agility they set out to reach, since they’ll be basing both tactical and strategic decisions on information of sub-par quality.
How does validation work? Automated validation can work in several ways. It can scan the environment to uncover inaccuracies, such as those mentioned in the above example, across multiple data sets, and flag them for review. An IT staff member can then manually take a look, and make any needed corrections to promote accuracy throughout the business.
The more advanced quality control techniques allow for the use of dynamic business rules. These rules can be proactively applied to back-end systems, to ensure that bad information doesn’t enter the environment in the first place. For example, it can prevent end users from entering client last names that include numbers, or mailing addresses that don’t have enough characters. These business rules can also be used to automatically “cleanse” bad data after the fact, instantly reformatting or altering it, based on pre-set guidelines, once it has been discovered.
In order for an MDM initiative to deliver optimum returns, fully-automated controls and validation must be put into place, to ensure that master data is accurate and up-to-date at all times. However, these controls must be broad-reaching, governing not only how data is handled once it has been created, but how it is generated and updated throughout its lifecycle.