Item Master Data Rationalization

m e t a g r o u p.c o m
800-945-META [6382]
October 2004
Item Master Data Rationalization
Laying the Foundation for Continuous Business
Process Improvement
“Bad master data that is, master data that is inaccurate, duplicated, incomplete,
or out-of-date hampers the accuracy of analysis, causes expensive exceptions that must
be resolved, and prevents refinement of processes. Moreover, when bad data or flawed
analysis is shared with partners, not only are the associated processes affected,
but also the level of trust is undermined. Under these conditions, frustrated
employees tend to continue their manual processes and future efforts in collaboration,
About META Group
integration, and automation become more difficult, due to employee resistance.
Return On Intelligence SM
In short, bad master data will destroy the best-designed business processes.”
META Group is a leading provider of information technology research, advisory services, and
strategic consulting. Delivering objective and actionable guidance, META Group’s experienced
analysts and consultants are trusted advisors to IT and business executives around the world. Our
unique collaborative models and dedicated customer service help clients be more efficient,
effective, and timely in their use of IT to achieve their business goals. Visit metagroup.com for
more details on our high-value approach.
208 Harbor Drive Stamford, CT 06902 (203) 973-6700 Fax (203) 359-8066
metagroup.com
Copyright © 2004 META Group, Inc. All rights reserved.
A META Group White Paper
Sponsored by Zycus
Item Master Data Rationalization
Item Master Data Rationalization
Laying the foundation for Continuous Business Process Improvement
Laying the foundation for Continuous Business Process Improvement
Contents
Bottom Line
Clean, Reliable Master Data Enables Successful Enterprise Initiatives
Executive Summary
2
Master Data Has a Material Impact on the Financial and Operational Health of an Organization
2
Item Master Data Requires Specialized Attention
2
management initiatives, they find that the efficiency and accuracy of their business processes and
Clean Item Master Data Enables a Wide Range of Business Initiatives
2
reporting are dependent on the item master data. More than just good housekeeping, a methodical
Master Data Rationalization Is the Foundation for Leveraging Existing ERP Investment
3
and automated approach to cleansing, classifying, and enriching item master data lays the
Master Data Rationalization Protects the SAP Master Data Management Investment
3
Successful Sourcing and Procurement Initiatives Depend on Clean, Reliable Master Data
3
As organizations consolidate ERP systems, engage in strategic sourcing or launch enterprise spend
foundation for the continuing success of many enterprise initiatives.
Optimum Master Data Maturity Enables Real-Time Analysis and Control of Business Processes
4
Introduction
4
Master Data Rationalization Is Required to Ensure Master Data Quality
ERP Systems Are Indispensable to the Business Operations of Large Organizations
4
There are many approaches to attaining master data quality. Some systems rely on field-level
Business Process Configuration in ERP Is Important, But Master Data Quality Affects the
validations and some use workflow for review and approval, while others combine techniques
Accuracy, Efficiency, and Reliability of the Process
5
Keeping Enterprise Applications in Shape Requires Constant Master Data Maintenance
5
in an ad hoc fashion. However, without the consistent, systematic approach of master data
Successful Business Initiatives Depend on Clean, Organized, and Reliable Master Data
6
rationalization, our research shows that these techniques fail to deliver the level of consistency
CEOs and CFOs Who Are Accountable Under Sarbanes-Oxley Need Good Data
6
The Role of Master Data in the Enterprise
7
and quality needed for ongoing operations.
Master Data Quality Issues Ripple Across the Enterprise
7
The Difference Between Primary and Derived Master Data Records
8
Building Master Data Rationalization Into ERP Consolidation Planning
A Disorganized Approach Toward Maintaining Master Data Is Common
8
Few organizations and systems integrators dedicate enough attention and resources to master data
Item Master Records Present Particular Challenges
8
rationalization in their ERP consolidation planning. Successful organizations will plan far ahead
Item Master Record Quality Problems Have Numerous Root Causes
9
The Effect of Bad Item Master Data on Business Initiatives Is Profound
10
Master Data Rationalization Is a Prerequisite for Successful Business Initiatives
11
Understanding the Process of Master Data Rationalization
12
Step 1: Extraction and Aggregation
12
Step 2: Cleansin
12
Step 3: Classification
14
of the small window in the schedule allotted to the master data load and will plan for master data
rationalization with an experienced service provider. Once the data is loaded and go-live
is reached, it is too late to rethink the impact of poor master data quality.
Master Data Rationalization Is a Key Component in Achieving Data
Quality Maturity
Step 4: Attribute Extraction and Enrichment
14
Step 5: Final Duplicate Record Identification
16
Our research shows that maturity of organizational master data quality practices varies greatly,
Automation Is Not an Option
17
from the most basic but not uncommon state of master data chaos, to the rare case of pervasive,
Integrating Master Data Rationalization Into ERP Consolidation or Upgrade Planning
19
real-time, high-quality master data. Organizations should understand where they are in the master
Moving Your Organization Through the Data Quality Maturity Model
19
data maturity model and chart a path to achieving an optimized level of master data quality
Level 1: Aware
20
Level 2: Reactive
21
maturity a level where they will be able to exploit spend data on a real-time basis to drive
Level 3: Proactive
21
continual improvements in supply side processes. Key to this evolution is the implementation
Level 4: Managed
22
of automated processes for the cleansing, enrichment, and maintenance of master data.
Level 5: Optimized
22
Bottom Line
23
Clean, Reliable Master Data Enables Successful Enterprise Initiatives
23
Master Data Rationalization Is Required to Ensure Master Data Quality
24
Bruce Hudson is a program director, Barry Wilderman is a senior vice president, and
Carl Lehmann is a vice president with Enterprise Application Strategies, a META Group
Building Master Data Rationalization Into ERP Consolidation Planning
24
advisory service. For additional information on this topic or other META Group offerings,
Master Data Rationalization Is a Key Component in Achieving Data Quality Maturity
24
contact info@metagroup.com.
1
22
Item Master Data Rationalization
Item Master Data Rationalization
Laying the foundation for Continuous Business Process Improvement
Laying the foundation for Continuous Business Process Improvement
Level 3: Proactive
Executive Summary
Moderate master data maturity can be ascribed to organizations that perceive master data as a
genuine fuel for improved business performance. These organizations have incorporated data
quality in the IT charter, and data cleansing is typically performed downstream by department-
Master Data Has a Material Impact on the Financial and Operational
Health of an Organization
level IT shops or in a data warehouse by commercial data quality software. Processes include:
Business executives depend on reliable reporting of operational and financial activities to guide
n
Record-based batch cleansing (e.g., name/address)
their decisions. The US government even mandates reliable and accurate reporting under the
n
Identification
Sarbanes-Oxley Act (SOX). The underlying enabler to meet the demands of business executives and
n
Matching
the government is the master data found in enterprise software systems. Master data represents
n
Weeding out duplicates
the items a company buys, the products it sells, suppliers it manages and the customers it has.
n
Standardization
When the master data is inaccurate, out-of-date, or duplicated, business processes magnify and
Figure 9 — Key Data Quality Characteristics
propagate these errors, and the company's financial and operational results are affected.
The results are profound. Shareholders lose their confidence and market capitalization falls.
Executives begin to manage by instinct rather than from facts and results suffer. Suppliers lose faith
n
n
n
n
Accuracy: A measure of information correctness
Consistency: A measure of semantic standards being applied
Completenes: A measure of gaps within a record
Entirety: A measure of the quantity of entities or events captured versus those
in the collaborative processes and build in safety stock. All these scenarios are likely and have
a direct effect on the financial and operational health of the enterprise.
Item Master Data Requires Specialized Attention
universally available
Customer relationship management (CRM) projects have long focused on the quality of customer
n
Breadth: A measure of the amount of information captured about an entity or event
master records managed by CRM systems. Item master records, on the other hand, often have
n
Depth: A measure of the amount of entity or event history/versioning
no clear owner to champion the cause of clean, reliable item master data, because the data often
n
Precision: A measure of exactness
resides in various systems and is used by different departments. However, these records require
n
Latency: A measure of how current a record is
special attention, because they contain the most pervasive master data in the enterprise and form
n
Scarcity: A measure of how rare an item of information is
Redundancy: A measure of unnecessary information repetition
the basis for many other dependent master records and business objects such as purchase orders
n
and pricing records.
Source: META Group
Moreover, item master records often have hundreds of attributes that are used by various systems
and business processes. It is critical that item master records be properly classified and have
complete and accurate attributes, because they form the foundation for accuracy and efficiency
These processes mend data sufficiently for strategic and tactical decision making. Our research
in enterprise software systems.
indicates that 15% to 20% of enterprises fit this profile.
Clean Item Master Data Enables a Wide Range of Business Initiatives
To reach the next data quality echelon, these organizations should implement forms of data
There are numerous business initiatives underway in an organization at any given time that
management policy enforcement to stem data quality problems at a business process level. In
are focused on cost reductions, operational efficiencies, or strategic synergies. A company's supply
addition, they should concentrate on moving beyond the onetime repair of glaring data quality
organization may engage in strategic sourcing or enterprise spend management, while the product
management group may focus on part reuse. The merger-and-acquisition team may be evaluating
potential targets based partially on synergies to be won in the consolidation of operations, supply
21
2
Item Master Data Rationalization
Item Master Data Rationalization
Laying the foundation for Continuous Business Process Improvement
Laying the foundation for Continuous Business Process Improvement
chains, or product lines. The successful ongoing operation of such initiatives rests on reliable
Level 3: Proactive
reporting: What do we spend? What do we buy and from whom? What parts do products have
Moderate master data maturity can be ascribed to organizations that perceive master data as a
in common? What can be substituted? When item master data is not clean, managers do not have
genuine fuel for improved business performance. These organizations have incorporated data
reliable data for the reporting needed to drive these initiatives forward.
quality in the IT charter, and data cleansing is typically performed downstream by departmentlevel IT shops or in a data warehouse by commercial data quality software. Processes include:
Master Data Rationalization Is the Foundation for Leveraging Existing
ERP Investment
n
Record-based batch cleansing (e.g., name/address)
n
Identification
Most IT organizations are challenged in driving continuing positive return on investment from their
n
Matching
ERP systems. Many are consolidating their various ERP and other enterprise software systems
n
Weeding out duplicates
to meet that challenge. In particular, many SAP customers facing the need to upgrade as SAP ends
n
Standardization
support of R/3 4.6c in 2006 in favor of R/3 Enterprise or mySAP ERP are using this opportunity
to consolidate and upgrade.
These processes mend data sufficiently for strategic and tactical decision making. Our research
indicates that 15% to 20% of enterprises fit this profile.
This is the ideal time to launch a master data rationalization initiative. Indeed, an item master
record format and classification scheme in SAP system #1 is typically not the same as in SAP system
To reach the next data quality echelon, these organizations should implement forms of data
#2. Before the systems can be consolidated, the master data must be rationalized according
management policy enforcement to stem data quality problems at a business process level. In
to agreed-upon format, classification scheme, and attribute definitions. Otherwise, companies risk
addition, they should concentrate on moving beyond the onetime repair of glaring data quality
contaminating their upgraded and consolidated ERP systems with even more bad data.
problems and simple edits to continuous monitoring and remediation of data closer to the source of
input. For example, leading spend management organizations deploy automated solutions that
Master Data Rationalization Protects the SAP Master Data Management
Investment
automatically classify spend data as it is put into the system.
We also note that a large number of SAP customers are preparing to implement SAP's Master Data
Level 4: Managed
Management (MDM) functionality found in the NetWeaver platform. Implementing SAP MDM does
Organizations in this penultimate data quality maturity level view data as a critical component of
not eliminate the need for master data rationalization. To the contrary, it emphasizes the need
the IT portfolio. They consider data quality to be a principal IT function and one of their major
for master data rationalization because its function is the syndication and management
responsibilities. Accordingly, data quality is regularly measured and monitored for accuracy,
of the various master data objects in enterprise software systems. SAP customers should protect
completeness, and integrity at an enterprise level, across systems. Data quality is concretely linked
their investment and undertake master data rationalization before implementing MDM, to ensure
to business issues and process performance. Most cleansing and standardization functions are
that only clean master data is managed by SAP MDM.
performed at the source (i.e., where data is generated, captured, or received), and item master
record data quality monitoring is performed on an international level.
Successful Sourcing and Procurement Initiatives Depend on Clean,
Reliable Master Data
These organizations now have rigorous, yet flexible, data quality processes that make
Companies implementing enterprise spend management learn very quickly that the quality of their
incorporating new data sources and snaring and repairing unforeseen errors straightforward, if not
master data holds the key to unlocking the promised value. Master data such as vendor and item
seamless. Data quality functions are built into major business applications, enabling confident
master records forms the basis for all other associated spend data and business objects such
operational decision making. Only 5% of enterprises have achieved this level of data quality-related
as purchase orders and goods receipts. The ugly reality is that this master data exists in many
information maturity. Evolving to the pinnacle of data quality excellence demands continued
systems and is often incomplete, duplicated, and wrongly classified or unclassified.
institutionalization of data quality practices.
3
20
Item Master Data Rationalization
Item Master Data Rationalization
Laying the foundation for Continuous Business Process Improvement
Laying the foundation for Continuous Business Process Improvement
Our data quality maturity model comprises five levels of maturity, from awareness to optimization.
Extracting, organizing, enriching, and analyzing this data potpourri is a major challenge
Advancing from one level to the next delivers real value to the organization and its partners.
for any organization, but it must be done. Without clean, reliable master data, a spend
This model should serve as a guide to aid organizations in understanding the necessary changes
management initiative will fail. Master data rationalization that is, the process of extracting,
and associated impact on the organization, its business processes, its information technology
normalizing, classifying, enriching, and staging data for analysis is fundamental to the spend
infrastructure, and its applications (see Figure 8).
management process. Organizations should invest in processes and tools that automate
to the greatest extent possible the master data rationalization process. The goal is to establish
Level 1: Aware
a repeatable, reliable process that enables confident spend data analysis on an ongoing basis.
These organizations live in master data chaos. They generally have some awareness that data
initiatives to cleanse data. Individuals typically initiate data quality processes on an ad hoc basis as
Optimum Master Data Maturity Enables Real-Time Analysis and Control
of Business Processes
needs arise. A common example is that of suppliers needing to be identified for a particular
Our research shows that the maturity of organizational master data quality practices varies greatly,
commodity and efforts being focused on weeding out duplicate entries. We find that approximately
from the most basic but not uncommon state of master data chaos, to the rare case of pervasive,
30% of Global 2000 enterprises currently fit this profile.
real-time, high-quality master data. Organizations should understand where they are in the master
quality problems are affecting business execution and decision making, but they have no formal
data maturity model and chart a path to achieving an optimized level of master data quality
To move to the next level, these organizations should strive to improve internal awareness and
maturity a level where they will be able to exploit spend data on a real-time basis to drive
communication about the impact of data quality and should link data quality to specific business
continual improvements in supply-side processes. Key to this evolution is the implementation
initiatives and performance indicators. Chief financial officers and chief procurement officers are
of automated processes for the cleansing, enrichment, and maintenance of master data.
key players in driving the organization to understand that it is suffering because of bad data. This
should set the stage for action.
Introduction
Level 2: Reactive
suspicion or knowledge of data quality problems, and managers revert to instinct-driven decision
ERP Systems Are Indispensable to the Business Operations of Large
Organizations
making, rather than relying on reports. Some manual or homegrown batch cleansing is performed at
Enterprise software applications have become so indispensable that they have a material effect
a departmental or application level within the application database. At this level, data quality
on company valuations. Over the years, we have seen companies incur charges totaling hundreds
issues tend to most affect field or service personnel, who rely on access to correct operational data
of millions of dollars because of ERP problems, companies miss the market with their products
to perform their roles effectively. About 45% of enterprises fit this profile.
because of ERP problems, and mergers fail to deliver intended results because of ERP problems.
Suspicion and mistrust abound at this level. Decisions and transactions are often questioned, due to
The health and continuing welfare of a company's ERP system is clearly an issue for the CEO.
To avoid the organizational paralysis that accompanies thoughts of a sweeping overhaul of the
company's master data, targeted data audits and process assessments should be the first order of
ERP systems, once a transformational investment where companies invested enormous sums
business for these organizations. Spend data should be audited by experts that can identify
without a clear understanding of the outcome, have dropped down the stack to become
remediation strategies, and business processes such as item master record maintenance should be
a true backbone of the organization. Accordingly, the focus surrounding their maintenance and
assessed for impact on data quality. Limited-scope initiatives leveraging hosted data management
economic performance has shifted, from a mindset of, “I'll pay whatever it takes to get it in and
solutions often deliver a quick return on investment and prove the business case for wider
beat my competition,” to one of, “I want Six Sigma quality, and I want to minimize my operational
deployment. To exit this level permanently requires some investment and a commitment from line-
costs,” as described by META Group's IT Application Portfolio Management theory. Chief information
of-business managers to improve data quality.
officers not only are tasked with the responsibility for improving the performance of their ERP
systems, but they also face the challenge of continuing to mine return from their ERP investment.
19
4
Item Master Data Rationalization
Item Master Data Rationalization
Laying the foundation for Continuous Business Process Improvement
Laying the foundation for Continuous Business Process Improvement
Business Process Configuration in ERP Is Important, But Master Data
Quality Affects the Accuracy, Efficiency, and Reliability of the Process
Integrating Master Data Rationalization Into ERP Consolidation
or Upgrade Planning
Organizations dedicate much attention and many resources to improving their business processes.
An organization should not consider consolidating its enterprise business systems without building
The focus of many ERP efforts revolves around process optimization and process extension to other
master data rationalization into the project. To do otherwise is to destroy the opportunity
enterprise systems such as CRM or supplier relationship management (SRM). As the process
to leverage a single instance of clean data for business improvement. Users should ensure
broadens to involve other organizational units or enterprise applications, many organizations
that their systems integrators understand the value and power of master data rationalization
discover that the process efficiency and reliability suffers. Accurate reporting is no longer possible,
and that they have experience in laying the foundation for a successful ERP consolidation.
and confidence in the systems drops. Investigation into these problems reveals that bad master
data is often the root cause of these process degradations.
Master data rationalization is a significant step on the path toward achieving data quality maturity.
Without this first step, further activities are like trying to plug holes in the dike with one's fingers.
Entropy: The Cause of Diminishing Returns
Moving Your Organization Through the Data Quality Maturity Model
We have seen the extent to which bad data limits the success of enterprise initiatives, and we have
Entropy (noun): a process of degradation or running down, or a trend to disorder.
(Source: Merriam Webster)
Entropy affects spend data as well as all other elements in the universe. Cleaning and
organizing spend data once is not sufficient to win continued savings and efficiencies.
Organizations must implement an automated, repeatable, scalable process to ensure
the completeness, accuracy, and integrity of spend data.
examined the strong business case in support of a systematic approach to master data quality.
The process of master data rationalization is straightforward. The next logical question involves
where to start. Determining where to start a master data management project begins with
identifying where the organization is in the data quality maturity model.
With spend data proving to be a true corporate asset, enterprises must adopt a method for gauging
their “information maturity”
that is, how well they manage and leverage information
to achieve corporate goals. Only by measuring information maturity can organizations hope
Bad master data that is, master data that is inaccurate, duplicated, incomplete, or out-of-date
to put in place appropriate programs, policies, architecture, and infrastructure to manage
hampers the accuracy of analysis, causes expensive exceptions that must be resolved, and prevents
and apply information better.
refinement of processes. Moreover, when bad data or flawed analysis is shared with partners,
not only are the associated processes affected, but also the level of trust is undermined.
Figure 8 — The Data Quality maturity pyramid
Under these conditions, frustrated employees tend to continue their manual processes and future
efforts in collaboration, integration, and automation become more difficult, due to employee
resistance. In short, bad master data will destroy the best-designed business processes.
Keeping Enterprise Applications in Shape Requires Constant Master
Data Maintenance
Level 5
Operate real-time data monitoring and enrichment to enable real-time
Optimized business reporting
Level 4
Measure data quality continually and analyze for impact on business operations
Managed
Proactive
Master data in enterprise applications such as ERP, SRM, or CRM is subjected to data entropy from
the first moment after go-live. Entropy, the universal trend toward disorder, takes many forms.
Reactive
In the application itself, incomplete validation routines, poor master data maintenance policies,
or subsequent master data loads can contaminate the system. Across a business process that spans
Level 3
Institute upstream data quality processes such as auto classification
at the point of data entry
Level 2
Conduct a targeted data and process audit, avoiding onetime
fixes, and begin master data rationalization
Level 1
Create awareness, linking data quality to business initiatives,
and get the CEO/CIO involved
Aware
more than one application, master data record formats and contents can vary, leading to inaccurate
transactions and analysis. In the fight against master data disorder, organizations must institute
5
Source: META Group
18
Item Master Data Rationalization
Item Master Data Rationalization
Laying the foundation for Continuous Business Process Improvement
Laying the foundation for Continuous Business Process Improvement
using manual content factories to screen the data. Again, this approach is neither scalable
master data quality tools, policies, and procedures. Master data requires continuous maintenance,
nor repeatable.
from the time it is created or loaded to the time it is archived, or business results will suffer.
Figure 7 — Incomplete approaches to Item Master Rationalization
Essential to master data quality is the process of master data rationalization. A typical enterprise
IT architecture comprises several enterprise applications and many sources of master data.
Integrated business processes that tap these sources as they wind their way through the various
systems suffer when there is no agreement among systems on something as fundamental as an item
ETL Solutions
(Extract, Transform, Load)
Asset Management Solutions
These solutions are too generic in functionality to deal
with the complexities of item master records. ETL
solutions do not perform classification and attribute
enrichment. Moreover, there is considerable effort and
expense in setting up these solutions for repeated use.
Asset management solutions typically target only a
subset of item master data, namely MRO
(maintenance, repair, and operations) items. This is
not sufficient for ERP consolidation or for
comprehensive spend analysis. In addition, there is
significant manual effort involved.
master record. Master data rationalization is the process that ensures that master data is properly
classified, with complete and normalized attributes, and that it is fully suitable for use throughout
the enterprise IT landscape.
Successful Business Initiatives Depend on Clean, Organized,
and Reliable Master Data
Business initiatives such as ERP system consolidation, enterprise spend management,
total inventory visibility, or component reuse promise high returns, whether from reduced
IT expenditures, as in the case of an ERP consolidation, or from more cost-effective designs
and faster time to market, as in the case of component reuse in the product design cycle.
Commerce Catalog Solutions
Commerce catalog solutions tend to focus only on the
items sold, rather than those procured. These
solutions are less experienced in tapping the various
internal and external sources of item data and fail in
the subject-matter expert department. Furthermore,
they do not automate the attribute enrichment,
automating instead only the workflow.
All of these business initiatives have one thing in common, though, and that is a dependency
on clean, organized, and reliable master data. Master data that is correctly classified with
a common taxonomy and that has normalized and enriched attributes yields a granular level
of visibility that is critical to search and reporting functions. Before undertaking any of these efforts
and similar business initiatives, organizations must ensure that they have instituted the policies,
procedures, and tools to ensure master data quality.
Manual Content Factories
Manual content factories, or manual approaches in
general, were common before the advent of artificial
intelligence tools for master data rationalization. The
manual approach cannot scale nor can it meet the
throughput demands of large projects.
CEOs and CFOs Who Are Accountable Under Sarbanes-Oxley Need
Good Data
The Sarbanes-Oxley Act, passed in 2002, underscores the importance of master data quality
for the CEO and CFO. This broad act addresses financial reporting and business processes that have
an effect on financial reporting. Under Sarbanes-Oxley, company officers must certify compliance
Source: META Group
of their financial reports with the act. As companies work toward compliance, many discover that
the quality of their master data has a direct and material impact on their financial reporting,
making the state of master data a Sarbanes-Oxley issue (see Figure 1).
Organizations should instead evaluate their prospective solution providers on their ability to deliver
an approach toward master data rationalization that automates as much of the classification,
Accordingly, CEOs and CFOs are using the Sarbanes-Oxley Act as the impetus for consolidating ERP
cleansing, attribute extraction, and attribute enrichment as possible on a repeatable basis.
systems, for driving visibility in corporate spending, and for visibility in inventories. Surveys within
In addition, the solution provider should bring to the table experience in taxonomies and specific
our client base confirm an increase in all these activities.
industry verticals along with the automated solution.
17
6
Item Master Data Rationalization
Item Master Data Rationalization
Laying the foundation for Continuous Business Process Improvement
Laying the foundation for Continuous Business Process Improvement
Automation Is Not an Option
Figure 1 — SOX Sections Impacted by Master Data
Applying this master data rationalization methodology requires deployment of an automated
solution. Without automation, it will be impossible to process the volume of records required
Organizations must also assess readiness, requirements, and controls across individual
to make an impact on the overall performance of the enterprise initiatives that depend on item
sections of the Sarbanes-Oxley Act:
master data. In particular, automating the classification and attribute enrichment steps
n
Section 404: Internal Controls
in the master data rationalization process is crucial to the overall process.
- Capability to comprehensively aggregate financial data
Organizations should examine available solutions based on a number of criteria, including:
- Accessibility of financial reporting details to executives
n
- Availability of management of tools for drill-down analysis of accounting reports
n
How strong are the algorithms used for the automated classification?
- Organizations should note the percentage of records that make it through screening
- Capability to routinely highlight key analysis areas based on tolerances and
with an 80% confidence level that the classification is correct.
financial metrics
- Capability to segment reporting into material or significant elements
n
Can the system learn?
- The strength of artificial intelligence is that self-learning systems require less support over
- Adequacy of visibility into any outsourced processes that impact
time, saving users money and resources.
SOX compliance
n
n
How repeatable is the process?
- Investing in a process that is not repeatable is a waste of money and resources.
- Support for frequent “flash” reporting
Sections 302 and 906: CEO/CFO Sign-Off
What is the throughput of the system?
- Data loads must be accomplished in short order: Business will not wait. High throughput
- Degree and efficiency of financial/ERP consolidation and integration
- Availability and quality of financial data marts/data warehouses
with high accuracy is a sign of a strong system.
n
- Quality of financial reporting/OLAP capabilities
How effective is the human support?
- Service providers offer expertise in setting up taxonomies and classification of materials.
- Consistency of defined financial and related metadata
Users should look for experience with their particular industry as well as with the toolset they
- Availability to management of compliance dashboards and related tools
have chosen to use. Systems integrators should have experience with both the master data
- Support for frequent flash reporting
rationalization tools as well as the ERP systems.
- Quality of ERP, best-of-breed, and legacy system controls
n
Source: META Group
Can the process be integrated into daily operations?
- Users should look for tools that support the classification of master data at the source.
An automated classification tool that is integrated into the business application ensures that
any new part is automatically classified with the correct codes before that part record is used.
The Role of Master Data in the Enterprise
Master Data Quality Issues Ripple Across the Enterprise
Master data represents the fundamental building blocks of operational enterprise software systems
and the key components of the company, including:
- The items it makes
- The items it buys
Currently, there are several avenues that organizations can take to attain master data quality
(see Figure 7). One approach is to limit the scope of the data to those records used by the asset
management system. Typically, there is a large amount of manual intervention because asset
management solution vendors believe that the low volume of data does not require significant
automation. Needless to say, this approach fails because of its narrow focus and lack of scalability.
- The employees who work there
Catalog content management providers also offer aspects of master data rationalization, though
-
their focus still remains primarily on the commerce side, rather than on the procurement and
The customers to whom it sells
- The suppliers it buys from
7
Data quality is thereby maintained.
supply sides of the organization. Finally, there are service providers that offer onetime cleansings
16
Item Master Data Rationalization
Item Master Data Rationalization
Laying the foundation for Continuous Business Process Improvement
Laying the foundation for Continuous Business Process Improvement
Step 5: Final Duplicate Record Identification
When any of these records becomes inaccurate, other dependent master records lso become
Once the records have been classified and their attributes enriched, the records undergo a second
corrupt. The ripple effect is pronounced as these records feed transactions and business processes.
round of duplicate identification (see Figure 6). With much more record information normalized,
Reporting becomes inaccurate and suspect, managers lose visibility of actual operational results,
enriched, and complete, most of the duplicates are automatically identified during this step.
and the company and its shareholders suffer.
Although this may vary by category, there are usually a small number of records that still must be
The Difference Between Primary and Derived Master Data Records
evaluated by subject-matter experts to determine their status.
Master data records can be classified into two main categories:
n
Figure 6 — Final Duplicate Record Identification
Primary master data records: These records are like prime numbers. They cannot be reduced
further. Employee, customer, vendor, and item master records are all examples of primary
master data records.
n
Derived master data records: Derived master data records are created by linking primary
Supplier
for creating a specific pricing record that is used in sales and trade management applications.
Quantity
Unit of Sale
Brightness
Weight
master data records together. Linking a customer record with an item record creates the basis
Size
Item Description
Part Number
UNSPSC
Description
UNSPSC
Classification
Item Record #1 After Attribute Enrichment
The number of derived master data records is an order of magnitude greater than primary master
data records and managing them is a challenge in itself. However, if the primary master data
records are bad, the challenge becomes insurmountable.
14 11 15 07 Printer 751381 Inkjet US
24lb. 104 Ream 500 Office
or
printer letter
Depot
copier
paper
paper
A Disorganized Approach Toward Maintaining Master Data Is Common
Organizations rarely have a unified approach toward managing primary master data.
Customer records typically fall under the purview of the CRM team, and customer data
is maintained as part of that initiative. Vendor master records normally belong to procurement
Item Record #2 After Attribute Enrichment
Supplier
Quantity
Unit of Sale
Brightness
Weight
data records, on the other hand, often have no clear owner.
Size
Item Description
Part Number
UNSPSC
Description
UNSPSC
Classification
or accounts payable, and their maintenance is administered by these departments. Item master
14 11 15 07 Printer 751381 Inkjet US
24lb. 104 Ream 500 Office
or
Depot
printer letter
copier
paper
paper
Item Master Records Present Particular Challenges
Item master records have numerous sources. Engineers and designers can create parts,
procurement can source new parts, and suppliers can load their part masters into the organization's
systems. Compounding the complexity surrounding the item master record is the number of systems
in which they reside.
In the simple example of a product as it moves from design to manufacturing:
n The design engineer creates a product using prototype parts that are supplied by a prototype
supplier. These parts have unique part numbers and often are procured by the engineer.
These records are routed to the Subject-matter
expert for duplicate identification.
Source: META Group
15
This normally takes place in the engineer's own product life-cycle management software
application.
n After winning approval, the design is released to manufacturing, where the manufacturing bill
8
Item Master Data Rationalization
Item Master Data Rationalization
Laying the foundation for Continuous Business Process Improvement
Laying the foundation for Continuous Business Process Improvement
of materials calls out for series production parts that must be sourced by procurement.
master records are full of cryptic attributes, due to poor validations and limited text-field lengths.
This takes place in another application, typically an ERP system.
In this step, attributes are extracted, normalized, and completed as part of record enrichment
n The service parts management group creates item master records for its service parts and works
through an after-market sales organization in yet another system.
(see Figure 5). This establishes the difference between the discovery of a metal nut and the
discovery of a ¼-20 hex nut made of 316 stainless steel. Because of the sheer volume of attributes
to be extracted and enriched, an automated approach is the only practical way to execute this step.
Item master records abound, yet rarely will they have complete and accurate information,
since they are remade in independent applications as new parts records. The organization
has no single version of the truth and has lost its ability to effectively manage its resources.
Figure 5 — Attribute Extraction and Enrichment
Item Record After Initial Normalization and Classification
Item Master Record Quality Problems Have Numerous Root Causes
This simple scenario highlights a few of the root causes of item master data quality problems,
UNSPSC
Classification
UNSPSC
Description
Part
Number
Item
Description
14 11 15 07
Printer or
copier paper
751381
Printer paper
81/2 x 11,
24lb., 500ct.
Supplier
which include:
n
Various master record formats: As a rule, no two software systems share the same master
record format. Therefore, a one-to-one correspondence, between fields is not possible,
and any data migration between systems will result in incomplete and inaccurate records.
n
Office
Depot
Various systems of record: The vast majority of organizations use more than one software
application. Product organizations may have many applications, including computer-aided
design (CAD), sourcing, manufacturing execution, ERP, warehousing and logistics, and CRM
Web CrossReferencing
Attribute Extraction &
Enrichment Engine
applications. Integration of all of these applications is not a guarantee of data integrity.
Incongruent naming standards or no naming standards: Item codes and descriptions
Supplier
or incomplete classification systems, which leads to items being wrongly classified
Quantity
purposes, item master records are classified, but all too often, organizations use proprietary
Unit of Sale
Lack of a standardized classification convention: As an aid to finding items and for reporting
Part Number
n
UNSPSC
Description
cannot be found.
UNSPSC
Classification
and search engines fail to find the right part. Duplicate records are created when existing parts
Brightness
abbreviations proliferate. Deciphering units of measure or part names becomes an IQ test,
Weight
Item Record After Attribute Extraction and Enrichment
Size
are too often a window to the creativity of those who created the master record. Consequently,
Item Description
n
or not classified at all. Consequently, reports are incomplete and inaccurate, which has
an impact on decision making.
n
Incomplete fields: This is a simple yet effective way that master record quality is reduced.
14 11 15 07 Printer 751381 Inkjet US
24lb. 104 Ream 500 Office
or
printer letter
Depot
copier
paper
paper
Inadequate validation routines often are the cause of incomplete fields being passed on.
Imprecise validation routines also affect a related issue: incorrect entries.
Source: META Group
9
14
Item Master Data Rationalization
Item Master Data Rationalization
Laying the foundation for Continuous Business Process Improvement
Laying the foundation for Continuous Business Process Improvement
Figure 3 — Initial duplicate identification based on part number & supplier name
The Effect of Bad Item Master Data on Business Initiatives Is Profound
Bad master data is not an IT problem, though the ITO is often called upon to solve it. The success
and measurable impact of business initiatives depend on consistent, high-quality master data.
Part Number
75A01
75AO1
75A-01
}
75AO1
Supplier Name
General
Electric
GE
Gen. Elec.
Merger and Acquisition Activities
}
General
Electric
Inc.
The synergies driving M&A activity often are dependent on consolidating operations and inventory
as well as sharing and integrating designs and leveraging use of common parts. Realizing these
synergies depends on the ability to merge item master data files and to accurately report
on the status of these initiatives. Failure to gain a common view of the item master data
Source: META Group
of both companies not only diminishes the synergies and drags out the integration process,
but also threatens the success of the merger or acquisition itself a business event typically far more
Step 3: Classification
expensive than the cost of the required data maintenance.
Classification is a critical step. The master records must be classified correctly, completely,
and to a level of detail that makes the record easy to identify for search and reporting functions.
ERP System Consolidation
Organizations often have multiple classification schemas. Although it is not necessary to choose
Increasingly more organizations are consolidating their ERP instances, targeting savings
one particular taxonomy, since taxonomies can coexist, it is necessary to have a taxonomy
and efficiencies. Business drivers for these consolidations include SOX compliance pressures,
that supports the enterprise's business initiatives. Our research confirms that the use of widely
the end of SAP R/3 version support, system harmonization across business units or geographies,
adopted taxonomies such as UNSPSC, NATO, or eClass improves the performance of enterprise
and architectural upgrades that allow companies to leverage service-oriented architectures.
spend management strategies significantly over legacy taxonomies. This step is best executed
However, attempting consolidation before the master data is rationalized will lead
with the help of a partner that has deep experience in taxonomy deployment (see Figure 4).
to a contaminated single instance. Cleansing the data once it lands in the new system is enormously
expensive and time consuming.
Figure 4 — An Example of Hierarchical Taxonomy
Enterprise Spend Management
The initial business benefits from enterprise spend management are substantial. Organizations
UNSPSC..................Description
routinely report cost savings of 5%-30% after aggregating spending and reducing the number
26
26
26
26
26
26
26
26
of suppliers for a given commodity. However, many companies find that they “hit the wall” after
00
10
10
10
10
10
10
10
00
00
16
16
16
16
16
16
00..................Power generation distribution machinery and accessories
00..........................Power motors
00....................................Motors
01............................................Induction motors
02............................................Alternating current (A/C) motors
09............................................Synchronous motors
11............................................Single-phase motors
12............................................Multi-phase motors
Source: META Group
a first round of spend management and that incremental gains afterward are small to non-existent.
These organizations are discovering that the familiar 80/20 rule has been turned on its head.
The first 20% of savings and efficiency gains is the easy part. The remaining 80% presents
a formidable challenge that most organizations and software solutions are not currently equipped
to tackle. Bad master data is a major culprit.
Step 4: Attribute Extraction and Enrichment
Sourcing
Although classification helps determine what an item is and how it relates to other items, attributes
Sourcing projects and the make-versus-buy decision process in general require a view of what exists
define the characteristics of the item and can run into the hundreds per item. Unfortunately,
already in the approved parts lists and approved vendor lists. Bad item master data can result
attributes in the item record may be left blank, be cryptic, or be inaccurate. In particular, ERP
in supplier proliferation, part proliferation, and a failure to leverage existing contracts.
13
10
Item Master Data Rationalization
Item Master Data Rationalization
Laying the foundation for Continuous Business Process Improvement
Laying the foundation for Continuous Business Process Improvement
Inventory Visibility
and that allows granular visibility of the item. Rarely do organizations themselves have
Warehouse management systems, ERP systems, and third-party logistics service providers manage
the resources in-house to evaluate and select the proper taxonomies. Accordingly, organizations
aspects of parts and finished goods inventories. This fragmented system landscape clouds inventory
should ensure that their consulting partners demonstrate their experience with taxonomy selection
visibility and leads to over purchasing, stock-outs, inventory write-offs, and disruptions
and deployment. Item record attributes play a similar important role.
of manufacturing operations. This impact can be measured in lost customers, missed deadlines,
and financial losses.
Attributes define the item and are important for successful parametric searches. Incomplete
or incorrect attributes prevent items from being found in the systems, resulting in proliferation
Part Reuse in Design
of parts and bloated inventories. Before the development of sophisticated automated tools
An engineer's design decisions can have lasting financial impacts on product margin as well as on the
to perform these functions, this process was an expensive and cumbersome process, and rarely
organization. Part reuse is dependent on the engineer's ability to find the right part based on
a successful undertaking.
attributes. When existing parts are incompletely or wrongly classified and attributes are missing,
frustrated engineers find it easier to create a new part than to perform an extended manual search.
Step 1: Extraction and Aggregation
The master data rationalization process begins with extraction of the master data from the various
This undermines sourcing strategies and merger-and-acquisition synergies, and further bloats
systems of record, whether they are internal systems such as ERP, SRM, or legacy, or external
inventories.
systems such as purchasing card suppliers.These records are aggregated in a database that serves as
the source for the follow-on processing. Initial validation can take place at this point to send bad
Master Data Rationalization Is a Prerequisite for Successful Business
Initiatives
records back for repair (see Figure 2).
The pervasive nature of item master data affects the success, efficiency, and material impact
Step 2: Cleansing
of many business processes and initiatives, as we have described above. Organizations must
Once aggregated, the data is subjected to an initial screening to identify duplicate records (see
establish a strategy for item master data that addresses data quality across the master data life
Figure 3). Part numbers, descriptions, and attributes (e.g., supplier names) are parsed using
cycle, from inception or introduction to archiving. The first step in this process is master
predefined rules. Exact matches and probable matches are identified and published. Weeding out
data rationalization.
duplicate records is an iterative process that requires subject-matter experts to identify those
records that cannot be culled in the first round. In this process, rule-based processing is inadequate
Understanding the Process of Master Data Rationalization
to manage the volume of data. Statistical processing and artificial intelligence is needed to ensure
The case for clean, reliable master data is clear, and we have seen that it is essential for master
the maximum level of automation and accuracy.
data to be clean from its inception or its introduction into a business application. Master data
rationalization is the first step that organizations should undertake in their drive for master
data quality.
Figure 2 — Extraction and Aggregation prior to duplicate identification
Data Sources
Master data rationalization is a multistep, iterative process that involves the extraction,
ERP
Templates
Initial Validation
Master Data
Rationalization
Environment
aggregation, cleansing, classification, and attribute enrichment of itemmaster data.
Key to this process is the proper classification and attribute enrichment of the item master record.
AP
Data
Warehouse/
Consolidated
Database
T&E
Most systems use some sort of taxonomy to classify items. However, for use throughout
the enterprise and with external partners, organizations should select a taxonomy that delivers
depth and breadth, such as UNSPSC (the United Nations Standard Products and Services Code),
11
PO
Corrupt records are returned to the source for repair
Source: META Group
12
Item Master Data Rationalization
Item Master Data Rationalization
Laying the foundation for Continuous Business Process Improvement
Laying the foundation for Continuous Business Process Improvement
Inventory Visibility
and that allows granular visibility of the item. Rarely do organizations themselves have
Warehouse management systems, ERP systems, and third-party logistics service providers manage
the resources in-house to evaluate and select the proper taxonomies. Accordingly, organizations
aspects of parts and finished goods inventories. This fragmented system landscape clouds inventory
should ensure that their consulting partners demonstrate their experience with taxonomy selection
visibility and leads to over purchasing, stock-outs, inventory write-offs, and disruptions
and deployment. Item record attributes play a similar important role.
of manufacturing operations. This impact can be measured in lost customers, missed deadlines,
and financial losses.
Attributes define the item and are important for successful parametric searches. Incomplete
or incorrect attributes prevent items from being found in the systems, resulting in proliferation
Part Reuse in Design
of parts and bloated inventories. Before the development of sophisticated automated tools
An engineer's design decisions can have lasting financial impacts on product margin as well as on the
to perform these functions, this process was an expensive and cumbersome process, and rarely
organization. Part reuse is dependent on the engineer's ability to find the right part based on
a successful undertaking.
attributes. When existing parts are incompletely or wrongly classified and attributes are missing,
frustrated engineers find it easier to create a new part than to perform an extended manual search.
Step 1: Extraction and Aggregation
The master data rationalization process begins with extraction of the master data from the various
This undermines sourcing strategies and merger-and-acquisition synergies, and further bloats
systems of record, whether they are internal systems such as ERP, SRM, or legacy, or external
inventories.
systems such as purchasing card suppliers.These records are aggregated in a database that serves as
the source for the follow-on processing. Initial validation can take place at this point to send bad
Master Data Rationalization Is a Prerequisite for Successful Business
Initiatives
records back for repair (see Figure 2).
The pervasive nature of item master data affects the success, efficiency, and material impact
Step 2: Cleansing
of many business processes and initiatives, as we have described above. Organizations must
Once aggregated, the data is subjected to an initial screening to identify duplicate records (see
establish a strategy for item master data that addresses data quality across the master data life
Figure 3). Part numbers, descriptions, and attributes (e.g., supplier names) are parsed using
cycle, from inception or introduction to archiving. The first step in this process is master
predefined rules. Exact matches and probable matches are identified and published. Weeding out
data rationalization.
duplicate records is an iterative process that requires subject-matter experts to identify those
records that cannot be culled in the first round. In this process, rule-based processing is inadequate
Understanding the Process of Master Data Rationalization
to manage the volume of data. Statistical processing and artificial intelligence is needed to ensure
The case for clean, reliable master data is clear, and we have seen that it is essential for master
the maximum level of automation and accuracy.
data to be clean from its inception or its introduction into a business application. Master data
rationalization is the first step that organizations should undertake in their drive for master
data quality.
Figure 2 — Extraction and Aggregation prior to duplicate identification
Data Sources
Master data rationalization is a multistep, iterative process that involves the extraction,
ERP
Templates
Initial Validation
Master Data
Rationalization
Environment
aggregation, cleansing, classification, and attribute enrichment of itemmaster data.
Key to this process is the proper classification and attribute enrichment of the item master record.
AP
Data
Warehouse/
Consolidated
Database
T&E
Most systems use some sort of taxonomy to classify items. However, for use throughout
the enterprise and with external partners, organizations should select a taxonomy that delivers
depth and breadth, such as UNSPSC (the United Nations Standard Products and Services Code),
11
PO
Corrupt records are returned to the source for repair
Source: META Group
12
Item Master Data Rationalization
Item Master Data Rationalization
Laying the foundation for Continuous Business Process Improvement
Laying the foundation for Continuous Business Process Improvement
Figure 3 — Initial duplicate identification based on part number & supplier name
The Effect of Bad Item Master Data on Business Initiatives Is Profound
Bad master data is not an IT problem, though the ITO is often called upon to solve it. The success
and measurable impact of business initiatives depend on consistent, high-quality master data.
Part Number
75A01
75AO1
75A-01
}
75AO1
Supplier Name
General
Electric
GE
Gen. Elec.
Merger and Acquisition Activities
}
General
Electric
Inc.
The synergies driving M&A activity often are dependent on consolidating operations and inventory
as well as sharing and integrating designs and leveraging use of common parts. Realizing these
synergies depends on the ability to merge item master data files and to accurately report
on the status of these initiatives. Failure to gain a common view of the item master data
Source: META Group
of both companies not only diminishes the synergies and drags out the integration process,
but also threatens the success of the merger or acquisition itself a business event typically far more
Step 3: Classification
expensive than the cost of the required data maintenance.
Classification is a critical step. The master records must be classified correctly, completely,
and to a level of detail that makes the record easy to identify for search and reporting functions.
ERP System Consolidation
Organizations often have multiple classification schemas. Although it is not necessary to choose
Increasingly more organizations are consolidating their ERP instances, targeting savings
one particular taxonomy, since taxonomies can coexist, it is necessary to have a taxonomy
and efficiencies. Business drivers for these consolidations include SOX compliance pressures,
that supports the enterprise's business initiatives. Our research confirms that the use of widely
the end of SAP R/3 version support, system harmonization across business units or geographies,
adopted taxonomies such as UNSPSC, NATO, or eClass improves the performance of enterprise
and architectural upgrades that allow companies to leverage service-oriented architectures.
spend management strategies significantly over legacy taxonomies. This step is best executed
However, attempting consolidation before the master data is rationalized will lead
with the help of a partner that has deep experience in taxonomy deployment (see Figure 4).
to a contaminated single instance. Cleansing the data once it lands in the new system is enormously
expensive and time consuming.
Figure 4 — An Example of Hierarchical Taxonomy
Enterprise Spend Management
The initial business benefits from enterprise spend management are substantial. Organizations
UNSPSC..................Description
routinely report cost savings of 5%-30% after aggregating spending and reducing the number
26
26
26
26
26
26
26
26
of suppliers for a given commodity. However, many companies find that they “hit the wall” after
00
10
10
10
10
10
10
10
00
00
16
16
16
16
16
16
00..................Power generation distribution machinery and accessories
00..........................Power motors
00....................................Motors
01............................................Induction motors
02............................................Alternating current (A/C) motors
09............................................Synchronous motors
11............................................Single-phase motors
12............................................Multi-phase motors
Source: META Group
a first round of spend management and that incremental gains afterward are small to non-existent.
These organizations are discovering that the familiar 80/20 rule has been turned on its head.
The first 20% of savings and efficiency gains is the easy part. The remaining 80% presents
a formidable challenge that most organizations and software solutions are not currently equipped
to tackle. Bad master data is a major culprit.
Step 4: Attribute Extraction and Enrichment
Sourcing
Although classification helps determine what an item is and how it relates to other items, attributes
Sourcing projects and the make-versus-buy decision process in general require a view of what exists
define the characteristics of the item and can run into the hundreds per item. Unfortunately,
already in the approved parts lists and approved vendor lists. Bad item master data can result
attributes in the item record may be left blank, be cryptic, or be inaccurate. In particular, ERP
in supplier proliferation, part proliferation, and a failure to leverage existing contracts.
13
10
Item Master Data Rationalization
Item Master Data Rationalization
Laying the foundation for Continuous Business Process Improvement
Laying the foundation for Continuous Business Process Improvement
of materials calls out for series production parts that must be sourced by procurement.
master records are full of cryptic attributes, due to poor validations and limited text-field lengths.
This takes place in another application, typically an ERP system.
In this step, attributes are extracted, normalized, and completed as part of record enrichment
n The service parts management group creates item master records for its service parts and works
through an after-market sales organization in yet another system.
(see Figure 5). This establishes the difference between the discovery of a metal nut and the
discovery of a ¼-20 hex nut made of 316 stainless steel. Because of the sheer volume of attributes
to be extracted and enriched, an automated approach is the only practical way to execute this step.
Item master records abound, yet rarely will they have complete and accurate information,
since they are remade in independent applications as new parts records. The organization
has no single version of the truth and has lost its ability to effectively manage its resources.
Figure 5 — Attribute Extraction and Enrichment
Item Record After Initial Normalization and Classification
Item Master Record Quality Problems Have Numerous Root Causes
This simple scenario highlights a few of the root causes of item master data quality problems,
UNSPSC
Classification
UNSPSC
Description
Part
Number
Item
Description
14 11 15 07
Printer or
copier paper
751381
Printer paper
81/2 x 11,
24lb., 500ct.
Supplier
which include:
n
Various master record formats: As a rule, no two software systems share the same master
record format. Therefore, a one-to-one correspondence, between fields is not possible,
and any data migration between systems will result in incomplete and inaccurate records.
n
Office
Depot
Various systems of record: The vast majority of organizations use more than one software
application. Product organizations may have many applications, including computer-aided
design (CAD), sourcing, manufacturing execution, ERP, warehousing and logistics, and CRM
Web CrossReferencing
Attribute Extraction &
Enrichment Engine
applications. Integration of all of these applications is not a guarantee of data integrity.
Incongruent naming standards or no naming standards: Item codes and descriptions
Supplier
or incomplete classification systems, which leads to items being wrongly classified
Quantity
purposes, item master records are classified, but all too often, organizations use proprietary
Unit of Sale
Lack of a standardized classification convention: As an aid to finding items and for reporting
Part Number
n
UNSPSC
Description
cannot be found.
UNSPSC
Classification
and search engines fail to find the right part. Duplicate records are created when existing parts
Brightness
abbreviations proliferate. Deciphering units of measure or part names becomes an IQ test,
Weight
Item Record After Attribute Extraction and Enrichment
Size
are too often a window to the creativity of those who created the master record. Consequently,
Item Description
n
or not classified at all. Consequently, reports are incomplete and inaccurate, which has
an impact on decision making.
n
Incomplete fields: This is a simple yet effective way that master record quality is reduced.
14 11 15 07 Printer 751381 Inkjet US
24lb. 104 Ream 500 Office
or
printer letter
Depot
copier
paper
paper
Inadequate validation routines often are the cause of incomplete fields being passed on.
Imprecise validation routines also affect a related issue: incorrect entries.
Source: META Group
9
14
Item Master Data Rationalization
Item Master Data Rationalization
Laying the foundation for Continuous Business Process Improvement
Laying the foundation for Continuous Business Process Improvement
Step 5: Final Duplicate Record Identification
When any of these records becomes inaccurate, other dependent master records lso become
Once the records have been classified and their attributes enriched, the records undergo a second
corrupt. The ripple effect is pronounced as these records feed transactions and business processes.
round of duplicate identification (see Figure 6). With much more record information normalized,
Reporting becomes inaccurate and suspect, managers lose visibility of actual operational results,
enriched, and complete, most of the duplicates are automatically identified during this step.
and the company and its shareholders suffer.
Although this may vary by category, there are usually a small number of records that still must be
The Difference Between Primary and Derived Master Data Records
evaluated by subject-matter experts to determine their status.
Master data records can be classified into two main categories:
n
Figure 6 — Final Duplicate Record Identification
Primary master data records: These records are like prime numbers. They cannot be reduced
further. Employee, customer, vendor, and item master records are all examples of primary
master data records.
n
Derived master data records: Derived master data records are created by linking primary
Supplier
for creating a specific pricing record that is used in sales and trade management applications.
Quantity
Unit of Sale
Brightness
Weight
master data records together. Linking a customer record with an item record creates the basis
Size
Item Description
Part Number
UNSPSC
Description
UNSPSC
Classification
Item Record #1 After Attribute Enrichment
The number of derived master data records is an order of magnitude greater than primary master
data records and managing them is a challenge in itself. However, if the primary master data
records are bad, the challenge becomes insurmountable.
14 11 15 07 Printer 751381 Inkjet US
24lb. 104 Ream 500 Office
or
printer letter
Depot
copier
paper
paper
A Disorganized Approach Toward Maintaining Master Data Is Common
Organizations rarely have a unified approach toward managing primary master data.
Customer records typically fall under the purview of the CRM team, and customer data
is maintained as part of that initiative. Vendor master records normally belong to procurement
Item Record #2 After Attribute Enrichment
Supplier
Quantity
Unit of Sale
Brightness
Weight
data records, on the other hand, often have no clear owner.
Size
Item Description
Part Number
UNSPSC
Description
UNSPSC
Classification
or accounts payable, and their maintenance is administered by these departments. Item master
14 11 15 07 Printer 751381 Inkjet US
24lb. 104 Ream 500 Office
or
Depot
printer letter
copier
paper
paper
Item Master Records Present Particular Challenges
Item master records have numerous sources. Engineers and designers can create parts,
procurement can source new parts, and suppliers can load their part masters into the organization's
systems. Compounding the complexity surrounding the item master record is the number of systems
in which they reside.
In the simple example of a product as it moves from design to manufacturing:
n The design engineer creates a product using prototype parts that are supplied by a prototype
supplier. These parts have unique part numbers and often are procured by the engineer.
These records are routed to the Subject-matter
expert for duplicate identification.
Source: META Group
15
This normally takes place in the engineer's own product life-cycle management software
application.
n After winning approval, the design is released to manufacturing, where the manufacturing bill
8
Item Master Data Rationalization
Item Master Data Rationalization
Laying the foundation for Continuous Business Process Improvement
Laying the foundation for Continuous Business Process Improvement
Automation Is Not an Option
Figure 1 — SOX Sections Impacted by Master Data
Applying this master data rationalization methodology requires deployment of an automated
solution. Without automation, it will be impossible to process the volume of records required
Organizations must also assess readiness, requirements, and controls across individual
to make an impact on the overall performance of the enterprise initiatives that depend on item
sections of the Sarbanes-Oxley Act:
master data. In particular, automating the classification and attribute enrichment steps
n
Section 404: Internal Controls
in the master data rationalization process is crucial to the overall process.
- Capability to comprehensively aggregate financial data
Organizations should examine available solutions based on a number of criteria, including:
- Accessibility of financial reporting details to executives
n
- Availability of management of tools for drill-down analysis of accounting reports
n
How strong are the algorithms used for the automated classification?
- Organizations should note the percentage of records that make it through screening
- Capability to routinely highlight key analysis areas based on tolerances and
with an 80% confidence level that the classification is correct.
financial metrics
- Capability to segment reporting into material or significant elements
n
Can the system learn?
- The strength of artificial intelligence is that self-learning systems require less support over
- Adequacy of visibility into any outsourced processes that impact
time, saving users money and resources.
SOX compliance
n
n
How repeatable is the process?
- Investing in a process that is not repeatable is a waste of money and resources.
- Support for frequent “flash” reporting
Sections 302 and 906: CEO/CFO Sign-Off
What is the throughput of the system?
- Data loads must be accomplished in short order: Business will not wait. High throughput
- Degree and efficiency of financial/ERP consolidation and integration
- Availability and quality of financial data marts/data warehouses
with high accuracy is a sign of a strong system.
n
- Quality of financial reporting/OLAP capabilities
How effective is the human support?
- Service providers offer expertise in setting up taxonomies and classification of materials.
- Consistency of defined financial and related metadata
Users should look for experience with their particular industry as well as with the toolset they
- Availability to management of compliance dashboards and related tools
have chosen to use. Systems integrators should have experience with both the master data
- Support for frequent flash reporting
rationalization tools as well as the ERP systems.
- Quality of ERP, best-of-breed, and legacy system controls
n
Source: META Group
Can the process be integrated into daily operations?
- Users should look for tools that support the classification of master data at the source.
An automated classification tool that is integrated into the business application ensures that
any new part is automatically classified with the correct codes before that part record is used.
The Role of Master Data in the Enterprise
Master Data Quality Issues Ripple Across the Enterprise
Master data represents the fundamental building blocks of operational enterprise software systems
and the key components of the company, including:
- The items it makes
- The items it buys
Currently, there are several avenues that organizations can take to attain master data quality
(see Figure 7). One approach is to limit the scope of the data to those records used by the asset
management system. Typically, there is a large amount of manual intervention because asset
management solution vendors believe that the low volume of data does not require significant
automation. Needless to say, this approach fails because of its narrow focus and lack of scalability.
- The employees who work there
Catalog content management providers also offer aspects of master data rationalization, though
-
their focus still remains primarily on the commerce side, rather than on the procurement and
The customers to whom it sells
- The suppliers it buys from
7
Data quality is thereby maintained.
supply sides of the organization. Finally, there are service providers that offer onetime cleansings
16
Item Master Data Rationalization
Item Master Data Rationalization
Laying the foundation for Continuous Business Process Improvement
Laying the foundation for Continuous Business Process Improvement
using manual content factories to screen the data. Again, this approach is neither scalable
master data quality tools, policies, and procedures. Master data requires continuous maintenance,
nor repeatable.
from the time it is created or loaded to the time it is archived, or business results will suffer.
Figure 7 — Incomplete approaches to Item Master Rationalization
Essential to master data quality is the process of master data rationalization. A typical enterprise
IT architecture comprises several enterprise applications and many sources of master data.
Integrated business processes that tap these sources as they wind their way through the various
systems suffer when there is no agreement among systems on something as fundamental as an item
ETL Solutions
(Extract, Transform, Load)
Asset Management Solutions
These solutions are too generic in functionality to deal
with the complexities of item master records. ETL
solutions do not perform classification and attribute
enrichment. Moreover, there is considerable effort and
expense in setting up these solutions for repeated use.
Asset management solutions typically target only a
subset of item master data, namely MRO
(maintenance, repair, and operations) items. This is
not sufficient for ERP consolidation or for
comprehensive spend analysis. In addition, there is
significant manual effort involved.
master record. Master data rationalization is the process that ensures that master data is properly
classified, with complete and normalized attributes, and that it is fully suitable for use throughout
the enterprise IT landscape.
Successful Business Initiatives Depend on Clean, Organized,
and Reliable Master Data
Business initiatives such as ERP system consolidation, enterprise spend management,
total inventory visibility, or component reuse promise high returns, whether from reduced
IT expenditures, as in the case of an ERP consolidation, or from more cost-effective designs
and faster time to market, as in the case of component reuse in the product design cycle.
Commerce Catalog Solutions
Commerce catalog solutions tend to focus only on the
items sold, rather than those procured. These
solutions are less experienced in tapping the various
internal and external sources of item data and fail in
the subject-matter expert department. Furthermore,
they do not automate the attribute enrichment,
automating instead only the workflow.
All of these business initiatives have one thing in common, though, and that is a dependency
on clean, organized, and reliable master data. Master data that is correctly classified with
a common taxonomy and that has normalized and enriched attributes yields a granular level
of visibility that is critical to search and reporting functions. Before undertaking any of these efforts
and similar business initiatives, organizations must ensure that they have instituted the policies,
procedures, and tools to ensure master data quality.
Manual Content Factories
Manual content factories, or manual approaches in
general, were common before the advent of artificial
intelligence tools for master data rationalization. The
manual approach cannot scale nor can it meet the
throughput demands of large projects.
CEOs and CFOs Who Are Accountable Under Sarbanes-Oxley Need
Good Data
The Sarbanes-Oxley Act, passed in 2002, underscores the importance of master data quality
for the CEO and CFO. This broad act addresses financial reporting and business processes that have
an effect on financial reporting. Under Sarbanes-Oxley, company officers must certify compliance
Source: META Group
of their financial reports with the act. As companies work toward compliance, many discover that
the quality of their master data has a direct and material impact on their financial reporting,
making the state of master data a Sarbanes-Oxley issue (see Figure 1).
Organizations should instead evaluate their prospective solution providers on their ability to deliver
an approach toward master data rationalization that automates as much of the classification,
Accordingly, CEOs and CFOs are using the Sarbanes-Oxley Act as the impetus for consolidating ERP
cleansing, attribute extraction, and attribute enrichment as possible on a repeatable basis.
systems, for driving visibility in corporate spending, and for visibility in inventories. Surveys within
In addition, the solution provider should bring to the table experience in taxonomies and specific
our client base confirm an increase in all these activities.
industry verticals along with the automated solution.
17
6
Item Master Data Rationalization
Item Master Data Rationalization
Laying the foundation for Continuous Business Process Improvement
Laying the foundation for Continuous Business Process Improvement
Business Process Configuration in ERP Is Important, But Master Data
Quality Affects the Accuracy, Efficiency, and Reliability of the Process
Integrating Master Data Rationalization Into ERP Consolidation
or Upgrade Planning
Organizations dedicate much attention and many resources to improving their business processes.
An organization should not consider consolidating its enterprise business systems without building
The focus of many ERP efforts revolves around process optimization and process extension to other
master data rationalization into the project. To do otherwise is to destroy the opportunity
enterprise systems such as CRM or supplier relationship management (SRM). As the process
to leverage a single instance of clean data for business improvement. Users should ensure
broadens to involve other organizational units or enterprise applications, many organizations
that their systems integrators understand the value and power of master data rationalization
discover that the process efficiency and reliability suffers. Accurate reporting is no longer possible,
and that they have experience in laying the foundation for a successful ERP consolidation.
and confidence in the systems drops. Investigation into these problems reveals that bad master
data is often the root cause of these process degradations.
Master data rationalization is a significant step on the path toward achieving data quality maturity.
Without this first step, further activities are like trying to plug holes in the dike with one's fingers.
Entropy: The Cause of Diminishing Returns
Moving Your Organization Through the Data Quality Maturity Model
We have seen the extent to which bad data limits the success of enterprise initiatives, and we have
Entropy (noun): a process of degradation or running down, or a trend to disorder.
(Source: Merriam Webster)
Entropy affects spend data as well as all other elements in the universe. Cleaning and
organizing spend data once is not sufficient to win continued savings and efficiencies.
Organizations must implement an automated, repeatable, scalable process to ensure
the completeness, accuracy, and integrity of spend data.
examined the strong business case in support of a systematic approach to master data quality.
The process of master data rationalization is straightforward. The next logical question involves
where to start. Determining where to start a master data management project begins with
identifying where the organization is in the data quality maturity model.
With spend data proving to be a true corporate asset, enterprises must adopt a method for gauging
their “information maturity”
that is, how well they manage and leverage information
to achieve corporate goals. Only by measuring information maturity can organizations hope
Bad master data that is, master data that is inaccurate, duplicated, incomplete, or out-of-date
to put in place appropriate programs, policies, architecture, and infrastructure to manage
hampers the accuracy of analysis, causes expensive exceptions that must be resolved, and prevents
and apply information better.
refinement of processes. Moreover, when bad data or flawed analysis is shared with partners,
not only are the associated processes affected, but also the level of trust is undermined.
Figure 8 — The Data Quality maturity pyramid
Under these conditions, frustrated employees tend to continue their manual processes and future
efforts in collaboration, integration, and automation become more difficult, due to employee
resistance. In short, bad master data will destroy the best-designed business processes.
Keeping Enterprise Applications in Shape Requires Constant Master
Data Maintenance
Level 5
Operate real-time data monitoring and enrichment to enable real-time
Optimized business reporting
Level 4
Measure data quality continually and analyze for impact on business operations
Managed
Proactive
Master data in enterprise applications such as ERP, SRM, or CRM is subjected to data entropy from
the first moment after go-live. Entropy, the universal trend toward disorder, takes many forms.
Reactive
In the application itself, incomplete validation routines, poor master data maintenance policies,
or subsequent master data loads can contaminate the system. Across a business process that spans
Level 3
Institute upstream data quality processes such as auto classification
at the point of data entry
Level 2
Conduct a targeted data and process audit, avoiding onetime
fixes, and begin master data rationalization
Level 1
Create awareness, linking data quality to business initiatives,
and get the CEO/CIO involved
Aware
more than one application, master data record formats and contents can vary, leading to inaccurate
transactions and analysis. In the fight against master data disorder, organizations must institute
5
Source: META Group
18
Item Master Data Rationalization
Item Master Data Rationalization
Laying the foundation for Continuous Business Process Improvement
Laying the foundation for Continuous Business Process Improvement
Our data quality maturity model comprises five levels of maturity, from awareness to optimization.
Extracting, organizing, enriching, and analyzing this data potpourri is a major challenge
Advancing from one level to the next delivers real value to the organization and its partners.
for any organization, but it must be done. Without clean, reliable master data, a spend
This model should serve as a guide to aid organizations in understanding the necessary changes
management initiative will fail. Master data rationalization that is, the process of extracting,
and associated impact on the organization, its business processes, its information technology
normalizing, classifying, enriching, and staging data for analysis is fundamental to the spend
infrastructure, and its applications (see Figure 8).
management process. Organizations should invest in processes and tools that automate
to the greatest extent possible the master data rationalization process. The goal is to establish
Level 1: Aware
a repeatable, reliable process that enables confident spend data analysis on an ongoing basis.
These organizations live in master data chaos. They generally have some awareness that data
initiatives to cleanse data. Individuals typically initiate data quality processes on an ad hoc basis as
Optimum Master Data Maturity Enables Real-Time Analysis and Control
of Business Processes
needs arise. A common example is that of suppliers needing to be identified for a particular
Our research shows that the maturity of organizational master data quality practices varies greatly,
commodity and efforts being focused on weeding out duplicate entries. We find that approximately
from the most basic but not uncommon state of master data chaos, to the rare case of pervasive,
30% of Global 2000 enterprises currently fit this profile.
real-time, high-quality master data. Organizations should understand where they are in the master
quality problems are affecting business execution and decision making, but they have no formal
data maturity model and chart a path to achieving an optimized level of master data quality
To move to the next level, these organizations should strive to improve internal awareness and
maturity a level where they will be able to exploit spend data on a real-time basis to drive
communication about the impact of data quality and should link data quality to specific business
continual improvements in supply-side processes. Key to this evolution is the implementation
initiatives and performance indicators. Chief financial officers and chief procurement officers are
of automated processes for the cleansing, enrichment, and maintenance of master data.
key players in driving the organization to understand that it is suffering because of bad data. This
should set the stage for action.
Introduction
Level 2: Reactive
suspicion or knowledge of data quality problems, and managers revert to instinct-driven decision
ERP Systems Are Indispensable to the Business Operations of Large
Organizations
making, rather than relying on reports. Some manual or homegrown batch cleansing is performed at
Enterprise software applications have become so indispensable that they have a material effect
a departmental or application level within the application database. At this level, data quality
on company valuations. Over the years, we have seen companies incur charges totaling hundreds
issues tend to most affect field or service personnel, who rely on access to correct operational data
of millions of dollars because of ERP problems, companies miss the market with their products
to perform their roles effectively. About 45% of enterprises fit this profile.
because of ERP problems, and mergers fail to deliver intended results because of ERP problems.
Suspicion and mistrust abound at this level. Decisions and transactions are often questioned, due to
The health and continuing welfare of a company's ERP system is clearly an issue for the CEO.
To avoid the organizational paralysis that accompanies thoughts of a sweeping overhaul of the
company's master data, targeted data audits and process assessments should be the first order of
ERP systems, once a transformational investment where companies invested enormous sums
business for these organizations. Spend data should be audited by experts that can identify
without a clear understanding of the outcome, have dropped down the stack to become
remediation strategies, and business processes such as item master record maintenance should be
a true backbone of the organization. Accordingly, the focus surrounding their maintenance and
assessed for impact on data quality. Limited-scope initiatives leveraging hosted data management
economic performance has shifted, from a mindset of, “I'll pay whatever it takes to get it in and
solutions often deliver a quick return on investment and prove the business case for wider
beat my competition,” to one of, “I want Six Sigma quality, and I want to minimize my operational
deployment. To exit this level permanently requires some investment and a commitment from line-
costs,” as described by META Group's IT Application Portfolio Management theory. Chief information
of-business managers to improve data quality.
officers not only are tasked with the responsibility for improving the performance of their ERP
systems, but they also face the challenge of continuing to mine return from their ERP investment.
19
4
Item Master Data Rationalization
Item Master Data Rationalization
Laying the foundation for Continuous Business Process Improvement
Laying the foundation for Continuous Business Process Improvement
chains, or product lines. The successful ongoing operation of such initiatives rests on reliable
Level 3: Proactive
reporting: What do we spend? What do we buy and from whom? What parts do products have
Moderate master data maturity can be ascribed to organizations that perceive master data as a
in common? What can be substituted? When item master data is not clean, managers do not have
genuine fuel for improved business performance. These organizations have incorporated data
reliable data for the reporting needed to drive these initiatives forward.
quality in the IT charter, and data cleansing is typically performed downstream by departmentlevel IT shops or in a data warehouse by commercial data quality software. Processes include:
Master Data Rationalization Is the Foundation for Leveraging Existing
ERP Investment
n
Record-based batch cleansing (e.g., name/address)
n
Identification
Most IT organizations are challenged in driving continuing positive return on investment from their
n
Matching
ERP systems. Many are consolidating their various ERP and other enterprise software systems
n
Weeding out duplicates
to meet that challenge. In particular, many SAP customers facing the need to upgrade as SAP ends
n
Standardization
support of R/3 4.6c in 2006 in favor of R/3 Enterprise or mySAP ERP are using this opportunity
to consolidate and upgrade.
These processes mend data sufficiently for strategic and tactical decision making. Our research
indicates that 15% to 20% of enterprises fit this profile.
This is the ideal time to launch a master data rationalization initiative. Indeed, an item master
record format and classification scheme in SAP system #1 is typically not the same as in SAP system
To reach the next data quality echelon, these organizations should implement forms of data
#2. Before the systems can be consolidated, the master data must be rationalized according
management policy enforcement to stem data quality problems at a business process level. In
to agreed-upon format, classification scheme, and attribute definitions. Otherwise, companies risk
addition, they should concentrate on moving beyond the onetime repair of glaring data quality
contaminating their upgraded and consolidated ERP systems with even more bad data.
problems and simple edits to continuous monitoring and remediation of data closer to the source of
input. For example, leading spend management organizations deploy automated solutions that
Master Data Rationalization Protects the SAP Master Data Management
Investment
automatically classify spend data as it is put into the system.
We also note that a large number of SAP customers are preparing to implement SAP's Master Data
Level 4: Managed
Management (MDM) functionality found in the NetWeaver platform. Implementing SAP MDM does
Organizations in this penultimate data quality maturity level view data as a critical component of
not eliminate the need for master data rationalization. To the contrary, it emphasizes the need
the IT portfolio. They consider data quality to be a principal IT function and one of their major
for master data rationalization because its function is the syndication and management
responsibilities. Accordingly, data quality is regularly measured and monitored for accuracy,
of the various master data objects in enterprise software systems. SAP customers should protect
completeness, and integrity at an enterprise level, across systems. Data quality is concretely linked
their investment and undertake master data rationalization before implementing MDM, to ensure
to business issues and process performance. Most cleansing and standardization functions are
that only clean master data is managed by SAP MDM.
performed at the source (i.e., where data is generated, captured, or received), and item master
record data quality monitoring is performed on an international level.
Successful Sourcing and Procurement Initiatives Depend on Clean,
Reliable Master Data
These organizations now have rigorous, yet flexible, data quality processes that make
Companies implementing enterprise spend management learn very quickly that the quality of their
incorporating new data sources and snaring and repairing unforeseen errors straightforward, if not
master data holds the key to unlocking the promised value. Master data such as vendor and item
seamless. Data quality functions are built into major business applications, enabling confident
master records forms the basis for all other associated spend data and business objects such
operational decision making. Only 5% of enterprises have achieved this level of data quality-related
as purchase orders and goods receipts. The ugly reality is that this master data exists in many
information maturity. Evolving to the pinnacle of data quality excellence demands continued
systems and is often incomplete, duplicated, and wrongly classified or unclassified.
institutionalization of data quality practices.
3
20
Item Master Data Rationalization
Item Master Data Rationalization
Laying the foundation for Continuous Business Process Improvement
Laying the foundation for Continuous Business Process Improvement
Level 3: Proactive
Executive Summary
Moderate master data maturity can be ascribed to organizations that perceive master data as a
genuine fuel for improved business performance. These organizations have incorporated data
quality in the IT charter, and data cleansing is typically performed downstream by department-
Master Data Has a Material Impact on the Financial and Operational
Health of an Organization
level IT shops or in a data warehouse by commercial data quality software. Processes include:
Business executives depend on reliable reporting of operational and financial activities to guide
n
Record-based batch cleansing (e.g., name/address)
their decisions. The US government even mandates reliable and accurate reporting under the
n
Identification
Sarbanes-Oxley Act (SOX). The underlying enabler to meet the demands of business executives and
n
Matching
the government is the master data found in enterprise software systems. Master data represents
n
Weeding out duplicates
the items a company buys, the products it sells, suppliers it manages and the customers it has.
n
Standardization
When the master data is inaccurate, out-of-date, or duplicated, business processes magnify and
Figure 9 — Key Data Quality Characteristics
propagate these errors, and the company's financial and operational results are affected.
The results are profound. Shareholders lose their confidence and market capitalization falls.
Executives begin to manage by instinct rather than from facts and results suffer. Suppliers lose faith
n
n
n
n
Accuracy: A measure of information correctness
Consistency: A measure of semantic standards being applied
Completenes: A measure of gaps within a record
Entirety: A measure of the quantity of entities or events captured versus those
in the collaborative processes and build in safety stock. All these scenarios are likely and have
a direct effect on the financial and operational health of the enterprise.
Item Master Data Requires Specialized Attention
universally available
Customer relationship management (CRM) projects have long focused on the quality of customer
n
Breadth: A measure of the amount of information captured about an entity or event
master records managed by CRM systems. Item master records, on the other hand, often have
n
Depth: A measure of the amount of entity or event history/versioning
no clear owner to champion the cause of clean, reliable item master data, because the data often
n
Precision: A measure of exactness
resides in various systems and is used by different departments. However, these records require
n
Latency: A measure of how current a record is
special attention, because they contain the most pervasive master data in the enterprise and form
n
Scarcity: A measure of how rare an item of information is
Redundancy: A measure of unnecessary information repetition
the basis for many other dependent master records and business objects such as purchase orders
n
and pricing records.
Source: META Group
Moreover, item master records often have hundreds of attributes that are used by various systems
and business processes. It is critical that item master records be properly classified and have
complete and accurate attributes, because they form the foundation for accuracy and efficiency
These processes mend data sufficiently for strategic and tactical decision making. Our research
in enterprise software systems.
indicates that 15% to 20% of enterprises fit this profile.
Clean Item Master Data Enables a Wide Range of Business Initiatives
To reach the next data quality echelon, these organizations should implement forms of data
There are numerous business initiatives underway in an organization at any given time that
management policy enforcement to stem data quality problems at a business process level. In
are focused on cost reductions, operational efficiencies, or strategic synergies. A company's supply
addition, they should concentrate on moving beyond the onetime repair of glaring data quality
organization may engage in strategic sourcing or enterprise spend management, while the product
management group may focus on part reuse. The merger-and-acquisition team may be evaluating
potential targets based partially on synergies to be won in the consolidation of operations, supply
21
2
Item Master Data Rationalization
Item Master Data Rationalization
Laying the foundation for Continuous Business Process Improvement
Laying the foundation for Continuous Business Process Improvement
Contents
Bottom Line
Clean, Reliable Master Data Enables Successful Enterprise Initiatives
Executive Summary
2
Master Data Has a Material Impact on the Financial and Operational Health of an Organization
2
Item Master Data Requires Specialized Attention
2
management initiatives, they find that the efficiency and accuracy of their business processes and
Clean Item Master Data Enables a Wide Range of Business Initiatives
2
reporting are dependent on the item master data. More than just good housekeeping, a methodical
Master Data Rationalization Is the Foundation for Leveraging Existing ERP Investment
3
and automated approach to cleansing, classifying, and enriching item master data lays the
Master Data Rationalization Protects the SAP Master Data Management Investment
3
Successful Sourcing and Procurement Initiatives Depend on Clean, Reliable Master Data
3
As organizations consolidate ERP systems, engage in strategic sourcing or launch enterprise spend
foundation for the continuing success of many enterprise initiatives.
Optimum Master Data Maturity Enables Real-Time Analysis and Control of Business Processes
4
Introduction
4
Master Data Rationalization Is Required to Ensure Master Data Quality
ERP Systems Are Indispensable to the Business Operations of Large Organizations
4
There are many approaches to attaining master data quality. Some systems rely on field-level
Business Process Configuration in ERP Is Important, But Master Data Quality Affects the
validations and some use workflow for review and approval, while others combine techniques
Accuracy, Efficiency, and Reliability of the Process
5
Keeping Enterprise Applications in Shape Requires Constant Master Data Maintenance
5
in an ad hoc fashion. However, without the consistent, systematic approach of master data
Successful Business Initiatives Depend on Clean, Organized, and Reliable Master Data
6
rationalization, our research shows that these techniques fail to deliver the level of consistency
CEOs and CFOs Who Are Accountable Under Sarbanes-Oxley Need Good Data
6
The Role of Master Data in the Enterprise
7
and quality needed for ongoing operations.
Master Data Quality Issues Ripple Across the Enterprise
7
The Difference Between Primary and Derived Master Data Records
8
Building Master Data Rationalization Into ERP Consolidation Planning
A Disorganized Approach Toward Maintaining Master Data Is Common
8
Few organizations and systems integrators dedicate enough attention and resources to master data
Item Master Records Present Particular Challenges
8
rationalization in their ERP consolidation planning. Successful organizations will plan far ahead
Item Master Record Quality Problems Have Numerous Root Causes
9
The Effect of Bad Item Master Data on Business Initiatives Is Profound
10
Master Data Rationalization Is a Prerequisite for Successful Business Initiatives
11
Understanding the Process of Master Data Rationalization
12
Step 1: Extraction and Aggregation
12
Step 2: Cleansin
12
Step 3: Classification
14
of the small window in the schedule allotted to the master data load and will plan for master data
rationalization with an experienced service provider. Once the data is loaded and go-live
is reached, it is too late to rethink the impact of poor master data quality.
Master Data Rationalization Is a Key Component in Achieving Data
Quality Maturity
Step 4: Attribute Extraction and Enrichment
14
Step 5: Final Duplicate Record Identification
16
Our research shows that maturity of organizational master data quality practices varies greatly,
Automation Is Not an Option
17
from the most basic but not uncommon state of master data chaos, to the rare case of pervasive,
Integrating Master Data Rationalization Into ERP Consolidation or Upgrade Planning
19
real-time, high-quality master data. Organizations should understand where they are in the master
Moving Your Organization Through the Data Quality Maturity Model
19
data maturity model and chart a path to achieving an optimized level of master data quality
Level 1: Aware
20
Level 2: Reactive
21
maturity a level where they will be able to exploit spend data on a real-time basis to drive
Level 3: Proactive
21
continual improvements in supply side processes. Key to this evolution is the implementation
Level 4: Managed
22
of automated processes for the cleansing, enrichment, and maintenance of master data.
Level 5: Optimized
22
Bottom Line
23
Clean, Reliable Master Data Enables Successful Enterprise Initiatives
23
Master Data Rationalization Is Required to Ensure Master Data Quality
24
Bruce Hudson is a program director, Barry Wilderman is a senior vice president, and
Carl Lehmann is a vice president with Enterprise Application Strategies, a META Group
Building Master Data Rationalization Into ERP Consolidation Planning
24
advisory service. For additional information on this topic or other META Group offerings,
Master Data Rationalization Is a Key Component in Achieving Data Quality Maturity
24
contact info@metagroup.com.
1
22
m e t a g r o u p.c o m
800-945-META [6382]
October 2004
Item Master Data Rationalization
Laying the Foundation for Continuous Business
Process Improvement
“Bad master data that is, master data that is inaccurate, duplicated, incomplete,
or out-of-date hampers the accuracy of analysis, causes expensive exceptions that must
be resolved, and prevents refinement of processes. Moreover, when bad data or flawed
analysis is shared with partners, not only are the associated processes affected,
but also the level of trust is undermined. Under these conditions, frustrated
employees tend to continue their manual processes and future efforts in collaboration,
About META Group
integration, and automation become more difficult, due to employee resistance.
Return On Intelligence SM
In short, bad master data will destroy the best-designed business processes.”
META Group is a leading provider of information technology research, advisory services, and
strategic consulting. Delivering objective and actionable guidance, META Group’s experienced
analysts and consultants are trusted advisors to IT and business executives around the world. Our
unique collaborative models and dedicated customer service help clients be more efficient,
effective, and timely in their use of IT to achieve their business goals. Visit metagroup.com for
more details on our high-value approach.
208 Harbor Drive Stamford, CT 06902 (203) 973-6700 Fax (203) 359-8066
metagroup.com
Copyright © 2004 META Group, Inc. All rights reserved.
A META Group White Paper
Sponsored by Zycus