The Data Administration Newsletter (TDAN.com) THE TIME HAS COME FOR

advertisement
TDAN Atkins - Enterprise Data Quality Management
Page 1 of 5
The Data Administration Newsletter (TDAN.com)
Robert S. Seiner - Publisher
THE TIME HAS COME FOR
ENTERPRISE DATA QUALITY MANAGEMENT
by Mark E. Atkins, Vality Technology, Inc.
[The Article Archive]
In an increasingly competitive global economy, companies must be in the business of being informed.
Those that know the most about their customers, products, vendors, and locations will be able to formulate
the most successful strategies and tactics – for a jump-start in winning. This information has a significant
impact on the bottom line in both revenue-generating and cost administration initiatives.
Data quality is a prerequisite for being fully informed. Only accurate, complete data about customers,
products, vendors, and locations – and their relationships with each other – will yield the enterprise
intelligence needed to win. Yet, in today’s business world data moves constantly among diverse systems
– increasing data and system complexity, which can lead to degradation in data and system quality. As a
result, achieving and sustaining high data quality is a challenge.
That’s why the time has come for enterprise data quality management (EDQM). Why EDQM is now
necessary to achieve successful business results, and how it is possible, become more apparent if we
address the following questions:
z
Why has the need for EDQM evolved?
z
Why is EDQM important?
z
What does EDQM involve?
Why Has the Need for Enterprise Data Quality Management Evolved?
Two mutually reinforcing trends – a technological revolution and a business evolution -- have put a
premium on high-quality information as the key to success. The global infrastructure has been
transformed: by fax machines, PCs, laptops, sophisticated networking, and the Internet and all the handheld devices that link to it. The result is clear: as the new communications bridges and highways expand,
the volume of information increases.
In response to this new infrastructure and other technological advances, business theory and practice
have also been evolving. In the past, corporations valued three assets: their personnel, products, and
customer set. Now, they are realizing that they have a fourth asset, their information – on customers,
vendors, distributors, products, inventory, and locations. This information is now a necessity rather than a
luxury.
Where will companies find this information? They have it – in thousands, even millions, of records stored
in legacy systems – with new data coming in daily. But they need to clean and match data components
http://www.tdan.com/i015ht03.htm
7/30/2005
TDAN Atkins - Enterprise Data Quality Management
Page 2 of 5
from diverse records to get more extended information. This is the valuable information for formulating
strategies and tactics and producing business benefits that they hope to get from the business intelligence
systems they’ve invested in. But for optimal results from these systems, they need to be loaded with highquality data.
Why Is EDQM Important?
What level of data quality is needed to leverage corporate information, and how and where does data
quality have an impact on business? Examining these topics reveals why data quality is an enterprise-wide
concern, demanding to be managed.
A Closer Look at Data Quality
Data quality is the level of accuracy, consistency of format and data representation, and completeness that
permits matching and integration of all records that pertain to an entity, such as a customer or patient, a
product, or a location. Specifically, it requires domain integrity --ensuring each data value is in a discrete,
appropriate domain or field so that it is easily accessible to users and query tools – and data consistency -conditioning and standardizing data, so that it has a common structure.
Once there is a common data structure, you can match records from diverse systems accurately and
extensively. Matching, in turn, enables you to link all information relevant to a specific entity. Achieving the
highest data quality also includes finding non-exact matches – like John Doe and J.Q. Doe -- so you can,
indeed, consolidate all relevant information about this entity and eliminate duplicates or redundancy in the
database.
This level of data quality – accurate, consistent, non-redundant data with a complete view of all
relationships to other data sets -- is required for leveraging enterprise information for success. This is clear
if we look at its impact.
The Impact of Data Quality on Revenue Generation and Cost
Administration
Within a company, two kinds of activities impact the bottom line: revenue generating initiatives and costadministering initiatives. Data quality is critical to the success of each.
For example, data quality is essential for CRM and other revenue-generating systems to deliver their
expected ROI. If you don’t know all the relationships that particular customers have with your products,
your marketing profiling and segmentation will be flawed, and your marketing campaigns will fall short of
their potential. Consider this: John Doe has purchased three products. But in the three line-of-business
systems containing his records, he is listed as John, Jon, and J.Q., and his Social Security number
contains transposed digits in one system. These inconsistencies and errors – or a lack of data quality -could prevent matching of these records.
Why is matching so important? If you think you have three customers, each with one product, rather than
one premium customer owning three products, you may miss the opportunity to offer special deals to
strengthen his loyalty and add revenues. What happens if you make these kinds of errors with ten percent
of your customers? Missed sales and even lost customers.
Data quality has a similar impact on administering costs, for example, in purchasing. Cost savings here can
have a major impact on managing the corporate budget and containing overall costs to ensure maximum
profitability.
For example, if you have accurate, standardized descriptions for inventory items, you can match
information from different locations and their systems – to discover costly and inefficient inventory
duplicates (listed in different systems under different product numbers) and save money by reducing
http://www.tdan.com/i015ht03.htm
7/30/2005
TDAN Atkins - Enterprise Data Quality Management
Page 3 of 5
inventory. Similarly, if you have accurate, standardized vendor information across locations, may see that
some “different” vendors that various locations buy from are actually divisions of the same company. So
you can negotiate a better price.
Challenges to Achieving and Sustaining Data Quality
These examples show the pervasive need for data quality. So what’s the problem? Typically, companies
have a subset of applications critical to running the business: customer sales order entry systems, financial
systems, product information systems, manufacturing or production systems. Data from these applications
moves constantly.
Information enters the enterprise through varied systems and portals. While you can train clerks to adhere
to a standard data entry format, information entering through Web forms and extranets is subject to the
user’s choices. That means your clean, standardized internal system environment is under threat of
degradation from impurities such as missing data, the “wrong” or “extra” data in a field, and misspelling and
typos; from non-standard data representation; and from duplicates.
Data also moves within the corporation, especially as companies extract data from their transaction
applications and route it in real-time to portals specific to functions and users’ roles. But different systems
and users require different data. So there’s a need for a re-engineering process – to enable the transaction
data to be integrated with data in these enterprise systems and in decision support.
Add all this together and what do you have? Enterprise information portals, “life blood” enterprise
application systems, decision support databases, extranets interacting with internal systems, and Web
sites. All these systems need data and information that conforms to enterprise standards and
demonstrates the comprehensive relationships critical to optimizing business operations. The point is
clear: it’s time for enterprise data quality management.
What Does EDQM Involve?
Enterprise data quality management (EDQM) is not a one-time, short-term fix. In the current, dynamic
business environment, everything – procedures, products, relationships, the organization itself – changes
on a short timetable. To remain competitive in the future will require quick reassessment and adjustment of
information systems to support these changes.
As a result, to lay the ground for EDQM requires taking a top-level view: examining the entire enterprise to
understand what information systems and pools are critical, how information flows through them to support
business functions, and, consequently, where the enterprise needs to implement data quality procedures to
ensure optimal quality information. Once a company has this understanding, EDQM is a three-phase
process.
I: Loading systems with high-quality data. You need tools that enable you to re-engineer legacy data
before loading it into new business intelligence systems - as quickly as possible, in Internet time. In
addition, at points of uncontrolled data entry like the Web, you need real-time data quality filters to correct
data, condition it to corporate standards, and match it to internal information. Finally, at points where
transaction information is integrated into other systems, you also need filters – to standardize data for the
receiving systems and perform matching in order to link the new data with the appropriate entities in the
existing databases.
II: Maintaining the highest level of data quality in information systems. This phase has two important
components. First, you need to create and empower an internal information quality group to maintain the
quality of corporate data. Second, systems integrators need to come up with ways to perform regular data
quality audits on critical information systems and pools.
In fact, auditing and enriching the information in systems can greatly enhance their function. With regular
data quality audits, companies will be able to accurately price mergers and acquisitions and accurately
http://www.tdan.com/i015ht03.htm
7/30/2005
TDAN Atkins - Enterprise Data Quality Management
Page 4 of 5
forecast revenue potential -- based on a true count of customers (in each organization and the combined
organizations) and inventories of all customer-product relationships. Enriching information adds value, too.
If field support systems have optimal data quality and have their address data integrated with spatial
information such as latitude and longitude, you can plan more efficient service routes, to cut service
delivery costs.
III: Information development – or proactively modifying and constructing information systems to
parallel the planning and implementation of new strategies. A top executive, such as the CIO, needs to
be looking at the impact of future business changes on information needs and quality – to proactively plan
and rollout out systems to support new strategies.
For example, if a retail and a commercial bank merge, there will be new strategies and a new, hybrid set
of products. So certain information systems will have to be merged, modified, and consolidated. To hit the
ground running with the new plan and products would require consistency in and matching of data across
the merged enterprise -- to optimally leverage resources for cross-selling all products to all customers and
get the quickest ROI.
In short, loading, maintaining, and planning for high-quality data -- that’s what EDQM is all about. Right
now, we’re just scratching the surface. However, the tools to re-engineer data and ensure data quality are
available today. All that is needed is commitment to do the top-level thinking about the enterprise’s data
quality needs. Then, the practical EDQM implementation will follow: the information quality group, the
regular audits, the proactive planning for quality in future systems to support the strategic direction. The
time to begin is now -- to meet the future of instant, ubiquitous information access prepared for success.
Mark E. Atkins is President and CEO at Vality Technology. Responsible for sales, marketing, and client
support and consulting, Mr. Atkins is a key contributor to building the company from five employees to a
nationally recognized firm.
Mr. Atkins is now driving the next level of growth into enterprise information intelligence (EII), extending
Vality's products and services into business intelligence, ERP migration, enterprise application integration
(EAI), and e-commerce. He is also spearheading major partner initiatives, including successful reseller
agreements with IBM.
Previously, Mr. Atkins was Senior Vice President of Computer Solutions Inc., the precursor of Powersoft
Corporation, acquired by Sybase. Mr. Atkins also gained extensive business experience in both sales
management and financial management positions at companies that include Honeywell Information
Systems; Service Bureau Corporation, a former IBM subsidiary; Polaroid Corporation; and BayBank (now a
part of Fleet Boston Corporation).
AUTHOR CONTACT INFO:
Mark. E. Atkins
Vality Technology
100 Summer Street, 15th Floor
Boston, MA 02110
P: 617-338-0300
F: 617-338-0368
matkins@vality.com
PR REPRESENTATIVE CONTACT INFO:
Emily Ellwood
Porter Novelli Convergence Group
855 Boylston St., 8th Floor
Boston, MA 02116
P: 617-450-4300 F: 617-450-4343
http://www.tdan.com/i015ht03.htm
7/30/2005
TDAN Atkins - Enterprise Data Quality Management
Page 5 of 5
[The Article Archive]
The Data Administration Newsletter (TDAN.com)
Robert S. Seiner - Publisher and Editor - rseiner@tdan.com
http://www.tdan.com/i015ht03.htm
7/30/2005
Download