News Stay informed about the latest enterprise technology news and product updates.

Implementing a data quality improvement plan

Many companies see their ERP investments fall short due to the lack of high-quality integrated master data.

This article originally appeared on the BeyeNETWORK.

In Part 1 of this series, the subject of data qualityand customer data integration in enterprise resource planning (ERP) initiatives was discussed through a series of e-mail correspondences at Wexford Widgets, a fictional company. This article is a continuation of those correspondences, through which we’ll address the complexities that Wexford faces in going live with their ERP system without a true data quality improvement plan in place, as well as explore best practices for integrating customer data integration (CDI) into any ERP implementation.


E-MAIL CORRESPONDENCE

To: Tony Williams, CEO
From: Bill Evans, CIO
Subject: ERP Parallel Run Issues

Tony,
This is a follow-up to our discussion this morning on the problems we’re having during our parallel run period with the new ERP application. To date, we’ve rolled the new system out to six of our 17 plants. The rollout to the remaining plants should be done within the next four months. Until then, we will have to continue to perform double entry of new account setup and account changes in the ERP and the appropriate legacy system. I am aware that we have not had 100% accuracy lately in this process. To improve our accuracy, I have replaced our temporary staff provider, and I have allocated additional funding to procure a higher quality temporary staff. This should have a beneficial impact on the error rates as we try to keep the old and new systems in sync.
Bill

To: Bill Evans, CIO
From: Tony Williams, CEO
Subject: Re: ERP Parallel Run Issues

Bill,
I am surprised that we let ourselves get into this situation. I hope our decision not to add to the project budget so that we could acquire a customer data integration solution doesn’t come back to haunt us. I keep thinking that if we had one of those “customer hubs” you explained to me, we might not be faced with all of this double entry during our rollout phase. Am I correct in thinking this? Is it too late?
Tony


The focus of most ERP projects is on the implementation of the software package, rather than on the data. Project staffing and funding reflect this bias. Many companies recognize the importance of integrated data in the ERP ecosystem only when problems are encountered during data loading, system testing, or even post-production. One of the scary aspects of this is that bad master data will not necessarily prevent an ERP system from going live. The application software can be put into production with bad data. The problems caused by this bad data will not manifest themselves until after the system is being used in production. In fact, many companies see their ERP investments fall short of delivering on the package vendor’s promises due to the lack of high-quality, integrated master data.

In Wexford’s case, the ERP rollout was conducted in stages, with the system being rolled out at different times to different manufacturing facilities. During this parallel run period, Wexford needed to keep data in sync between the new ERP application and the appropriate legacy application. They elected to do this via double entry, hiring temporary staff to re-enter customer information from one system into another. This is a common approach to support parallel runs, and is done by many companies. However, the problem with parallel entry is that it is very error-prone and expensive. But for many companies, it’s the only way to ensure that customer information entered or updated in one system gets accurately reflected in the other system.

A customer data integration solution can provide a customer hub, which serves as a central integration point for customer data across applications. When customer master file data is created or updated in one system, the customer hub processes those updates and notifies other systems of the new/updated customer information. This process can be seamless and automated, removing the potential for human error created by the dual-entry process.


E-MAIL CORRESPONDENCE

To: Bill Evans, CIO
From: Tony Williams, CEO
Subject: Data Quality Improvements

Bill,
Congratulations on the success of our customer data integration program. It truly has helped us improve the quality of our data, and has helped us realize the full potential of our investment in the new ERP system. While I wish we had built this in from the start of the ERP program, I have to commend you and your team for recognizing that CDI was the missing component in our program, and acting quickly to implement a solution.

I’m sure I speak for many in our organization when I say that I truly thought that we already had excellent quality data. We’ve had automated systems for nearly 40 years, and our data quality has always been sufficient for those systems. I realize now that by integrating our business processes with the ERP application, we brought to light all sorts of inaccuracies and inadequacies that have long existed in our data. While there’s no magic bullet, I do feel that our CDI initiative has given us a tremendous understanding of our customers and a real improvement in the quality of our data. Again, on behalf of Wexford, thank you.
Tony

To: Tony Williams, CEO
From: Bill Evans, CIO
Subject: Re: Data Quality Improvements

Tony,
You’re welcome.
Bill

One of the traps that many organizations fall into is believing that because their existing data has been good enough to run the business so far, it will be of sufficient quality for the ERP system. Each old system had a limited scope, and the fact that it contained redundant or inconsistent data (as compared to what data in other systems reflects) didn’t affect the operation of that single system. Had Wexford looked at the functioning of these systems as a whole, they would have seen that they had data issues. But it can be very difficult to spot these data issues when the data is locked up in individual applications. By linking the business processes into a comprehensive ERP application, Wexford shined a bright light on the latent data issues. By implementing a CDI solution, organizations like Wexford can allow their ERP investments to realize their full potential.

On a similar note, organizations also need to take into account their goals when choosing a CDI solution. It’s not just a matter of picking one solution over another or which one will better yield accurate results. Processing performance for matching engines is a key issue and determining how quickly your system runs will assist in the amount of value you’ll receive. There are different methods of matching engines in CDI implementations. There’s deterministic matching, which is more cut and dry and faster for processes. Conversely, probabilistic matching permits higher levels of variance between records and might hold additional likely matches. It’s safe to say most organizations choose to blend both options when facing an implementation – but when considering a CDI vendor, I would advise their CDI solution be tested and evaluated against any unique and future requirements the organization may face.

Some things to keep in mind when evaluating the right CDI solution will be to test as much data as possible and compare the level of results. You’ll also need to make sure your employees can easily navigate the system, so an easy-to-search interface is necessary. Organizations must think into the future when carefully planning their CDI solutions and understandably so, choosing the right matching engine is just a part of the overall solution. As long as organizations think about their current and future requirements as well as the rate and speed of their process performance, they’re more than likely to partner with a CDI vendor that will best suit their needs and give them the ROI they expected.

  • David GleasonDavid Gleason 

    David brings 18 years of experience in the consulting field to his role, with expertise in data architecture, data management, business intelligence and data warehousing, as well as working with several Fortune 500 and mid-market manufacturing and high tech companies on their most pressing business intelligence-related needs. David served as VP of Marketing for Platinum Technology’s Global Consulting Organization, a position he was appointed to after serving as the company’s VP of Business Solutions for several years. He has also held leadership positions at Intelligent Solutions and Reltech Group. David holds a Bachelor of Science degree in Computer Science from The College of William & Mary.

 

Dig Deeper on Business intelligence architecture and integration

PRO+

Content

Find more PRO+ content and other member only offers, here.

Start the conversation

Send me notifications when other members comment.

By submitting you agree to receive email from TechTarget and its partners. If you reside outside of the United States, you consent to having your personal data transferred to and processed in the United States. Privacy

Please create a username to comment.

-ADS BY GOOGLE

SearchDataManagement

SearchAWS

SearchContentManagement

SearchCRM

SearchOracle

SearchSAP

SearchSQLServer

SearchSalesforce

Close