X
Business

The art of data mapping

Going through corporate merger, or upgrading to a new application. Either way, it is inevitable that you'll have to convert, map and validate many of your organization's data types.
Written by Brenda Hinthorn, Contributor
When organizations combine data from different systems, getting the data from one system matched with the equivalent data in another is usually less than straightforward. For instance, some development teams concentrate so many resources on trapping data exceptions that they fail to test "normal" data like names and addresses. In a data-mapping project that I was involved with, the unfortunate result of not checking “normal” data (in this case ZIP codes) led to some serious problems. Corporate acquisitions and mergers can raise the stakes on a data-mapping effort, since the slightest error can have tremendous negative impact on a large number of customers.

You begin the process with careful analysis and gather as much information as possible about the data used by the systems. One method for data collection is the questionnaire. (I will drill down on the how-tos of an effective questionnaire in a future article.) Once you’ve collected the data, it’s time to validate the information and put it to use. I learned the data validation lesson the hard way as a systems analyst mapping data from disparate systems after the merger of two financial institutions. Here are some suggestions for handling the validation, conversion, and verification of data.

Working with the results
You should start by doing a comparison to see if the information gathered during your analysis corresponds to the data in the files. You might uncover discrepancies that could indicate serious problems. For example, during the merger and acquisition of these financial institutions, information provided in the deposit application questionnaire indicated that a large number of customers normally received combined account statements. Since the development team did not have a programmatic way to convert the data relationships, we weren’t sure how to continue to mail out combined statements after the integration of the disparate systems.

Analysis of the input files did not support the information in answers to the questionnaire. The data-mapping and integration team had numerous phone conversations and exchanged many e-mails in an effort to determine what was missing from the files that would allow us to locate these relationships and continue producing combined account statements. Nothing worked. Finally, I asked the business analyst to describe in detail the combined statement process. She explained that during statement processing, the organization's staff looked for customer statements with the same name and address and stuffed these statements into one envelope. It was no longer a mystery as to why we couldn’t find the relationships needed to produce combined statements—they didn’t exist. Not only that, but the conversation also revealed that the “large” number of customers turned out to be 93 out of 5,200!

Discrepancies between the questionnaire and files can be added to the file maintenance task list created at the beginning of the project. Remember, this list of exceptions will most likely change from one set of files to the next. Once the most critical file exceptions have been identified, you must decide how to correct them, either manually or programmatically. Factors to consider in a manual fix include staffing resources, reliability of staff, and impact to the conversion if the maintenance is not completed on schedule. Programmatic solutions will mainly be affected by the cost involved in programming time.

Putting it in motion
Once you’ve validated the information, the next step is to write the programming specifications for each application. Depending upon how you decide to handle the file exceptions, custom code may be necessary to complete the data mapping. Keep track of custom code specifications and make sure they make their way into the base programming. Proper testing of all data should make this effort a nonevent.

Complicated data may be too cumbersome to write into specs, but translation tables will usually help make this process more manageable. In one acquisition, we had to convert more than 200 combinations of account officer initials. We used translation tables to house this data outside the data-mapping document. We needed 12 pages of tables, but since this information was a key part of testing, the tables benefited both the programming and testing staff.

Testing the data
Verification or testing of data means having a clear understanding of what the data looked like before and what it should look like after programming, custom code, and file maintenance are completed. During the acquisition of one small community bank, no date of last customer contact was consistently kept or updated for CD or IRA accounts. Since most financial institutions use this date to track account dormancy, the decision was made to use the conversion date to prevent possible accounts being flagged as dormant due to inaccurate file dates. During testing, it would therefore be critical to confirm that the original data was changed during the system conversion to the appropriate new date.

End users should play an integral part in developing test scripts. Ideally, the users should participate in testing, but their level of involvement may depend upon current management philosophy. It is well worth the effort to include users in the testing process, because the success (or perception of success) of the conversion increases tremendously when the users are involved.

Working with the users will also give the analysts a better understanding of what test scripts to write. During the first few acquisitions I was involved in, we had ongoing problems with postconversion errors on all deposit applications. We started bringing the users into the testing process earlier, gave them access to the test region for online viewing, and provided them with reports from the acquired bank and customized reports from the input files. After we implemented this process, conversion errors dropped by 95 percent.

Another part of testing involves balancing the amount of data coming in on the files with what ends up on the system. Some data may be legitimately “lost” or dropped and should easily be accounted for if before-and-after balancing is part of testing. For example, most financial institutions allow CD accounts to transfer interest payments into another checking or savings account. During processing, interest-transfer relationship data would routinely drop because the transfer account number was no longer open or valid. Therefore, the difference in before-and-after totals would be legitimate.

Sharing the testing results with the end users and your key contacts should always be the final step of the testing process. The benefit of this step is to help both groups understand what was tested, approve of the results, and have the opportunity to ask for additional testing.

Editorial standards