Legacy Data Quality – Morgan Stanley

Legacy Data Quality – Morgan Stanley

Project Background

Our client, a top five investment bank, required a reference data system to provide real-time control of data quality across all its legacy master databases. The bank approached PaceMetrics to provide a solution to its data governance problems, to enforce rules-based data policies over 13 million entities.

Key concerns for the client included:

  • Data quality: The bank needed visibility on data quality from external authoritative sources, so that exceptions could be repaired before bad data reached downstream users
  • Turnaround time: Data repairs had to be made more quickly and easily than under the existing process which could only be carried out by experienced database administrators
  • No interruptions: The bank wanted a system which would not interfere with the current master database since the many (100+) existing consuming systems could not afford any disruption
  • Self-sufficiency: The data management team required the ability to manage the data quality management system themselves.

Solution

Assure, PaceMetrics’ enterprise data management platform, was implemented to manage the reference data quality, displaying quality status across many data-types in a unified dashboard. Assure provided the following features:

  • Continuous monitoring of the data quality with alerts to the Data Management team whenever bad data is detected
  • Capability of the Data Management team to make data repairs or ad-hoc data changes rapidly through the user interface, within the controls of a rules-based environment
  • Provision of a traffic light view to identify problems, their origin and root-cause in real-time, so the team can make changes and instantly cleanse the data

Benefits

By implementing Assure, our client is realising the following benefits:

  • Visibility on the quality of all its reference data before it is published to downstream users
  • No need for highly skilled database administrators to make ad-hoc changes to the reference data, so resources are used more efficiently
  • Turnaround time for data repairs and data changes is reduced from weeks to minutes
  • Single user interface to maintain reference data from multiple business domains, and located in multiple databases
  • Self-sufficiency for maintenance of its own reference data model, the data sources, and the data quality rules and policies