Reference data is an important asset in financial firm. Due to recent crisis in global market, regularity changes and explosion of derivative and structured products, the need for valuable market & reference data has become central focus for financial institutions. For any financial transaction accurate information/data is the key element and faulty data is the major component of the operation risk.
Reference data used in financial transactions can be classified as static and dynamic
- Static Data: Data elements which have unalterable characteristics such as financial instrument data, indexes, legal entity/ counterparty, markets and exchanges.
- Dynamic Data: Variable data such as closing and historical prices, corporate actions.
Reference data is stored and used across front office, middle office and back office systems of the financial institutions. In a transaction life cycle, reference data is used to interact with various systems and application internally and externally. Problems related to faulty reference data continue to exist and this leads to increased operations risks and cost.
To reduce data related risk & issues and contain cost, financial institutions are looking at innovative solutions to improve data management efficiency. Centralization, standardization and automations of data management process if key to achieve this goal.
- Poor data quality; lack of global standards; presence of data silos; multiple data sources leading to inefficiency in the whole data governance process.
- Data duplication and redundancy across various business functions.
- Lack of data governing policies.
- Lack of standardized data definition.
- Time consuming data source onboarding process.
- Inconsistent data leading to poor reporting and management.
- High manual intervention in data capturing and validation process.
Poor data quality is leading to
- Deploy centralized reference data management system and create data management framework.
- Create golden copy of the reference data received from the various sources within an organization that can be accessed by all business functions.
- Update the data daily/real time at this single point.
- Validate data at single place before distributing to relevant business functions.
- Resolves data exception centrally to avoid issues at downstream systems.
- Improve process efficiency by centralization of data management.
- Reduced operational and data management cost.
- More control over data quality and change management.
- Reduced turnaround time for new business needs and meeting new regulatory requirement.
- Early detection and resolution of potential data issues.
Magic Experience And Expertise