Background
23rd October 2018

Scrutiny on the Data Supply Chain

The idea of a ‘supply chain’ is most commonly associated with the manufacturing process, however, the concept is now increasingly being applied to the way that financial services firms manage data. While businesses across the financial services space deal with growing volumes of raw data, rather than raw materials, the parallels are striking.

Scroll
Article Image Circle Circle


Scrutiny on the Data Supply Chain

Scrutiny on the Data Supply Chain

by Martijn Groot, VP of Product Management, Asset Control

The idea of a ‘supply chain’ is most commonly associated with the manufacturing process, however, the concept is now increasingly being applied to the way that financial services firms manage data. While businesses across the financial services space deal with growing volumes of raw data, rather than raw materials, the parallels are striking.

As with any supply chain, being able to trace materials or data across the whole process is very important. In the data supply chain, financial services companies need to understand and to audit what happens to the data across the process, who has looked at it, how it has been verified and they also need to keep a full record of any decisions that are made. Ultimately, they need to ensure traceability, that they can track the journey of any piece of data across the supply chain and see both where it has been and where it finally ends up.   

The benefit for financial services firms who reach the end of this data supply chain is that the result of this process supports informed opinion that in turn drives risk, trading and business decisions. 

Bringing the data together in this way is important for many financial services firms. After all, the reality is that these businesses, today even more than pre-crisis, typically have many functional silos of data in place, a problem made still worse by the preponderance of mergers and acquisitions taking place across the sector in recent times. Typically today, market risk may have its own database, so too credit risk, finance stress testing and product control. In fact, every business line may have its own data set. Moreover, all these different groups will all also have their own take on data quality. 

More and more financial services appreciate that this situation is no longer sustainable. The end to end process outlined above should help to counteract this but why is it happening right now? 

Regulation is certainly a key driver. In recent years, we have seen the advent of the Targeted Review of Internal Models (TRIM) and the Fundamental Review of the Trading Book (FRTB) both of which demand that a consistent data set is in place. It seems likely that the costs and the regulatory repercussions of failing to comply with this will go up over time.

Second, it is becoming increasingly costly to keep all these different silos alive to support it. A lot of these silos are internally developed systems. The staff who originally developed them are often no longer with the business or have a completely different set of priorities, so it makes for a very costly infrastructure. Finally, there is a growing consensus that if a standard data dictionary and vocabulary of terms and conditions are used within the business, and there is common access to the same data set, that will inevitably help to drive a better and more informed decision-making process across the business.


Finding a Way Forward

To address these issues and find a way of overcoming the data challenges outlined above, organisations can begin by ensuring that they have a 360˚ view of all the data that is coming into the organisation. They need to make sure they know exactly what data assets there are in the firm – what they already have on the shelf, what they are buying and what they are collecting or creating internally. In other words, they need to have a comprehensive view of exactly what data enters the organisation, how and when it does and in what shape and form.

Firms need to, therefore, be clearer not only about what data they are collecting internally but also what they are buying. If they have a better understanding of this, they can make more conscious decisions about what they need and what is redundant and prevent a lot of ‘unnecessary noise’ when it comes to improving their data supply chain.

They also need to be able to verify the quality of the data of course – and that effectively means putting in place a data quality framework that encompasses a range of dimensions from completeness to timeliness, accuracy, consistency and traceability.

To deal with all these data supply chain issues, of course, businesses need to have the right governance structure and organisational model in place. Consultants can help here in advising on processes and procedures and ensure for example that the number of individual departments independently sourcing data is reduced and there is a clear view in place of what is fit for purpose data.

The Role of Technology

Technology can play a key role, of course, in helping organisations to get a better handle on their data supply chains. For most businesses, a primary requirement is to have good data sourcing and integration capability in place. This means systems that understand financial data products but also the different data models and schemas that are in place to identify instruments, issuers, taxonomies and financial product categorisations.

The chosen solutions should also be able to quickly and easily move between one set of identifiers and classification schemes to another. Organisations also need the capability to support the workflow process and workflow integration to effectively manage a process whereby users can easily interact with the data either to include their own data in the integration or to check the result of various screening rules that affect the quality of the data.

Businesses also need a data reporting capability. Technology chosen to fulfil this role must be capable of providing metrics on the impact of all the different data sources the organisation has bought, what benefits it has achieved from those sources; what kind of quality are they and what gaps are there in the data, and where is the organisation in providing this data to business users for ad-hoc usage.

Beyond understanding and monitoring their supply chains and ensuring that an auditing and traceability element is in place, financial services businesses must also guarantee that data governance and data quality checking is fully implemented. After all, to get the most from their data supply chains they must make the data itself readily available to users to browse, analyse and support decision-making processes that ultimately contribute to driving business advantage and competitive edge.


Categories: Articles



Other Articles You Might Like
Arrow

Wealth & Finance International is part of AI Global Media

Discover our 10+ brands covering different sectors
APAC InsiderBUILD MagazineCorporate VisionEU Business NewsGHP NewsAcquisition InternationalNew World ReportMEA MarketsCEO MonthlySME NewsLUXlife MagazineInnovation in BusinessThe Business Concept