Empowering business users with mature data governance

By Martijn Groot | 14 July 2017

Rapidly evolving digital competition is shining a spotlight on the IT infrastructure of many financial institutions. Growing digital competition requires both a review of internal models to reduce costs and improve business agility, as well as external capabilities that can facilitate partnerships with the burgeoning fintech ecosystem and enhance direct customer interaction.

IT is now tasked with automating an end-to-end supply chain involving different service providers – and that means finding a way to easily yet securely expose data to these third parties. Yet, in the melee of legacy application integration and decommissioning, and the removal of line of business duplication, information consumers across the business are still constrained by both data siloes and proprietary data discovery methods. The creation of links to external providers likewise remains fraught with risk. Where is the data governance? The ability to manage permissions, comply with data privacy laws, adhere to content license agreements or safeguard commercially sensitive information?

Internal IT rationalisation

IT rationalisation has become a major focus for financial services firms over the past couple of years - from Deutsche Bank’s Strategy 2020 which includes modernising outdated and fragmented IT architecture, including the reduction of operating systems, hardware and software applications, to HSBC’s Simplify the Bank plan which includes an architecture-led strategy to halve the number of applications across the whole group over a 10-year period.

This emphasis on streamlining complex infrastructure is being driven by the new competitive and regulatory landscape. The last ten years have made it very apparent that continuing with line of business data silos has become a significant risk, given both the cost of regulatory compliance, with its demands for cross-sectional reporting, and the implications for speed of business change.

As a result, a key part of this rationalisation process has been an investment in APIs to enable interoperability between applications and, hopefully, support the eradication of duplicate applications. However, while many organisations have appointed Data Stewards with a remit to determine data and application requirements across specific business functions, the siloed mentality remains due to a lack of data governance maturity. From cost reduction to business agility, the realisation of any successful application rationalisation or data supply chain improvement project will require significantly improved models for data governance. 

Exposing data

At the same time, of course, the business focus is turning increasingly outward, as organisations recognise the importance of the new financial ecosystem. In addition to rationalisation efforts, IT is now also tasked with automating an end-to-end supply chain involving different service providers.

With a need to expose data to the new fintech partners, as well as customers, many banks are putting in place their own API marketplaces through which they expose their data to selected third parties. While such changes in the retail market are being driven in the EU by the revised Payment Services Directive (PSD2), corporate products in cash, foreign exchange, liquidity and finance data will also demand new APIs.

Given this demand for openness both internally and externally, a common, cross-application taxonomy of products and services and uniform data dictionary is clearly important.  Without this, services could still be added on top of existing infrastructures but the integration would be brittle and error prone and not up to quality levels or interaction speeds clients would expect.

But this model has to go further: Consolidating data from different sources, mastering it and subjecting it to different quality controls and creating a common data model is a great start. But how is that data being consumed?  Requiring business users to rely on the IT team to use proprietary APIs to gain access to this data makes for a steep and costly learning curve, undermines data value and compromises both the IT rationalisation vision and the creation of a successful financial ecosystem. 

Explore & exploit

It is essential to empower business users to explore and exploit this consistent information resource, not only to meet regulatory demands but to support business change. Replacing proprietary tools for data access and discovery by using industry standards APIs – such as the Representational state transfer (REST) API - will simplify the integration of standard data discovery tools. In addition, the use of a standard data schema within a datamart will provide a shared understanding of terminology, definitions and values.  The combination of standard data model with the REST API will enable business users to gain access to this golden copy repository in a lightweight fashion – without reliance on the intervention of IT.

Opening up a single, consistent data source to business users via standardised, self-service technologies is transformative. A simple browser based interface that enables business users to select the required data on demand, with the addition of formatting and frequency tools, effectively opens up the data asset to drive new value. Data can be accessed, integrated into other systems and/or explored via standard data discovery tools – all without any complex proprietary Java based tools.

Obviously this model has to be controlled – from avoiding a data deluge to ensuring confidentiality is maintained, the data cannot be left open to everyone. The ability to manage permissions, for service providers, internal users and customers is essential if the organisation is to ensure compliance to data privacy laws, adherence to content license agreements and protection of commercially sensitive information. A REST API should include the ability to control access to specific data to avoid exposure of data to users who are not permitted due to license constraints or data sensitivity.

With the right security measures in place, information that would have taken business users weeks to access whilst waiting for IT, can now be discovered and reported upon in days. Given the increasing need for reports – both regulatory and data discovery to support business change – this self-service access to trusted, standardised data is key.  In addition to reducing the cost of business change, the use of the REST API also enables a simple, lightweight integration that reduces infrastructure costs – and avoids the need for expensive and highly trained experts.

Conclusion

In today’s competitive market, IT rationalisation is an imperative. Line of business silos need to be removed and operational costs reduced. At the same time, an integrated financial ecosystem is becoming vital in both retail and corporate markets. A mature data governance model that leverages new enablers, including APIs and standard data dictionaries, is key to achieving both these rationalisation and extension goals.

Only by creating a centralised data source and exploring new standardised technologies to mobilise data and empower users throughout the business, will the vision of agile, simplified and competitive financial services business models be realised.