Banking, Regulation and Big Data…Too Big to Change?

By  Matt Shaw, Associate Partner – Crossbridge, the financial markets consultancy

A bitter pill to swallow matt shaw
In 2012 many investment banks find themselves in a situation where profits have been flat or in decline for the last 2 to 3 years and now they must incur mandatory costs for the next 2 to 3 years in order to implement a raft of regulations, compliance with which will reduce their profitability yet further.  Putting issues of timing, extra-territoriality, regulator’ capacity, economic and political diversions (such as the US Presidential Elections and the Euro-crisis) to one side, it is clear that the financial services industry is entering a period of sustained reform. 

Jamie Dimon, chief executive of JPMorgan Chase, estimates that Dodd-Frank alone will add an additional $400m-$600m to the firm’s annual cost base.  By some estimates, the return on equity of the banking sector as a whole will fall from about 20 percent to around 12 percent, which does not provide investors with a great deal of compensation for the volatility of the capital-markets.  Already depleted change budgets, for at least the next few years, will mainly be spent on US, European Union and global regulatory reforms.  These variously seek to address some of the systemic stability issues which have come to light during the recent financial crises and combined, they affect almost every area of the financial services industry – the wholesale, retail and insurance markets, from individual and corporate customers, to buy and sell side participants, to trading venues and infrastructure providers.

Under observation
Regulators are starting to put much more emphasis on the quality, richness, volume and transparency of the data they receive from financial institutions and are using this as a proxy to enforce additional and more robust controls, and increased substantiation, that regulated firms are adhering to policy, providing full disclosure and complying with the law.

By mandating the use of the unique Legal Entity Identifier (LEI) on all trades, regulators are effectively forcing firms to clean-up their counterparty data structures and values and the internal linkages with the various front office businesses which enable counterparties to trade.   Over the next few years it seems regulators are moving towards a global single view of counterparties, but exactly how market participants will obtain and register their LEI is not yet finalised.  It is another matter as to which sovereigns, governments, corporates and individuals will accept this scrutiny into their banking transactions and intrusion into their privacy.  There are competing identity schemes being put forward by different regulators (unfortunately), although LEI seems to be approaching a ‘tipping point’ and if it is adopted by one of the major European or Global regulations, it will surely become the industry standard.  LEI could become the ‘passport to trade’ for market participants who fall under the regulations (which is most).

There are similar initiatives to create global identifiers for standardised derivative products and their underlying contracts and this too may become more feasible as regulators force more and more OTC products onto exchanges and through central counterparty clearing.  As with LEI, we could see the formation of centralised securities and contract registries.  What will this do for financial innovation and the creation of new products?  Of course, it remains to be seen how and where the market for specialised bi-lateral contracts and structures (to meet a specific risk profile and market view) will persist in the new environment…but the demand will be met.

In theory at least, improved identification should lead to improved classification and linkage of transactions, counterparties and products and this is what drives many risk and accounting provisions, controls, limits and reports.  Given enough computing and manpower, local and global regulators (and by implication the firms they regulate) should be able to build a more detailed picture of the network of transactions and risk concentrations across different products counterparties, industries and countries, but do regulators really expect to turn top-down surveillance into bottom-up sousveillance?

Where does it hurt?
Underpinning much of this change is data – the information which is used to identify, classify and enrich a firm’s transactions and to consolidate, control and report on its books and records.    At the data level the different regulations (Dodd-Frank, EMIR/MiFID2, FATCA, Basel 2.5/3, Recovery and Resolution Planning, CASS, and IFRS to name a few) often come crashing together.   If we focus instead on the different data domains, we can see common patterns start to emerge.


  • Near real-time (15 minute) derivatives trade reporting, utilising new ‘global’ identifiers. Huge volumes of cross-industry ‘trade repositories’.

Party/Legal Entity

  • More robust identification and more extensive classification schemes will need to be applied.
  • More detailed relationships (e.g. agent, principal) may need to be stored for more granular (fund-level) credit risk calculation and disclosure.

Internal Legal Entity Structure

  • Holding company, subsidiary, special purpose vehicle, and branch ownership structures require clarification for insolvency planning and (potential) segregation of wholesale and retail businesses.


  • More extensive identification and categorisation is needed to support treasury and finance accounting changes; more sophisticated risk analysis.


  • Customer and Ledger accounts will need additional controls putting in place to limit trading and investment activity, segregate collateral and assets, apply new taxes and withholding regimes, and report more off balance sheet assets.
  • Transactions and balances may need migrating (novating) and collateral netting improvements made as OTC trades shift to central counterparties.

Book/Organisation Structure

  • Firms need to understand the impact on lines of business for insolvency and segregation planning.
  • New regulatory capital, funding and liquidity charges may also result in transaction migrations from Trading to Banking books in order to seek more cost efficient booking models.


  • The location of any, or all, of the above data elements may need to be known in order to comply with extra-territoriality clauses and for insolvency planning.

Figure 1 shows some of the data improvements and changes firms might be expected to make.

Whether they have a leading edge service-oriented messaging infrastructure or a dual-key and reconciliation-based information architecture, Crossbridge believes that firms need to review their data lifecycle quality, ownership, standards and flexible storage solutions, and map these onto their book of work, in order to identify where cost-savings can be achieved.  We recommend a consistent approach to the delivery of data solutions, from developing business rules to data and structural enhancements, to BAU process and capture updates, through to data remediation and migration.  Figure 2 shows some of the trade-offs to be considered.


Population reduction versus complexity of BAU process updates.

Client Focus

Impact on new and existing client lifecycle service from the group, division, line of business and platform perspectives (with regional variations).

Risk Appetite

Non-compliance risk assessment (data quality and false positives/negatives).


Cost-benefit analysis of data quality (fitness for purpose, not perfection).

Enterprise Solution

Buy (vendor), build (in-house/open-source) or partner (externalise).

Delivering coherent and cost effective solutions is made difficult given that data organisations need to support project focused ‘point deliveries’ with fixed compliance deadlines.  The situation is further complicated by inflexible or monolithic data capture, storage and distribution platforms and unclear ownership amongst the many suppliers and consumers of the data.

A miraculous recovery
Some technology start-ups (Google and Facebook included) start with a data model and subsequently develop their business models on top.  Since their business models are under threat, maybe it is time for banks to develop data models to retro-fit to their already well established (but often multi-faceted, multi-vendor) business and operating models?  It is often said that there is no competitive advantage to data.  Maybe, maybe not, but a firm-wide focus on data quality can reduce operating and transaction costs, improve risk aggregation and management, and optimise balance-sheet and inventory utilisation.  More and more firms are starting to view data as an asset and understand the importance of data quality as an enabler not just for regulatory compliance, control and reporting processes, but also for integration, differentiation, reputation and profitability enhancing initiatives.

Complex financial product accounting, risk analytics and valuation models are becoming increasingly commoditised – companies like and OpenGamma already offer highly modular, scalable, fault-tolerant, cloud-based (or ready) accounting and risk management platforms.  Could it be too long before we see the emergence of Microsoft Azure Ledger, Google Financial Risk Analytics, or Amazon Transaction Banking?  If banks overcome their ‘not built here’ syndrome they could begin to work with trusted and proven platform service partners who promise to cut operating costs and ‘not be evil’.   As more OTC products become standardised on exchanges, algorithmic trading and STP will be used to drive down the ‘cost per trade’.  Could a ‘big data strategy’ steal the march on regulators and competitors and allow a firm to refocus on value creation activities, such as trading and hedging strategies, structuring and contract origination, advisory, sales and marketing?

As we have seen above, regulators are beginning to force the issue with more prescriptive identity and classification schemes, which will result in the externalisation of firms’ trade, counterparty and product information in centralised regulatory repositories.  Vendors such as Avox (a commercial subsidiary of the DTCC) are already starting to realise the value in this model as they match, merge, validate and re-distribute counterparty data to and from a number of large banks – they offer a ‘Counterparty in the Cloud’ service (although the matching algorithm is a little more ‘fuzzy’ than for iTunes).  It is not a great leap to envisage them (or a competing vendor or consortium) leverage the standardisation afforded by the LEI and extend their business and data validation teams to provide a shared KYC and client on-boarding service.   Although anti-competition and risk concentration concerns would need to be addressed, the benefits to clients are clear; they will only have to go through the standard regulatory KYC ‘passport application’ process once and then negotiate legal and commercial ‘visas’ with each of the firms with which they wish to trade.  Maybe some aspects of the global markets’ client on-boarding process will be outsourced yet further and delegated back to the governments, regulators and credit ratings agencies that mandate these ‘approved’ classifications, ratings and identities for counterparties and products.  If you are going to demand ID, you have to issue the passports.

Comments are closed