January 2020

Record Year for Regulatory Fines Demands a Paradigm Shift

Fines for financial services firms that are misreporting regulatory data are on the increase, in a bid to eradicate inaccurate and out-of-date transaction reporting.

By Keith Whelan, Head of Strategic Relationship Management

A recent report by law firm RPC has found that the value of fines issued in 2019 by the Financial Conduct Authority (FCA) rocketed up by 550% when compared to 2018’s figures. This included a hefty £62.6m for misreporting financial data, with institutions being held responsible for failing to spot data discrepancies that could indicate fraud or market abuse.

According to the FCA, some of the largest firms in the world failed to provide accurate and timely transaction reporting. A Tier 1 US bank was fined £34m for transaction breaches and a Tier 1 European bank £27.6m for similar offences. A dozen other banks were also hit with charges.

Commenting on one of the fines, Mark Seward, Director of Enforcement and Market Oversight at the FCA commented, “The failings demonstrate a failure over an extended period to manage and test controls that are vitally important to the integrity of our markets.’

This lack of control is obviously a serious and pervasive problem that needs to be addressed with the right technology, people and process. But it’s not an easy problem to solve.

 

Complexity at all levels

Accurate regulatory reporting is hard because of the myriad of different systems and data formats in place at most institutions. Most organisations operate in multiple jurisdictions, each carrying their own rules and regulations (and ever evolving requirements) and managing all of the complexity that was hard enough to begin with, begins to multiply.

Getting the right data together in the right place from a range of sources, both internal and external, with little in the way of data standards is a common challenge. Ensuring the accuracy and completeness of data, or to put it another way, its actual integrity, is not easy. Once in place, transforming and enriching the data to satisfy the demands of the final reports is a headache, and one that traditionally takes a lot of manual effort.

A roadblock to automating this process and putting out timely and accurate regulatory reports is often antiquated data integrity and reconciliation tools. These legacy tools are just not up to the job of ‘managing and testing controls’ needed to ensure the integrity of regulatory reporting data – as the FCA pinpointed as being at the crux of the issue.

 

Legacy technology doesn’t cut it

Legacy on-premise systems were initially built to deal with one type of data – cash or equities for example – with extensive custom development work needed to handle bespoke regulatory requirements.

The systems were often developed in a pre-financial crash world. Whilst the regulatory world has changed significantly, these systems have simply not kept pace with the data demands and controls now needed to support regulatory compliance reporting.

Getting these controls in place using legacy technology takes months, sometimes years, and all the while reports could be incorrect and incomplete. It’s no coincidence that some of the recent fines were for breaches stretching back across nearly a decade.

In addition, the pace of regulatory change is high. As soon as controls are in production, a change can occur, or a new regulation with different requirements can come into force. What’s needed is a new approach to this problem, enabling firms to set up data integrity checks and reconciliations in a very short timeframe, and adapt them as requirements change.

 

Turning complexity into clarity

The back office is extremely complex with a range of different systems and data formats to navigate. Traditional systems are unable to help as they are often hampered by the narrow design focus from when they were first built, leading to a proliferation of spreadsheets for the more complex regulatory reconciliations.

To turn this complexity into clarity requires systems that are agile and flexible. Systems built to easily ingest data from a variety of disparate sources, and to standardise and normalise data ready for reporting.

A system like Duco for example enables regulatory experts and operations professionals to set up checks and controls extremely quickly, between internal systems, regulators, competent authorities, trade repositories, APAs, ARMs, other third parties, external databases and so on. You can then monitor data integrity across systems throughout the reporting lifecycle rather than at one point of course or provision.

When you look at the benefits of agile, cloud-hosted systems it is easy to see why they are a better fit for the control and regulatory frameworks required today:

  • Speed: install the system in a day, set up new controls in hours and be fully compliant in just a few weeks
  • Responsiveness: respond instantly to change whenever regulatory requirements shift, meaning you’re future proof from day one
  • Ease of use: there is no steep learning curve. Operations or compliance staff can learn the system in just a few hours and build controls without coding
  • Flexibility: with just one system you can cover all jurisdictions across any regulation – MiFID II, EMIR, SFTR, Dodd Frank (to name just a few)

 

Make the shift

Large regulatory fines are on the increase, but with the latest cloud-based systems, you can start to trust your data, creating regulatory controls and reconciliations quickly and accurately. And it gives you the essential back office controls that the FCA are looking for.

That’s the paradigm shift that’s needed in order to avoid regulatory scrutiny and fines, while keeping your reputation intact.

For more information, download our factsheet on ensuring EMIR and MiFID II compliance.

 

March 2021

Announcing Duco Data Prep

By Christian Nentwich, CEO. Today we announced the launch of our new Data Prep product at Duco, adding the ability […]