Unleashing the power of automation: redefining Capital Markets with data

Unleashing the power of automation

A Coalition Greenwich report – "Data Automation: The Workflow Efficiency Game" - released last week tells us, in numbers, something Xceptor has known for 20 years…. technology and automation in capital markets is a game-changer. Without it, financial institutions will continue to have to deal with the risk and inconvenience of slow, inefficient processes and the never-ending remediation of errors due to incomplete or untrusted data.

The market also has deep concerns about its ability to meet T+1 timelines. 20% of the respondents – most of whom are in the C-suite or senior management – say they’re not sure they’ll be ready. Perhaps of even more concern, 75% of respondents believe it’s unlikely that the market will be prepared. These startling figures shine a bright light on the reality that capital markets firms must solve their data automation challenges if they want to meet regulatory requirements and ultimately remain competitive. 


 

The report covers middle- and back-office issues in the context of data automation and workflow processing, with a few consistent themes that the industry must continue to explore and solve for:

1. The regulatory burden is heavy and getting heavier

Meeting regulatory requirements is the single most important driver for financial institutions to adopt new technologies and embrace sophisticated data cleansing and automation processes. As regulations change, new rules bring about the need for new processes. The more these can be automated, the better.

It’s not just new regulations. Meeting existing regulations becomes more challenging as trading volumes increase, new asset classes evolve, and firms continue to search for yield. As resulting data volumes expand, particularly in unstructured data, ingesting, transforming, standardizing, and cleansing this data is often still a manual, time-intensive, and error-prone exercise with a knock-on negative impact on meeting regulatory needs.

The problem is widespread. According to Coalition Greenwich, manual data cleansing processes are used by almost two-thirds of the market for more than 10% of their data. This is not just inefficient and error-prone; it’s resource-intensive, often requiring "bodies" to be "thrown at" the problem.

Add to this the fact that as many as 30 data sources, with inconsistent data quality and formats, need to be integrated into workflow processes, and it’s clear why the current approach to data is unsustainable.

2. Technology can do it all

There's little doubt that technology can address these challenges. Market professionals know it, technology specialists acknowledge it, and regulators recognize it. This includes the use of AI, which is likely to be transformative, as long as it's subject to appropriate guardrails and governance.

Highly complex, unstructured data such as PDFs, contracts, faxes, and forms usually cannot be integrated into existing workflow architecture, and existing data or process solutions are unprepared for this. Data solutions manage data, but they do not handle processes. Process automation software cannot effectively address the data challenge, and business applications are often too specific and do not cover the entire trade lifecycle.

3. The in-house vs third-party debate

Coalition Greenwich recognizes that third-party solutions capable of ingesting data in any format, standardizing it, cleansing it, and validating it, and then outputting it to the required workflow for automation processes, can be extraordinarily powerful.

Overall, the most efficient processes seem to be those that are either entirely proprietary or entirely third-party-led, with decreased effectiveness when a "mixed and matched" approach is adopted. Only about half of the market is using any form of third-party solution, which is unfortunate because offline reconciliation rates and manual processing are lower for firms working with a specialist provider.

Delivering trusted data downstream reduces errors and minimizes risk, leading to a decrease in the total cost per trade. There are also opportunities for new or improved revenues through quicker client onboarding, product implementation, and improved customer service.

The time to value for all of this must be a key factor when considering solutions. Time to value for cost savings and revenue generation is more likely and quicker to implement when solutions don't have to be built from scratch. However, they must meet the unique needs of the financial services industry, including the need for scalable, auditable, and transparent solutions that can be configured to meet the specific requirements of firms and business lines.

Data transformation answers many of the challenges

On average, our clients experience an 86% reduction in trade capture errors and an 86% increase in trade handling time. They also achieve a 95% straight-through processing rate. Moreover, the issue of relying on manual efforts ("throwing bodies" at the problem) is significantly mitigated, with our clients benefiting from almost 2,000 hours less in overtime when they utilize our data automation solutions.

The fact that the market has recognized the necessity of data automation is an excellent first step. Now, the challenge is implementing it correctly in a scalable, robust, and agile way to future-proof firms and meet their evolving data requirements. Our clients are well on their way to achieving this goal.

 

 

Watch our webinar featuring Audrey Blater, Author of 'Data automation: the workflow efficiency game'
Webinar - on demand
Surviving the T+1 deadline: a guide to efficient data and automation

Found this article useful?

Share it with someone!