Data quality has never been more integral to efficient financial management. Access to accurate, timely and comprehensive data helps large corporates and financial institutions to reduce risks and costs, as well as increase efficiency and performance. High data quality also supports deeper insights, better forecasts and sharper decision making. So what’s new? Over the past decade, rapid and far-reaching changes in the regulatory, technological and competitive landscape have steeply increased levels of automation and digitization in financial management processes, heightening the importance of frictionless data flows. Unsurprisingly perhaps, 63% of all large companies now have a chief data officer, expected to rise to 90% by 2019, according to NewVantage Partners. Combined, the ‘mega-trends’ listed below have helped to raise the importance of data quality exponentially:
All firms need to ensure data can be integrated and enriched upstream and downstream along the value chain, carrying information and enabling insight. But specific sectors face particular challenges, which often serve to further emphasize the need for improvements in data management, governance and quality.
Missing and incorrect market data inputs, resulting in inaccurate reporting, incomplete generation of transactions / payment instructions and over-/under utilization of limits due to bad valuations.
Implementation of regular, automated checks to ensure external market feeds are running correctly, including 'sanity check' on rates, plus data completeness checks to ensure instrument setup contains all relevant fields. This includes analysis and automated exception reporting when market data or other mandatory information is missing or potentially incorrect.
Errors encountered in downstream system, following transfer of month-end accounting data from the trading, treasury and inverstment management system to client ERP system, requiring manual and extremely time-consuming resolution by staff.
Automated data completeness check to ensure transaction deal entry is correct and downstream systems are in synch. Mapping tables deployed to successfully export data to client ERP system. When a deal entry is made where mapping tables are not yet updated, an exception report notifies the user.
Client requires daily checks of bank statement quality, used for audit reasons, following project to centralize the treasury function. Client needs automated solution as has over 3,500 bank accounts.
Fully-automated report, set-up in OmniFi, quickly identifies incorrect or missing statements so staff can follow-up. Report also checks received balances to confirm that previous day closing equals following opening balance, as well as checking that the difference between opening and closing correspond to movement in the day.
The core processes of any organization reflect its unique mix of counterparties, systems, controls, portfolios, cash flows and legal entities. Indeed, many blue-chip corporates and large financial institutions use front-to-back / all-encompassing trading, treasury and investment management systems precisely because of their flexibility, i.e. the ability to configure its modules and functionality to meet precise, bespoke needs.
But system flexibility and configurability can lead to data quality issues and low STP rates if not deployed carefully and methodically. In today’s environment, few can afford the cost, time and risk issues that occur when transaction and data flows grind to a halt between systems. Problems commonly arise from incorrectly formatting static data used to meet regulatory reporting requirements, or failing to include the necessary ISIN code or accounting classification when setting up a new instrument type, leading to settlement or reporting failure. However, it is possible to improve process efficiency and data completeness, while still meeting existing internal policies and objectives.
Through SkySparc’s deep and extensive consulting experience and the unique process and data management capabilities of OmniFi, we can help clients to improve data quality and ensure transaction and information flows adapt efficiently to present and future challenges. Although SkySparc offers a variety of flexible, customizable service models for consulting projects and support, we typically support clients’ data quality needs in two main ways:
The future pace of change is hard to predict but is already rapid. AI and related technologies are increasingly being deployed to automate and manage back-office processes, with robots quickly and efficiently identifying and resolving exceptions. Human staff are already spending much less time on manual, routine processing tasks. They are responding to technology-generated alerts if there is a problem, but otherwise are focusing more on intuitive, value-added skills that contribute more meaningfully to strategic objectives, such as risk analysis, decision-making and relationship-building.
Individual firms will be on their own journey to this highly automated state, dictated by their own priorities. Whether reducing reliance on manual processing to achieve incremental efficiency gains or meet regulatory compliance requirements, or refining data flows as part of a larger strategic project, SkySparc has the expertise and resources to help clients achieve their data quality objectives.
Business Consulting
Strategic Advisory
Interim services
Risk, regulatory & compliance
Program & Project management
System selection
Data engineering
Visualization & Discovery
Information governance
Data science
System expertise
Wallstreet Suite
FIS
Capital Market platforms
Technology advisory
Test services
Implementations & upgrades
Platform integration
Cloud migration
OmniFi
Datamart Optimizer for Murex
Value at Risk (VaR) solution
Central banks
Corporate treasuries
Banks
Asset managers
OmniFi
Our journey
Our industries
Our leadership
Our solution
Thought leadership
Podcasts
Videos
Fact sheets
In the news
Young Professionals
Experienced Professionals
Contractors
Jobs available