Editorial & Advertiser disclosure

Global Banking and Finance Review is an online platform offering news, analysis, and opinion on the latest trends, developments, and innovations in the banking and finance industry worldwide. The platform covers a diverse range of topics, including banking, insurance, investment, wealth management, fintech, and regulatory issues. The website publishes news, press releases, opinion and advertorials on various financial organizations, products and services which are commissioned from various Companies, Organizations, PR agencies, Bloggers etc. These commissioned articles are commercial in nature. This is not to be considered as financial advice and should be considered only for information purposes. It does not reflect the views or opinion of our website and is not to be considered an endorsement or a recommendation. We cannot guarantee the accuracy or applicability of any information provided with respect to your individual or personal circumstances. Please seek Professional advice from a qualified professional before making any financial decisions. We link to various third-party websites, affiliate sales networks, and to our advertising partners websites. When you view or click on certain links available on our articles, our partners may compensate us for displaying the content to you or make a purchase or fill a form. This will not incur any additional charges to you. To make things simpler for you to identity or distinguish advertised or sponsored articles or links, you may consider all articles or links hosted on our site as a commercial article placement. We will not be responsible for any loss you may suffer as a result of any omission or inaccuracy on the website.

Business

Posted By Jessica Weisman-Pitts

Posted on January 3, 2022

Why ‘intelligent data automation’ is important — and how to harness it

By Douglas Greenwell, Chief Strategy Officer, Duco 

The way a business manages its growing volumes of data can make the difference between clear insights that aid business growth, and slow reporting processes that cost time, money and fall short of regulatory standards.

The accelerated momentum for improving data quality in financial services has been driven by growth in digital services –all of which rely on high quality data– combined with the pandemic shedding light on the issue of data integrity and data reconciliation.

Laws and regulations such as BCBS 239, Sarbanes-Oxley, Basel III, SFTR, IFRS17, GAAP and the new US depository regulations are also making compliance increasingly complex, requiring organisations to guarantee that appropriate systems and procedures are in place to effectively manage and control operational risk.

Despite the mission-critical reliance on good quality data, many financial services firms are still using outdated legacy systems and manual processes to manage their reconciliations. These systems are unable to handle complex and changing data formats, resulting in inconsistencies that require human intervention.

The current state of data reconciliation 

Financial services organisations are at breaking point with their current legacy systems and manual processes.

Manual processes and legacy systems are not only costing organisations a lot of money and man hours, but they are also causing problems with transparency, which can also have costly affects in the form of regulatory fines.

Financial firms are at this breaking point because many organisations are finding that the amount and complexity of data they now handle as a business is unmanageable with their current systems and processes. And this complexity is leading to transparency issues, with the lack of a transparent, consolidated view of reconciliations being a major problem.

Concerns about data inconsistency and lineage issues are also at the front of companies’ minds as they plan for future growth. Without automation, automated data lineage is not possible, meaning that instead of a holistic view of the data presented in a structured way, financial services firms are dealing with unstructured data silos with teams and individuals unable to see how upstream data feeds affect them and what downstream business processes rely on their data.

In fact, in a study conducted by Duco – surveying 300 heads of global reconciliation utilities, chief operating officers, heads of financial control and heads of finance transformation working in large financial services organisations– nearly half (42%) of financial services firms said they are currently struggling with poor data quality and data integrity within their organisations.

Why change is difficult

While agility and fraud are big drivers of the move towards automation and away from legacy systems, often the biggest business case for change is simply cost control. When businesses assess the cost of ownership of technology from data normalisation, data prep and infrastructure to hosting, running and upgrades, the opportunity for technology rationalisation becomes clear.

However, despite there being a compelling argument for changing the status quo, businesses are still finding it difficult to shake things up.

According to our survey, 44% believe that reconciliation without manual processes would be too challenging due to the different types and sources of data they are dealing with. A further 42% believe that the risk of disrupting their business to improve data reconciliation is not worth the benefits of data automation.

But, while there is still some nervousness around the perceived disruption that a move to automation and machine learning will involve, the appetite to become more automated is strongly evident amongst financial services organisations.

Moving towards intelligent data automation

The pandemic, however, has provided the much-needed impetus for change, at a time when conveniently, intelligent data automation (IDA) has become commercially viable.

IDA is a data management strategy that uses no-code, cloud-based technology to automate and control all financial, operational and commercial data across an organisation — helping firms to cut costs significantly while reducing risk and improving compliance.

With its use of fully customisable, low-cost solutions that can sit alongside or on top of legacy systems, an IDA approach is key to not only successfully managing data, but to unlocking the full benefits of that data for the business.

By employing an over-arching, self-optimised level of automation, an IDA approach enables businesses to get a detailed view of data across the entire enterprise. With this level of insight, financial services organisations can better understand the performance of their operations, uncover and address weaknesses and identify new opportunities, all of which drives greater efficiency and agility across the organisation and improves regulation reporting accuracy.

With internal and external factors pressurising firms to change, financial services organisations are beginning to look towards IDA as a tool to secure their prosperity in the long term.

Encouragingly, almost half (49%) of financial services firms surveyed say that intelligent data automation is the future, and organisations will need to embrace it to survive. Furthermore, 42% say they will investigate the use of more machine learning in 2021 for the purposes of intelligent data automation.

Covid-19 has accelerated the adoption of machine learning and data automation. We can expect this momentum to continue, driven by the benefits to the business and the end user which make IDA a game-changer for financial services’ compliance, risk management and in reducing costs.

Recommended for you

  • Optimizing SME Supply Chain Financing Through Data Analytics in Banking

  • Effective Customer Acquisition and Retention Strategies for SMEs in Banking

  • Impact of Economic Policies on SME Growth in Banking