Good data quality is one of the keys to success of all businesses. Advances in technology have given us the ability to collect and store vast amounts of data. For the tech giants like Amazon, Google and Facebook the collection and monetization of this data is their core business and their whole infrastructure is geared towards this. However, most of the financial services businesses that make up our local economy pre-date the so-called information age and their business models have built up around paper-based processes. These processes may now be digital, but how the resulting data is stored and maintained can be patchy at best.
Most of the work we do is helping international trust and corporate services providers improve their business processes and get more value from their investments in technology. There are huge gains to be had from digitalising processes and using technology, but all of this requires good quality data.
Why is data quality so important?
Apart from the obvious benefits of having up to date and accurate information in a single place, your data is a valuable business asset and it needs to be looked after.
Many businesses will have dedicated data administration teams who are responsible for maintaining data integrity. Workflows (either manual or digital) will be used to ensure that only the data admin team can add or update certain records. In other businesses the client facing administration teams may be responsible for maintaining the data on their own clients. Whilst there is no right or wrong way to manage data it is important to have have proper controls in place to ensure consistent data quality.
Without effective controls there is a risk of inconsistent data and / or duplicate records. This is often the case of address and contact records in client relationship management systems.
As we move to a more connected workplace the need to maintain accurate records becomes even greater. An error in one system is likely to be replicated in all connected systems. On the other hand, integrated systems ensures there are fewer opportunities for data integrity gaps.
As stated earlier, we also collect more data so the problem of data management is much bigger than it was, say ten years ago. Whilst the problem is getting bigger the benefits of looking after your data and the opportunities it creates are also greater.
Benefits of good data quality
Within financial services there are some obvious benefits that will only be realised when you are in control of your internal data quality:
Compliance with legislation
Having good data integrity and quality is essential when it comes to demonstrating compliance with relevant legislation. Specific laws that need to be adhered to are beyond the scope of this article, but the two that affect most businesses will be, the Data Protection (GDPR) laws covering the personal data and the company laws which mandate the recording and retention of various data items, including proper financial record keeping.
Whether you rely on manual paper-based records, spreadsheets or database systems the data within them needs to be accurate, up to date and, in the case of personal data, you need to have permission to hold it.
Regulatory Reporting
Businesses in most sectors, but particularly financial services, are required to provide data to government and regulatory bodies. The Channel Islands have been at the forefront of automating these returns with initiatives such as the JFSC MyRegistry Portal and similar online systems for tax reporting. Businesses benefit from these systems when they can connect them directly to their internal client management systems and databases. However, this relies on having accurate data.
Unfortunately I see many organisations struggling to implement these technologies because of needing to have a manual checking process to deal with data issues in their core systems.
Straight-through-processing (STP)
STP is used to interface transactions from counterparties (usually banks or brokers) direct to your core systems without rekeying the data. This can save significant time and increase accuracy, but will only be beneficial when your internal data is aligned with the counterparty.
Without the necessary controls on maintaining data, such as bank accounts, you will never be able to successfully benefit from processes such as online payment systems, automatic reconciliation and bookkeeping. In the STP projects I have been involved in, the most time consuming part is the initial data cleansing before connecting systems together.
Client portals
Your customers are probably demanding online access to your business, either to buy products, consume services or receive reports. Gone are the days when we were happy to receive quarterly statements; we want up to date information now, at our convenience, not the banks!
Implementing portal technology can reduce the time spent responding to client queries. However, unless your data is accurate you will be putting your business reputation at significant risk by connecting your clients directly to their data. Would you be happy to give your clients online access to relevant data in your back office systems? Of course, once you have taken this step to provide data through a portal and developed the processes to ensure data quality, your other internal processes will also benefit.
Outsourcing
When you have accurate data it opens up new possibilities for improving efficiency and lowering costs. Outsourcing non-core activities to lower cost jurisdictions is only effective when your service provider has access to accurate and well controlled source data. Many outsourcing projects fail to deliver the predicted benefits because of the additional time spent dealing with data issues. My golden rule is never outsource a problem. Fix your internal data issues before giving the processes to a service provider.
APIs and RPA
Technology advances may have created these data problems by allowing us to collect, store and process more and more data, but technology also provides the solution.
Manual data checking processes can be replaced by automated data integrity and exception reports. Using your technology effectively can also remove many inherent inefficiencies within your internal processes. This is definitely the case with modern IT systems that use APIs (Application Programming Interfaces) to join them together. However, even with non-API systems relatively low cost RPA (Robotic Process Automation) techniques can often be used to manipulate and transfer information between systems which increases efficiency and removes the chances of human error.
Technologies to support data quality management
Typically we see organisations using Excel to manage data quality. Whilst this isn’t wrong, there are much better tools that allow you to manage data more effectively.
One of the best we have come across is Alteryx. Alteryx is a powerful data analytics tool that enables data-driven decision making in your business. At Solitaire Consulting we are active users of Alteryx and are supported by one of the leading Alteryx partners in UK and Europe, Jersey based Continuum.
Using Alteryx you can connect directly into the databases of your core application and conduct many data analysis tasks from basic exception reporting to sophisticated data analytics and reporting.
All businesses rely on data, not only to deliver their services, but also to support decision making. No important decisions should be made without supporting data, so it goes without saying that data needs to be accurate and reliable.
Many leaders I talk to have big ambitions for the digitalisation of their businesses. They want to be able to automate processes with workflows, implement client portals and use RPA. However, their biggest challenges in achieving this are often around having incomplete and unreliable data sets across disparate IT systems. If you don’t tackle these data issues now it becomes much harder to implement and benefit from new technologies.