“It is a capital mistake to theorise before one has data,” according to the infamous fictional detective, Sherlock Holmes. But what you do with the data once you have it, is equally important. Today data analytics goes far beyond an Excel spreadsheet.
To date, the discipline of data analysis has traditionally been dominated by manual and unstructured processes. Just as insurance has tended to lag sectors such as banking in its adoption of technology, risk management has tended to trail other disciplines such as finance in its adoption of technology-driven analysis.
Consequently, risk evaluation has historically been basic, attempting to learn from past occurrences and failures – providing hindsight rather than insight.
That is, however, beginning to change.
Requirements and capabilities
Three factors are driving that transformation:
1) The first is simply the growing availability of data from both within and outside organisations. The increased sophistication of Enterprise Resource Planning (ERP) systems makes capturing and extracting data from source systems easier. The rise in big data sources – both public and private databases – and a mass of data from the Internet of Things (IoT) mean the volume of accessible data has never been greater. Addressing risks within an organisation with limited data coverage, will almost certainly cause risks to be overlooked.
2) The second is the increased availability of technological tools to store, structure, analyse and visualise data.
Cloud technology, for example, has eliminated capacity constraints and requirements for in-house hardware to store large datasets. Meanwhile, tools such as “Alteryx” services allow users to incorporate different types of information, including unstructured data, such as text. And data visualisation and analytical tools are helping organisations make sense of this to not only understand the present and past but predict the future.
One example of this centres around a proactive fraud detection system, which we used on a client project to predict fraudulent journal entries, based on historical input data. This AI-based fraud detection system was also used to indicate any outliers in the data, which would make reactive fraud detection easier. This not only helped our client address the fraud risk, but also highlighted other overlooked controls which needed to be implemented.
Such tools have proven invaluable in managing disruption to supply chains during the pandemic and its aftermath, for example – bringing greater flexibility, resilience and visibility across suppliers. For instance, requirements to screen potential new vendors (and their suppliers) create a massive manual workload. Using technology to link to external databases, such as sanction lists and company filings, however, can automate screening processes to manage risks while rapidly onboarding suppliers.
3) Finally, regulation has helped drive the need for better risk analysis. Starting with banks and other financial institutions, requirements for aggregated risk data providing a single measure of total risk exposure have encouraged the use of more sophisticated tools for analysis.
Another good example of upcoming regulatory requirements is the new German (and soon EU) Supply Chain Act. In this act, companies are required to monitor human rights and environmental risks in their supply chain. As you can imagine, supply chain processes are complex, and contain vast amounts of data. Without analysing this data using the correct tools, companies are unlikely to comply to the new act. In short, the risk of failing to adequately analyse risk has grown.
However, while the drivers for improved data analytics in risk management are compelling, the barriers, particularly for smaller and mid-market businesses, can still seem significant.
A clean break
Most obviously, the cost of more advanced tools, such as predictive analytics, can be substantial. To be at the cutting edge of technology requires a significant investment. But less ambitious adoption of better data management, analysis and visualisation can still bring returns through the increased efficiencies of automation and improved visibility and understanding of historical risks: Improving business intelligence if not providing advanced analytics.
Secondly, many businesses lack in-house expertise, and given current skills shortages, these are difficult to address. At the end of 2022, research by the training company, Skillsoft, found that around three-quarters of IT decision-makers worldwide were facing critical skills gaps across their tech departments. The survey of over 9,000 global IT specialists found that 76% have skills gaps in their departments, an increase of 145% since 2016.
Furthermore, recruiters, Harvey Nash, have warned that the growth of the global tech sector is in jeopardy due to a massive skills shortages. A recent survey found that more than two-thirds (67%) of digital leaders globally are now unable to keep pace with change because they are struggling to attract the right talent.
Again, though, the problem is not insurmountable. Managed and subscription services are likely to be more efficient, effective solutions for many businesses anyway, and there is a continual flow of new providers.
Breaking the barriers
Nevertheless, the key barriers to greater uptake of data analytics for risk management are internal. The first is often a failure of businesses to use the in-house capabilities that they do possess. All companies collect at least some key performance indicators and business performance metrics, but responsibility for these is often kept within management.
Passing this on to the analytics function will not only make reporting quicker and more accurate but free management to better plan and outline an analytics strategy.
The second key barrier is data quality. A lack of data is rarely the key issue: The challenge is capturing, processing and storing it correctly. Without accepted standards for formats, content and accuracy, most of an organisation’s time, effort and expense will go into cleaning the data rather than analysing and drawing insights from it.
Ensuring acceptable data quality requires robust data governance and data management strategies. To be effective, however, it must start with education. Where information is collected manually, those responsible need to not only be taught how it should be entered but why – and the consequences for the business’s analytics if it goes wrong. Moreover, the effectiveness of the governance strategy needs to be monitored and regularly reviewed.
Without sufficient quality, no quantity of data will help, or, as another famous author, Mark Twain, put it: “Data is like garbage. You’d better know what you are going to do with it before you collect it.”