Rob Faulkner, at Software Solved, takes a look at data and how insurers and brokers can get more from it, by using third party specialists.
The insurance market is likely to be entering a significant hard market, possibly the most severe I have seen in my 23-year insurance career.
The signs are clear, most insurers will have been affected by the COVID-19 pandemic with significant losses spanning most key classes of business. This is on top of a market that has already started to harden over the last few years and the damage that the pandemic has also done both to business and the economy.
A key strategy for insurance providers and brokers must be to broaden their use of advanced data analytics across their organisation, removing silos and connecting their disparate information systems.
Insurance organisations must look to develop their data strategy to improve efficiency, reduce costs, identify profit areas, niches and gain transparency of business performance across distribution, underwriting, risk consulting and claims. However, in order to implement this strategy insurers and brokers need to understand what data they hold and where it is held, as the reality is that it is often stored in unconnected systems.
Therefore, data maturity assessments should be implemented in order to find out where data is being held. Bringing it all into one place is crucial to be able to use it to understand whether new processes need to be put into place to mitigate risk, enhance growth and meet business objectives.
What are the challenges to becoming Data Mature?
Expanding the information and data gathering network to better anticipate and understand new insights, whist also reaping the best possible value from the data they already have can be achieved through data maturity solutions. Ultimately improving an insurers ability to collect, integrate, analyse and communicate data to create actionable insights.
With a hardening market looming it is more important than ever for insurers and brokers to find other ways to differentiate themselves. Understanding the data that they hold and how this can be utilised is the first step.
With data stored in disparate systems across multiple sources such as distribution chains, partners, customers, suppliers, often in different formats makes the challenge of getting a central view and undertaking analysis no mean feat.
Many insurers and brokers do not have the resources or the experience to know where to start in mitigating the impact of disparate systems that don’t communicate using a centralised data warehouse, which is why outsourcing to specialist third-party data and software solutions providers is crucial.
Even before planning begins to bring these systems in to one centralised source of truth, the output and desired end-result need to be outlined and many questions need to be answered. These include: what reports does the business need to have to improve operational efficiency? What data currently sits off the systems such as business logic and processes shared between staff that isn’t documented.
Third-party software solutions providers need to understand how systems are linked together and how and if data is being mapped effectively, as data types and data keys are likely to differ across systems. Deciding where data systems will be hosted, in the cloud or on premise is another key issue. Even though we now live in a world where many are moving to cloud solutions due to remote working, leadership teams still may be sceptical over security and third-party solutions providers are in the prime position to offer both insight and reassurance.
Automated tasks and reports should also be looked at as an option as well as the impact of this on ROI, HR and de-skilling. If a business doesn’t have the right skill set, third-party software solutions providers can up-scale through flexible resourcing to fill any gaps.
Why data quality is important for the insurance sector:
Technology is transforming how people communicate and conduct business. For brokers and insurers technology can help to boost data flows and value-adding analytical capabilities can take costs out of distribution and service. Technology is opening up new ways of engaging with risk managers, insurers, and reinsurers and providing the basis for richer discussions and sharper insights.
Data accuracy is key for the insurance industry allowing insurers to identify homogenous risk groups which could impact decision making regarding claims. Missing data values can lead to an inconsistent level of detail, therefore standardised and consistent practices and rules to check data quality is key.
Mechanisms to improve data quality and give better insights by collating from external data sets as well which external datasets could be potentially useful in adding value to the current data should be one of the first points of business.
While a lot of importance is given to text analytics, computer vision techniques (image recognition, segmentation, object detection, custom object detection), using AI can produce valuable insights for insurers. The challenge lies in the creation of a centralised repository of computer vision datasets that are useful for the insurance sector.
Existing datasets have a lot of noise in them and require significant amounts of pre-processing; claims are often not integrated with risk scoring leading to noisy ground truths for risk predictive analytics. Therefore, mechanisms for on-going monitoring of data quality and revision need to be provided and this provides an opportunity for insurers and brokers to provide even more value to each other.
The four-step approach to insurers becoming data mature:
Although many insurers and brokers are looking towards AI and automation to help them to understand their data, they first need to undertake a four-step process below in order to become data mature.
ACCESS– Being able to access your data may sound like an obvious step but it is the first and most fundamental to making your data maturity journey a successful one. From understanding where and how your data is stored, it’s accuracy and how it needs to align to your business objectives will allow you to identify key projects that will cement strong foundations.
ANALYSE- When easy access to accurate data is established you will be in a position to start looking at how best to analyse your data. It may be moving to using smart visuals or, it could be focusing on unlocking new value in your data to gain competitive advantage and an improved customer experience.
LEARN- As day to day analysis of your data becomes business as usual and further value has been realised the potential for using machine learning or data science techniques to advance your insights can be considered. A project of this type relies on both the accuracy of data and the confidence that potential bias has been identified and excluded. Ultimately this provides access to advanced levels of trends and insights.
PREDICT- This focuses on the ability to predict outcomes from your systems and data. It is often the outcome of successful explorations in the learn phase and sees those models being implemented to live systems. Ultimately, this is the production version of successful projects identified during the learn stage.
Technology is transforming how people communicate and conduct business. For brokers and insurers technology and resource from third-party data software providers can help to boost data flows and provide value-adding analytical capabilities that can take costs out of distribution and service. Technology that can access and enhance data is opening- up new ways of engaging with risk managers, insurers, and reinsurers and providing the basis for richer discussions and sharper insights for consumers.
The insurance sector has a plethora of data which within it holds new opportunities especially important in challenging times such as these. By accessing and analysing data, insurers and brokers can immediately make a huge impact on the level of service they are able to deliver.