Most people in the Insurtech sector have by now heard of the term API – the acronym for the snappy term “application programme interface”. But not everyone is aware of just how significant market-wide API development is for the future streamlining, interoperability and efficiency of a digital London Market and the wider re/insurance sector.
So, IE caught up with Nick Mair, co-founder of DQPro, to find out more. In a nutshell, Nick and his team are working with the leading Managing Agents in the London Market as well as global specialty insurance carriers to make sure that everyone can harness the highest quality data by checking that their data is right, no matter what systems they use.
Key to doing this is the use of the humble API to ensure that the myriad of different platforms deployed by carriers, brokers, MGAs and beyond can talk to newer insurtech products. So, can linked APIs really help to smooth out the data quality difficulties in the London Market and beyond?
Let’s take a step back before we get stuck in and consider what exactly those data problems are and why they arise.
IE; People in the insurance sector often say data is the new oil, but just like crude oil it’s what you do with the raw material that counts, it’s how you use the data?
NM; Yes, DQPro was founded partly because we could see that there was more data than ever that insurers and brokers could use. The key, however, is gathering the right data together, at each point of the insurance cycle, and making sure it’s accurate, compliant and that it supports, rather than hinders, downstream processes.
The big step forward in recent years has been Cloud based systems, because that unlocks vast amounts of new datasets in real time. In turn, you can get quotes much faster because you can price the risk accurately and again, in real time. Rather than rely on data that’s been filed in the past, you’re accessing the stuff in the Cloud right now.
However, one of the problems with Specialty risks is the complexity of all the components involved, which can lead to manual checking or waiting for a specific part of the equation to be approved by an underwriter. It can be very challenging for firms to find, flag and fix their data issues before they incur cost, inefficiency and compliance risk in the back office.
The team at DQPro have been in deep conversation with the market about the volume of data and different data sources they handle since we began operating, and it became clear that there was a strong case for software that overlaid all of the different systems they use to check for data accuracy in real time.
DQPro has developed an API roadmap to further extend our reach.. One of the first milestones of this roadmap was the launch of DQPro Connect earlier in 2022.
IE; So how does an API actually help?
What DQPro Connect aims to do is to inform an end user in any system of any issues with the data they’re using. Secondly we make sure the data being exchanged is compliant in various markets and meets London Market and other regulatory standards too.
The clever bit is bolting on access to third party data systems, which obviously has a real time benefit for insurers using older or legacy IT systems.
API’s provide the bridge between the old and new tech, in our case by providing key data checks at any stage of the placing process, including comprehensive data checks at the pre-bind stage.
I think this kind of interoperability is just a first step in the direction of a truly flexible digital insurance market which at every stage is using data to gear itself towards more accurate underwriting data at source, reduced back office costs, and more profitable underwriting. The aim is to catch and identify data errors or control breaches proactively, before they impact the bottom line.
IE; Do you think the great insurtech boom is over and what we are looking at now is a refining of systems?
NM; I’ll go out on a limb and say, yes the latest insurtech cycle has been completed. But I’m not just saying that to be controversial – I genuinely think that just proving you can disrupt things is no longer good enough, you have to deliver real world results, and make a day to day difference to the bottom line for businesses you work with.
Which is what DQPro does of course working with about 20 global specialty insurance firms. We think we do this well because we focus on the really tricky granular problems that need solving in order to get the whole machine working more smoothly, and we are in constant conversation with the market to achieve this.
Ultimately, any insurtech – and we are no exception – has to be able to show that they are helping to generate real profits and or savings each month for their customers.
IE; Do you think that AI is now making an impact too?
NM; It’s a great question because AI has been a buzzword for so long, and is clearly a data hungry technology. Being a data quality nerd, I’d point out once again that garbage in equals garbage out when it comes to AI, machine learning or any similar initiative. Without high quality data fundamentals, even the most sophisticated AI will underperform or draw incorrect conclusions.
And, looking at the way insurtech is being used in general across the insurance market I would say that AI is only being used effectively about 20% of the time. We are really only about 2-3 years in when it comes to using AI to underpin systems. So there is a long way to go, and the AI journey really will follow the data quality journey closely – once the market realises how to standardise and properly check all data, the “input data” for AI will improve substantially making the use case for AI a reality.
IE; Which are the areas or sectors where AI can make a big impact?
NM; In some ways, just like the insurtech boom of 2015-19 really, it’s not always best to think of things in terms of billion dollar concepts, disrupting existing systems etc. If we use AI to focus on near-term problems then that’s a good thing. Ask yourself, ‘How can we be the bridge?’ That is what gets you to the next level.
In terms of Commercial, or Specialty sectors you can see how AI has its limitations compared to Personal Lines. There it’s easier to scale up AI because you have similar sets of risks. But Specialty is inherently unique and to price things accurately you need lots of data that is unique to that piece of infrastructure, supply chain and so on.
So where we are at right now is that data is making a big difference. For example, each modern vessel at sea generates about 5000 individual pieces of data, so you can cross-reference that information, add on weather, political risk, cargo details etc. That’s where AI can truly understand a Specialty risk – in real time.
IE; How can the insurance industry meet ESG standards using AI and data?
NM; Lloyd’s has been taking a lead on this, along with other big insurers. No new fossil fuel exploration will be covered and more than that, London Market members can audit their client’s supplier chain, or materials sources etc. quite quickly. We’ve reacted quickly from a data monitoring standpoint too – our users can run a series of checks during the quote process to make sure that they are meeting the new ESG standards.
The great thing about software like DQPro and the DQPro Connect API is that it acts as a data bridge. In the past you might not have known exactly what you were insuring or companies in that risk chain were all ESG compliant. But now you can assimilate much more information from 100s of subsidiaries in a short space of time. By using APIs and AI Lloyd’s can also keep a log of past quotes and the data given at that time. So in effect, you become a de facto register, a database that regulators or ESG compliance departments can call upon.
Lloyd’s has a long history of understanding complex or unusual risks and AI will help us better understand and grow that business in the future.
IE; Interesting stuff, thanks for your time Nick.
Be the first to comment