The April theme at IE is AI and how it make a real difference in understanding data, and acting upon it, in real time. The benefits of using AI to collate and triage shared data on third party platforms is arguably where the big gains are. Let’s explore a bit more.
This piece is by Errol Rodericks, EMEA & LATAM Product & Solutions Marketing Director, Denodo;
We have seen how AI adoption in the insurance industry has shifted from cautious pilots to boardroom priorities. Recent industry findings of more than 400 insurance leaders describing an ‘early-majority’ phase, with nearly 80% experimenting with generative and agentic AI (GenAI) or planning to adopt it within two years indicate that AI innovation is advancing. Broader surveys reinforce this trend – insurers are reallocating investment toward customer-facing GenAI use cases as controls and monitoring mature, highlighting increased confidence that AI can enhance both experience and efficiency. A multistate review in the U.S. found roughly 92% of health insurers aligning AI/ML governance with established principles, underscoring the rising bar for accountability and auditability.
For insurance companies, AI succeeds only if it strengthens the combined ratio – the balance of loss and expense ratios that determines underwriting profitability. That ratio can be undermined by claims leakage, slow claims handling and high adjustment costs, while manual pricing and intake processes prolong underwriting cycles. At the same time, regulatory defensibility has become inseparable from operational performance. AI deployments must provide explainability, lineage and consistent policy enforcement – all of which influence how quickly new AI-driven workflows can be deployed.
Addressing the last-mile data challenges
Modern lakehouses have become foundational, but many insurers still struggle to provide timely, trusted data. Data remains fragmented across systems, document stores and multiple clouds. Traditional approaches such as heavy extract, transform and load (ETL) pipelines and duplicated stores can slow processes and leave users struggling with conflicting versions or unclear definitions. Industry recommendations for data strategies emphasise that while lakehouses unify storage and compute, insurers also need real-time access to all sources, consistent business semantics and federated governance to avoid a last-mile bottleneck that diminishes the value of analytics and AI efforts.
Logical data management – the missing layer in insurance AI
A logical data management approach addresses these gaps by providing insurers unified access to distributed data across hybrid and multi-cloud environments – without the need to physically move it. Implemented as an abstraction layer, it delivers zero-copy, real-time data with centralised governance and a universal semantic model, enabling teams to instantly work with trusted, consistent views. This approach also reduces duplication, speeds decision-making and ensures AI models are using current, controlled information. In practice, this reduces engineering overhead, improves claims and underwriting cycle times – enhancing overall explainability with shared definitions and complete lineage.
A global top‑10 insurer faced this exact challenge. Despite already building a modern lakehouse to drive self-service, the insurer found that analysts were still searching for the ‘right’ data across Oracle, SharePoint, Azure and other cloud-native sources, with manual ETL and inconsistent governance standing in the way. By introducing a logical data layer, and paired with Databricks, the insurer was able to establish universal semantics and federated controls – reducing delivery times for critical use cases from weeks to hours. This example shows that the breakthrough came not from replicating more data, but from enabling governed, real-time access to what already existed.

Partnerships that speed data delivery
Insurers are increasingly pairing their lakehouse architectures with logical data layers that integrate seamlessly with their existing governance frameworks. This combined approach delivers unified access across hybrid and multi-cloud, applies consistent policies and speeds AI and analytics – enabling teams to move confidently from experimentation to production while maintaining detailed control of sensitive data. This architecture is increasingly validated by real-world adoption, where a logical data layer delivers real-time unified access and secure, entitlement aware data delivery – without core systems overhauls.
Centralised governance for regulated AI
With heightened regulations, insurers increasingly need governance they can demonstrate, not just document. A logical data layer meets this need by centralising access controls, entitlements and a full lineage across lakehouses, core systems and software-as-a-service applications. For insurers deploying GenAI, those capabilities reinforce trustworthy, retrieval, policy-aware prompts and traceable outputs – all aligned with rising regulatory expectations. Furthermore, the same governed, real-time data also improves business performance. Claims handlers gain a unified view of policies, losses, documents and external signals, reducing leakage and lowering adjustment costs. Underwriters and pricing models can instantly assemble content from multiple systems and datasets – improving quote speed and risk selection. These operational gains ultimately improve the combined ratio of reducing losses, controlling expenses and removing latency created by fragmented data estates.
Getting ahead of the AI race
Insurers that in this AI era will be those that connect, govern and deliver data at speed. A logical data management approach modernises existing lakehouse and cloud environments without moving data, proving value quickly and scaling safely – while maintaining confidence. The insurance industry is incredibly margin-sensitive, therefore governed, real-time delivery is what transforms GenAI into measurable profit – with backed up profits.

Be the first to comment