Fortegra | News & Insights

Joe Lettween Provides Insights on Platform Modernization

Written by Fortegra | May 6, 2026 1:35:01 PM

Now is the time to accelerate platform modernization in insurance 

Joe Lettween, Chief Innovation, Data Science, and Technology Officer, has recently shared insights with Insurance Thought Leadership on how the insurance industry is at a turning point on platform modernization.

Previosuly, legacy systems have kept insurers restricted in their agility as technology has advanced. Artificial intelligence (AI) now allows for a shift from making small, incremental operational changes toward larg-scale modernization improvements that can improve competitiveness, partnerships, and decisions at scale.

Read the full article here from Insurance Thought Leadership, and learn more about the key points below.

Why legacy platforms keep insurers grounded

Many insurers still rely on monolithic core systems that tightly integrate functions like policy administration, billing, claims, and underwriting. While deeply embedded and critical to daily operations, these systems were not designed for flexibility or openness, making modernization feel complex and high-risk. The challenge is less about their age and more about their underlying design—closed architectures and fragmented data models that limit integration, slow innovation, and make it challenging to adopt new technologies like AI.

This creates a difficult tradeoff. To take advantage of AI and advanced automation, insurers often need to first invest in significant data cleanup and restructuring—efforts that are both costly and time-intensive. As a result, the focus has shifted from whether to modernize to how to do so strategically, balancing the need for transformation with the realities of maintaining business continuity.

The data mindset that determines success

Modern, open systems have the potential to significantly improve underwriting speed, claims outcomes, and automation, but their effectiveness depends heavily on the quality of the underlying data. For many insurers—particularly those working across diverse distribution networks—data inconsistency is a major barrier. Partners vary widely in technical maturity, bringing different formats, structures, and integration capabilities, making it difficult to standardize, validate, and use data effectively across programs. When these foundations are weak, the impact is felt across operations, from delayed onboarding and fragmented claims processes to manual reporting and error-prone data handling.

To address these challenges, forward-looking insurers are focusing on strengthening data at the source—validating inputs earlier, improving ingestion processes, and enabling more dynamic, program-level analytics. As expectations around speed, transparency, and integration continue to rise, the ability to manage and exchange high-quality data is emerging as a key differentiator in building scalable, high-performing insurance programs.

Knowing where, and where not, to apply AI

One of the most important decisions in modernization is not which AI tools to use, but where to apply them. AI delivers the greatest value when focused on specific workflows and customer interactions—such as automating data validation, enhancing claims review, or supporting underwriting decisions—where it can augment human judgment and improve efficiency without requiring major system changes. These targeted applications reduce friction and generate immediate benefits while working alongside existing infrastructure.

In contrast, applying AI directly to core transaction systems like policy administration, billing, and claims settlement introduces greater risk due to regulatory, financial, and data integrity requirements. Without strong governance and reliable data foundations, these efforts can create more issues than they solve. Successful organizations take a more disciplined approach, modernizing their architecture first and then layering in AI where it adds clear value, avoiding the pitfalls of overextension and ensuring more sustainable, effective transformation.

How AI changes the modernization equation

AI is not only accelerating platform modernization in insurance but fundamentally reshaping how it happens, shifting away from high-risk, multi-year “big bang” system replacements toward more incremental and flexible approaches. It does this in two distinct ways—by changing how applications are built and deployed, and by embedding intelligence directly into workflows—requiring organizations to clearly distinguish between the two to avoid misaligned investments.

AI development tools: building and deploying faster

The first wave of AI impact in insurance technology is transforming how applications are built, with AI-assisted development tools dramatically accelerating the design, testing, and deployment process. What once required large teams, external vendors, and lengthy timelines can now be accomplished by smaller internal teams in a matter of weeks, enabling rapid development of tools that enhance reporting, streamline claims intake, and improve data validation without replacing core systems. By embracing these capabilities, insurers can move faster, reduce reliance on third parties, and build more flexible, in-house solutions that deliver immediate operational value while strengthening long-term technical expertise. 

Embedding AI in workflows: decisions at scale

The second wave of AI adoption is more transformative, embedding intelligence directly into operational workflows to automate and improve core business decisions. Unlike development tools, this approach integrates AI into processes such as underwriting, claims triage, and pricing, fundamentally reshaping how decisions are made rather than simply enhancing existing tasks. However, its effectiveness depends heavily on strong, consistent data foundations, making data quality a critical prerequisite rather than a parallel effort.

Together with AI-driven development, this shift is redefining the economics of modernization—enabling more organizations to access advanced capabilities, streamline operations, and unlock greater value from their technology investments.

Choosing the right retirement strategy for legacy systems

There are a few key factors organizations must weigh when exiting legacy systems: transaction volume, regulatory complexity, partner dependencies, and appetite for operational risk. These tend to lead to three patterns: 

The strangler pattern

Instead of replacing a legacy system all at once, organizations can incrementally build new capabilities around it—using microservices and APIs—gradually transitioning functionality until the legacy platform can be retired with minimal disruption and risk. This approach is effective for large systems and minimizes risk.

Microservicing and modular decomposition

Organizations can modernize incrementally by carving out specific functions—like claims intake, document generation, or rating—from a monolithic system and rebuilding them as independent, API-driven services, creating flexibility, cleaner integrations, and reduced transformation risk while the core system remains in place.

Sunsetting and runoff

For legacy systems tied to shorter policy lifecycles, a managed wind-down—shifting new business to modern platforms while maintaining but not enhancing the old system—can be the most practical approach, and in practice, effective modernization strategies often combine this with incremental replacement and domain-level decomposition, requiring thoughtful decisions about which approach fits each part of the business.

The right conditions for change

Complete alignment across systems and data models isn’t realistic in insurance, but improving data exchange through more interactive, near-real-time integration is both achievable and impactful. Ultimately, success will be defined not by the most advanced platform, but by the most adaptable—built on open integration, flexible data structures, and a collaborative approach that meets partners where they are.