Technology has finally caught up to the complexity of commercial risk. Insurance, however, hasn’t caught up to its own data. For Julianna Muir (pictured), founder and CEO of Vellum Insurance, the industry’s growing retreat from entire categories of risk - from cyber to climate - isn’t driven by a lack of tools or innovation. It’s driven by delayed, distorted, and fragmented data moving through an outdated value chain.
“The farther back in the value chain you sit, the more delayed the data you get,” Muir explained. “The more hands that have touched it, the murkier it becomes, and the less fidelity that asset has.”
That loss of clarity has real consequences. Without timely, high-fidelity visibility into exposure, insurers are forced to make blunt decisions, often exiting risk entirely rather than managing it with nuance.
In areas like climate and cyber, Muir sees a familiar pattern emerge.
“I often say the industry likes to throw the baby out with the bathwater,” she said. “We’re out of Florida. We’re out of cyber.”
Those decisions aren’t necessarily irrational. They’re the best decisions insurers can make with the data they have at the time. When exposure is only partially visible, or visible too late, the safest option becomes disengagement.
The opportunity, Muir argues, isn’t just to innovate underwriting at the front of the value chain, but to modernize the back of it. That’s where capital providers, reinsurers, and portfolio managers need clearer, more immediate insight into what they actually hold.
“If they had more real-time exposure and visibility into the same set of information,” she said, “they could operate with a level of nuance they simply don’t have today.”
Despite years of hype around big data and AI, Muir believes the industry is still stuck on a foundational problem: there is no shared data language.
“What one carrier calls a product, another calls a line of business, and another calls a coverage,” she noted.
Without standardization, even the most advanced analytics fall short. AI can process enormous volumes of information, but only once that information is structured, trusted, and understood.
“You can’t just throw data at a machine and let it tell you the answer,” Muir said. “You first have to structure that data, understand it, trust it, and standardize it, then you can drive insights.”
The irony, she added, is that these challenges aren’t new. The industry has been talking about big data for decades. What’s changed is the volume and velocity of information now flowing through insurance ecosystems, and the urgency to finally do the foundational work required to leverage it.
Legacy systems are often blamed for slowing transformation, but Muir sees the mindset as the bigger obstacle.
“There’s often an ‘it works, don’t fix it’ mentality,” she said. “This is the way we’ve always done it.”
In a highly regulated industry, that hesitation is understandable. Auditors demand traceability. Regulators expect consistency. And insurers can’t afford operational disruption.
“It’s not like you have to rip out the old and replace it fully with new,” Muir explained. “These things can run in parallel.”
By testing new technologies alongside existing processes, insurers can evaluate where automation improves outcomes, and where human judgment remains essential.
“Part of the next five years,” she said, “will be understanding where human judgment is your competitive advantage, and where it’s simply a matter of doing things faster and better.”
As insurers move toward more modular, customized risk solutions, data infrastructure shifts from volume to adaptability.
“There’s a huge spectrum of technological sophistication in this industry,” Muir said. Some organizations are just beginning to build data warehouses. Others employ hundreds of data scientists. Regardless of maturity, every insurer faces the same challenge: data flows both upstream and downstream, across brokers, MGAs, reinsurers, and capital partners. That data must support not just reporting and compliance, but real-time visibility into balance-sheet risk.
“If your data structure is too rigid,” Muir warned, “you won’t be able to scale.”
True scalability, she argues, isn’t about the number of rows in a database. It’s about the breadth of information an organization can ingest, adapt to, and act on as new risks, and new underwriting variables emerge.
Despite insurance’s reputation for risk aversion, Muir sees a familiar pattern when it comes to technology adoption: buying what feels safe rather than what’s fit for the future.
“The industry often buys the expensive tool that someone else used,” she said, “because they won’t get in trouble for it.”
But that approach can lock organizations into systems that struggle to evolve. The better question, she argues, isn’t what worked yesterday, but what will still work tomorrow.
“Is this tool nimble enough to service me as the business changes?” she asked.
For insurers navigating increasingly complex and volatile risk landscapes, the answer to that question may determine whether they continue to retreat or finally re-enter the risks they’ve been walking away from.