Waiting for AI clarity? "You're already behind"

Compliance chief on balancing speed and safety in tech rollouts, and why there's "no perfect time"

Waiting for AI clarity? "You're already behind"

Insurance News

By Gia Snape

As artificial intelligence (AI) reshapes the insurance industry, compliance leaders are stepping into a new role that combines regulatory expertise with strategic innovation.

For Shannon Woods (pictured), chief compliance officer at The Mutual Group, the key to navigating this fast-moving landscape has been creating strong guardrails that enable innovation without compromising on trust or accountability.

“Compliance can’t be just a ‘check-the-box’ function anymore,” Woods told Insurance Business. “It must be strategic, proactive, and embedded in the organization’s broader mission.”

That mission is especially critical now, as insurers adopt AI tools across underwriting, claims processing, legal workflows, and customer service. According to Woods, companies that wait for perfect clarity on regulations before taking action risk falling behind.

Fighting “perfection paralysis” in a fast-moving world

Woods’ career path, which began on a farm in Iowa, instilled a deep respect for resilience, hard work, and community. These values continue to guide her leadership style as she helps The Mutual Group balance emerging technologies with its member-first approach.

“Growing up, you learned to solve problems fast, and that mindset still shapes how I operate,” she said. “The Mutual Group is about long-term stability, not quarterly earnings. That gives us room to innovate responsibly.”

One of Woods’ biggest lessons in AI adoption? Don’t wait for perfect conditions before making a move.

“If you wait for everything to be clear and tidy, you’re already behind,” she said. “Some of the best ideas come from spotting what’s broken and fixing it, even if the fix isn’t perfect yet.”

This philosophy underpins Woods’ philosophy with technology. She gave two pieces for advice for overcoming “perfection paralysis”: “First, be curious and have the courage to challenge the status quo. Some of the best innovations come from people who spot what’s not working and suggest a better way.

“Second, don’t wait for perfect conditions. If you do, you’re already behind, especially in tech and insurance, where things move fast. Whether we’re evaluating compliance tools or exploring a new model entirely, I believe in leaning in early and bringing others with me. That early action gives us the time and flexibility to implement proactive, thoughtful solutions rather than reacting under pressure.”

Best practices for building a culture of compliance and innovation

When it comes to best practice, Woods believes that creating consistency across jurisdictions starts with clarity of purpose.

The Mutual Group’s legal team continually tracks federal and state regulatory shifts, identifying risks and opportunities early. These insights are then shared across project teams to build compliant solutions from the ground up. “It’s not about reacting to regulations,” Woods said. “It’s about anticipating them and aligning innovation accordingly.”

This forward-looking approach has enabled her organization to integrate AI into a variety of core processes, including claims triaging, legal drafting, and subpoena response.

However, while Woods embraces early adoption, she also emphasized caution. At the Mutual Group, an internal AI governance program ensures transparency, fairness, and oversight, while a cross-functional AI committee evaluates every new tool, vetting for data usage, bias risk, and required human involvement.

“We don’t allow AI to make policyholder-impacting decisions without human review,” Woods said. “It’s a support system, not a substitute.”

The company also leans on external standards such as the NAIC’s model bulletin to shape its internal guardrails, while maintaining flexibility to tailor governance to each new use case.

In high-risk areas such as underwriting or pricing, Woods stressed the importance of thoughtful deployment. Each potential application is assessed for efficiency gains, regulatory implications, and ethical risks. Vendors must pass a rigorous review process before their tools are greenlit, followed by an AI committee evaluation focused on fairness, accuracy, and impact.

This measured approach has allowed The Mutual Group to create internal audit frameworks and compliance monitoring tools that go beyond regulatory minimums, strengthening transparency and reinforcing trust with regulators, partners, and members alike.

And while much of the industry frets about AI’s impact on jobs, Woods views it differently. At The Mutual Group, AI helps teams make sharper decisions by refining quotes, streamlining legal opinions, and accelerating claims processes, but it doesn’t replace human expertise.

“What we’ve seen is that AI enhances rather than eliminates roles. It provides our teams with better data and insight. For example, in underwriting, it may help refine quotes or speed up decisions,” Woods said. “But it doesn’t take away the role; it sharpens it. We view AI as a tool that makes our people better, not redundant.”

Related Stories

Keep up with the latest news and events

Join our mailing list, it’s free!