All articles

Future of Data Management

A Seamless Experience Erodes Trust in Finance, but 'Responsible Friction' Builds It Back

The Data Wire - News Team
|
October 10, 2025

Olivia Hyde, co-founder of Prompts to Profit, explains how an obsession with finding AI use cases can create inefficiencies for financial institutions that deploy enterprise AI.

Credit: Outlever
Key Points
  • Olivia Hyde, co-founder of Prompts to Profit, explains how an obsession with finding AI use cases can create inefficiencies for enterprise AI.

  • She introduces the "iceberg model" as a new approach to help leaders understand AI in finance, where most of the work is accomplished unseen.

  • Hyde concludes that the need for responsible friction in finance to build trust and ensure security will only grow as the development of new technology continues to outpace regulatory changes.

What the user sees is just the tip of the iceberg. The vast majority of what gets done, including complex data processing, risk analysis, and compliance checks, occurs below the surface, either 'under the hood' in the code or in the 'back of the house' with internal teams. That's where the real heavy lifting of AI is being done.

Olivia Hyde

Co-founder
Prompts to Profit

Too many leaders are chasing the wrong prize in the race to agentic AI. Now, an obsession with finding new "use cases" for the technology has enterprises throwing money at problems without a meaningful return. Most are asking, "What can we do with AI?" when the more valuable question is, "What is the most important problem we can solve with AI?"

For an expert's take, we spoke with Olivia Hyde, co-founder of the Prompts to Profit podcast. As the former Head of Strategy and Operations at Ghost and with a background that includes director-level roles at McKinsey & Company and Fidelity Investments, Hyde is a strategy and product leader deeply familiar with this complex terrain. In fact, her passion for the subject extends so far beyond the corporate world that she co-founded a podcast with former McKinsey colleagues to continue the conversation.

  • The tip of the iceberg: Understanding AI in finance requires a new approach, Hyde says. Calling it "the iceberg model," she explains how here, users can see a few features on the surface, but most of the work remains unseen. "What the user sees is just the tip of the iceberg. The vast majority of what gets done, including complex data processing, risk analysis, and compliance checks, occurs below the surface, either 'under the hood' in the code or in the 'back of the house' with internal teams. That's where the real heavy lifting of AI is being done."

To illustrate how quickly this underwater engine is evolving, Hyde points to a project from just two years ago. An insurance business tried to use AI to query massive blocks of policies for compliance data. But the technology's nondeterministic nature meant it couldn't handle complex actuarial projections. Today, that picture is radically different.

  • Sentiment as a signal: In finance, the "under the hood" work has officially moved from simple retrieval to sophisticated analysis, Hyde says. Today, some quant analysts even use AI to predict stock movements by analyzing market sentiment. "Stock valuations aren't just about financial statements. They are highly behavioral. If you can use AI to scan the internet and analyze the 'chatter,' the sentiment and feeling people have about a stock, you gain a powerful signal. That behavioral data can be just as critical for predicting a stock's movement as the company's balance sheet."

  • The data mesh: Looking forward, Hyde sees this invisible infrastructure becoming even more powerful. "Every large company struggles with data that's trapped in different silos. New protocols can act like a connective mesh that unifies these disparate systems. Instead of having to move all the data to one place, it allows AI to query across all of it at once, annotating responses with provenance data so users can audit or validate the response—an especially important security measure in financial services. That way, you don't miss critical information just because it's stored in the wrong place.

Despite popular wisdom, the pursuit of frictionless experiences in finance often overlooks a critical truth about money and trust, Hyde explains. The result is a need for "responsible friction," or the idea that experience design isn't about eliminating all friction, but about applying it intelligently to build trust.

  • The friction paradox: For example, Hyde recounts calling her bank to close an old account recently. When the representative who called her back asked what she wanted to do with the funds without verifying her identity first, she immediately felt uneasy. "I was on the phone with my bank, and the representative was ready to move my money without asking for my name, my birthday, or any verification at all. It felt deeply insecure. In finance, if you don't pause the experience to confirm you're talking to the right person, you create anxiety."

That anxiety stems from a simple truth: humans need a bit of friction at security milestones to feel safe. While AI is capable of eliminating the need for conspicuous identity validation, it's still an important part of the user's psychological experience. "AI is a user-centered experience," Hyde explains. "Sure, I could remove friction in the name of efficiency, but it ultimately isn't helpful if it degrades the user's trust."

  • An emotional journey: The key to navigating the friction paradox is found in understanding a customer's psychological needs, Hyde explains. "Any customer interaction, whether it's opening an account or checking a balance, is an emotional journey. The key is to map that journey not just by the steps a user takes, but by their state of mind at each moment. Ask, 'What does my customer feel right now? What do they need psychologically to feel secure and confident?' When you solve for those emotional needs, the right design choices become obvious."

  • An invisible assist: Stripe's Link feature is a masterclass in this approach, Hyde says. Its invisible convenience is the perfect surface-level example of her iceberg theory at work. "The best frictionless experiences are the ones you don't even notice. With a feature like Stripe's Link, the system has already done the work for you, recommending the card you typically use for a specific type of purchase. You just click 'confirm' without ever realizing that a complex decision was made on your behalf. The convenience is invisible."

At the other end of the spectrum, however, is immense risk. To make her point, Hyde applies the analogy of self-driving cars. "Much like with autonomous vehicles, in financial services, there is simply no acceptable margin for error." Here, even a small mistake can be catastrophic for both the customer and the market. "Because the stakes are so high and the regulatory environment is so strict, you can't fully automate final decisions. You still need a 'human in the loop' to act as the ultimate circuit-breaker, providing the final review and sign-off before an action is taken."

Ultimately, the environment demands a level of caution that will always temper the pace of innovation, Hyde concludes. "There is a fundamental and unavoidable conflict in this space: AI technology evolves in a matter of months, while financial regulation changes over a matter of years. This gap means there will always be a lag between what is technologically possible and what is legally and ethically permissible."

Related Stories