All articles

Future of Data Management

A New, Intuitive Blueprint for Financial Data Architecture Prioritizes Business Outcomes Over Scale

The Data Wire - News Team
|
November 5, 2025

Subhash Kommuru, VP of Enterprise Architecture at Wells Fargo, explains how a new blueprint for data architecture can help leaders avoid repeating costly mistakes with AI.

Credit: Outlever
Key Points
  • While past data investments created more complexity than value, the next wave of AI spending risks repeating the same mistake.

  • Subhash Kommuru, VP of Enterprise Architecture at Wells Fargo, explains why leaders must address AI's biggest risk—inconsistent results—and stop treating data as a technical afterthought.

  • The model Kommuru proposes starts with setting a business goal, embracing simplicity, and building iteratively with lean, agile architectures.

When you think about a data warehouse, it's traditionally an afterthought. The core application does the hard work, and only after the transaction is done does the data warehouse come into the picture.

Subhash Kommuru

Vice President of Enterprise Architecture
Wells Fargo

*The views and opinions expressed by Subhash Kommuru are his own and do not necessarily represent those of any former or current employers.

After years of investing in large-scale data infrastructure, enterprises are shifting their focus to value. In the past, systems often created more complexity than clarity, making it challenging to identify ROI. Now, as AI drives the next investment cycle, leaders in highly regulated industries have a new mandate: build data systems that deliver measurable business insights.

For an insider's take, we spoke with Subhash Kommuru, Vice President of Enterprise Architecture at Wells Fargo. A senior software and enterprise architect with over 20 years of experience, who has held leadership roles at BrightInsight and UnitedHealth Group, Kommuru's expertise in API strategy, cloud adoption, and Site Reliability Engineering (SRE) puts him at the forefront of this conversation. In his experience, many of the issues plaguing data teams stem from a single, foundational flaw in their approach.

"When you think about a data warehouse, it's traditionally an afterthought. The core application does the hard work, and only after the transaction is done does the data warehouse come into the picture," Kommuru says. To break this reactive cycle, he proposes a new approach. Here, data becomes a consumer of AI, using agents to create a "triangulation of information" by pulling customer, product, and interaction data in parallel.

  • The consistency conundrum: But with that power must come caution, Kommuru explains. For him, the biggest AI risk isn’t failure but a lack of consistency. "What I'm worried about is reproducing that result over and over again. In a regulated industry like banking or healthcare, this is key. A model can't recommend Tylenol for a headache one day and then claim the next day that Tylenol was never intended for headaches at all. You cannot have that much variation."

For Kommuru, the inconsistency factor requires a redefinition of modern AI governance. Instead of lineage, the goal is ensuring "traceability and being able to reproduce the same result over and over again," he explains. But the high-stakes nature of this risk can also lead to indecision.

  • Action over analysis: To combat this, Kommuru suggests leaders adopt a core principle from AI itself to escape the "analysis paralysis" trap. "We can learn from a core principle in AI: you don’t have to be perfect. Decision-making doesn't have to be perfect all the time either. If you just sit and think about it, you will fall into analysis paralysis. You have to be willing to put the first step forward."

A philosophy of iterative action is the cultural foundation for a successful data governance strategy, Kommuru continues. From there, he outlines a clear blueprint for building a modern data architecture, starting with a single guiding rule.

  • Confusion in, chaos out: The first principle is to define a clear goal before touching any data, Kommuru says. "If you don't know what outcome you want, AI won't know what to produce. It's just going to spit out 'garbage in, garbage out' for you."

  • Purpose before petabytes: "You have to start with an objective," he advises. "For example, you could decide you want to help a blind person by using computer vision to create glasses that capture an image of a product and read its expiration date aloud. That vision gives you one part of the solution, and data gives you the rest. But if you start with the data, you won't know what to do with it."

The second principle is a direct challenge to the complexity that plagues many enterprise systems. For Kommuru, sophistication is found in simplicity, not complication.

  • Complexity is failure: "People try to make things very, very complicated. The moment you make things complicated, you're bound to failure. If it's not simple, you haven't thought enough. You're probably either overthinking or underthinking," he says. "Every extra layer is a chance for misalignment. It's like architecting a cathedral to house a single candle, or building a whole pharmacy when all the consumer wants is a painkiller."

  • Resist the scale: Dismissing the old "bigger is better" ethos of the big data era, Kommuru says, "Just because your database can now hold petabytes of data doesn't mean that you need to bring that much data. You need to make sure you cleanse your data properly."

  • Discernment over dashboards: "We don’t need more dashboards. We need more discernment. The next generation of data leaders must build systems that earn trust, not just scale," he continues.

Ultimately, Kommuru advocates for a fundamental shift in how data systems are developed, transitioning from rigid, project-based approaches to an agile, iterative process. To illustrate the practical application of his "action over analysis" mindset, he offers an analogy: "You don't have to build the whole car right away. You can build one bike, then add a motor, and then you can build the car. You can do all of those things on the application side. But on the data side, people are still trying to just rush into it."

In conclusion, Kommuru identifies a specific villain that represents this outdated, complex approach: "Think about the medallion framework—it just triples my storage and compute costs. I want a lean zone architecture that can grow with my business and adapt to new requirements as they come, rather than forcing me to rebuild the whole thing." As a result, he says, the mission is clear for the next generation of data leaders. "The speed with which data must feed into real-time use cases is the intuitive portion that our next generation needs to pick up on."

Related Stories