All articles

AI

AI Leader Says Enterprise AI Isn't a Mandate Problem, It's a Systems Problem

The Data Wire - News Team

|

April 22, 2026

Prashant Parida, Leader of AI Strategy, Transformation, and Portfolio Management at IBM, argues that AI adoption requires both top-down and bottom-up collaboration and suggests a four-quarter budget approach with quarterly governance.

Credit: The Data Wire
Key Points
  • Top-down AI mandates often clash with ground-level realities, leaving data teams to fix fragmented systems and decades of technical debt.

  • Prashant Parida, AI Strategy, Transformation, and Portfolio Management Leader at IBM, notes that successful adoption requires leaders to ensure AI is grounded in individual work.

  • Parida emphasizes a four-quarter approach that connects a top-down strategy with bottom-up execution, serving as a necessary map for enterprise transformation.

You must figure out all three buckets of operations: individual work, the functional work within team workflows, and the enterprise-level workflows between functions. Unless you map all three, AI transformation won't work.

Prashant Parida

AI Strategy, Transformation & Portfolio Management Leader
IBM

Enterprise AI transformation has a direction problem. Boards pressure CEOs, CEOs push mandates to executives, and executives pass them down to teams — but somewhere between the strategy deck and the daily work, the translation breaks down. The ambition is enterprise-wide, but the execution is often another story.

Prashant Parida, a longtime enterprise technology leader who has advised and operated within Fortune 15 organizations, argues that the disconnect runs deeper than coordination or commitment. As a leader responsible for AI strategy, transformation, and portfolio management across large-scale consulting operations, he sees a pattern: organizations treat AI as a mandate problem when it is fundamentally a systems problem—one that spans data, workflows, decision ownership, skills, and alignment with business incentives. Unless those systems are addressed together, AI programs will continue to stall.

"Unless you figure out all three buckets — the individual work, the functional workflow, and the enterprise-level connections between functions — and tie that to revenue growth or margin, it is not going to work."

The friction often starts with the hype cycle. When boards react to weekly generative AI headlines, it creates a cascade of misaligned expectations for the frontline workers who actually have to implement the technology. The result, in many cases, is what Parida calls "AI theater" — organizations layering AI interfaces or copilots on top of fragmented data and disconnected workflows, creating the appearance of progress without delivering real intelligence or outcomes.

  • Trickle-down tech: Parida says the pressure pattern is predictable but rarely productive. "So far, with generative AI, there is a new release every other week. Boards create pressure on CEOs, and CEOs pass it on to the executives. But for that pressure to translate to the people who are really doing the work, it has to translate to the individual's daily tasks."

The real culprit behind that translation problem is often decades of M&A technical debt and legacy applications that haven't been updated for some time. Many large organizations find that years of mergers and acquisitions leave them with years of siloed applications, fragmented data models, and integration challenges.

For enterprises looking to get beyond that theater, efforts to build unified data architectures and seamless data lakes have become foundational to modern enterprise data standards. That demand for integration is one reason cloud platforms like Snowflake and Databricks are experiencing massive market growth.

  • Stuck at 98 percent: Parida argues that near-complete data readiness still isn't enough to unlock enterprise-scale AI value — the threshold is decision-grade data, complete and reliable enough to act on. "Before AI can really make an impact, your data has to be decision-grade. Companies will struggle until they go down to the grassroots level and fix both the workflows and the data."

That missing two percent is a human problem. Strict policies that keep employees away from official AI tools can create a frustrating AI velocity gap. Workers frequently default to off-system workflows and use personal tools to get their jobs done. Without a deliberate focus on enabling human data sharing, some employees tend to keep information in local files and side channels. That localized hoarding inadvertently destroys the centralized data lakes that companies are spending millions to build.

However, many employees outside technical roles may not immediately know which parts of their day can be automated or augmented. Broad messaging fails here; one-on-one task categorization is the only tactical way to achieve adoption.

  • Lost in translation: AI doesn't just happen by mandate, according to Parida. You have to physically sit down and map the work. "There is so much talk about agentic orchestration, but I might not know how to apply it because I'm not a technical person. Somebody has to sit down with me and actually categorize my tasks: identifying what is repetitive and can be handled by agents, what requires human analysis and judgment, and what relies on creative strategy. Unless someone figures that out with me, I will just come in and listen to the corporate talk without changing."
  • Buckets to bottom lines: All that said, mapping tasks is inherently tactical. But the practice fits into a larger strategic framework. They must be integrated into team-level and cross-functional workflows in ways that reflect organizational goals. Parida explains that connecting those dots offers a clear pathway to tie daily execution to board-level financial metrics. "You must figure out all three buckets of operations: the individual work being done, the functional work that happens within team workflows, and the enterprise-level workflows between functions. Unless you map all three, and then connect how that works impacts revenue growth or profit margins, the AI transformation is not going to work."

For many legacy companies, traditional IT budgets tend to be heavily weighted toward keeping the lights on. That leaves limited room for new data foundations and AI enablement work that do not deliver immediate savings. Flipping that ratio is a mathematical requirement for transformation.

  • Flipping the bill: Parida says the budget math is the real barrier, not executive ambition, and that the organizations willing to restructure their IT spend around AI readiness will be the ones that pull ahead. "Typically, the challenge with IT has been that 70 or 80 percent of your budget goes into maintaining what you have. That's mission critical; that's what the business is running on. Now you only have 15 or 20 percent of an innovation budget. For you to really clear up the data and catch up to the AI train, you have to do it the reverse. You have to make 80 percent of your investments available. The winners will be people who are very smart about how you really find the money to fix the data issue, enable people, and then really get onto the train."

Because of the math, leaders often manage a delayed timeline. Infrastructure and enablement costs hit early, while the resulting impact on margins can take multiple quarters to materialize. Since clear ROI often lags initial deployments, Parida recommends a staged investment model that mirrors industry conversations about measuring AI value over time, prioritizing initiatives with clear business cases.

  • The four-quarter countdown: Parida suggests a four-quarter approach to AI adoption, budgeting, and ROI measurement: "In the first quarter, you are building the UI, organizing datasets, and delivering the first version of the MVP. That milestone is your initial ROI. In the second quarter, you measure adoption. You don't have exact productivity savings yet, but if employees are using the tool repeatedly, that proves its value. By the third quarter, that repeated use translates into reduced cycle times. Finally, when you hit the fourth quarter, you see the actual financial savings. You must use different metrics to justify your ROI at different maturity stages, rather than assuming everything must be an immediate financial metric."

The reality of enterprise AI is far less glamorous than the headlines suggest. For many teams, the path forward relies less on downloading the latest model release and more on the unglamorous work of systematically connecting people, processes, data, and capital. The organizations that progress, Parida notes, are the ones that treat AI not as a mandate to be passed down but as a system to be built up, from the individual task level all the way to the balance sheet. "You have different metrics to justify your ROI at different stages. That's where enterprises should find it meaningful, rather than just saying everything is financial metrics."

Related Stories