All articles
Pure Storage VP Says 2026 Will Spark Major AI Infrastructure Shifts as Enterprises Scale Up
Nirav Sheth, VP of Worldwide Sales & Customer Success Engineering at Pure Storage, discusses how AI’s transition to production will push businesses to modernize their infrastructure in 2026.

Key Points
As AI moves into production, legacy infrastructure and cost models struggle to keep up, exposing major weaknesses.
Nirav Sheth, VP of Worldwide Sales & Customer Success Engineering at Pure Storage, highlights the challenges businesses face as they scale AI.
Sheth emphasizes the need for flexible, multi-platform infrastructure strategies to meet AI demands and ensure seamless growth.
2026 is going to be a pinnacle year because a perfect storm of outside forces will really get customers to think about their next-generation data and architectural strategy.

The AI honeymoon is over. What began as curiosity-driven experimentation now carries a hard expectation to deliver real ROI. As models move from the lab into production, long-ignored weaknesses in legacy infrastructure are surfacing, along with cloud cost assumptions that don’t survive at scale. The pressure is pushing business leaders to reckon with how adaptable their architecture actually is.
Nirav Sheth, VP and Leader of Worldwide Sales & Customer Success Engineering at Pure Storage, sees these converging pressures creating a clear inflection point for enterprise leaders. Drawing on experience leading large, global teams at Okta, Google, and Cisco, he points to a moment where architectural decisions can no longer be deferred or isolated to AI alone.
"2026 is going to be a pinnacle year because a perfect storm of outside forces will really get customers to think about their next-generation data and architectural strategy," Sheth says. For many organizations, that shift begins when AI leaves the sandbox and collides with the realities of production scale, cost, and operational accountability.
- Reality check: As AI moves into production, the question stops being what’s possible and starts being what actually holds up. "We’re shifting from an era of experiments and pilots into an era where we really need to understand the ROI," Sheth says. "What are the outcomes and the impact that these investments are going to have?" In practice, that scrutiny often exposes unexpected constraints. In one high-profile technology environment, he recalls, "the network started to give up before our storage platform did in terms of throughput," forcing a broader architectural rethink to support production-scale AI.
As organizations scale to production, they need assurance that their platform can evolve alongside them. This has led many leaders to adopt a more pragmatic approach, shifting focus from perfecting datasets to leveraging "good enough" data. An emerging priority is utilizing production data where it resides, bypassing the cost and complexity of replication. With that, the goal becomes ensuring seamless growth, avoiding the disruptive need for a "rip-and-replace" upgrade in the future.
The economics of scaling AI also can’t be ignored. While the public cloud can be ideal for pilots, it quickly becomes cost-prohibitive at full scale, Sheth notes. This financial reality is driving many companies to repatriate workloads to more efficient hybrid environments. For example, Sheth points to New York-Presbyterian Hospital, which used AI on imaging data to improve diagnoses. By shifting much of this work from the public cloud to a hybrid setup, the hospital is now saving over $2 million annually, which can be reinvested into other critical infrastructure needs.
- Building resilience: The Broadcom acquisition of VMware has significantly disrupted the market, particularly with its changes to licensing models. This shift is pushing many enterprises to adopt a more flexible, multi-platform "Lego block" approach to their infrastructure. Sheth explains, "It's not about choosing one option over another; it's about creating the right combination of solutions that fit your needs." He adds, "Customers are excited because, no matter what mix they choose, we can provide the right answer for every combination."
- Common ground: Sheth reinforces his "Lego block" analogy with a specific, multi-platform integration strategy aimed at de-risking customer investment. "We deliver agnostic support through integrations that reduce licensing costs in VMware environments," he says. "We also offer a dedicated enterprise architecture for Nutanix and our Portworx platform, which provides a common layer for data portability in container-based applications."
- Support local data: Geopolitical tensions are accelerating the push for data sovereignty regulations, forcing multinational corporations to keep customer data within local markets. Sheth explains, "With the geopolitical climate, there's a surge of focus on data sovereignty. Government and regulatory bodies are mandating that data for customers served in certain markets has to stay in those markets." This shift highlights the need for an intelligent, automated control plane to enforce data residency policies without human intervention, making manual data placement increasingly risky and impractical.
Navigating today’s challenges, Sheth argues, requires more than just a flexible platform. It demands a shift from transactional vendor relationships to deep, collaborative partnerships. "Ultimately, we want to be their partners on this journey," he says. "Everything we've done has been so that we can truly be viewed as a partner to our customers and not just a supplier."
For Sheth, customer-centricity isn’t just about a high Net Promoter Score; it’s about taking actions that go beyond the typical sale. "The network conversation is a great example. We don't sell networking, but we want to provide the expertise to help our customers navigate what a next-generation networking strategy looks like. We'll help them architect it the right way, even though we have nothing to sell them in that space."




