All articles
How Data Reduction Software Shields Cloud Budgets from Hardware Price Volatility as AI Surges
As AI demand causes cloud costs to spiral, Dan Kogan, VP of Cloud + New Products at Pure Storage, explains how data reduction software cuts bills and unlocks funds for innovation.

Key Points
With AI workloads causing a hardware supply crunch and driving up cloud costs, many IT budgets are facing a runaway line item.
Dan Kogan, VP of Cloud + New Products at Pure Storage, explains that the most effective way to regain control is not through negotiation but by using smart software to fundamentally reduce data consumption.
He argues that by shrinking their data footprint, companies can reallocate savings from infrastructure "plumbing" to fund high-value innovation, such as new AI services.
When you shrink the amount of data you’re storing, the math changes dramatically, and suddenly your cloud spend becomes predictable instead of a runaway line item.

AI-driven demand is tightening hardware supply and pushing up the cost of core cloud infrastructure. As enterprise cloud bills climb, multi-year reserved instances no longer guarantee stability. Data reduction software offers a more durable lever by shrinking storage footprints before costs spiral and creating predictable economics even as hardware prices rise.
Dan Kogan, Vice President of Cloud and New Products at Pure Storage, has over two decades of experience in the analytics and cloud industries. He has a notable track record of driving business growth, including achieving 4x revenue growth in Pure Storage's cloud sector and leading the marketing efforts at Azuqua that resulted in its successful acquisition by Okta. His approach centers on a capability most enterprises overlook: intelligent data reduction.
“Storage is one of those unavoidable building blocks of every cloud application, but most customers don’t realize how much control they actually have. When you shrink the amount of data you’re storing, the math changes dramatically, and suddenly your cloud spend becomes predictable instead of a runaway line item," says Kogan.
- From data center to silver screen: The company's expertise in data compression is the foundation of the Pure Storage Cloud offering, which shrinks data with a typical 4:1 data reduction ratio. This turns a petabyte of data into just 250 terabytes, slashing a customer’s cloud bill. In fact, the concept is so central to the company’s identity that its hardware famously had a cameo on the HBO show Silicon Valley. "Pied Piper's compression algorithm was actually based on Pure Storage's technology," Kogan says, noting that the company advised on the show.
- The million-dollar math: The data-reduction approach has delivered concrete savings for major enterprises. The strategy's effectiveness is evident in how consultancies are starting to use it as a financial too. In a case Kogan wrote about recently, a large public utility achieved significant cost reduction over five years by switching from Amazon Elastic Block Storage to Pure Storage Cloud. "You start to see a massive drop off in your spend, and your five-year TCO ends up being a 40% cost savings," Kogan says. In another example, a global consultancy used the software in a gain-share model to find $35 million in savings for a consumer packaged goods company on a portion of its Azure estate. "We're saving them around $15 million over five years," he notes.
Beyond data reduction, the strategy addresses what some call the "performance-for-capacity trap," where enterprises overprovision expensive storage to meet peak performance demand. The Pure Storage Cloud Dedicated solution decouples performance from capacity, eliminating that compromise. The platform also provides a hedge against vendor lock-in, a relevant topic given the recent shake-up from the VMware-Broadcom situation. By acting as a "neutral arbiter," Kogan explains, the platform is designed to allow companies to migrate data between different VM types, which helps preserve long-term platform freedom.
But the deeper value lies in reallocating savings to fund innovation. As AI forces a rethink of infrastructure, leaders grapple with how to fund new, high-value AI services. Storage optimization unlocks savings that can be directly reallocated to fund these new initiatives, in line with modern FinOps principles that address the capex pressure Big Tech faces in the AI era.
- Plumbing for progress: Kogan frames the shift as redirecting spend from infrastructure overhead to strategic investment. "You save money on what we call 'plumbing.' It’s the not interesting, non-sexy stuff in storage. Then you can go spend that on much more interesting things like AWS's Bedrock AI services," he says. "Customers aren't actually going to spend less in the cloud overall, but they can spend less on plumbing and more on high-value initiatives."
- The software advantage: The "insane" hunger for raw materials created by AI means that customers will likely face increased costs for core building blocks for the next two to three years. "The cost of NAND, the raw material that goes into flash, has doubled over the last four months," Kogan notes, explaining that this added cost is often passed back to the customer. "Because there is no raw flash component in our software, the price stays flat even as hardware costs climb," Kogan explains. A software-only architecture insulates customers from hardware price volatility, providing predictability even as underlying cloud infrastructure prices rise.
While vendor prices are largely uncontrollable, enterprises still have effective options. The most effective path to predictability comes from intelligently reducing consumption, which offers far more control than simply renegotiating vendor prices. "The most effective lever executives have to pull is to find software that can ultimately reduce how much storage they purchase and consume," Kogan stresses. "That is the way to drive their costs down."




