In January, tech sector stocks tumbled when a Chinese artificial intelligence startup, DeepSeek, unveiled a large language model that operates at a much lower cost than U.S. versions while maintaining similar capabilities. DeepSeek reports that it spent just $5.6 million to train its AI, compared to the more than $100 million committed by OpenAI for GPT-4. While some critics question the veracity of DeepSeek’s claims, the intensifying demand to increase data center construction to support AI and other technologies is undisputed.
“The news related to DeepSeek hasn’t impacted our long-term forecast of where AI workloads in the data center sector are going,” explains Andrew Batson, senior director of Americas data center research and strategy for JLL, during a webinar. “Our view is that the cost per query for AI will decline over time. You’ll hear about Jevons paradox, or the idea that as the cost of any resource — in this case AI — declines, the usage increases. That’s what we’re seeing in the AI space.”
America’s tech companies understand the growing need for this digital infrastructure and are spending big. Batson says that Amazon, Microsoft, Google and Meta have announced close to $1 trillion in capital expenditures towards data centers over the last five years. That investment, he continues, could translate into approximately 30 gigawatts of new and redeveloped data center space worldwide.
Due to the media attention AI receives, Batson says many people overemphasize how much it’s currently driving data center development. The reality is that internet traffic, enterprise processes and storage are still massive users of computing power. The sheer amount of data being created each year is increasing at a 24% compound annual rate, and AI is just one vector of that growth.
According to data from Avison Young, the global datasphere is expanding exponentially and is expected to double from 101 zettabytes in 2022 to 221 zettabytes by the end of 2026. The report highlights the internet of things, autonomous vehicles and AI as examples of emerging technologies with “the potential to consume orders of magnitude more data than typical applications today.”
In 2025, Baston estimates that 15% of data center workloads are related to AI since most people interface with it in a limited capacity. But as AI embeds itself into other technologies, programs and services, that number could balloon up to 40% by the end of the decade.
“Within the industry, there’s a wide range of forecasts,” he continues. “We don’t know how quickly AI adoption will happen across sectors, but for now, AI is an important yet small component of overall data center workloads, but we expect that to increase over time.”
Arizona’s data center sector
In recent years, Greater Phoenix has established itself as a data center darling thanks to its low propensity for natural disasters, available infrastructure and ability to offer low-latency connectivity to nearby by West Coast tech hubs. A JLL market report notes that operators want to deploy quickly, making speed to market an advantage for the region.
“The [region] is seeing huge demand, with very low vacancy and most new facilities fully preleasing before completion,” the report reads.
Data from Cushman & Wakefield corroborates this claim, with the market posting an 84% prelease rate, showing that wholesale occupiers are turning to under-construction product for their needs. The report also highlights that Greater Phoenix saw 411 megawatts of colocation leasing activity in 2022, expanding to 669 megawatts by the end of 2024.
MORE NEWS: 2,069-acre parcel will become a $20 billion data center park in Buckeye
INDUSTRY INSIGHTS: Want more news like this? Get our free newsletter here
As data centers have become a more regular sight in the Valley, some municipalities have started to restrict their construction over concerns about water consumption and a perception that they don’t create enough jobs considering how much land these projects require.
During ULI Trends Day, JD Beatty, senior manager of site selection for Americas at Iron Mountain Data Centers, notes that water efficiency is also a top concern. Newer builds from Iron Mountain, he continues, are air-cooled and utilize a closed loop system, meaning the facilities consume about the same amount of water as other similarly-sized industrial buildings — several thousand gallons daily.
“Certainly, previous data center developments were using a lot more water,” Beatty says. “But the industry has listened, and the majority of what’s being built in Arizona uses an air-cooled design because we know how critical that is for the state.”
Dan Diorio, senior director of state policy for the Data Center Coalition, adds that since the sector is the “new kid on the block” for many states, addressing these concerns and showing the value data centers bring is crucial.
Nationwide, Diorio says that every one job at a data center has to be supported by six and a half jobs elsewhere. Counting the direct, indirect and induced jobs in Arizona, he notes there were 81,730 jobs created in 2023, a 34% increase from 2017. During that same period, labor income increased 60% to $6.23 billion.
“Just from ‘22 to ‘23, the contribution [to Arizona’s] GDP was $21.59 billion, and tax revenues were $1.69 billion,” Diorio concludes. “The key point is that [data centers] have a substantial impact net of any incentives, and that affects the quality of life in these communities, from [helping fund] schools to hospitals.”