Skip to main content

The Impact of AI on Data Center Markets

From increased computing power and capacity driving hyperscale customers to ramp-up their build plans over the next five years; to using colocation and wholesale providers to build and then lease-back for more capacity; to an Edge market that is on the rise for better connectivity away from the main hubs, artificial intelligence (AI) is shifting the way companies conduct business in the data center space.

The increasing use of AI in data centers is significantly impacting computing and power capacity requirements, according to Ben Burgett, business unit leader of advanced technology at Gray. He says this is because AI applications, particularly deep learning models, require vast amounts of computational power and storage to analyze data. As AI workloads become more complex and data-intensive, traditional data centers may struggle to keep up with the demand and require high-performance processors.

 

“For instance, a single AI model can consume as much energy as five cars do in their entire lifetime,” Burgett points out. Specific examples of this trend can be seen in the growth of AI-related hardware deployments. “The adoption of Graphics Processing Units (GPUs) and Tensor Processing Units (TPUs) in data centers has surged to accelerate AI tasks due to their specialized capabilities. In 2019, NVIDIA, a leading GPU manufacturer, reported that over 80% of its data center revenue came from AI-related applications.”

 

These components require ample power and cooling to operate efficiently, contributing to increased space requirements within data centers. “As a result, hyperscale customers, such as large enterprises, are expanding their data centers to accommodate the growing AI workloads,” Burgett says. “These expansions are driven by the need to house more servers and other AI-specific hardware.”

 

To maintain the performance and reliability of AI infrastructure, hyperscale data centers often require geographic distribution and redundancy, leading to the construction of new facilities in various regions, he adds.

"The adoption of Graphics Processing Units (GPUs) and Tensor Processing Units (TPUs) in data centers has surged to accelerate AI tasks due to their specialized capabilities. In 2019, NVIDIA, a leading GPU manufacturer, reported that over 80% of its data center revenue came from AI-related applications."
Ben Burgett, Vice President, Data Center Market

Accommodating the AI trend

 

AI workloads require specialized hardware, high-density computing, extensive power, and unique cooling capabilities that traditional data center setups simply may not be designed to meet, Burgett says. So, to account for the demand in AI-driven data center infrastructure, he says colocation and wholesale data center providers have had to adapt their services to accommodate this trend in the following ways:

 

  • Infrastructure upgrades. Providers are investing in hardware upgrades, including the deployment of AI-optimized servers with GPUs and TPUs to support AI workloads.
  • Power and cooling solutions. AI workloads generate significant heat, necessitating efficient cooling systems. Data centers are implementing advanced cooling technologies such as liquid cooling to manage the increased heat density.
  • High-density racks. Providers are incorporating high-density racks to maximize computing power per square foot and efficiently utilize space.
  • Offering scalable solutions allows customers to adjust their infrastructure according to their AI workload demands.
  • Strategic locations. Providers are establishing data centers in regions with suitable access to power, resources, and connectivity to cater to the needs of AI-driven applications.

 

On the Edge

 

Edge computing refers to the decentralized processing of data closer to the source, reducing latency and the need to send data to centralized data centers. Burgett says the AI boom is significantly impacting the Edge market. For example, AI applications are being used at the Edge to enable real-time data analysis and decision making. This is crucial for applications like autonomous vehicles, industrial IoT, and smart cities.

 

Other impacts that AI is having on the Edge market include reduced latency, data privacy and security, improved connectivity, and distributed AI networks. “By processing AI workloads locally at the Edge, latency is minimized, leading to faster response times for critical AI applications,” Burgett says. He adds that some AI applications, such as facial recognition, require data processing on-premises due to privacy concerns. “Edge computing allows data to be processed locally, reducing the need to transmit sensitive information to central data centers.”

 

As AI applications demand extensive data transfer, the Edge market drives the need for improved connectivity in remote or underserved areas away from major data center hubs and deploys 5G networks, Burgett notes, adding that the growth of AI at the Edge is leading to the emergence of distributed AI networks. This is where AI models are deployed across multiple Edge devices, sharing insights while reducing the dependency on centralized data centers, he says.

"By processing AI workloads locally at the Edge, latency is minimized, leading to faster response times for critical AI applications."
Ben Burgett, Vice President, Data Center Market

Anticipated future trends

 

When asked what specific trends or developments in the future of AI in data centers that industry experts might anticipate, Burgett had several:

 

  • AI at the Edge. The deployment of AI workloads at the Edge will continue to grow, driven by the need for low-latency processing and real-time decision-making.
  • Specialized AI hardware. AI-specific hardware designs and accelerators will become more prevalent, offering improved performance and power efficiency for AI workloads.
  • Energy-efficient data centers. Data centers will continue to focus on energy efficiency and sustainable practices, driven by environmental concerns and cost-saving incentives.
  • Federated learning. Federated learning, where AI models are trained locally on Edge devices and share insights with a central model, will gain traction, enabling privacy-preserving AI applications.
  • AI-driven automation. AI will be increasingly used to optimize data center operations, automate resource allocation, and enhance overall efficiency.
  • Hybrid cloud and multi-cloud AI deployments. Organizations will combine on-premises data centers with public and private clouds to create a flexible and scalable AI infrastructure.
  • AI for data center security. AI will be leveraged for advanced threat detection, intrusion prevention, and security analytics in data centers.

 

Burgett notes that these predictions are speculative and may evolve based on advancements in AI technology, regulatory changes, and market dynamics.

    Some opinions expressed in this article may be those of a contributing author and not necessarily Gray.

    Get the Latest.