DataCenterNews US - Specialist news for cloud & data center decision-makers
Modern data center server racks cooling systems energy efficient blue illustration

AI data centre cooling patents rise 50% amid energy concerns

Thu, 23rd Oct 2025

Patent filings related to AI data centre cooling systems have increased by more than 50% since 2019, reflecting growing technology sector efforts to address the environmental impact of AI workloads, according to new data from Appleyard Lees.

The firm's fifth annual Inside Green Innovation: Progress Report analysed patent activity through the end of 2023, which reveals a significant surge in inventions targeting more efficient cooling technology for data centres.

Energy demand of AI

Appleyard Lees Partner Ean Davies said that rising global adoption of generative AI systems is increasing data centre energy demands considerably compared to conventional online activity.

"A typical, single generative AI prompt can use 10 to 100 times the energy of a simple internet (e.g. Google) search. As more users worldwide make use of AI systems, it is easy to see how power consumption can only increase.

Davies explained that this dynamic is driving the industry to develop technological solutions across both hardware and facilities management.

"And this is now prompting technology companies to innovate ways - from chip to data centre - of optimising how they manage the effects of computing power needed for AI."

Growth in cooling patents

The report highlights sustained growth over the last two decades in the volume of patents filed for server and data centre cooling applications. While there were 87 such patent filings in 2003, this number rose to 353 by 2023, with fresh annual records set each year since 2020. Application numbers from 2021 to 2023 were 383, 377, and 353 respectively.

Davies noted that high energy consumption for cooling has long been an established challenge for enterprise data centres and is set to become more pronounced as AI workloads intensify.

"Even before the advent of AI, this is unsurprising, as cooling systems in enterprise data centres require more than 30% of the total energy consumed. And as the use of AI data centres increases, and workloads become more intensive, traditional air-cooling methods are likely to be insufficient."

According to the report, recent innovations in the sector include direct-to-chip cooling systems, which deliver liquid coolant to power-dense components. One patent application from NVIDIA describes a liquid-cooled "cold plate" designed to contact semiconductor devices, such as graphics processing units (GPUs) commonly deployed in AI computing. This system features a replaceable intermediary layer allowing for adaptable configurations depending on the circuit board layout.

Microsoft has filed a separate patent application detailing microfluidic channels embedded directly into printed circuit boards, in combination with pumps to control coolant flow across the board, thereby enhancing localised cooling.

Reducing AI energy requirements

Global data centre energy use was around 415 terawatt hours in 2024, equivalent to about 1.5% of the world's electricity consumption and on a similar scale to annual electricity usage in countries including Saudi Arabia, Iran, and Mexico, as detailed in the Appleyard Lees report.

In response, technology companies are seeking patent protection for systems that lower the energy requirements of AI systems and data centres. IBM, for example, is working on a workload allocation engine designed to assign computing tasks to servers that would produce the fewest carbon emissions. The company has also submitted patents referencing energy-aware and carbon-aware operational strategies for cloud computing.

Similarly, Google is seeking to patent an AI accelerator designed to boost computational output and reduce data movement, thereby lowering power consumption in deep learning inference applications.

The report points out that while patent filings illuminate a significant cross-section of industry innovation, not all breakthroughs are necessarily disclosed in public patent literature, as some businesses may choose to keep operational optimisations as trade secrets.

Outlook

Davies commented that the push for greater energy efficiency in AI hardware and infrastructure will remain a key trend as adoption accelerates within the technology sector and more broadly.

"As AI adoption accelerates across industries, the demand for more energy-efficient AI technologies has become a critical priority, both to manage operational costs and to reduce environmental impact.
"Therefore, we anticipate a continued increase in innovation in the coming years, focused on minimising energy consumption of AI hardware and data centres, such as advances in cooling technologies, workload optimisation and low-power AI architectures."
Follow us on:
Follow us on LinkedIn Follow us on X
Share on:
Share on LinkedIn Share on X