Adventures in data centre power and cooling

Data centre managers and engineers don't generally find ourselves the centre of attention at cocktail parties in the way fighter pilots, mountaineers, or storm chasers do. Nothing draws a crowd like adventure stories, but tales in technology lean more toward supporting roles or humorous interludes, as when an ill-fated weasel brought down the Large Hadron Collider at CERN.By Chris Adams, President & COO, Park Place Technologies.

  • Monday, 1st October 2018 Posted 6 years ago in by Phil Alsop
Our time as adventure heroes may be coming, however. The immense energy demands of data centres which could devour one-fifth of the world's power by 2025 are sending operators on a quest for solutions. And their destinations, both real and conceptual, are far afield, with a few even entering that final frontier.

The call of the wild

There used to be pretty reliable "rules" when it came to siting a data centre. Most enterprises would put a facility on their premises or near their headquarters, which tended to cluster them in metropolitan areas. Colocation and early cloud services providers (CSPs) also selected urban zones for proximity to customers and telecommunications hubs.

Now, times are changing, and data centres are moving out. Real estate costs, expansion concerns, and even threats like terrorism are driving some data centres to the peripheries. Ireland, for one, has found itself a tech darling for this reason.

Then, there are the data centres going further to capture power and cooling advantages, and which aren't available in the likes of London or New York. One target, in the U.S. in particular, are towns with underutilised power infrastructure, much of which exists in post-manufacturing regions. Interestingly, data centres may be the spark for much needed reinvestment in these communities.

More ambitious are the data centre operators following the simple logic of cooling: The hotter the climate, the more energy the air con will consume. Ergo, finding naturally colder spots on the planet can save money.

Iceland is getting a lot of looks with its cool-to-frigid temperatures and the added bonus of 100% renewable primarily geothermal energy infrastructure. Admittedly, the sometimes high latency and active volcanoes have turned some people off, but there's always Finland's Arctic reaches, the Norwegian fjords, or the Swiss Alps to consider.

In North America, Canada is promoting itself as a pretty chill place for data centres, and the Rocky Mountain region of the U.S. is trying that pitch as well. At the same time, Oregon and Washington State are garnering interest with a combination of temperate clime and plentiful hydropower.

Not to be outdone are some hot but sunny locales. An Australian operator, for instance, has just announced the country's first solar-powered "behind the grid" data centre, giving it direct access to wholesale electricity prices. These types of projects could make deserts and areas between the Tropics of Capricorn and Cancer appealing.

Then there was the recent news about Microsoft's undersea data centre, sunken off the Orkney Islands. Not only do facilities under the ocean have cooling advantages in and of themselves, but such experiments are bringing us closer to data centres in space, where our hardware will someday enjoy low temperatures and plenty of sun beyond our atmosphere. If only they'd need engineers to join them, our space travel dreams could finally come true.

Air and back again

One thing about quests the stories are mostly about the outbound journey. Take The Hobbit, Or There and Back Again over 90,000 words on reaching the destination and a quick summary of the return.

In contrast, we in IT spend time coming back around. We went from centralised computing to distributed client-server models, and then back to more centralised clouds for a decade. Now, we're pushing out to the edge. And when it comes to keeping machine temperatures within the operating zone, we started with liquid cooling, went to air, and now are headed back again.

Liquid cooling had fallen out of style, even for mainframes. Big Blue took a 15-year hiatus until the introduction of a water-cooled zEnterprise machine in 2010. But now liquid is all the rage, and the reason boils down to Moore's law, which posits that overall processing power for computers will double every two years.

Annual CPU performance growth has slowed, but enterprise needs continue to expand. Machine learning has kicked processing requirements into overdrive. As a result, accelerator processors, mostly GPUs, are hot literally. Uptake is strong, but the higher thermal density point requires intense cooling, often more than air systems can deliver.

Google recently went public about its data centre retrofits. For years, the hyperscale provider had designed facilities to enable liquid cooling but hadn't found it necessary because spreading out over more floor space was always cheaper and easier. A third generation of Tensor Processing Units made the technology indispensable.

Compared with air cooling, liquid infrastructure does require some adjustment from an engineering and maintenance perspective. Initial fears about flooding sensitive equipment are generally overstated, as heat exchangers are usually employed to keep dielectric fluids away from the sensitive parts. Nonetheless, dealing with rear-door heat exchangers and direct-to-chip cooling systems can involve disconnection of non-drip plumbing connectors, while those fixing or replacing immersion cooled components will get their hands a bit dirtier.

On the upside, liquid cooling can actually make data centres more comfortable for human habitation. Engineers accustomed to the noisy fans and extreme temperature differentials between the hot and cold aisles of the air-cooled centre may be pleasantly surprised after liquid conversion.

The unmanned mission to the edge

One of the challenges in Microsoft's undersea data centres and one we must confront in sending computing pods to space is automation. Those facilities will be lights off and hands off, so remote monitoring and control will be essential. More so than such futuristic aspirations, however, is edge computing which may serve as the primary impetus for rapid development of facilities command and control automation.

In an edge scenario, data won't always migrate to centralised data centres. Much of it will eventually make the trek, as archival storage. AI-based analytics, and other functions can be most efficiently performed there.  A lot of raw data, processing capacity, and even storage could be under-utilized or wasted  simply because there won't be the manpower available to manage the vast amounts of data created by edge computing.   

Industry commentators looking at edge prospects estimate the number of remote data processing locations per centralised data centre facility, are likely to run to ratios of at least 50:1 and possibly several orders of magnitude higher.

Imagine managing more than 1,000 outposts for a single enterprise. It can't be done if power and cooling systems not to mention the IT hardware require a lot of manual checking and maintenance. There's probably not enough HVAC, electrician, or tech talent on Earth to staff all the locations that are soon to be, and travelling engineers only mitigate the problem when an enterprise runs dozens, not thousands, of micro-data centres.

The only option will be to enhance automation to the extent that edge computing facilities can operate unattended for long periods. Engineering interventions would likely be limited to disaster-level downtime issues and major upgrades. Modular solutions start looking very attractive. Companies may purchase edge-ready, plug-and-play pods with power, cooling, and automated monitoring and control built in, or providers might operate edge facilities on their behalf, turning to much the same solutions to manage such offerings at scale.

AI-based predictive maintenance technology, including Park Place Technologies' ParkView, is a step down this road. But there are many innovation iterations left on our way to smart grids, smart cities, and life among the Jetsons.

In other words, the adventure continues.