Rising workloads, tougher sustainability regulations, and cost pressures will increasingly squeeze data center operators in 2024 and force tradeoffs between business expansion and infrastructure reliability. Credit: Gorodenkoff / Shutterstock Five key data center trends will impact operators in the coming year, according to the Uptime Institute. The research firm highlighted the top trends – which include tougher sustainability regulations, supply chain impacts from AI, and liquid cooling adoption – in a report released earlier this month. Together these conditions are not only driving innovation but also increasing infrastructure complexity, the institute’s researchers said. Data centers will have to balance reliability, efficiency, emerging technologies, and business expansion goals. “A common thread among these trends is that most of these challenges facing the industry are actually a result of its own ongoing success, seen through continued growth in demand,” said Douglas Donnellan, a research analyst at Uptime Institute, at a recent webinar. Key to navigating these challenges are tech approaches such as infrastructure automation, standardized designs, economies of scale, cross-domain skill building, and leveraging data analytics. Here are Uptime’s top five trends for 2024: 1. Sustainability regulations get tougher According to the institute, the top prediction for this year is that the data center sector will continue to use more power, and emit more carbon, as its footprint rapidly grows. As a result, publicly stated net-zero goals and other commitments will become harder and more expensive to maintain. Some companies may have to backtrack on their commitments or increase their investments in energy efficiency to meet tighter reporting and accountability regulations. “It’s going to be a challenging period for the sector over the next five years,” said Andy Lawrence, the institute’s executive director of research. He pointed to new regulations and stricter reporting requirements in Europe, UK, China, Europe and other geographies. “A lot of different countries now are beginning to apply some quite strict laws,” Lawrence said. “In the EU in particular, the Energy Efficiency Directive is going to require organizations to report energy and carbon use and a lot more else besides – things like water, workloads, and megawatt use and so on.” Even when the targets are voluntary and set by the companies themselves, the fact that they are published alongside financial reports may make them legally binding, he added. “Companies will have to close that gap,” he said. “I think a lot of companies are going to go rather quiet and possibly walk back on some of their commitments. But, hopefully, there will be some serious attention and investment paid to the issue of efficiency and a lot of new innovation will happen.” 2. AI demand impacts supply chains more than facilities While artificial intelligence innovations like ChatGPT have created huge excitement in the industry since late 2022, their effect on most data center operators in the near term will be limited. “AI’s impact will be widely felt throughout the industry,” said Uptime Institute research analyst Jacquie Davis. “But mostly that will be indirectly through straining equipment supply chains, pushing server chip power levels, and making operators rethink their resiliency posture.” In 2024, the major hyperscalers will be hosting the most demanding models, she said. This will prevent runaway facility power and cooling requirements that many operators fear, she said. And when data center operators do get AI workloads, they’ll most likely distribute them through the data center in order to minimize the impact on power and cooling systems. “But even operators who have little to no AI deployment are still likely to find themselves in competition for other facility equipment,” she added. 3. Data center software gets smarter New data center infrastructure management tools using machine learning have emerged that can optimize efficiency, maintenance, and capacity utilization. But traditional DCIM and BMS vendors have been slow to evolve, said research analyst Max Smolaks. Instead, a group of startups have appeared that leverage sensor, monitoring, and other facilities data to improve cooling, predict problems, reduce waste, and discover stranded capacity. They include Phaidra, Coolgradient, QIO, TycheTools, Vigilent, and EkkoSense. However, most data centers have been slow to adopt these technologies for a variety of reasons, including implementation risks, the need for more data collection and network connectivity, and cybersecurity concerns due to the bigger attack surface. On a scale of one to five, with five being fully autonomous, most data centers are now at level three, Smolaks said. “But we’re finally seeing the beginning of a shift towards level four – and then level five.” At level three, data centers use software to track physical equipment characteristics, location, and operational status. At level four, machine learning starts to be used for prediction, service management, and making recommendations about optimizing the data center for power consumption or cooling. At level five, AI is used to automatically manage data center operations. Moving up the maturity scale will require data center operators to learn how to collect, manage, and analyze data. “You might also need to hire analysts and data scientists,” said Smolaks. However, AI can also help alleviate some staffing issues. There are many procedures that haven’t been documented, and data centers rely on engineers to know what handle to turn, what toggle to tweak, he said. “But there aren’t enough engineers,” he says. “We’re facing a staffing shortage – and also a silver tsunami. The most experienced people are going to leave.” AI can be used before they leave in order to codify their knowledge in software, he said. 4. Direct liquid cooling won’t solve efficiency challenges Direct liquid cooling is exciting and has very promising physics, said Davis, and expectations for data center adoption are high. “But there are some challenges yet to be solved,” she added. For example, one downside to liquid cooling is the impact on resiliency, she said. If there’s an outage that affects a liquid cooling system, temperatures may rise more quickly than with an air-cooled system, she said. “Some cold plate systems might offer less than a minute of ride-through time due to the small amount of coolant that they contain.” In addition, the efficiency of liquid cooling is limited by the gradual pace of adoption, she said. Many data centers have hybrid cooling environments or shared infrastructure, and there are trade-offs between cooling performance, capacity, and interoperability. As a result, liquid-cooling and air-cooling systems will probably coexist in data centers for several years, she said, which limits opportunities for optimization. “The net impact of direct liquid cooling is going to be an investment in performance rather than an investment in efficiency,” she said. 5. Hyperscale campuses emerge Massive new hyperscale colocation campuses will emerge to meet surging compute demand, spanning upwards of millions of square meters and gigawatts of power capacity, Uptime predicts. They will support different types of tenants, including hyperscale cloud providers and individual enterprises looking for colocation space, according to research analyst John O’Brien. The biggest of these will be in North America, but other hyperscale campuses are being developed in the Asia-Pacific region and elsewhere. “There’s massive scale in and around North Virginia,” he said. “The level of investment dwarfs that of other regions – we’re looking at $45 billion being spent there, four times more than the next nearest region which is Asia Pacific.” The biggest challenges these facilities face are power, he said, and ensuring the use of renewable or carbon-free energy. Other issues that affect location include connectivity, proximity to customers and other data centers, taxes and regulations, and the labor pool. When these new hubs do arise, they’ll be networked globally and have new levels of automation, efficiency, and power, O’Brien said. Related content how-to Compressing files using the zip command on Linux The zip command lets you compress files to preserve them or back them up, and you can require a password to extract the contents of a zip file. By Sandra Henry-Stocker May 13, 2024 4 mins Linux news High-bandwidth memory nearly sold out until 2026 While it might be tempting to blame Nvidia for the shortage of HBM, it’s not alone in driving high-performance computing and demand for the memory HPC requires. By Andy Patrizio May 13, 2024 3 mins CPUs and Processors High-Performance Computing Data Center opinion NSA, FBI warn of email spoofing threat Email spoofing is acknowledged by experts as a very credible threat. By Sandra Henry-Stocker May 13, 2024 3 mins Linux how-to Download our SASE and SSE enterprise buyer’s guide From the editors of Network World, this enterprise buyer’s guide helps network and security IT staff understand what Secure Access Service Edge (SASE) and Secure Service Edge) SSE can do for their organizations and how to choose the right solut By Neal Weinberg May 13, 2024 1 min SASE Remote Access Security Network Security PODCASTS VIDEOS RESOURCES EVENTS NEWSLETTERS Newsletter Promo Module Test Description for newsletter promo module. Please enter a valid email address Subscribe