Before each dinner, I ask my son to consider everything that went into preparing his meal. We start with Mom or Dad cooking, which leads to working hard to provide healthy food, and so we give thanks for everyone and everything that makes working and living well possible. Then we acknowledge the grocery stores, farmers, truck drivers, mobile machines, water treatment facilities, engineers, chemists — you name it, we give thanks for it. As we go through this exercise, it becomes clear how interconnected one simple meal is to just about everything. It’s like playing Six Degrees of Kevin Bacon, except we get to eat the bacon.
We can do the same before watching a YouTube video or asking ChatGPT a question and consider how much digital technology, and therefore electricity, we use in daily living. Electricity powers the data centers that make most things in modern life possible. Thousands of data centers worldwide constantly receive, compute, and send information, which demands a lot of electricity.
The International Energy Agency (IEA) released its “Electricity 2024” report in January, sparking debates about data centers straining global grids. The IEA estimates that global data center electricity consumption was 460 terawatt-hours (TW-h) in 2022 and could exceed 1,000 TW-h in 2026, roughly the amount of electricity Japan uses in a year. The organization predicts that data centers will account for one-third of U.S. electricity consumption and a 30% increase in the EU, with Ireland and Denmark consuming 20% of the EU’s additional demand.
Artificial intelligence (AI), 5G networks, the Internet of Things (IoT), and cloud-based services are pegged as primary contributors. At Design World, our team increasingly reports on AI-embedded hardware and software with ChatGPT-esque functionalities and machine learning (ML) capabilities that improve efficiency, productivity, and safety. Thus, as more manufacturers adopt advanced digital solutions, it’s wise to consider how local improvements impact broader energy consumption.
For instance, AI requires significant computing resources, with large language models (LLM) and ChatGPT consuming most of the load. However, as with any new technology, that can be seen as a launching pad for improvement. In fact, Google researchers learned that by following just four best practices, they can “reduce energy by 100x and emissions by 1000x” for ML workloads and “keep ML under 15% of Google’s total energy use.” Yet, the IEA predicts that if Google switches to an entirely AI-powered search tool, it could consume nearly 10 times the electricity of current searches, exacerbating the issue.
With net zero targets looming through 2050, some say that energy problems are just being pushed around the supply chain and never really solved. Others say this double-edged sword will dull as technology and infrastructure advance. Though today’s AI is an energy hog, it’s also an asset for advancing energy solutions, suggesting that the very thing causing today’s ills could eventually be its own antidote. Until then, perhaps we can lessen the strain by optimizing our digital technologies, considering how local efficiency gains impact broader energy demands, and embracing AI responsibly as a potential ally for sustainability.
You may also like:
Filed Under: NEWS • PROFILES • EDITORIALS, Commentaries • insights • Technical thinking