As technological advancements in Generative AI (GenAI) accelerate, the increasing energy demands associated with powering these technologies raise important sustainability concerns. With greater computing power comes higher energy usage, leading to significant environmental impacts. Companies leveraging GenAI must focus on energy efficiency to balance the benefits of innovation with the urgent need to mitigate climate change.
Global technological progress is advancing at an unprecedented pace, but productivity growth remains stagnant. In the U.S., productivity has grown at a sluggish rate of just 1.4% since 2005, despite major technological innovations. Traditionally, economic growth has been driven by population expansion, productivity increases, and debt accumulation. With population growth and debt expected to remain stagnant, the burden of future economic growth will fall primarily on improving productivity. This puts GenAI and its potential for transformative productivity gains at the center of the conversation, but at what environmental cost?
Jeff Wong, Global Chief Innovation Officer of Ernst & Young, recently highlighted the potential of GenAI to drive major technological change, but he also noted the significant environmental implications that come with its adoption. Wong points out that asking a question to GenAI requires six to 10 times more energy than traditional internet searches. This increase in energy use is driven by the complexity of large language models (LLMs) powering GenAI.
The Climate School at Columbia University has raised concerns about whether AI's contribution to decarbonization outweighs its growing energy consumption. Currently, data centers that support GenAI queries account for 2.5% to 3.7% of global greenhouse gas (GHG) emissions, exceeding the emissions of the entire aviation industry. This footprint is expected to grow as AI models become more advanced and more energy is consumed in processing increasingly complex queries.
PwC’s 2023 Emerging Technology Survey projects that computing performance will increase four-fold by 2028, but processing workloads are expected to surge by 50 times due to more sophisticated models with billions of parameters. This growth in processing needs, coupled with the escalating demand for AI solutions, is set to increase the energy consumption of data centers significantly. For instance, data centers in Europe alone are projected to see a 28% rise in energy consumption by 2023.
Reducing the carbon footprint of GenAI requires concerted efforts in four key areas:
Sparse Models of Experts (MoE) Architectures: One promising approach is the development of Sparse MoE architectures, which increase the number of model parameters without proportionally increasing computational loads. This ensures improved performance while maintaining lower energy costs.
Specialized Hardware: A shift toward more energy-efficient hardware, such as Tensor Processing Units (TPUs), is crucial for minimizing energy use in AI processing. Studies show that TPUs are significantly more energy-efficient than traditional GPUs (Jouppi et al., 2017). Specialized hardware like TPUs offers optimized performance for machine learning tasks, using less power and emitting fewer GHGs.
Optimized Data Centers: Data centers focused on machine learning have made significant strides in optimizing infrastructure to reduce overheads such as cooling costs. By improving Power Usage Effectiveness (PUE), they achieve greater energy efficiency. For example, cooling systems are essential to manage the immense heat generated during GenAI processing. Water-based cooling systems are highly efficient in reducing energy consumption and minimizing environmental impact.
Renewable Energy Integration: The transition to renewable energy sources is another crucial step in reducing the carbon footprint of GenAI. Many data centers are increasingly relying on solar and wind power to meet their growing energy demands, reducing their operational carbon emissions (Acun et al., 2023). Despite these efforts, the proliferation of AI-specific hardware in data centers risks widening the gap between operational and embodied carbon footprints (Wu et al., 2022).
In addition to these technological innovations, AI software providers can themselves develop innovative ways to optimise AI in order to cut carbon. RSe Global, through its AI-driven solutions, prioritises efficiency in AI processing to help reduce carbon footprint. By employing advanced techniques such as efficient data retrieval and streamlined processing models, RSe can minimise unnecessary computational loads for its clients. For instance, RSe Global's focus on breaking down complex data into smaller, relevant chunks before passing it to the AI for processing enables a reduction in energy consumption while maintaining the accuracy and performance of AI models, while avoiding redundant processing.
Surprisingly, only 22% of business leaders consider sustainability a top issue in their GenAI deployments. Companies must take proactive steps to account for the emissions produced by GenAI, and this starts with:
As technology evolves, C-Suite executives and Chief Information Officers (CIOs) must play a crucial role in balancing innovation with sustainability. Wong emphasizes that these leaders need to guide companies in understanding the convergence of technology and its business benefits, while also assessing the environmental costs. Wong adds, "A lot of people in the technology world are very attuned to the fact that sustainability is an important issue for the planet, even outside and beyond their industry."
While the future of GenAI and other disruptive technologies holds immense potential, their rapid adoption must come with a commitment to sustainability. Companies must adopt energy-efficient practices, leverage specialized hardware, and integrate emissions accounting into their decision-making processes. As the landscape of AI and sustainability converges, technology leaders have a pivotal role in ensuring that the future of AI is as environmentally responsible as it is transformative.