The AI revolution - what's the true cost to telcos?
Feature

The AI revolution - what's the true cost to telcos?

AdobeStock_ AI Automation.png

Nadine Hawkins investigates whether telcos are ready to handle the upsurge in AI compute power demands

The stratospheric growth of the AI functionality being incorporated into applications has led many experts to reveal grave concerns as to whether telcos are ready to handle the rapidly increasing demands. Tools that produce content such as ChatGPT are seeing a massive surge in computing power which is likely to have a huge impact on network capacity, energy, and the environment.

Generative AI (GenAI) uses considerable computing power resources to effectively train the models used to process data, which in turn generates content. Currently ChatGPT only uses data up until the year 2021. However, to train the model regularly to include up-to-the-minute data will see a significant upsurge in computing power. At the moment ChatGPT operates as a stand-alone service but incorporating it into a search engine for example, could see its users jump from millions to billions in a short period of time.

Generative AI expert Henry Ajder, said. “The astonishing progress we’ve seen with generative AI over the last few years is in partly due to researchers deploying huge amounts of compute and data to train new models. The resulting advances in generative AI have been met with much excitement and concern, but the environmental impact of powering GenAI is often left out of the picture.

“While we’re seeing researchers push for more efficient and less computationally demanding models, the unavoidable reality is compute, local or by cloud services, uses a huge amount of energy produced by fossil fuels. AI will inevitably become an essential part of global infrastructure, meaning resulting carbon emissions will only grow if we don’t accelerate the transition to renewable sources in powering the generative revolution.”

WHAT IS THE TRUE COST?

It’s predicted that over the next few years data centre infrastructure will grow by around 10%. With the AI boom being primarily responsible for the substantial growth as data centres scramble to meet the demand. The increase in compute is likely to exceed the growth rate of the hyperscale community.

Craig Huffman, co-founder and CEO, Metro Edge Development Partners said, “Most companies today are using GPU’s (graphics processing unit) which have incredible compute power, but the only way they work effectively is to have the clusters as close as possible and most data centres are not able to handle the heat load. That said, the increase of compute is estimated to be faster than the typical hyper scale community has grown the past three years. I envision a exponential amount of growth in the AI space over the next few years."

Lim May-Ann, director of the Fair Tech Institute, Access Partnership believes that awareness around the true cost of using AI needs to be communicated effectively.

“The challenge will lie in getting the general public to understand that accessing and using Generative AI – especially services which are "free to use" – do still assert a cost, that is to power demand and draw and therefore carbon emissions. Nasdaq has some initial research that has shown that just training the GenAI will consume 1287MWh, or what they describe as 121 US households’ worth of energy use in a year.

“Telcos will face pressures to upgrade their networks to prepare for the increased demand. This will be challenging for countries which have either just started or are in the middle of their capitalisation journey for 4G infrastructure roll-out and are not quite ready to invest in 5G networks yet.”

Rafael Possamai, data centre facility manager, Bluebird Network believes that whilst investment is likely to be great, it’s possible to minimise some of the traditional hardware procurement challenges.

“The significance of the investment in hardware to support the increase in generative AI largely depends on the scale and scope of the deployment, the specific models being used, and the overall strategy of the companies. While the investment in hardware can be significant, it's important to consider that both Google and Microsoft have a strong foothold in cloud computing and infrastructure. This gives them flexibility to allocate resources as needed, minimising some of the challenges. However, as AI models continue to grow in complexity and size, the investment in hardware to support their deployment will likely remain a substantial consideration.”

WHAT'S THE IMPACT ON THE SUPPLY CHAIN?

Indeed supply chain issues are likely to also be a cause for concern. Craig Huffman, Co-Founder and CEO, Metro Edge Development Partners said. “One of the largest GPU providers has commented that they are unable to get access of GPU’s to do lab testing because of the high demand today. Organisations that are able to create a similar product than that of the NVIDIA GPU (leader in this space today) will likely have the same supply chain issues delivering product in the timely manner due to the heavy growth in AI. In order to fulfil the current demand, supply chain will have to increase by over 100% on delivery to keep up with the rate of orders.”

Whilst companies have been evolving and preparing for the GenAI boom for years, they are in uncharted waters as the rise of AI increases at an exponential rate from being used to predict and diagnose network anomalies, through to improving customer service. The pressure to invest in infrastructure and storage, enhance network performance as well as balance environmental impact makes it one of the most challenging eras the industry has faced.

Steve Alexander, chief technology officer, Ciena, says “service providers must first address some necessary architectural efficiencies, and evolve their networks to a more streamlined, open, and programmatic infrastructure that serves as the foundation for innovation in the distributed services and content era."

“Essentially, you can't automate what you can’t see, so AI-driven automation and monetisation can’t happen until there is full visibility into the network infrastructure providing a ‘single source of truth’."

THE ENVIRONMENTAL TOLL

The complexity of GenAI means it uses significantly more energy than traditional variations of computing. Currently it’s only estimated just how much power and water (via data centre cooling systems) it takes to run AI models. But as those get bigger and powerful the resultant impact on the environment is likely to increase.

David Hirst, group executive for Australian sovereign data centre provider, Macquarie Data Centres said. “AI will have a profound impact on very person, in every company, in every industry. This big AI bang will lead to a data explosion like we’ve never seen before. At the very heart of it are data centres, which will power the AI revolution.

“AI presents a new sustainability challenge and opportunity for the industry. On one hand, the data-intensive workloads produced by AI training and inferencing will see power consumption rise further. On the other, the technology itself has the ability to transform industry efficiency.

“Operational efficiency and environmental sustainability go hand in hand. Operating efficiently not only positively benefits the environment but also reduce costs. AI will have the ability to make infrastructure more efficient through real-time insights on energy and cooling consumption, as well as drive down power usage effectiveness numbers even further. Meanwhile, real-time insights from predictive maintenance can extend the life expectancy of data centres and reduce landfill.

“The time taken for hundreds of millions, potentially billions, of simple math calculations done by AI is being reduced significantly by specialised hardware including graphics processing units and tensor processing units, while cooling methods such as liquid and direct-to-chip are gaining traction to keep the high-density infrastructure that AI requires cool".

New legislation comes into play in January 2024 in the form of the EU’s new Corporate Sustainability Reporting Directive which will see businesses become accountable for their environmental impact.

THE LEGAL PICTURE

Mandatory ESG disclosures and external verification has a huge implication on how companies will report in the future. Yet it’s believed that data centres are currently accountable for 1% of all greenhouse gas emissions.

Hirst believes that the industry must now balance a huge growth cycle while at the same time improving their sustainability credentials.

“Our digital footprint continues to grow and so does its impact on the environment. However, data centres are the most efficient and sustainable environment for running the digital economy," he commented.

“The onus to protect the environment in the AI era doesn’t rest with the industry and its partners alone. Governments across the globe can support the tech industry in achieving environmental, social and governance (ESG) goals by encouraging the adoption of sustainable practices through policy changes and financial incentives including grants.

“Thinking bigger, governments need to prioritise investment into renewable energy. This will provide the digital world with conscious-free consumption, sustainable computing.

“Organisations leveraging data centre resources have an important role to play too. Hyperscalers, clouds and governments need to seek out data centre partners flexible enough to adapt to new technologies like AI, and that maximise efficiency to run it sustainably.

"As the custodians of this technology, it’s important to work together with our customers and governments to continue fuelling this innovation in the most ethical and sustainable way.”

Gift this article