skip to log on skip to main content
Article related to:

AI

AI: Powering the engine

Chief Risk Officer, Data & Technology, ANZ

2024-11-11 00:00

We all have an opinion about the rising tide of artificial intelligence (AI), but one certainty is it’s already pervading our day-to-day lives.

This rapidly expanding use of AI demands much more computer processing than anything we’ve run before. It’s a bit like going from Go-Kart racing to Formula 1 – you need a bigger and better engine and a lot more fuel.”

It is ubiquitous; from using a digital voice assistant, to navigating by Google Maps, to relying on a streaming service preference algorithm, to using ChatGPT.

This is not to mention the myriad ways many businesses – including our own – are working to innovate and scale AI to create efficiency and boost productivity.

But this rapidly expanding use of AI demands much more computer processing than anything we’ve run before. It’s a bit like going from Go-Kart racing to Formula 1 – you need a bigger and better engine and a lot more fuel.

To coin a local term – it needs more grunt. But how do we get this grunt? And why is Australia so well positioned to take advantage of this boom?

How AI works

First, we must understand the huge changes in physical infrastructure across many countries which is making this online tool possible. AI can be understood through four distinct levels, each representing a critical component in the AI ecosystem.

At the foundation lies power, the essential energy required to run the compute and infrastructure necessary to support AI operations. Above this, data centres play a crucial role in storing and processing vast amounts of data required for AI algorithms to function effectively.

Moving up the hierarchy, AI vendors provide the specialized tools, platforms and services that enable the development and deployment of AI solutions. At the top, AI applications represent the practical implementations of AI technology, delivering intelligent functionalities and insights across various industries and use cases.

Implementing AI

When we bring a new application into the bank, we don’t typically think too much about issues like power and data centres. This is because we already have strong agreements to cover this part of the chain.

Cloud brought us and many other organisations the ability to store our data somewhere else, or ‘in the cloud’. In reality, the cloud is just someone else’s storage area.  And that is what a data centre is – someone else’s storage centre and we rent out the space.

But something significant is happening in these two crucial parts of the cloud ecosystem – and it directly impacts Australia. 

Why does AI need so much grunt? 

The rapidly expanding use of AI demands bigger and better engines with more “grunt” – specifically Large Language models or LLMs.  

An LLM is a neural network, meaning it uses many layers to understand patterns. It’s like the most intense pivot table you’ve ever seen in an excel workbook with 20 worksheets. 

But this beast of an excel workbook can do anything. It can write code, translate language and write sentences…sounds familiar right? Chat GPT?

Unfortunately, our monthly power consumption has doubled just to run this excel spreadsheet – because every instruction a question is asked of it, it works through all the layers many times over.

AI power crunch

Between 2017 and 2021, the electricity used by Meta, Amazon, Microsoft, and Google – the main providers of commercially-available cloud computing – more than doubled.

LLM-enabled services such as Google Translate execute 500 billion word translations every day. This has been coined the AI power crunch and it’s widely estimated power consumption for data centres will double again by 2030.

And they will need to secure more power. Google’s ‘Council Bluffs’ data centre in Iowa is a US$5 billion investment. In September it was reported Microsoft bought an additional 16 acres of land in India for data centres, which followed 25 acres in 2022 and 48 acres in Hyderabad. 

The value of the global cloud computing market will surge to more than four times its current value from US$546 billion to US$2.32 trillion, according to law firm Allens

When Amazon bought a nuclear power plant in Pennsylvania earlier this year, it said “data centres are at the heart of growth” when asked about why they needed so much power. According to the Australian Information Industry Association, Australia is one of the top five data centre hubs in the world.   

Microsoft is committing $5 billion to expand its data centre footprint to 29 sites in Australia, which is a 37 per cent share of their total data centre foot-print worldwide.  In 2023, Amazon purchased 13.2 hectares in Melbourne, as well as a $30 million land purchase in Sydney.  

But why Australia?

What makes us so special? It’s because we are seen as having an abundance of renewable energy resources. Sun, wind, and the essential minerals for clean energy technologies ranging from batteries to wind turbines.

And this is important, because our existing conventional electricity grids just will not be able to cope with the demand that is emerging. Our stable government and location in the Asia Pacific region also contributes to ideal conditions to invest.

The so-called “hyperscalers” – Amazon, Microsoft, Meta, Google – are taking matters of power supply and resilience into their own hands. They are not risking their supply chain on the conventional power transmission networks of government provided grids and they want clean energy.   

In 2020 Google set a goal to run 24/7 on carbon free energy by 2030.  With the global footprint of Google Data Centres steadily increasing they are currently at 64 per cent clean energy.

Demand for AI compute is a significant headwind to that goal though. In their 2024 Environmental Report Google claims some regions where they have data centres are hard to decarbonise.   

This presents an opportunity for Australia and Google has recently signed contracts to purchase four gigawatts of clean energy generation capacity from Australia, Texas and Belgium. If you happen to be driving through the solar farm in the NSW Riverina district – those are Google panels hard at work. 

This injection of demand for renewable energy may be the trigger to transform job security in rural areas of Australia where there is an appetite to sustain and grow towns that currently rely on jobs linked to coal-fired power plants.

The hyperscalers understand the need to re-invent how they source power. 

Will they revolutionise the renewable energy industry?

If you’ve ever visited an ANZ data centre in Melbourne, you would know they are row and racks of computer servers. Informally described as a ‘hot hotel’ for our computer hardware, data centres need to be cooled. 

Similar to the challenge of shared power, there is a need for shared water to run through and cool the data centres – much like a car engine.  In their sustainability report, Google states it ‘replenished’ 3.9 billion litres of water in 2023, which represents 18 per cent of their current water consumption.

While sounding large, these numbers are significantly smaller than the water consumption of the mining and manufacturing industries. However the hyperscalers are investing in how to sustainably use this precious resource in the locations in which they operate.

So how do we take all this in and think about the need to navigate the world of data centres if you have an expanding AI use?

First, if you are using an AI vendor for an AI enabled application that does not own the compute and processing power, a watch point is their resilience as a third party to the costs being provided by a hyperscaler. Look at the company Return On Investment closely.

Second, businesses need to retain their agency in choosing a hyperscaler provider for compute and processing power – ensuring that competition remains for the data centre supply chain.

Third, business needs to think about how they factor in the sustainability metrics of the hyperscalers in their own ESG reporting. These are all valid questions corporates need to ask themselves as the need for their business to utilise AI grows.

Because as AI becomes more integrated into our lives – understanding the costs and opportunities of running its systems are just as crucial.

Michelle Pinheiro is Chief Risk Officer for Data & Technology at ANZ..

anzcomau:Bluenotes/ai
AI: Powering the engine
Michelle Pinheiro
Chief Risk Officer, Data & Technology, ANZ
2024-11-11
/content/dam/anzcomau/bluenotes/images/articles/2024/november/ai-racecar.jpg

The views and opinions expressed in this communication are those of the author and may not necessarily state or reflect those of ANZ.

EDITOR'S PICKS

Top