Marissa Loewen first started using artificial intelligence in 2014 as a project management tool because she has autism and ADHD and it helps her organise her thoughts.
Hold tight - we’re checking permissions before loading more content
"We try to use it conscientiously, though, because we do realise that there is an impact on the environment," she said.
Her personal AI use isn't unique anymore.
Now it's a feature in smartphones, search engines, word processors and email services.
Every time someone uses AI, it uses energy that is often generated by fossil fuels.
That releases greenhouse gases into the atmosphere and contributes to climate change.
And it's getting harder to live without it.
AI is largely powered by data centres that field queries, store data and deploy information.
As AI becomes ubiquitous, the power demand for data centres increases, leading to grid reliability problems for people living nearby.
"Since we are trying to build data centres at a pace where we cannot integrate more renewable energy resources into the grid, most of the new data centres are being powered by fossil fuels," said Noman Bashir, computing and climate impact fellow with MIT's Climate and Sustainability Consortium.
The data centres also generate heat, so they rely on fresh water to stay cool.
Larger centres can consume up to 18.9 million litres a day, or close to eight Olympic swimming pools, according to the Environmental and Energy Study Institute.
It's difficult to imagine, because for most users the impact isn't visible, said AI and Climate Lead Sasha Luccioni with the AI company, Hugging Face.
"In one of my studies, we found that generating a high-definition image uses as much energy as charging half of your phone. And people were like, 'That can't be right, because when I use Midjourney (a generative AI program), my phone battery doesn't go down,'" she said.
Jon Ippolito, professor of new media at the University of Maine, said tech companies are constantly working to make chips and data centres more efficient, but that does not mean AI's environmental impact will shrink.
That's because of a problem called the Jevons Paradox.
"The cheaper resources get, the more we tend to use them anyway," he said.
When cars replaced horses, he said, commute times didn't shrink. We just travelled further.
How much those programs contribute to global warming depends on a lot of factors, including how warm it is outside the data centre that's processing the query, how clean the grid is and how complex the AI task is.
Information sources on AI's contributions to climate change are incomplete and contradictory, so getting exact numbers is difficult.
But Ippolito tried anyway.
He built an app that compares the environmental footprint of different digital tasks based on the limited data he could find.
It estimates that a simple AI prompt, such as "Tell me the capital of France," uses 23 times more energy than the same question typed into Google without its AI Overview feature.
"Instead of working with existing materials, it's writing them from scratch. And that takes a lot more compute," Luccioni said.
And that's just for a simple prompt.
A complex prompt, such as "Tell me the number of gummy bears that could fit in the Pacific Ocean," uses 210 times more energy than the AI-free Google search.
A three-second video, according to Ippolito's app, uses 15,000 times as much energy.
It's equivalent to turning on an incandescent lightbulb and leaving it on for more than a year.
It's got a big impact, but it doesn't mean our tech footprints were carbon-free before AI entered the scene.
Watching an hour of Netflix, for example, uses more energy than a complex AI text prompt.
An hour on Zoom with 10 people uses 10 times that much.
"It's not just about making people conscious of AI's impact, but also all of these digital activities we take for granted," he said.
Ippolito said he limits his use of AI when he can.
He suggests using human-captured images instead of AI-generated ones.
He tells the AI to stop generating as soon as he has the answer to avoid wasting extra energy.
He requests concise answers and he begins Google searches by typing "-ai" so it doesn't provide an AI overview for queries where he doesn't need it.
Loewen has adopted the same approach. She said she tries to organise her thoughts into one AI query instead of asking it a series of iterative questions.
She also built her own AI that doesn't rely on large data centres, which saves energy in the same way that watching a movie you own on a DVD is far less taxing than streaming one.
"Having something local on your computer in your home allows you to also control your use of electricity and consumption. It allows you to control your data a little bit more," she said.
Luccioni uses Ecosia, which is a search engine that uses efficient algorithms and uses profits to plant trees to minimise the impact of each search.
Its AI function can also be turned off.
ChatGPT also has a temporary chat function, so the queries you send to the data centre get deleted after a few weeks instead of taking up data centre storage space.
But AI is only taking up a fraction of the data centre's energy use.
Ippolito estimates roughly 85 per cent is data collection from sites like TikTok and Instagram, and cryptocurrency.
His answer there: make use of screen time restrictions on your phone to limit time scrolling on social media.
Less time means less personal data collected, less energy and water used, and fewer carbon emissions entering the atmosphere.
"If you can do anything that cuts a data centre out of the equation, I think that's a win," Ippolito said.
Australian Associated Press