Physical Address
304 North Cardinal St.
Dorchester Center, MA 02124
Physical Address
304 North Cardinal St.
Dorchester Center, MA 02124
Apple just put AI in millions of people’s pockets. The company is rolling out what it calls Apple Intelligence this week, bringing some basic text generation and image editing features to iPhone, iPad, and Mac users who opt in. I’ve been testing these tools through the developer beta version of the software for a couple months now, and they’re pretty mediocre. But this is only the beginning.
Generative AI, once a parlor trick for the tech-obsessed, is fast becoming the main event for major software releases. As Apple pushes its version of the technology, Google is building AI into its Android operating system and forcing everyone to look at AI Overviews at the top of virtually every Google Search. OpenAI and Meta are building their own AI-powered search engines, while the startup Perplexity already has one. Microsoft and Anthropic recently announced new, super-powerful AI agents that can complete complex tasks much like humans would. (Disclosure: Vox Media is one of several publishers that has signed partnership agreements with OpenAI. Our reporting remains editorially independent.)
While some companies have had generative AI products out in the wild for over a year, the arrival of Apple Intelligence marks an inflection point for the mainstreaming of the technology. Apple Intelligence is only available on the latest Apple devices, but over half the phones in the United States are iPhones. As people upgrade, millions more can tap into the new technology.
If you’re not already using AI, you probably will be soon — whether you like it or not.
“We’re getting AI, especially generative AI, shoved down our throats with little to no transparency, and honestly, the opt-out mechanisms are either nonexistent or complicated,” said Sasha Luccioni, AI researcher and climate lead at Hugging Face, a platform for sharing AI and machine learning tools.
If that fills you with dread, it’s understandable. Maybe you feel bad participating in the race to build a superintelligent AI nobody asked for. You may feel complicit for using AI models trained on copyrighted material without paying the creators. You probably feel just plain bad about the flood of AI slop that’s ruining the internet even if you did not personally create the slop.
Then there’s the climate consequences of it all. AI, in its many shapes and forms, requires a lot of energy and water to work. A lot. That might make you feel downright guilty about using AI.
There’s a chance Apple Intelligence is more guilt-free than the other big AI options as far as energy is concerned. Apple says it keeps the processing for certain AI features, like GenMoji and Image Playground, entirely on your device. That means less reliance on energy-intensive data centers.
We don’t know exactly how much energy AI uses at these data centers. Using data from a recent Microsoft Research study, Shaolei Ren, an engineering professor at the University of California Riverside, came up with this: Asking ChatGPT to write two 200-word emails uses roughly the same amount of energy as a Tesla Model 3 would need to drive one mile. Because they generate so much heat, the processors that generated those emails would also require about four half-liter bottles of water to cool down.
The consequences of such energy profligacy become clearer if you scale up. The amount of electricity used by data centers, where AI processing largely takes place, is predicted to grow by 160 percent by the end of the decade, and carbon dioxide emissions could more than double as a result, according to Goldman Sachs. Meanwhile, the amount of water needed will also spike, so much so that by 2027, AI’s thirst could be equal to half the annual water withdrawal of the United Kingdom.
These are all estimates based on limited data because the tech companies building AI systems, including Apple, Google, Microsoft, and OpenAI, do not share exactly how much energy or water their models use.
“We’re just looking at the black box because we have absolutely no idea of the energy consumption for interacting with the large language models,” Ren said. He compared the situation to searching for flights on Google and being able to see the carbon emissions for each leg. “But when it comes to these large language models, there’s absolutely none, zero, no information.”
The lack of transparency about AI’s energy demands also runs counter to these tech companies’ sustainability promises. There’s good reason to believe that AI is leading directly to those promises being broken.
Due to increases in data center energy usage, Google saw its greenhouse gas emissions increase by 48 percent from 2019 to 2023, despite a pledge to cut emissions by 50 percent from its 2019 levels by 2030. The company no longer claims to be carbon neutral. Microsoft similarly saw a 29 percent jump in emissions from 2020 to 2023. While Microsoft has promised to be carbon negative by 2030, it is now openly struggling with ways to make that happen while keeping pace with AI innovation.
This is what an arms race looks like. It’s worth pointing out here that all energy usages started to spike around the time that OpenAI knocked the world’s socks off with its surprise release of ChatGPT in November 2022. The chatbot became the fastest-growing app ever, capturing 100 million users in two months and kick-starting the AI gold rush in Silicon Valley. Now, 40 percent of all venture capital money in cloud computing goes to generative AI companies. OpenAI itself announced a $6.6 billion funding round in early October — the largest venture capital round of all time — giving it a $157 billion valuation.
With such staggering amounts of money at play, it’s perhaps no surprise that energy efficiency takes a back seat to growth and innovation. Companies like OpenAI want the models that power their AI technology to get bigger so they can get better and outperform competitors. And the bigger the model, the greater the energy demand — at least for now. Over time, it’s likely that performance will get more efficient thanks to advances in chip technology, data center cooling, and engineering.
“Because the innovation happened so quickly around when ChatGPT burst onto the scene, you would expect, initially, for the efficiency to be at its lowest point,” Josh Parker, head of sustainability at chipmaker Nvidia, told me.
Still, the most energy-intensive products are now what companies like OpenAI, Google, and Meta are pushing the hardest. Those include real-time chatbots, voice assistants, and search engines. These features enlist larger models and require more advanced chips to work at the same time to reduce latency, or lag. Put simply, they have to do a lot of hard math problems all at once and very quickly. That’s why it takes as much electricity as it does to run a Tesla.
Apple, however, seems to present itself as an exception. As part of its promise to protect user privacy, the company says it handles as many Apple Intelligence tasks as it can on your device without sending queries to data centers. That means when you opt in to Apple Intelligence, you download a small generative AI model that can handle pretty simple tasks on your phone. Your iPhone battery, unlike a grid-connected cloud data center, has a limited amount of power, which forces Apple Intelligence to handle these tasks with some efficiency. Maybe on-device AI is the guilt-free version of the future after all.
The problem, of course, is that we don’t know exactly how Apple Intelligence works. But there is some insight you can gain: You can actually access a log of your activity on Apple Intelligence by going to Settings and then Privacy and Security. There, you’ll see an option to export an Apple Intelligence Report. Unless you’re a developer, the file you download will be hard to read, since it’s full of code, but it does technically reveal which tasks are handled on your device and which ones get sent up to Apple’s Private Cloud Compute servers, giving them a bigger carbon footprint. Apple also says its servers run on 100 percent renewable energy, part of the company’s broader commitment to be carbon neutral by the end of the decade.
So again, if you’re feeling dread or guilt about AI in your life, that’s understandable. It is clear that this technology, in its current state, consumes vast and increasing amounts of energy, contributing to greenhouse gas emissions and worsening human-caused climate change. It is also true that you might not have a choice, as big tech companies make generative AI more foundational to their products. You can opt out of Apple Intelligence or never opt in. But you’ll find it’s more difficult, if not impossible, to opt out of AI products from Google, Meta, and Microsoft. (If you want to try, here’s a helpful guide.)
“I don’t think there’s a reason to feel guilty,” Luccioni said. “But I do think there’s a reason — as with climate change in general — to ask for more information, to ask for accountability on behalf of the companies that are selling us this stuff.”
If AI is supposed to solve all our problems or destroy us all or both, it would be nice to know the details. We could ask ChatGPT, but that might be a huge waste of energy.
Update, 10:40 am, November 1, 2023: This story, originally published October 31, has been updated with additional details about Apple Intelligence and its energy use.
A version of this story was also published in the Vox Technology newsletter. Sign up here so you don’t miss the next one!