AI Computing on Pace to Consume More Energy Than India, Arm Says (news.bloomberglaw.com)
from Hypx@fedia.io to technology@lemmy.world on 06 May 2024 22:16
https://fedia.io/m/technology@lemmy.world/t/775553

AI’s voracious need for computing power is threatening to overwhelm energy sources, requiring the industry to change its approach to the technology, according to Arm Holdings Plc Chief Executive Officer Rene Haas.

#technology

threaded - newest

Rognaut@lemmy.world on 06 May 2024 22:56 next collapse

Sounds like some sensationalized bullshit. They don’t give a single number or meaningful statement and they are paywalled.

kakes@sh.itjust.works on 06 May 2024 23:18 collapse

I don’t disagree that they should back up their claim, but it does intuitively make sense. AI - GPT LLMs in particular - are typically designed to push the limits of what modern hardware can provide - essentially eating whatever power you can throw at it.

Pair this with a huge AI boom and corporate hype cycle, and it wouldn’t surprise me if it was consuming an incredible amount of power. It’s reminiscent of Bitcoin, from a resource perspective.

FaceDeer@fedia.io on 07 May 2024 00:43 collapse

No, it makes no sense. India has over a billion people. There's no way that amount of computing power could just magically have poofed into existence over the past few years, nor the power plants necessary to run all of that.

BakerBagel@midwest.social on 07 May 2024 00:55 next collapse

The current LLM’S kinda suck, but companies have fired huge swaths of their staff and plan in putting LLMs in their place. Either those companies hire back all those workers, or they get the programs to not suck. And making LLMs actually capable of working unsupervised will take more and more energy.

MakePorkGreatAgain@lemmy.basedcount.com on 07 May 2024 02:07 next collapse

LLM’s will probably improve at an exponential level - similar to how cpu’s did in the 80s/90s. in ~10 generations the LLMs will likely be very useful

FaceDeer@fedia.io on 07 May 2024 02:25 next collapse

Sure, but it's simply not physically possible for AI to be consuming that much power. Not enough computers exist, and not enough ability to manufacture new ones fast enough. There hasn't been a giant surge of new power plants built in just the past few years, so if something was suddenly drawing an India's worth of power then somewhere an India's worth of consumers just went dark.

This just isn't plausible.

kakes@sh.itjust.works on 07 May 2024 04:52 collapse

My take is that LLMs are absolutely incredible… for personal use and hobby projects. I can’t think of a single task I would trust an LLM to perform entirely unsupervised in a business context.

Of course, that’s just where LLMs are at today, though. They’ll improve.

revv@lemmy.blahaj.zone on 07 May 2024 02:41 next collapse

If only there had been another widespread, wasteful prior use of expensive and power hungry compute equipment that suddenly became less valuable/effective and could quickly be repurposed to run LLMs…

FaceDeer@fedia.io on 07 May 2024 04:25 collapse

Pretty sure the big AI corps aren't depending on obsolete second-hand half-burned-out Ethereum mining rigs for their AI training.

TheOctonaut@mander.xyz on 07 May 2024 03:13 collapse

This is a future prediction, not a current observation.

I’m not saying it’s correct as a prediction, but “where are the extra power plants” is not good counter-argument.

FaceDeer@fedia.io on 07 May 2024 04:27 collapse

A couple of months ago the average temperature where I live was well below freezing. Now it's around twenty degrees C.

By this time next year it'll be thousands of degrees!

assassinatedbyCIA@lemmy.world on 06 May 2024 23:23 next collapse

I wonder if they made an error as simple as this in their projections. There’s no guarantee that AI interest continues to grow.

foggy@lemmy.world on 06 May 2024 23:42 next collapse

Weird metric, but ok.

JoShmoe@ani.social on 07 May 2024 00:18 next collapse

They finally reached crypto miner level awareness.

NeoNachtwaechter@lemmy.world on 07 May 2024 01:59 next collapse

I wonder why countries let them.

Using up more electric power than there’s available is NOT a simple matter of demand and supply.

If they actually pull too much from the grid, they are going to cause damage to others, and maybe even to the grid itself.

john89@lemmy.ca on 07 May 2024 16:12 collapse

Because they’re not actually pulling too much from the grid to cause damage to others or even the grid itself.

Any musings about curtailing AI due to power consumption is just bullshit for clicks. We’ll improve efficiency and increase productivity, but we won’t reduce usage.

frezik@midwest.social on 08 May 2024 12:44 collapse

Improving the models doesn’t seem to work: arxiv.org/abs/2404.04125?

We comprehensively investigate this question across 34 models and five standard pretraining datasets (CC-3M, CC-12M, YFCC-15M, LAION-400M, LAION-Aesthetics), generating over 300GB of data artifacts. We consistently find that, far from exhibiting “zero-shot” generalization, multimodal models require exponentially more data to achieve linear improvements in downstream “zero-shot” performance, following a sample inefficient log-linear scaling trend.

It’s taking exponentially more data to get better results, and therefore, exponentially more energy. Even if something like analog training chips reduce energy usage ten fold, the exponential curve will just catch up again, and very quickly with results only marginally improved. Not only that, but you have to gather that much more data, and while the Internet is a vast datastore, the AI models have already absorbed much of it.

The implication is that the models are about as good as they will be without more fundamental breakthroughs. The thing about breakthroughs like that is that they could happen tomorrow, they could happen in 10 years, they could happen in 1000 years, or they could happen never.

Fermat’s Last Theorem remained an open problem for 358 years. Squaring the Circle remained open for over 2000 years. The Riemann Hypothesis has remained unsolved after more than 150 years. These things sometimes sit there for a long, long time, and not for lack of smart people trying to solve them.

bilb@lem.monster on 07 May 2024 04:03 next collapse

Take that, India! 😎

whoreticulture@lemmy.blahaj.zone on 07 May 2024 06:29 next collapse

I can’t think of a single thing AI does that is worth the amount of energy consumption.

FiniteBanjo@lemmy.today on 07 May 2024 06:39 next collapse

I can’t think of a single thing AI does

loutr@sh.itjust.works on 07 May 2024 07:01 next collapse

Come on that’s not fair, it’s very good* at drawing album covers and video game assets, which gives more time to artists to go work for Starbucks or Amazon instead of doing something they actually enjoy.

* passable actually, but much cheaper.

FiniteBanjo@lemmy.today on 07 May 2024 07:07 collapse

I’m not in the industry per se, but I wouldn’t hire an AI to do art for my games or album covers.

ArkyonVeil@lemmy.dbzer0.com on 07 May 2024 14:06 next collapse

Correction, AI in the LLM/Diffusion sense is a decent tutor for cheap. Can cobble together rough temp art, and if used by an actually capable artist, make cool stuff.

Anything else and it’s a garbage firehose, it’s the undisputed king of mediocrity. Which, given the standards of SPAM and the modern web, is exactly what it’s being used for.

What a shame.

FiniteBanjo@lemmy.today on 07 May 2024 16:19 collapse

Anything you learn from AI has a margin of error that could ruin you.

ArkyonVeil@lemmy.dbzer0.com on 07 May 2024 17:10 next collapse

Still saves a ton of time from learning from either somewhat related tutorials. Garbage courses. Or digging through the modern spam infested web.

It’s a decent tutor, never said that it’s perfect. I will not hesitate that using it as an assistant has bumped up my productivity and learning by roughly 50% when it comes to programming.

Of course, it has it’s myriad problems, specially in bleeding edge fields like AI development with libraries iterating sometimes nightly. As well as it’s trend to not exactly teach, but instead answer your specific question. So you still need to have some initiative and still rely on a few human resources.


HOWEVER, I do agree that blindly copy pasting code from an AI is a TERRIBLE idea. And all the buzz about AI developers seems like a disaster waiting to happen (and it certainly will!).

UnderpantsWeevil@lemmy.world on 07 May 2024 18:25 collapse

And we’re rapidly liquidating the reserves of useful information in order to feed this beast.

Google results are declining as websites like Stack Exchange and Reddit crap out, Wikipedia pages are filling up with misinformation, news articles are increasingly full of nonsense and procedural generated fluff.

Its not just garbage on its face. Its a cancer that’s spreading through the rest of our internet archives, blotting out the good and bloating front pages with bad data.

FiniteBanjo@lemmy.today on 07 May 2024 18:28 collapse

It also gets worse every generation as it recursively feeds on the bad data.

[deleted] on 07 May 2024 16:11 next collapse

.

FiniteBanjo@lemmy.today on 07 May 2024 16:19 next collapse

Thou doth protest too much, whilst instigating in forums.

john89@lemmy.ca on 07 May 2024 17:02 collapse

Found one.

bizzle@lemmy.world on 07 May 2024 17:16 collapse

We don’t use that word anymore, I’m sure you can find a better one.

UnderpantsWeevil@lemmy.world on 07 May 2024 18:07 collapse

You and AI, both.

FiniteBanjo@lemmy.today on 07 May 2024 18:09 collapse

I’m glad to know AI can’t think of a single thing AI does, although if it thought otherwise I still wouldn’t care.

Ibuthyr@lemmy.wtf on 07 May 2024 18:45 collapse

The only really useful AI thing is the denoiser in Adobe Lightroom. I can shoot pictures in pitch black darkness with the highest ISO settings. Obviously it is a grainy mess. The denoiser manages to clean that up while retaining all of the details. It’s really fucking great!

Anything else is just novelty bullshit.

whoreticulture@lemmy.blahaj.zone on 07 May 2024 19:23 collapse

Sounds useful, but not at all worth the amount of energy being used to produce AI. You could just use that energy to feed/house people who could do the labor of denoising.

Ibuthyr@lemmy.wtf on 07 May 2024 21:30 collapse

I know what you mean, but it’s not really possible to manually denoise a picture the way the AI denoiser does. Let alone within 10 seconds. Plus, it’s more of a niche usage. I don’t think it consumes all that much energy.

Generating shitty images, creating deepfakes, prompting all kinds of bullshit… now that is a waste of energy as it really just makes the world worse. AI generated articles are popping up all over the internet. They aren’t even reviewed anymore. Enshittification of the internet took some gigantic strides since the AI boom.

frezik@midwest.social on 08 May 2024 12:29 collapse

Do you know if the model is running locally or some cloud shit? If locally, the actual energy usage may be modest.

Energy spent training the model initially may have been prohibitive, though.

Ibuthyr@lemmy.wtf on 08 May 2024 20:44 collapse

Good question, I’ll look it up!

MonkderDritte@feddit.de on 07 May 2024 11:32 next collapse

Yeah, don’t AI everything, please.

dutchkimble@lemy.lol on 07 May 2024 14:15 next collapse

Soon they’ll need to make Duracells out of humans

crispyflagstones@sh.itjust.works on 07 May 2024 17:02 next collapse

The ENIAC drew 174 kilowatts and weighed 30 tons. ENIAC drew this 174 kilowatts to achieve a few hundred-few thousand operations per second, while an iPhone 4 can handle 2 billion operations a second and draws maybe 1.5w under heavy load.

Like, yeah, obviously, the tech is inefficient right now, it’s just getting off the ground.

AlotOfReading@lemmy.world on 07 May 2024 17:55 next collapse

ML is not an ENIAC situation. Computers got more efficient not by doing fewer operations, but by making what they were already doing much more efficient.

The basic operations underlying ML (e.g. matrix multiplication) are already some of the most heavily optimized things around. ML is inefficient because it needs to do a lot of that. The problem is very different.

crispyflagstones@sh.itjust.works on 07 May 2024 20:51 collapse

There’s an entire resurgence of research into alternative computing architectures right now, being led by some of the biggest names in computing, because of the limits we’ve hit with the von Neumann architecture as regards ML. I don’t see any reason to assume all of that research is guaranteed to fail.

AlotOfReading@lemmy.world on 07 May 2024 21:11 collapse

I’m not assuming it’s going to fail, I’m just saying that the exponential gains seen in early computing are going to be much harder to come by because we’re not starting from the same grossly inefficient place.

As an FYI, most modern computers are modified Harvard architectures, not Von Neumann machines. There are other architectures being explored that are even more exotic, but I’m not aware of any that are massively better on the power side (vs simply being faster). The acceleration approaches that I’m aware of that are more (e.g. analog or optical accelerators) are also totally compatible with traditional Harvard/Von Neumann architectures.

crispyflagstones@sh.itjust.works on 08 May 2024 11:06 collapse

And I don’t know that by comparing it to ENIAC I intended to suggest the exponential gains would be identical, but we are currently in a period of exponential gains in AI and it’s not exactly slowing down. It just seems unthoughtful and not very critical to measure the overall efficiency of a technology by its very earliest iterations, when the field it’s based on is moving as fast as AI is.

UnderpantsWeevil@lemmy.world on 07 May 2024 18:06 collapse

The ENIAC drew 174 kilowatts and weighed 30 tons.

Combined electricity use by Amazon, Microsoft, Google, and Meta more than doubled between 2017 and 2021, rising to around 72 TWh in 2021.

it’s just getting off the ground

That’s what we’re afraid of, yes.

crispyflagstones@sh.itjust.works on 07 May 2024 20:37 collapse

Yeah, uh huh, efficiency isn’t really a measure of absolute power use, it’s a measure of how much you get done with the power. Nobody calls you efficient if you do nothing and use no power to do that nothing. Google, Amazon, Microsoft, and Meta all together could not get anything done as companies if they all had to split an ENIAC (vastly less powerful than an older model iPhone) between them. This is a completely meaningless comparison.

Absolute power consumption does matter, but global power consumption is approximately 160,000 TWh, so the doubling means all the largest cloud providers all together are now using less than 0.05% of all the energy used across the world. And a chunk of that extra 36 TWh is going to their daily operations, not just their AI stuff.

The more context I add in to the picture, the less I’m worried about AI in particular. The overall growth model of our society is the problem, which is going to need to have political/economic solutions. Fixating on a new technology as the culprit is literally just Luddism all over again, and will have exactly as much impact in the long run.

UnderpantsWeevil@lemmy.world on 07 May 2024 21:00 collapse

Google, Amazon, Microsoft, and Meta all together could not get anything done as companies

Google’s biggest revenue stream is advertisement

Amazon’s biggest revenue stream is data hosting for national militaries and police forces.

Microsoft’s biggest revenue stream is subscriptions to software that was functionally complete 20 years ago

Meta’s biggest revenue stream is ads again

So 72-TWh of energy spent on Ads, Surveillance, Subscriptions, and Ads.

Absolute power consumption does matter, but global power consumption is approximately 160,000 TWh

If these firms were operating steel foundries or airlines at 72-TWh, I would applaud them for their efficiency. Shame they’re not producing anything of material value.

The more context I add in to the picture, the less I’m worried about AI in particular.

Its not for you to worry about. The decision to rapidly consume cheap energy and potable water is entirely beyond your control. Might as well find a silver lining in the next hurricane.

Semi_Hemi_Demigod@lemmy.world on 07 May 2024 21:05 next collapse

So 72-TWh of energy spent on Ads, Surveillance, Subscriptions, and Ads.

Capitalism truly does end up with the most efficient distribution of resources

crispyflagstones@sh.itjust.works on 07 May 2024 21:28 collapse

I don’t like these companies for their cooperation/friendly attitude towards nation-states either, but your comments are insipid. AWS has like 2 million businesses as customers. They have 30% marketshare in the cloud space, of course they provide cloud services to cops and militaries. They’re cheap, and one of the biggest providers, period. I can’t find any numbers showing their state contracts outweigh their business contracts.

And, sure, plenty of those business contracts are for businesses that don’t do anything useful, but what you don’t seem to understand is that telecoms is vital to industry and literally always has been. It’s not like there’s a bunch of virtuous factories over here producing tons of steel and airplanes, and a bunch of computers stealing money over there. Those factories and airlines you laud are owned by businesses, who use computers and services like AWS to organize and streamline their operations. Computers are a key part of why any industry is as productive as it is today.

AI, and I don’t so much mean LLM’s and stable diffusion here, even if they are fun and eye-catching algorithms, will also contribute to streamlining operations of those virtuous steel foundries and airlines you approve so heartily of. They’re not counterposed to each other. Researchers are already making use of ML in the sciences to speed up research. That research will be applied in real-world industry. It’s all connected.

Its not for you to worry about. The decision to rapidly consume cheap energy and potable water is entirely beyond your control. Might as well find a silver lining in the next hurricane.

By the same token, you shouldn’t worry about it either? So insipid.

UnderpantsWeevil@lemmy.world on 07 May 2024 22:36 collapse

AWS has like 2 million businesses as customers.

None of them hold a candle to the Wild and Stormy Cloud Computing contact issued by the NSA.

crispyflagstones@sh.itjust.works on 08 May 2024 10:29 collapse

I don’t like defending Amazon, but your arguments are shockingly ignorant. Stop making things up on the spot and do a shred of research. The cost of the Wild and Stormy contract is ~half a billion, while AWS’s annual revenues are projected to top $100 billion this year.

So, less than half a percent of AWS’s annual revenues. Stop just making shit up off the cuff.

UnderpantsWeevil@lemmy.world on 08 May 2024 12:04 collapse

The cost of the Wild and Stormy contract is ~half a billion

It’s ten billion.

crispyflagstones@sh.itjust.works on 09 May 2024 00:59 collapse

If you do the numbers out on that, the volume doubles to 1% of gross revenues over that time period. Not really bolstering the point you were trying to make here, but you did catch me merely skimming the article because of how dull and bad this conversation is. This conversation is pointless because at the end of the day, AI is literally just a potentially very useful tool, which is why everybody’s freaking out about it. Being against AI as such just because bad people are also using it is kind of pointless.

UnderpantsWeevil@lemmy.world on 09 May 2024 12:13 collapse

the volume doubles to 1% of gross revenues

One contract from one state agency worth 1% of all your gross revenues is substantial.

you did catch me merely skimming the article because of how dull and bad this conversation is

Uh huh. Okay.

crispyflagstones@sh.itjust.works on 09 May 2024 14:01 collapse

Yeah, you were trying to argue AWS is basically for the NSA and cops. That hilariously false claim is what I’ve been consistently rebutting this entire time. You’re moving the goalposts and continuously have this entire conversation, which is why this is a dull and bad conversation. You didn’t start out arguing that 1% is “substantial.” You made a rather different argument. I never disputed that a contract amounting to 1% of a company’s annual revenues is significant, I disputed that that 1% means AWS is just a cop shop. Because that’s not how anything works.

You were wrong, and you were making shit up, and you’re moving the goalposts to avoid having to admit being wrong.

UnderpantsWeevil@lemmy.world on 09 May 2024 14:12 collapse

My guy, you’re arguing with yourself at this point. At the least, learn to read your own material before you try to fact check someone.

crispyflagstones@sh.itjust.works on 09 May 2024 14:17 collapse

I usually do, when the other person in the conversation doesn’t seem like an insincere ass and I’m not looking up an open and shut factual question I already know the answer to, like “is the majority of AWS’s business from cops and the NSA?”

And I was off by like half a percent because I skimmed, and that half a percent doesn’t actually make your point for you. We’re not arguing because you have no arguments

iAvicenna@lemmy.world on 07 May 2024 21:11 next collapse

main use cases: government surveillance and chatbot girl friends

Chessmasterrex@lemmy.world on 07 May 2024 21:12 next collapse

It won’t be needed because nobody will have a job to pay for it. I forsee kurt vonnegut’s book “Player Piano” on steroids.

explodicle@sh.itjust.works on 08 May 2024 13:06 collapse

This focus on individual applications shifts blame onto consumers, when we should be demanding that energy prices include the external cost of production. It’s like guilt tripping over the “carbon footprint” (invented by big oil) of your car.