No lol the amount of power that cloud services use is atrocious. The serverless trend makes things add up. It’s cheaper in terms of hardware but oh boy do all those layers of abstraction make things heavy, especially loading of the applet, communication to other applets. (I’m forgetting if applets is the correct name).
I actually don’t belive the ~5% figure at all—especially with the track record of honesty (or the lack of it) these companies have.
Sure, datacenters are rather efficent, but times 330 million people in the US adds up.
The amount of power AI and Crypto require is orders of magnitude the amount of power required by pretty much any regular application. The company I work at uses somewhere around 2000 CPU cores worth of compute at AWS (and we have ~100 microservices. We are a fairly complex org that way).
Generally speaking, an 80CPU core system takes up ~200W worth of power. That means my companies entire fleet operating eats about 5kW of power when running full bore (it isn’t doing that all the time). My company is not a small company.
Compare that to what a single nvidia A100 eats up. Those GPUs take up to 400W of power. When doing AI/crypto stuff you are running them as hard as possible (meaning you are eating the full 400W). That means just 12 AI or crypto apps will eat all the same amount of power that my company with 100 different applications eats while running full bore. Now imagine that with the model training of someone like chatgpt which can eat pretty much as many GPUs as you can throw at it.
To put all of this in perspective. 5kW is roughly what a minisplit system will consume.
Frankly, I’m way more concerned about my companies travel budget in terms of CO2 emissions than I am our datacenter usage.
DarkDarkHouse@lemmy.sdf.org
on 21 Dec 23:46
collapse
I would bet that hardware being way more efficient and corporate IT infrastructure being consolidated in data centers is much more energy efficient than the alternative. The fact that we are running much more layered and compute-intensive systems doesn’t really change that.
sunzu2@thebrainbin.org
on 21 Dec 16:23
nextcollapse
Good thing US taxpayer will finally pay for the grid upgrades that are much needed after decades of malinvestment by the capitans of the industry.
We will pay higher rates to becuase industry deserves lower rates 🤡
sentient_loom@sh.itjust.works
on 21 Dec 16:31
collapse
That’s a small price to pay for having no healthcare and a shorter lifespan than those in poorer Western countries.
This but essentially just build SMRs or microreactors into colo plans.
Maybe not so great for the ones that are in the middle of cities, where space is at a premium…but a lot of colo’s are in unassuming spaces in suburbs or BFE.
threaded - newest
Wow, so much energy spent to get worse search results compared to 10 years ago.
But the data mining is off the charts Bob! More AI! More server things! We must be ready for our robot overlords!
I hope we have another carrington event. That’ll fix it.
And I’ll bet roughly 50->90% of that usage is idiots doing crypto/ai garbage.
No lol the amount of power that cloud services use is atrocious. The serverless trend makes things add up. It’s cheaper in terms of hardware but oh boy do all those layers of abstraction make things heavy, especially loading of the applet, communication to other applets. (I’m forgetting if applets is the correct name).
I actually don’t belive the ~5% figure at all—especially with the track record of honesty (or the lack of it) these companies have.
Sure, datacenters are rather efficent, but times 330 million people in the US adds up.
The amount of power AI and Crypto require is orders of magnitude the amount of power required by pretty much any regular application. The company I work at uses somewhere around 2000 CPU cores worth of compute at AWS (and we have ~100 microservices. We are a fairly complex org that way).
Generally speaking, an 80CPU core system takes up ~200W worth of power. That means my companies entire fleet operating eats about 5kW of power when running full bore (it isn’t doing that all the time). My company is not a small company.
Compare that to what a single nvidia A100 eats up. Those GPUs take up to 400W of power. When doing AI/crypto stuff you are running them as hard as possible (meaning you are eating the full 400W). That means just 12 AI or crypto apps will eat all the same amount of power that my company with 100 different applications eats while running full bore. Now imagine that with the model training of someone like chatgpt which can eat pretty much as many GPUs as you can throw at it.
To put all of this in perspective. 5kW is roughly what a minisplit system will consume.
Frankly, I’m way more concerned about my companies travel budget in terms of CO2 emissions than I am our datacenter usage.
I would bet that hardware being way more efficient and corporate IT infrastructure being consolidated in data centers is much more energy efficient than the alternative. The fact that we are running much more layered and compute-intensive systems doesn’t really change that.
Good thing US taxpayer will finally pay for the grid upgrades that are much needed after decades of malinvestment by the capitans of the industry.
We will pay higher rates to becuase industry deserves lower rates 🤡
That’s a small price to pay for having no healthcare and a shorter lifespan than those in poorer Western countries.
I want my daddy owner to live his best live and an adequate social policy for peasamts is the sacrifice he is willing to make. I respect my daddy
I have a solution: spin up a dozen nuclear power plants to fuel AI to solve the problem. If there’s not enough nuclear, just burn coal.
This but essentially just build SMRs or microreactors into colo plans.
Maybe not so great for the ones that are in the middle of cities, where space is at a premium…but a lot of colo’s are in unassuming spaces in suburbs or BFE.
The factory must grow