OpenAI signs $1 trillion worth of chip deals to feed its AI habit (www.fudzilla.com)
from Powderhorn@beehaw.org to technology@beehaw.org on 07 Oct 23:08
https://beehaw.org/post/22560269

These numbers don’t make any sense to me, as the hed is about buying lots of chips, and the body is about power use. No matter how you slice it, $8.76/kWh is a terrible fucking investment … if that’s chip-inclusive, that’s another story.

Still, the audacity of saying “we’re going to invest $1 trillion” is Dr. Evil-level humour.

OpenAI is signing about $1 trillion (€940 billion) in deals this year for computing power to keep its artificial intelligence dreams humming.

On Monday the outfit inked a deal with AMD which follows earlier tie-ups with Nvidia, Oracle and CoreWeave, as Sam Altman’s outfit scrambles to secure enough silicon to keep ChatGPT online and the hype machine alive.

The latest commitments would give OpenAI access to more than 20 gigawatts of computing capacity over the next decade, roughly the output of 20 nuclear reactors. At about $50 billion per gigawatt, according to OpenAI’s estimates, the total tab hits that $1 trillion figure.

Analysts are not convinced this financial engineering makes any sense. DA Davidson analyst Gil Luria said: “OpenAI is in no position to make any of these commitments,” adding that it could lose about $10 billion this year.

#technology

threaded - newest

ryper@lemmy.ca on 07 Oct 23:54 next collapse

These numbers don’t make any sense to me, as the hed is about buying lots of chips, and the body is about power use. No matter how you slice it, $8.76/kWh is a terrible fucking investment … if that’s chip-inclusive, that’s another story.

Data center scale is usually given in terms of power consumption, not computing power. The trillion dollars is meant to buy enough hardware to suck up 20GW of power, and probably none of the money will go towards power generation.

lichtmetzger@discuss.tchncs.de on 08 Oct 00:49 next collapse

Ed Zitron’s gonna have a field day with this. OpenAI’s motto seems to be scaling “to infinity and beyond”. But what can you expect from a techbro CEO that takes Dyson spheres seriously.

Powderhorn@beehaw.org on 08 Oct 01:17 next collapse

That’s a Roomba that just rolls around, right?

floofloof@lemmy.ca on 08 Oct 01:31 collapse

Apparently they were never meant to be taken seriously. Dyson’s article was satirical. Techbros, unfortunately, don’t pick up on this.

Dyson spheres are a joke - Angela Collier

MaggiWuerze@feddit.org on 08 Oct 06:12 collapse

Its the torment nexus all over again

chromodynamic@piefed.social on 08 Oct 03:05 collapse

It’s strange that the concept of efficiency seems to have been abandoned. Is consumption of vast computing resources no longer seen as indication of a design flaw?

ryannathans@aussie.zone on 08 Oct 04:06 collapse

Cost efficiency is all there is now

chromodynamic@piefed.social on 08 Oct 04:11 collapse

I have to doubt the cost efficiency too.

ryannathans@aussie.zone on 08 Oct 05:17 collapse

Do you reckon openai could get cheaper power or gpus? Or something else? Could nvidia get lower production costs for these?

chromodynamic@piefed.social on 08 Oct 08:46 collapse

I’m talking about the software side of things. Generative “AI” seems to be a “brute force” approach to artificial intelligence - just throwing hardware at the problem instead of finding a better approach.
Given the limitations of GenAI, it just feels crazy to keep going this way. Like a sunk-cost fallacy.
These are just my thoughts though, not a real scientific analysis.

ryannathans@aussie.zone on 09 Oct 00:07 collapse

Recent advancements using “dynamic sparsity” or “selective activation” approaches increase efficiency beyond “brute force”. This is how China began to compete without anywhere near the number of GPUs or power.