from Powderhorn@beehaw.org to technology@beehaw.org on 07 Oct 23:08
https://beehaw.org/post/22560269
These numbers don’t make any sense to me, as the hed is about buying lots of chips, and the body is about power use. No matter how you slice it, $8.76/kWh is a terrible fucking investment … if that’s chip-inclusive, that’s another story.
Still, the audacity of saying “we’re going to invest $1 trillion” is Dr. Evil-level humour.
OpenAI is signing about $1 trillion (€940 billion) in deals this year for computing power to keep its artificial intelligence dreams humming.
On Monday the outfit inked a deal with AMD which follows earlier tie-ups with Nvidia, Oracle and CoreWeave, as Sam Altman’s outfit scrambles to secure enough silicon to keep ChatGPT online and the hype machine alive.
The latest commitments would give OpenAI access to more than 20 gigawatts of computing capacity over the next decade, roughly the output of 20 nuclear reactors. At about $50 billion per gigawatt, according to OpenAI’s estimates, the total tab hits that $1 trillion figure.
Analysts are not convinced this financial engineering makes any sense. DA Davidson analyst Gil Luria said: “OpenAI is in no position to make any of these commitments,” adding that it could lose about $10 billion this year.
threaded - newest
Data center scale is usually given in terms of power consumption, not computing power. The trillion dollars is meant to buy enough hardware to suck up 20GW of power, and probably none of the money will go towards power generation.
Ed Zitron’s gonna have a field day with this. OpenAI’s motto seems to be scaling “to infinity and beyond”. But what can you expect from a techbro CEO that takes Dyson spheres seriously.
That’s a Roomba that just rolls around, right?
Apparently they were never meant to be taken seriously. Dyson’s article was satirical. Techbros, unfortunately, don’t pick up on this.
Dyson spheres are a joke - Angela Collier
Its the torment nexus all over again
It’s strange that the concept of efficiency seems to have been abandoned. Is consumption of vast computing resources no longer seen as indication of a design flaw?
Cost efficiency is all there is now
I have to doubt the cost efficiency too.
Do you reckon openai could get cheaper power or gpus? Or something else? Could nvidia get lower production costs for these?
I’m talking about the software side of things. Generative “AI” seems to be a “brute force” approach to artificial intelligence - just throwing hardware at the problem instead of finding a better approach.
Given the limitations of GenAI, it just feels crazy to keep going this way. Like a sunk-cost fallacy.
These are just my thoughts though, not a real scientific analysis.
Recent advancements using “dynamic sparsity” or “selective activation” approaches increase efficiency beyond “brute force”. This is how China began to compete without anywhere near the number of GPUs or power.