Nvidia sells tiny new computer that puts big AI on your desktop (arstechnica.com)
from vegeta@lemmy.world to technology@lemmy.world on 15 Oct 03:16
https://lemmy.world/post/37365333

#technology

threaded - newest

SnoringEarthworm@sh.itjust.works on 15 Oct 04:25 next collapse

In fact, according to The Register, the GPU computing performance of the GB10 chip is roughly equivalent to an RTX 5070. However, the 5070 is limited to 12GB of video memory, which limits the size of AI models that can be run on such a system. With 128GB of unified memory, the DGX Spark can run far larger models, albeit at a slower speed than, say, an RTX 5090 (which typically ships with 24 GB of RAM). For example, to run the 120 billion-parameter larger version of OpenAI’s recent gpt-oss language model, you’d need about 80GB of memory, which is far more than you can get in a consumer GPU.

Or you could’ve just made GPUs, and then we’d all be gaming and calling each other shitheads in Valorant instead of - checks notes - literally stealing the water from poor communities.

Lembot_0004@discuss.online on 15 Oct 05:16 next collapse

You can game with bricks. Or ball.

And throw away your notes. They are a completely disgraceful waste of paper.

SnoringEarthworm@sh.itjust.works on 15 Oct 07:01 collapse

Blocked for being a dick.

Engywuck@lemmy.zip on 15 Oct 09:09 collapse

Well, he’s right. Most likely you’re “wasting” energy and water as well, just in a different manner.

SnoringEarthworm@sh.itjust.works on 15 Oct 10:54 collapse

The difference is your comment managed to say that without being a dick about it.

mctoasterson@reddthat.com on 15 Oct 12:01 next collapse

If I had to come up with a steelman argument for small “AI focused” systems like this, I’d say that the more development in this space, makes the cost of entry cheaper, and actually eventually starves out the big tech garbage like OpenAI/Google/Microsoft.

If everyone who wants to use AI can locally process queries to a locally hosted open-source model with “good enough” results, that cuts out the big tech douchebags, or at least gives an option to not participate in their data collection panopticon ecosystem.

NGram@piefed.ca on 15 Oct 12:32 collapse

Unfortunately Nvidia is also big tech so starving out (sort of) competitors doesn't help get rid of douchebags. It actually has the added risk of giving some of the douchebags a monopoly.

Buying one of those AMD Ryzen AI Max chips actually makes more sense now...

monogram@feddit.nl on 18 Oct 16:11 collapse

But this device will be air cooled, the freshwater argument is a huge problem but only exists for hyperscalers and cloud ai.

This would actually be a good way to lower demand of building more ai servers farms.

MonkderVierte@lemmy.zip on 15 Oct 09:26 next collapse

Ok, but can you use it as a PC?

spacelord@sh.itjust.works on 15 Oct 10:32 collapse
unattributed@app.wafrn.net on 15 Oct 03:43 collapse

This thing is actually pretty cool…although it does have a bit of a power scaling issue compare to something like the AMD Strix Halo systems that are out now.

But dangit - that 200Gbe network….oooohhhh….


#comuter #minipc #risc
athairmor@lemmy.world on 15 Oct 14:14 collapse

It’ll be interesting to see what hackers do with it when it fails to sell and fire sale starts.