Nvidia overtakes Apple as the second most valuable company. (www.cnbc.com)
from Timely_Jellyfish_2077@programming.dev to technology@lemmy.world on 05 Jun 20:40
https://programming.dev/post/15116329

#technology

threaded - newest

BombOmOm@lemmy.world on 05 Jun 20:59 next collapse

Selling shovels during a gold rush is the best way to get rich. :)

RecallMadness@lemmy.nz on 06 Jun 08:50 collapse

While suing everyone else that makes shovel handles that work with your shovel heads.

WhyDoYouPersist@lemmy.world on 05 Jun 20:59 next collapse

Fuck this stupid world we’ve built.

_sideffect@lemmy.world on 05 Jun 21:17 next collapse

Lmao, stupidity

flamingo_pinyata@sopuli.xyz on 05 Jun 21:23 next collapse

Time to sell Nvidia stock. Congrats to Huang for pulling it off. Get out when you’re on top.

eager_eagle@lemmy.world on 06 Jun 01:16 next collapse

imagine how many leather jackets he can buy now

kromem@lemmy.world on 06 Jun 03:02 next collapse

Depends on if they acquire/acquhire from here or if they don’t and get their lunch stolen by photonics plays.

Lemonyoda@feddit.de on 06 Jun 10:38 collapse

This is not how you do shares… :o

dogslayeggs@lemmy.world on 05 Jun 21:23 next collapse

I didn’t know there were that many PC gamers out there. /s

Seriously, though, the pivot from making video cards to investing in AI and crypto is kinda genius. The crypto thing mostly fell into their laps, but they leaned in. The AI thing, though, I’m not sure how they decided to focus on that or who first pitched the idea to the board; but that was business genius.

chrash0@lemmy.world on 05 Jun 21:31 next collapse

same as with crypto. the software community started using GPUs for deep learning, and they were just meeting that demand

dkc@lemmy.world on 05 Jun 21:45 next collapse

To your point, when you look at both crypto and AI I see a common theme. They both need a lot of computation, call it super computing. Nvidia makes products that provide a lot of compute. Until Nvidia’s competitors catch up I think they’ll do fine as more applications that require a lot of computation are found.

Basically, I think of Nvidia as a super computer company. When I think of them this way their position makes more sense.

Aceticon@lemmy.world on 06 Jun 16:13 collapse

Also those thing are highly parallelizable and mainly deal with vector and matrix data, so the same “lots of really simple but fast processing units optimized for vectors and matrix operations working in parallel” that works fine for modern 3D Graphics (for example, each point on a frame image to display on the screen can be calculated in parallel with all the other points - in what’s called a fragment shader - and most 3D data is made of 3D vectors whilst the transforms are 3x3 Matrices) turns out to also work fine for things like neural networks were the neurons in each layer are quite simple and can all be processed in parallel (if the architecture of that wasn’t layered, GPUs would be far less effective for it).

To a large extent Nvidia got lucky that the stuff that became fashionable now works by doing lots of simple and highly paralellizeable computations, since otherwise it would’ve been the makers of CPUs that gained from the rise of said computing power demanding tech.

swayevenly@lemm.ee on 06 Jun 02:36 next collapse

DLSS was a necessity to make gains at speeds their hardware could not keep up with.

kromem@lemmy.world on 06 Jun 03:01 next collapse

They were doing that for years before it became popular. The same tech for video graphics just so happened to be useful for AI and big data, and they doubled down on supporting enterprise and research efforts in that when it was a tiny field before their competitors did, and continued to specialize as it grew.

Supporting niche uses of your product can sometimes pay off if that niche hits the lottery.

webghost0101@sopuli.xyz on 06 Jun 04:52 collapse

Hardware made for heavy computing being good at stuff like this isn’t all that schokking though. The biggest gamble is if new technology will take off at all. Nvidia, just like google has the capital to diversify, bet on all the horses at once to drop the losers later.

slacktoid@lemmy.ml on 06 Jun 05:55 next collapse

To their credit they’ve been pushing GPGPUs for a while. They did position themselves well for accelerators. Doesn’t mean they don’t suck.

RecallMadness@lemmy.nz on 06 Jun 08:55 collapse

They were first to market with a decent GPGPU toolkit (CUDA) which built them a pretty sizeable userbase.

Then when competitors caught up, they made it as hard as possible to transition away from their ecosystem.

Like Apple, but worse.

I guess they learned from their Gaming heyday that not controlling the abstraction layer (eg OpenGL, DirectX, etc) means they can’t do lock in.

K1nsey6@lemmy.world on 05 Jun 21:50 next collapse

Pelosi’s insider trader is paying off for her.

flop_leash_973@lemmy.world on 05 Jun 22:07 next collapse

The real game now is how long will it last before the hype and with the the floor falls out of “AI” and a good chunk of their stock gains with it.

Damage@feddit.it on 05 Jun 23:06 next collapse

Well, they also make good silicon that is apparently useful for different things, that may not change… If it’s good for the next fad as well, they’ll just stay on top.

bamboo@lemm.ee on 05 Jun 23:36 collapse

I don’t think generative AI is going anywhere anytime soon. The hype will eventually die down, but it’s already proved its usefulness in many tasks.

neshura@bookwormstory.social on 06 Jun 05:39 collapse

Is AI useful? Maybe. But is it profitable? AI will go the same way .com did: there will be a massive crash and at the end of that you’ll see who actually had their pants on

Nighed@sffa.community on 06 Jun 06:38 next collapse

Nvidia IS making a profit on it though. It’s the whole “in a good rush, sell shovels” thing.

neshura@bookwormstory.social on 06 Jun 06:47 next collapse

My Point is more that their revenue stream will temporarily take a giant hit during that, when everyone is busy going bankrupt the few AI companies that make a profit with it have better things to do than buy new Accelerators right that instant.

Telodzrum@lemmy.world on 06 Jun 18:52 collapse

nVidia is selling shovels and picks during the AI gold rush.

bamboo@lemm.ee on 06 Jun 08:01 collapse

It can be quite profitable. A ChatGPT subscription is $20/m right now, or $240/year. A software engineer in the US is between $200k and $1m with all benefits and support costs considered. If that $200k engineer can use ChatGPT to save 2.5 hours in a year, then it pays for itself.

neshura@bookwormstory.social on 06 Jun 08:24 next collapse

It’s quite funny that you think ChatGPT is making a profit on that 20$ subscription if you replace a software dev with it.

The bust won’t be because it’s not profitable to use AI but because the companies selling the service cannot do so at rates which are both profitable and actually marketable. Case in point: OpenAI has not made a single cent of profit so far (or at least not reported a profit). The way AI is currently shoved in everywhere is not sustainable because the cost of running an AI model cannot be recuperated by most of these new platforms.

bamboo@lemm.ee on 06 Jun 16:00 collapse

OpenAI is a non-profit. Further, US tech companies usually take many years to become profitable. It’s called reinvesting revenue, more companies should be doing that instead of stock buybacks.

Let’s suppose hosted LLMs like ChatGPT aren’t financially sustainable and go bust though. As a user, you can also just run them locally, and as smaller models improve, this is becoming more and more popular. It’s likely how Apple will be integrating LLMs into their devices, at least in part, and Microsoft is going that route with “Copilot+ PCs” that start shipping next week. Integration aside, you can run 70B models on an overpriced $5k MacBook Pro today that are maybe half as useful as ChatGPT. The cost to do so exceeds the cost of a ChatGPT subscription, but to use my numbers from before, a $5k MacBook Pro running llama 3 70B would have to save an engineer one hour per week to pay for itself in the first year. Subsequent years only the electrical costs would matter, which for a current gen MacBook Pro would be about equivalent to the ChatGPT subscription in expensive energy markets like Europe, or half that or less in the US.

In short, you can buy overpriced Apple hardware to run your LLMs, do so with high energy prices, and it’s still super cheap compared to a single engineer such that saving 1 hour per week would still pay for itself in the first year.

neshura@bookwormstory.social on 06 Jun 16:13 collapse

Yeah I don’t know why you keep going on about people using AI when my point was entirely that most of the companies offering AI services don’t have a sustainable business model. Being able to do that work locally if anything strengthens my point.

frezik@midwest.social on 06 Jun 11:39 collapse

I’ve seen pull requests filled with ChatGPT code. I consider my dev job pretty safe.

bamboo@lemm.ee on 06 Jun 16:02 collapse

ChatGPT isn’t gonna replace software engineers anytime soon. It can increase productivity though, that’s the value LLMs provide. If someone made a shitty pull request filled with obvious ChatGPT output, that’s on them and not the technology. Blaming ChatGPT for a programmer’s bad code is like blaming the autocomplete in their editor for bad code: just because the editor suggests it doesn’t mean you have or should accept it if it’s wrong.

Zatore@lemm.ee on 06 Jun 03:46 next collapse

I’m holding on at least till the stock split

filister@lemmy.world on 06 Jun 03:52 next collapse

The big AI bubble

photonic_sorcerer@lemmy.dbzer0.com on 06 Jun 08:27 collapse

Nvidia and other chipmakers produce actual, useful products. They’ll be sitting pretty after the bubble pops.

mal3oon@lemmy.world on 06 Jun 09:43 collapse

Their main growth drivers are data centers, when demand will dry within 2 years, a bubble will pop. Especially when theoretical architecture of Neural Network change, the need for high performance will decrease.

photonic_sorcerer@lemmy.dbzer0.com on 06 Jun 13:06 next collapse

Then we’ll all get cheaper GPUs! Oh no!

Aceticon@lemmy.world on 06 Jun 16:02 next collapse

Well, there are a period after the last Bitcoin bubble burst when the best way to get a good Graphics card for cheap was buying a used one from on the Bitcoin miner operations that were closing down.

lud@lemm.ee on 06 Jun 21:09 collapse

No not really, the GPUs that datacenters are buying doesn’t even have a display output so they are useless to the vast majority of home users.

If you run a home lab then maybe.

frezik@midwest.social on 06 Jun 17:04 next collapse

See also: Sun Microsystems, who made tons of servers that drove the dotcom boom. They didn’t fare so well afterword.

This is a “grab the pile of cash and be happy” situation.

Aux@lemmy.world on 06 Jun 17:27 collapse

CUDA has a lot of applications outside of AI. They’ll just refocus on the next bubble and will continue hoarding wealth.

[deleted] on 06 Jun 04:14 next collapse

.

zewm@lemmy.world on 06 Jun 05:10 next collapse

All that value and they still can’t get their video cards to work worth a shit in Linux.

victorz@lemmy.world on 06 Jun 08:35 next collapse

I’m using a 2080 Super since 2020 and it’s been mostly gravy. Granted, I’ve not been using anything Wayland-related. But I’m gaming on Steam and shit and it works wonderfully. Better performance than on Windows. Though there is some slight audio delay. A few milliseconds over Windows.

I’ve been looking to switch to Hyprland but it was a bit glitchy with gaming and screen sharing sometimes so I’m holding off on that until I jump over to the AMD ship. It’ll be sweet.

mal3oon@lemmy.world on 06 Jun 09:41 next collapse

What card are you using? Their Linux support in the past years is impressive. They even have open source drivers now (still beta). And thanks to proton, gaming is seemless on Linux. I don’t see the issue you’re describing?

zewm@lemmy.world on 06 Jun 16:59 collapse

I was using 3090 but had to swap to an AMD card due to too many crashes and visual glitches/artifacts.

dev_null@lemmy.ml on 06 Jun 11:39 next collapse

I’ve been using Nvidia cards on Linux for many years and never had issues. I did have issues with the laptop cards (Optimus switching), but on the desktop it was always flawless for me.

accideath@lemmy.world on 06 Jun 11:51 next collapse

I mean, they work. But the drivers aren’t as feature complete as AMD or intel. Wayland support was a strict no until very recently and gamescope support is still very hit n miss and they are less stable than their competition. They’re completely useable though. My 1650 runs well, most of the time.

dev_null@lemmy.ml on 06 Jun 12:05 collapse

When I was in the market for a new card 2 years ago I looked into AMD, but learned that they don’t work as well as Nvidia for GPU passthrough to VMs, which I need to work. I’d love to switch because Nvidia is a shit company, but AMD GPU’s just don’t work for my use case.

I’m curious though because I don’t know what I’m missing. What are the features in AMD drivers that make it more complete?

accideath@lemmy.world on 06 Jun 15:25 collapse

As I said, AMD works much better with wayland and gamescope, thus has, for example, HDR and VRR support. Besides that, their Linux drivers are open source and more stable.

But to my knowledge, AMD GPUs pass through just fine to VMs? What was your problem with them?

dev_null@lemmy.ml on 06 Jun 17:12 collapse

Do many distros use Wayland now? I use Kubuntu and it doesn’t, so that probably explains why I never ran into any issue with that. Gamescope looks like some Wayland tool too from what I see. I don’t have an HDR monitor either. Looks like good stuff, that I just never needed so never noticed it not working.

But to my knowledge, AMD GPUs pass through just fine to VMs? What was your problem with them?

I asked on the VFIO subreddit back then and was told AMD cards have a bug where you have to restart the PC to switch between host and VM (which makes it no better than dualbooting since you have to restart anyway), this was not the case on Nvidia.

So now that Nvidia has open source drivers and works on Wayland, what’s the difference? Just gamescope?

zewm@lemmy.world on 06 Jun 16:57 collapse

I guess you aren’t using Wayland. It’s abysmal with Wayland. Especially electron apps. They just flicker and crash.

dev_null@lemmy.ml on 06 Jun 17:35 collapse

No, I’m on Kubuntu, it doesn’t use Wayland.

lud@lemm.ee on 06 Jun 21:06 collapse

Pretty sure Wayland is installed by default and maybe even enabled by default on new installs.

On the login screen there should be a button to switch between x11 and Wayland.

dev_null@lemmy.ml on 06 Jun 21:21 collapse

Oh yeah there is a button to switch on the login screen, but X11 is the default and I never saw a reason to switch the default.

lud@lemm.ee on 06 Jun 21:39 collapse

Personally, I wouldn’t say “it doesn’t use Wayland” when it absolutely can with a single mouse click and it works great.

dev_null@lemmy.ml on 06 Jun 23:36 collapse

Yes, it can, and by default it doesn’t use it.

DaPorkchop_@lemmy.ml on 06 Jun 16:39 collapse

Why does everyone always complain about Nvidia support on Linux? I’ve been using Nvidia GPUs on Ubuntu and Debian for years and it has never required any more effort than ‘sudo apt install nvidia-driver’.

TheGrandNagus@lemmy.world on 06 Jun 16:42 next collapse

Because they’re notoriously fickle and bug/breakage-prone.

Lucidlethargy@sh.itjust.works on 07 Jun 01:25 collapse

I don’t know, I don’t find Linux folks very fickle. Seems like they have the opposite problem more often than not.

TheGrandNagus@lemmy.world on 07 Jun 13:35 collapse

I’m referring to Nvidia’s dogshit Linux drivers.

zewm@lemmy.world on 06 Jun 16:59 next collapse

It’s not difficult to install the drivers. I recently had to swap out my 3090 for an AMD card because Wayland just crashes and works poorly with Nvidia.

TheGrandNagus@lemmy.world on 07 Jun 13:36 collapse

You should probably rephrase that to say Nvidia crashes and works poorly with Wayland.

Saying Wayland works badly with Nvidia is a bit like saying Linux doesn’t support Photoshop, rather than the other way around.

zewm@lemmy.world on 07 Jun 16:20 collapse

Tomato tomato

TheGrandNagus@lemmy.world on 08 Jun 07:00 collapse

Not really, the wording completely changes who is at fault.

When you say Wayland doesn’t work for Nvidia, it’s blaming Wayland, but Linux/Wayland isn’t at fault here, Nvidia is for providing drivers that aren’t fit for purpose.

If Nvidia drivers broke on Windows, nobody would say “Windows is broken for Nvidia”, they’d say the opposite, but with Linux we act like the problem is Wayland, for some reason.

KneeTitts@lemmy.world on 06 Jun 17:09 next collapse

wont

bitwolf@lemmy.one on 06 Jun 17:09 next collapse

In my experience newer kernels and Wayland + nvidia is a huge mess.

I switched to AMD and have had 0 downtime, all the cool features nvidia touts, and fully working Wayland with no effort at all.

helenslunch@feddit.nl on 08 Jun 03:51 collapse

I guess if you had a positive experience that must mean everyone else is lying?

venusaur@lemmy.world on 06 Jun 05:29 next collapse

AI is this decades .com boom. Brace yourself for the crash.

Baggie@lemmy.zip on 06 Jun 23:41 collapse

God I hope so, but the next thing will likely be even more stupid than this, NFTs and crypto.

Setnof@feddit.de on 06 Jun 08:24 next collapse

“Valuable”

MonkderDritte@feddit.de on 06 Jun 11:30 next collapse

So they rip customers off? Got it.

frezik@midwest.social on 06 Jun 11:48 next collapse

Last year’s Nvidia keynote at Computex had Jensen trying to get the audience to have an awkward, AI-generated sing along. The market thought this was great and sent the market cap over $1T.

For this year’s keynote, Jensen wandered the stage like he was looking for his cat while rambling about language models. The market thinks this is great and sent the market cap over $3T.

For the second biggest company on Earth, he is a shockingly bad speaker, and completely ill prepared. For some reason, the market loves this guy.

FlyingSquid@lemmy.world on 06 Jun 12:21 collapse

Is it that the market loves him or is it that a CEO’s keynote isn’t really that big a deal and is mostly an ego-stroking event?

Because I’m guessing what the market actually loves is the new products that are announced.

frezik@midwest.social on 06 Jun 12:23 collapse

That’s the thing: no new products were announced.

FlyingSquid@lemmy.world on 06 Jun 12:23 next collapse

I take back what I said in that case.

bitwolf@lemmy.one on 06 Jun 17:07 collapse

For consumers. They’re pushing put giant power hungry gpus for data centers to power LLM.

Most of the valuation is likely consumers hyping the bull run, and speculation about just how much b2b revenue they will get.

frezik@midwest.social on 06 Jun 17:24 next collapse

They didn’t though. Blackwell was announced before this, and there isn’t any real specifics besides showing some prototypes. There’s some software stuff about improving Pandas and pregenerated LLMs. That’s about it.

bitwolf@lemmy.one on 06 Jun 19:30 collapse

Dont product announcements usually precede the stock hype?

Bartsbigbugbag@lemmy.ml on 06 Jun 19:58 collapse

No, usually it’s buy the hype sell the news.

jj4211@lemmy.world on 07 Jun 13:54 collapse

This weekend I proposed to my girlfriend, here’s what it taught me about B2B sales…

jadelord@discuss.tchncs.de on 06 Jun 17:23 next collapse

It is all AI hype isn’t it?

balder1991@lemmy.world on 06 Jun 21:29 next collapse

Now we just have to wait for the crash.

pipe01@programming.dev on 06 Jun 23:48 next collapse

Always has been

ours@lemmy.world on 07 Jun 13:45 collapse

Not quite, it used to be crypto hype.

sirboozebum@lemmy.world on 07 Jun 01:58 collapse

AI wank

starman@programming.dev on 06 Jun 20:04 next collapse

I wonder why AMD stock hasn’t really gone significantly up (500% vs over Nvidia’s 3000% in last 5Y). They make GPUs too

lorty@lemmy.ml on 06 Jun 20:49 next collapse

This is all AI hyoe, which Nvidia is sadly much ahead of their competitors.

guacupado@lemmy.world on 07 Jun 00:14 collapse

Problem is, that’s why they’re jacking up their price and pumping GPUs so quickly a good chunk of them are DOA and their customer service sucks too. No one likes dealing Nvidia on any level which is why everyone is making their own asics to get away from having to buy Nvidia gpus.

GoodEye8@lemm.ee on 06 Jun 22:03 next collapse

Because of a lot of things. From graphics side RTX and DLSS left AMD catching up (even if RTX isn’t really that big of a deal now), then there was Nvidia cards being better at crypto mining and now it’s Nvidia cards being better at AI computation + Nvidia pivoting into AI hardware space…

If you want to boil it down to the undeniable, it’s that Nvidia is just better at marketing. Everyone knows what Nvidia is doing. What is AMD doing? Besides playing catch-up to Nvidia.

Timely_Jellyfish_2077@programming.dev on 07 Jun 02:19 collapse

In one word: cuda

starman@programming.dev on 06 Jun 20:07 next collapse

<img alt="" src="https://programming.dev/pictrs/image/a215aeae-6cf9-445e-8f52-a70ae105fc5c.jpeg">

UnderpantsWeevil@lemmy.world on 06 Jun 20:55 collapse

Most rational stock market analysis.

phoneymouse@lemmy.world on 07 Jun 15:05 next collapse

With a quarter the revenue of Apple

MystikIncarnate@lemmy.ca on 07 Jun 15:27 collapse

I feel like the executives are all in this “AI” echo chamber. Like, most people grossly misunderstand what AI is, what it does and what it cannot do, with current tech… And all the execs are sitting around in a circle jerk making up solutions using AI, for which there is no problem to solve.

Don’t get me wrong, some companies are doing cool shit with it. Not necessarily practical shit, but cool nonetheless, other companies just seem to be drinking the AI Kool aid and throwing it at fucking everything for no goddamned reason just to get in on the hype. Investors are close behind, trying to ride the coattails of their “success” to riches, and it’s all just a self-reaffirming system with no basis in reality.

Nvidia is the one profiting here, all this AI smoke and mirrors needs something for it to run on top of, they’re selling the physical tools to make it go. Whether it goes somewhere useful or drives off a goddamned cliff, doesn’t matter to Nvidia in the slightest. They made their money. Get wrecked.