Linus Torvalds still uses an AMD RX 580 from 2017 — also ditches Apple Silicon for an Intel laptop (www.tomshardware.com)
from throws_lemy@lemmy.nz to tech@programming.dev on 02 Aug 05:10
https://lemmy.nz/post/26317262

Despite the rapid pace of GPU evolution and the hype around AI hardware, Linus Torvalds — the father of Linux — is still using a 2017-era AMD Radeon RX 580 as his main desktop GPU here in 2025. The Polaris-based graphics may be almost a decade old, but it’s aged remarkably well in Linux circles thanks to robust and mature open-source driver support. Torvalds’ continued use of the RX 580, therefore, isn’t just boomer nostalgia. It’s a statement of practicality, long-term support, and his disdain for unnecessary complexity.

Spotted by Phoronix, this revelation came during a bug report around AMD’s Display Stream Compression (DSC), which was causing black screen issues in Linux 6.17. Torvalds bisected the regression himself, eventually reverting a patch to maintain kernel progress. Ironically, DSC is what allows his Radeon RX 580 to comfortably drive his modern 5K ASUS ProArt monitor, a testament to how far open-source drivers have come.

“… same old boring Radeon RX 580,” Torvalds wrote in an email to the Linux Kernel Mailing List (LKML), reverting the patch for now so development can continue uninterrupted. That one line from the man himself speaks volumes about his preference for stability over novelty.

#tech

threaded - newest

mindbleach@sh.itjust.works on 02 Aug 05:19 next collapse

I have the same GPU and Blender still doesn’t give a fuck.

Lembot_0004@discuss.online on 02 Aug 05:51 next collapse

You: Please, draw a triangle.
Blender: I don’t give a fuck what you want – I won’t draw anything with this GPU!

(Safety :) here before someone started seriously explaining that 580 is totally enough for Blender)

ExtremeDullard@lemmy.sdf.org on 02 Aug 08:32 collapse

That’s pretty much my experience with Blender: the Blender release cycle seems to be hell-bent on shutting out everybody who doesn’t have the latest GPU-du-jour. i.e. if you don’t have infinite resources to throw at the latest compute-cum-space-heater device, you’re permanently stuck with a late version 2 or 3.

iAmTheTot@sh.itjust.works on 02 Aug 05:36 next collapse

I’m going to hazard a guess that he’s not doing a lot of high resolution, high refresh gaming on it.

olosta@lemmy.world on 02 Aug 05:40 next collapse

The reasons to upgrade from this GPU is to launch AAA games from the last three years, AI/ML, creative tools (3D, video…) or to save a few watt-hour. If you don’t care about that upgrading is just wasteful.

LodeMike@lemmy.today on 02 Aug 06:40 collapse

I remember watching an LTT video about Torvalds’ setup and they mentioned he said something like “I don’t game, this [the 580] is overkill”

And it probably still is overkill.

Sibbo@sopuli.xyz on 02 Aug 05:43 next collapse

“news”

Rentlar@lemmy.ca on 02 Aug 06:09 next collapse

That’s pretty RAD-eon.

who@feddit.org on 02 Aug 06:26 next collapse

isn’t just boomer nostalgia.

Of course it isn’t, Mr. Nasir. Linus is not a boomer.

ExtremeDullard@lemmy.sdf.org on 02 Aug 06:37 next collapse

“Boomer” has become a agist term used indiscriminately by younger generations to refer to people they perceive as old. My generation said “Pop” or “Grandpa”,

Just like “Hacker” used to be something to be proud of and now means anyone with or without skills up to no good with computers, and just like “Beg the question” has nothing to do with supplication, this is simply the English language shifting in real time right in front of your eyes.

AllNewTypeFace@leminal.space on 02 Aug 09:42 collapse

“boomer” now just means “no-longer-young person”, and we’re a decade or two away from millennials being boomers.

BananaTrifleViolin@lemmy.world on 02 Aug 06:30 next collapse

This is tech writers thinking everyone lives like them. An 8 year old graphics card if you’re not high end gaming or video editing is fine. That card will still run a 4k desktop, and probably multiscreen 4k desktops without any issue.

For most users, graphics cards have long been at a level when they don’t need upgrading. A mid range graphics card from even 10 years ago is more than powerful enough to watch video, or use desktop programs, and even fine for a wide range of games.

It’s only if you want high end 3D gaming that upgrading is needed and arguably even that has already beyond a point of diminishing returns in the last 5 years for the majority of users and titles.

I do game a fair it and my RTX 3070 which is 5 years old really doesn’t need upgrading. Admittedly that was higher end when it launched, but it still plays a game like Cyberpunk 2077 at a high end settings. It’s arguable how much of the “ultra” settings on most games most users would even notice the difference, let alone actually need. New cards are certainly very powerful but the fidelity jump for the price and power just isn’t there in the way it would have been when upgrading a card even 10 years ago.

iAmTheTot@sh.itjust.works on 02 Aug 14:34 next collapse

I do game a fair it and my RTX 3070 which is 5 years old really doesn’t need upgrading.

Resolution and frame rate? Because at 4k mine was struuuuuggling, I had to get more VRAM.

fuckwit_mcbumcrumble@lemmy.dbzer0.com on 02 Aug 20:12 collapse

The 30 series really sucks from its lack of vram. Amazing GPU, just not enough vram to keep it fed.

TheV2@programming.dev on 03 Aug 08:16 collapse

I played Cyberpunk 2077 at the lowest settings with a GTX 1060. If realistic graphics can be a feature of a game, not caring about realistic graphics is a feature of mine.

matelt@feddit.uk on 02 Aug 07:24 next collapse

Nice, I don’t feel so bad about my RX 560 now

owiseedoubleyou@lemmy.ml on 02 Aug 08:44 next collapse

Damn an RX 580 is now considered “outdated”?

RisingSwell@lemmy.dbzer0.com on 02 Aug 08:59 next collapse

Is that really surprising? It’s old as shit as far as computer hardware goes.

[deleted] on 02 Aug 10:09 next collapse

.

RisingSwell@lemmy.dbzer0.com on 02 Aug 10:14 next collapse

He’s using it because it fits his purpose. If you replaced my laptop GPU with a 580 a lot of my games would stop working.

[deleted] on 02 Aug 10:23 collapse

.

RisingSwell@lemmy.dbzer0.com on 02 Aug 11:00 collapse

My laptop due to MSIs poor decisions and my not knowing better is 4k so uh… Most games, but definitely cyberpunk, probably Baldur’s gate 3, CS2 probably has issues on some maps because other people bitch about nuke and train.

I played cyberpunk originally at 1080p medium on a 1070 and it wasn’t the greatest experience, but I’ve had worse. I can probably provide a more updated list when I get home, just requires scrolling because I have too many games installed.

[deleted] on 02 Aug 11:02 collapse

.

RisingSwell@lemmy.dbzer0.com on 02 Aug 11:08 collapse

I tried 1080p and it looks like shit on the 4k, I don’t really understand why.

Lesson learned though, next laptop is 1440p. I always learn 1 major lesson from a new computer, last time it was ‘hyperthreading matters’ and this time it’s ‘4k laptop is dumb’

fuckwit_mcbumcrumble@lemmy.dbzer0.com on 02 Aug 20:10 next collapse

Ugh, laptops without hyper threading should be illegal. At least core 200 series has E cores, but still you’d need 2 more P cores to make up for the lack of HT.

RisingSwell@lemmy.dbzer0.com on 03 Aug 02:00 collapse

My first gaming laptop had an i5 7600k, good and all, but yeah no hyper threading. On the plus side, that laptop was the first time I learned some laptops can replace CPUs so I put an i7 in it after a couple years.

fuckwit_mcbumcrumble@lemmy.dbzer0.com on 03 Aug 06:24 collapse

Wait. A gaming laptop had a desktop i5? If you’re going to put a desktop CPU into a laptop it better be the best one. A laptop i7 would probably crush that thing unless it was overclocked. Especially at 7th Gen. right as zen was rolling out the gates. Games started to properly take advantage of more than 4 threads for a while at that point.

4th or 5th Gen. was the last time you got replaceable laptop CPUs. But to this day they still make laptops with desktop CPUs.

RisingSwell@lemmy.dbzer0.com on 03 Aug 08:05 collapse

It was an old one, clevo, makes fat laptops. I had the i5 overclocked to 4.7 or 4.8? It didn’t appreciate it but it also didn’t crash or thermal throttle so the only real issue was the thrust produced by the fans. The i7 7700 went slightly higher but not hotter, and that’s what I put in it after I learned hyper threading really mattered, not just having 4 cores.

I miss that laptop, 5 screws to take off the back and it just slides off? Fucking god tier accessibility. Have an MSI titan now and it’s like 17 screws and a bunch of clips. It also has the fastest processor available when I got it because I learned a lesson haha.

[deleted] on 02 Aug 23:27 collapse

.

RisingSwell@lemmy.dbzer0.com on 03 Aug 02:01 collapse

I just left the screen at 4k and told the game 1080p. I figure I’ll ignore the potential issue until I run into performance problems in a couple years

CallMeAnAI@lemmy.world on 02 Aug 10:24 collapse

“perfectly fine” 👌🤣 you’re just being argumentive and completely ignoring the purpose of 98% of non OEM commercial gpus.

Congratu fuckin lation on not buying a Porsche for your 5 minute commute. Elitist 🤣

[deleted] on 02 Aug 10:37 next collapse

.

Walk_blesseD@piefed.blahaj.zone on 03 Aug 02:15 collapse

Dude go outside

CallMeAnAI@lemmy.world on 03 Aug 08:37 collapse
CallMeAnAI@lemmy.world on 02 Aug 10:25 collapse

No it’s ml jackasses being pedantic and starting shit as usual.

[deleted] on 02 Aug 16:45 collapse

.

CallMeAnAI@lemmy.world on 02 Aug 20:07 collapse

Naw fuck you ,🙄 me you 🤡.

Everything Nvidia/Amd/Intel makes on the external side including this is targeted at gamers. You know exactly what you’re doing and it makes you look dumb as fuck.

“I’m going to buy a $300 GPU to fucking run ssh” - pedantic moron Tankie.

[deleted] on 02 Aug 20:41 collapse

.

JATtho@lemmy.world on 02 Aug 11:58 collapse

It has been an goat-tier supported gpu on linux. Still plays doom (2016) era games at nearly max settings. Since last year i have startted calling it a potato that never rots.

guynamedzero@lemmy.dbzer0.com on 02 Aug 12:28 collapse

Ohhhhhh I might have to steal that phrase

Blackmist@feddit.uk on 02 Aug 09:07 next collapse

Good for him. My missus had one and it died pretty quickly due to RAM failure.

One of many reasons I’d never buy a laptop with soldered RAM.

CallMeAnAI@lemmy.world on 02 Aug 10:27 next collapse

Linus doesn’t game??? Holy fuck, let’s get channel 12 up in here to figure out what’s going on in Linus House.

vext01@lemmy.sdf.org on 02 Aug 12:18 next collapse

Me using onboard graphics…

socialsecurity@piefed.social on 02 Aug 14:38 next collapse

A lot of millennial gamers are going this route since endless upgrading does not yield much improvement and of course, fuck nvidia.

Hardware should be used until it either does not do the job required or it breaks outright.

douglasg14b@programming.dev on 03 Aug 08:27 collapse

This is how I’ve always used hardware. Y’all out here buying up new parts each year they release?!?

It’s like iPhone crowd energy, but for PC parts I suppose.

thatradomguy@lemmy.world on 02 Aug 18:06 next collapse

Linus ditching Apple? Now that’s a cold day in hell.

Sidyctism2@discuss.tchncs.de on 03 Aug 02:34 collapse

same bud