RTX 4070 Super launch day sales are rumored to be a ‘disaster’ – what’s going on with Nvidia’s new GPU? (www.techradar.com)
from Hypx@kbin.social to technology@lemmy.world on 21 Jan 2024 02:59
https://kbin.social/m/technology@lemmy.world/t/777511

Do PC gamers feel 12GB of VRAM is simply not enough for the money in 2024?

#technology

threaded - newest

LOLjoeWTF@lemmy.world on 21 Jan 2024 03:09 next collapse

My Nvidia 1070 with 8gb vram is still playing all of my games. Not everything gets Ultra, nor my monitor isn’t 4K. Forever I am the “value buyer”. It’s hard to put money into something that is marginally better though. I thought 16g would be a no-brainer.

MeatsOfRage@lemmy.world on 21 Jan 2024 03:20 collapse

Exactly, people get to caught up in the Digital Foundry-ification of ultra max settings running at a perfect ~120 unlocked frames. Relax my dudes and remember the best games of your life were perfect dark with your friends running at 9 FPS.

1080p is fine, medium settings are fine. If the game is good you won’t sweat the details.

umbrella@lemmy.ml on 21 Jan 2024 03:26 next collapse

30fps is fine too on most games…

friend of mine makes do with a gtx960@720p and is perfectly fine with it, the fun games run. even new ones.

maybe an upgrade to digital foundry perfect 120fps would be worth it if it werent so damn expensive nowadays outside the us.

swayevenly@lemm.ee on 21 Jan 2024 06:02 collapse

Not to shill for them but Alex makes it a point to run tests and to include optimized settings for non flagship hardware in every review he does. I’m not sure where your digital foundry nomenclatures are coming from.

And no, 30fps is not fine…

umbrella@lemmy.ml on 21 Jan 2024 07:09 collapse

i was referring to the op i was responding to

Ragdoll_X@lemmy.world on 21 Jan 2024 03:33 next collapse

As someone who really doesn’t care much for game graphics I feel that a comment I wrote a few months ago also fits here:

I’ve never really cared much about graphics in video games, and a game can still be great with even the simplest of graphics - see the Faith series, for example. Interesting story and still has some good scares despite the 8-bit graphics.

To me many of these games with retro aesthetics (either because they’re actually retro or the dev decided to go with a retro style) don’t really feel dated, but rather nostalgic and charming in their own special way.

And many other people also don’t seem to care much about graphics. Minecraft and Roblox are very popular despite having very simplistic graphics, and every now and then a new gameplay video about some horror game with a retro aesthetic will pop up on my recommended, and so far I’ve never seen anyone complain about the graphics, only compliments about them being interesting, nostalgic and charming.

Also I have a potato PC, and it can’t run these modern 8K FPS games anyway, so having these games with simpler graphics that I can actually run is nice. But maybe that’s just me.

FlyingSquid@lemmy.world on 21 Jan 2024 12:01 next collapse

I kind of feel the same way about TV resolution. I have a 1080p TV and a 720p TV and I’m honestly fine with them. Sure, there’s better quality out there, but I can always go to the movies if I want that. And I have the advantages of TVs without any ‘smart’ bullshit. They can’t even connect to the internet.

I’m not saying no one else should buy 8k TVs or whatever, if that’s what you want, fine, but there are plenty of people I’ve talked to who feel the same way as me, so I’m glad they haven’t done anything like make us all change to new TVs again like they did when they updated to HD.

Trainguyrom@reddthat.com on 21 Jan 2024 18:39 collapse

I literally have a higher resolution computer monitor than I do TV. My computer monitor costs more than my TV did too!

EssentialCoffee@midwest.social on 22 Jan 2024 21:14 collapse

FPS games tend to be better to run at lower settings to be more competitive anyway. You don’t want all the visual noise.

ABCDE@lemmy.world on 21 Jan 2024 03:55 next collapse

remember the best games of your life were perfect dark with your friends running at 9 FPS.

The frame rate was shat on at the time and with good reason, that was unplayable for me. Best times were Halo 4-16 local multiplayer.

Whom@midwest.social on 21 Jan 2024 09:16 next collapse

I agree that this happens to an extent but Digital Foundry in particular makes a point to take into account performance of the cards most used by regular people and are one of the biggest forces in that space pushing people to not just hit “ultra” and move on as you can see with their optimized settings series and the like, as well as getting the best out of older games as in their retro series. They like games that look good and play smoothly, of course, but I don’t think it’s fair to associate them with that kind of ULTRA MAX OR DIE attitude.

I think there’s sometimes an overcorrection from the “gameplay over graphics” crowd. I’ve been part of that group before and get it, it’s frustrating when from your perspective the industry is ignoring the parts of games that you care about the most. But it’s a strange thing to pick on because at the end of the day pretty things that feel smooth to play are wonderful! That can be done on a toaster with beautiful pixel art / low poly 3D models, but it can also be done in dramatically different ways by pushing high end hardware to its limits. There’s room for both and I adore both. Games are art like anything else and it’d be strange to tell people who appreciate going to a beautiful movie shot on particularly nice film on-location in expensive places just because it’s still a good movie if you watch it on an old laptop with awful web compression or because an underground mumblecore film from 2003 is also great.

Graphics aren’t all that matter to me but if the primary joy someone gets from gaming is seeing ultra-detailed and perfectly rendered scenes the best way they possibly can, good for them. Personally, I like getting good visuals when I can but my primary concern is always framerate, as particularly in first person games even 60fps often triggers my motion sickness and forces me to stick to short sessions. Ultimately I see the this whole debate as a relic of the past that only made sense when the only games the average person had access to were AAA/AA releases. Low-spec gaming is better than it has ever been, with the indie scene continuing to go strong like it has for the past 15+ years and an ever-expanding backlog of classics which now run on just about anything every year.

Crashumbc@lemmy.world on 22 Jan 2024 19:20 collapse

You lost me at 1080p. It’s a basic quality of life thing. Even 1440p is a HUGE upgrade even for regular computer use not even gaming.

I run 4k but I use/need it more work space at work than gaming.

ReallyActuallyFrankenstein@lemmynsfw.com on 21 Jan 2024 03:21 next collapse

Yep, it’s the RAM, but also just a mismatched value proposition.

I think it’s clear at this point Nvidia is trying to have it both ways and gamers are sick of it. They used pandemic shortage prices as an excuse to inflate their entire line’s prices, thinking they could just milk the “new normal” without having to change their plans.

But when you move the x070 series out of the mid-tier price bracket ($250-450, let’s say), you better meet a more premium standard. Instead, they’re throwing mid-tier RAM into a premium-priced project that most customers still feel should be mid-tier priced. It also doesn’t help that it’s at a time where people generally just have less disposable income.

Shirasho@lemmings.world on 21 Jan 2024 03:26 next collapse

I don’t know about everyone else, but I still play at 1080. It looks fine to me and I care more about frames than fidelity. More VRAM isn’t going to help me here so it is not a factor when looking at video cards. Ignoring the fact I just bought a 4070, I wouldn’t not skip over a 4070 Super just because it has 12GB of RAM.

This is a card that targets 1440p. It can pull weight at 4k, but I’m not sure if that is justification to slam it for not having the memory for 4k.

mozz@mbin.grits.dev on 21 Jan 2024 03:51 next collapse

Is it weird that until I read this I forgot that GPUs can make graphics

Deceptichum@kbin.social on 21 Jan 2024 03:55 next collapse

I’m fine playing at 30fps, I don’t really notice much of a difference. For me ram is the biggest influence in a purchase due to the capabilities it opens up for local AI stuff.

iAmTheTot@kbin.social on 21 Jan 2024 04:33 collapse

If someone says they don't notice a difference between 60 FPS and 120+ FPS, I think... okay, it is diminishing returns, 60 is pretty good. But if someone says they don't notice a difference between 30 and 60... you need to get your eyes checked mate.

Deceptichum@kbin.social on 21 Jan 2024 04:38 collapse

I notice a difference, it’s just not enough to make it a big deal for me. It’s like going from 1080 to 1440, you can see it but it’s not really an issue being on 1080.

Jakeroxs@sh.itjust.works on 21 Jan 2024 08:50 collapse

It depends on the game, quick action packed stuff you can see the jumping and in something like a shooter it can be a disadvantage.

For something like Slay the Spire tho, totally fine.

Obi@sopuli.xyz on 22 Jan 2024 18:18 collapse

I’m at the age where if games require such quick reactions that the difference in FPS matters, I’m going to get my ass handed to me by the younguns anyway…

Jakeroxs@sh.itjust.works on 22 Jan 2024 21:20 collapse

Well maybe if you had a 240hz monitor… ;)

Totally fair, just worth point out that it can/does make a difference in those games as it can literally mean the difference in firing where someone was rather then where they are because of how long it takes for you to see the next frame.

ABCDE@lemmy.world on 21 Jan 2024 03:56 next collapse

I think the only reason you’d really need that kind of grunt is on a 4K TV anyway, and even then you can use DLSS or whatever the other one is to upscale.

atocci@kbin.social on 21 Jan 2024 04:16 next collapse

My monitor is only 1440p, so it's just what i need. I ordered the Founders Edition card from Best Buy on a whim after I stumbled across it at launch time by coincidence. I'd been mulling over the idea of getting a prebuilt PC to replace my laptop for a few weeks at that point and was on the lookout for sales on ones with a 4070. Guess I'll be building my own instead now.

miss_brainfart@lemmy.ml on 21 Jan 2024 07:15 collapse

It can pull weight at 4k, but I’m not sure if that is justification to slam it for not having the memory for 4k.

There are many games that cut it awfully close with 12GB at 1440p, for some it’s actually not enough. And when Nvidia pushes Raytracing as hard as they do, not giving us the little extra memory we need for that is just a dick move.

Whatever this card costs, 12GB of vram is simply not appropriate.

PlasmaDistortion@lemm.ee on 21 Jan 2024 03:46 next collapse

My RTX 4060 has 16GB of RAM. What on earth makes them think people would go for 12GB?

lapommedeterre@lemmy.world on 21 Jan 2024 06:11 next collapse

Not being a power of 2 gives me displeasure.

lemmyvore@feddit.nl on 21 Jan 2024 07:18 collapse

It is in base 6.

NightAuthor@lemmy.world on 21 Jan 2024 14:50 collapse

And base 3 and 12. But we don’t really use those numbering systems.

paraphrand@lemmy.world on 21 Jan 2024 08:25 next collapse

I’ve seen people say that card is absurd. I’m not sure who is right there.

elvith@feddit.de on 21 Jan 2024 09:10 collapse

I have a 2060 super with 8GB. The VRAM is enough currently for FHD gaming - or at least isn’t the bottle neck, so 12 GB might be fine with this use case BUT I’m also toying around with AI models and some of the current models already ask for 12 GB VRAM to run the complete model. It’s not, that I would never get a 12 GB card as an upgrade, but you’d be sure, that I’d do some research for all alternatives and then it wouldn’t be my first choice but a compromise, as it wouldn’t future proof me in this regard.

AProfessional@lemmy.world on 21 Jan 2024 12:03 next collapse

Do you think there is a large overlap of people who buy $600-$900 cards and like 1080p?

My 3080 10GB runs out of VRAM personally at 1440p. I would never get <16GB again.

elvith@feddit.de on 21 Jan 2024 19:47 collapse

Hard to say. I was an early adopter of FullHD, always had the equivalent of an xx80 card. Then I stepped back a bit with the 970, as it was the best upgrade path for me (considering I was only upgrading the GPU and the CPU would very likely be the bottle neck moving forward). I was thinking to move to higher resolutions with my new PC. Then my PSU fried my mainboard, CPU and GPU while covid and crypto currencies cause huge price spikes on almost every component and I had to pay way to much for what I’d get performance wise. That’s why I’m running a 2060 super now and stay on FHD.

I might consider upgrading the next time I need a new PC, as this left me in an awkward spot: If I want a higher resolution, I need a new monitor. If I buy one, I’d need a new GPU probably, too. And since my CPU would now be a bottleneck for the rig I should also change that in this process. Then I might want a new mainboard, as I’m currently only running on DDR-4 RAM, and so… the best way forward is basically a new PC (I might save some money by keeping my NVMe drive, etc…).

I’m not sure, what I’m going to do in the future. Up until around the GTX 970, you could get a decent rig that plays current games in FHD on ultra or very high and would continue doing for about 1-2 years. If you degrade to medium - high, probably 4-5 years. You could get that easily for ~900-1000 bucks (or less). Nowadays, the GPU alone can get you close to this price range…

I get it. 1080p is about 2.1 megapixels, while 1440p is already 3.69 megapixels - that’s 75% more pixels and thus, you need way more performance to render it (or rather raster and shade it). But still… I don’t like these prices.

AA5B@lemmy.world on 21 Jan 2024 14:02 next collapse

Thanks, that was going to be exactly my question. I don’t see anyone choosing low memory for video but had no idea what ai needs

elvith@feddit.de on 21 Jan 2024 20:00 collapse

You can run Stable Diffusion XL on 8GB of VRAM (to generate images). For beginners, there’s e.g. the open source software Fooocus, which handles quite a lot of work for you - it sends your prompt to a GPT-2 model (running on your PC) to do some prompt engineering for you and then uses that to generate your images and generally features several presets, etc. to get going easily.

Jan (basically an open source software that resembles ChatGPT and allows you to use several AI models) can run in 8GB, but only for 3B models or quantized 7B models. They recommend at least 16GB for regular 7B models (which they consider “minimum usable models”). Then there are larger, more sophisticated models, that require even more.

Jan can run on CPU in your regular RAM. Since it’s chatting with you, it’s not too bad, when it spits out words slowly, but GPU is / would be nice here…

AA5B@lemmy.world on 22 Jan 2024 12:30 collapse

Thanks

Fungah@lemmy.world on 23 Jan 2024 01:30 collapse

I have a 4090 and I feel the pinch on vram with ai. It’s never enough.

DoctorButts@kbin.melroy.org on 21 Jan 2024 04:32 next collapse

4070 is $600. That seems like total shit to me. That's why.

FiskFisk33@startrek.website on 21 Jan 2024 05:16 next collapse

GPUs haven’t been reasonably priced since the 1000 series.

And now there’s no coin mining promising some money back.

Sibbo@sopuli.xyz on 21 Jan 2024 06:14 next collapse

You mean Nvidia GPUs? I got my 6750XT for 500€, and I think it’s a good price for the performance I get.

popekingjoe@lemmy.world on 21 Jan 2024 07:38 next collapse

Yeah right? I got my 6700 XT for just over $400USD. It was a great deal.

StopSpazzing@lemmy.world on 22 Jan 2024 07:10 collapse

Just got my brand new 6800xt for $350, upgrading from a 970 screw Nvidia.

Amir@lemmy.ml on 22 Jan 2024 12:44 collapse

970 Ti doesn’t even exist

StopSpazzing@lemmy.world on 22 Jan 2024 14:59 collapse

Whoops meant SC

2xar@lemmy.world on 21 Jan 2024 07:58 next collapse

That is still overpriced i think. Although, much less egregious than what Nv is doing. Launch msrp for a HD7850, which was the same category as the 6700XT today (upper middle tier) was 250 usd. A few years prior the 4850 started at 200 usd. Even the Rx 480 started at only 230 usd. And those were all very decent cards in their time.

Whom@midwest.social on 21 Jan 2024 09:32 next collapse

Yeah, I think it’s important not to lose perspective here and let expectations slide just because Nvidia are being more awful right now. Make no mistake, value went out the window a long time ago and AMD are also fucking us, just a little less hard than their main competitor. Even adjusting for inflation, what used to get you the top of the line now gets you last-gen midrange.

reinar@distress.digital on 21 Jan 2024 16:38 next collapse

Launch msrp for a HD7850, which was the same category as the 6700XT today (upper middle tier) was 250 usd.

There’s much more effort involved to produce modern GPU now. Either way, if NVidia would be truly greedy, they’d close gaming gpu business right away and would produce only AI accelerators. You can take same 4070, add $200 worth of GDDR chips to the layout and sell this for $15k minimum, shit would be on backorder.

SailorMoss@sh.itjust.works on 21 Jan 2024 19:55 collapse

I bought a GTX780 for $500 MSRP circa 2013. I considered that to be crazy expensive at the time, but I was going all out on that system. Currently I run a GTX1080Ti(bought used) with 11GB of VRAM and they want me to spend $600 for 1 more GB of VRAM? PS5 has 16 GB of shared memory, 16GB should be entry level of VRAM for a system thats expected to keep up with this generation of graphics. There’s no reason for Nvidia to do this other than to force users to upgrade sooner.

Funny part is the market is so fucked that reviewers are lauding this a decent deal. I think the 1080Ti will last me until OLED matures and I finally upgrade from a 1080p monitor. According to the steam survey most gamers are in a similar boat.

[deleted] on 21 Jan 2024 16:18 next collapse

.

Amir@lemmy.ml on 22 Jan 2024 12:44 collapse

That’s a shit deal when the 4070 is €550

9488fcea02a9@sh.itjust.works on 21 Jan 2024 15:31 collapse

The new mining is AI… TSMC is at max capacity. They’re not going to waste too many wafers making gaming GPU when AI acceleratora are selling for $30k each

FiskFisk33@startrek.website on 21 Jan 2024 16:23 collapse

ugh

wooki@lemmynsfw.com on 21 Jan 2024 05:57 next collapse

If they dont drop the price by at least 50% goodbye nVidia.

So no more nVidia. Hello Intel.

lemmyvore@feddit.nl on 21 Jan 2024 06:44 collapse

I don’t think they care. In fact I think they’re going to exit the consumer market eventually, it’s just peanuts to them and the only reason they’re still catering to it is to use it as field testing (and you’re paying them for the privilege which is quite ironic).

genie@lemmy.world on 21 Jan 2024 07:02 next collapse

Right? TPUs make more sense at scale (especially for LLMs & similar). The consumer market is more about hype and being a household name than it is about revenue.

Kyrgizion@lemmy.world on 21 Jan 2024 08:42 collapse

This. Corporations are lining up in droves for gpu’s to run AI applications. Nvidia doesn’t care about regular consumers because we aren’t even their primary market anymore, just a bonus to be squeezed.

wewbull@feddit.uk on 21 Jan 2024 11:12 collapse

If Nvidia pivot completely out of the consumer space, which I can totally see coming, they are placing the company totally dependent on the AI hype train. That’s a fairly precarious position in my eyes. I’ve yet to see an actual application which it solves with enough reliability to be more than just a curiosity.

willis936@lemmy.world on 21 Jan 2024 13:03 next collapse

They leaned their strategy pretty hard into mining when that was on the table. They for sure chase trends and alienate their base. Any way to juice near term profits and they will. It’s working out for them right now, so surely it will forever.

ScreaminOctopus@sh.itjust.works on 21 Jan 2024 13:44 collapse

They got astronomically lucky the crypto boom fed directly into the Ai boom as it ended, otherwise it would have been 2017 all over again.

agitatedpotato@lemmy.world on 22 Jan 2024 13:05 collapse

Yeah but if they pump their valuation high enough, they will have plenty of time to sell off shares before their decision start to effect the rest of the people who work there.

the_q@lemmy.world on 21 Jan 2024 06:13 next collapse

laughs in 6800XT

caseyweederman@lemmy.ca on 21 Jan 2024 06:44 next collapse

Remember when eVGA decided they would rather leave the market entirely than spend one more day working with Nvidia?

dependencyinjection@discuss.tchncs.de on 21 Jan 2024 09:01 next collapse

Really?

frezik@midwest.social on 21 Jan 2024 10:48 collapse

Yup. It was something like 90% of their revenue, but 25% of their profit.

AlexisFR@jlai.lu on 21 Jan 2024 11:45 collapse

And now they have 0 revenue and 0 profit.

Filthmontane@lemmy.world on 21 Jan 2024 11:53 next collapse

Well according to the previous math, they retained 10% of their revenue and 75% of their profits. I know, math is hard.

altima_neo@lemmy.zip on 21 Jan 2024 15:29 next collapse

They aren’t gonna get far just making keyboards and power supplies though. They wound down their motherboard too, I believe. They let kingpin go.

Filthmontane@lemmy.world on 22 Jan 2024 00:00 collapse

Companies get by making keyboards and power supplies all the time.

AlexisFR@jlai.lu on 21 Jan 2024 16:33 collapse

They are in the process of closing down this year.

FlyingSquid@lemmy.world on 21 Jan 2024 11:57 next collapse

They still exist. However their website also says they’re “America’s #1 NVIDIA partner,” so…

frezik@midwest.social on 21 Jan 2024 12:51 collapse

They do seem to be winding down operations as a whole, though. It’s a deliberate choice on the owner’s part.

Daveyborn@lemmy.world on 21 Jan 2024 14:51 next collapse

Yeah sadly they weren’t gonna be able to stay the same with their remaining products being expensive niche case motherboards and good power supplies. Hopefully the employees got good gigs elsewhere at least.

[deleted] on 21 Jan 2024 20:56 collapse

.

downhomechunk@midwest.social on 21 Jan 2024 18:11 collapse

I wish they would have started putting out AMD products. Powercolor just doesn’t feel like a flagship partner like evga was to nvidia.

shasta@lemm.ee on 21 Jan 2024 20:08 collapse

I would’ve actually switched to AMD if EVGA did

Rakonat@lemmy.world on 21 Jan 2024 06:46 next collapse

Nvidia over pricing their cards and limiting stock, acting like there is still a gpu shortage from all the crypto bros sucking everything up.

Right now, their competitors are beating them at hundreds of dollars below nvidias mrp like for like with the only true advantage nvidia has is in ray tracing and arguably VR.

It’s possible we’re approaching another shorter with the AI bubble though for the moment that seems to be pretty far off.

TL;DR Nvidia is trying to sell a card at twice it’s value cause greed.

genie@lemmy.world on 21 Jan 2024 06:59 next collapse

Couldn’t agree more! Abstracting to a general economic case – those hundreds of dollars are a double digit percentage of the overall cost! Double digit % cost increase for single digit % performance doesn’t quite add up @nvidia :)

Especially with Google going with TPUs for their AI monstrosities it makes less and less sense at large scale for a consumers to pay the Nvidia tax just for CUDA compatibility. Especially with the entrance of things like SYCL that help programmers avoid vendor lock.

Evilcoleslaw@lemmy.world on 21 Jan 2024 09:14 collapse

They’re beating AMD at ray tracing, upsampling (DLSS vs FSR), VR, and especially streaming (NVENC). For the latter look at the newly announced beta partnership with Twitch and OBS which will bring higher quality transcoding and easier setup only for Nvidia for now and soon AV1 encoding only for Nvidia (at first anyway).

The raw performance is mostly there for AMD with the exception of RT, and FSR has gotten better. But Nvidia is doing Nvidia shit and using the software ecosystem to entrench themselves despite the insane pricing.

mihies@kbin.social on 21 Jan 2024 10:40 next collapse

And they beat AMD in efficiency! I'm (not) surprised that people ignore this important aspect which matters in noise, heat and power usage.

MonkderZweite@feddit.ch on 21 Jan 2024 12:27 collapse

Toms Hardware did a test, Rx 6800 is leader there. Next, RTX 3070, is 4.3% worse. Are their newer cards more efficient than AMD’s newer cards?

pycorax@lemmy.world on 21 Jan 2024 13:43 next collapse

They seem to be but honestly, this generation hasn’t been very impressive for both team green and red. I got a 6950 XT last year and seeing all these new releases has only proven that I made a good investment.

Daveyborn@lemmy.world on 21 Jan 2024 14:57 collapse

Nothing compelling enough for me to hop off of a titan Xp yet. (Bought a titan because it was cheaper than a 1070 at the time because of scalpers)

Crashumbc@lemmy.world on 22 Jan 2024 18:47 collapse

30 series maybe.

40 series power usage Nvidia destroys AMD.

The 4070 uses WAY less than a 3070… It’s 200 (220 for supera) that’s nearly more than my 1070 170w

altima_neo@lemmy.zip on 21 Jan 2024 15:30 next collapse

And AI. They’re beating the pants off AMD at AI.

Evilcoleslaw@lemmy.world on 21 Jan 2024 22:07 collapse

True enough. I was thinking more of the gaming use case. But even beyond AI and just a general compute workload they’re beating the pants off AMD with CUDA as well.

umbrella@lemmy.ml on 21 Jan 2024 19:12 collapse

Streaming performance is really good on AMD cards, IME. Upscaling is honestly close and getting closer.

I dont think better RT performance is worth the big premium or annoyances nvidia cards bring. Doubly so on Linux.

CosmoNova@lemmy.world on 21 Jan 2024 09:33 next collapse

I mean yeah when I‘m searching for GPUs I specifically filter out anything that‘s less than 16GB of VRAM. I wouldn‘t even consider buying it for that reason alone.

Thorny_Insight@lemm.ee on 21 Jan 2024 13:16 collapse

And here I’m thinking upgrading from two 512mb cards to a GTX 1660 SUPER with 6GB VRAM is going to be good for another 10 years. The heck does someone need 16 gigs for?

redditReallySucks@lemmy.dbzer0.com on 21 Jan 2024 13:25 next collapse

Gaming in 4k or AI (e.g stable diffusion or language models)

pycorax@lemmy.world on 21 Jan 2024 13:41 next collapse

Future proofing. GPUs are expensive and I expect to be able to use it for at least the next 7 years, even better if it lasts longer than that.

SPRUNT@lemmy.world on 21 Jan 2024 15:58 next collapse

VR uses a lot of RAM.

Treczoks@lemmy.world on 21 Jan 2024 16:29 next collapse

And I thought I had the lamest card on the block with my 2GB. …

[deleted] on 21 Jan 2024 16:46 next collapse

.

barsoap@lemm.ee on 21 Jan 2024 21:03 next collapse

AI. But you’re right my 4G 5500XT so far is putting up a valiant fight though I kinda dread trying out CP77 again after the big patch it’s under spec now. Was a mistake to buy that thing in the first place, should’ve gone with 8G but I just had to be pigheaded with my old “workstation rule” – Don’t spend more on the GPU than on the CPU.

Crashumbc@lemmy.world on 22 Jan 2024 18:58 collapse

Unless your gaming that’s fine.

But if you want to play any newer AAA games (even less than 5-8 years old) or use more than 1080p. You’ll need better.

sandwichfiend@c0tt0n.world on 21 Jan 2024 11:13 next collapse

re: Do PC gamers feel 12GB of VRAM is simply not enough for the money in 2024?

@Hypx @technology GPUs are still too expensive for me

UnfortunateShort@lemmy.world on 21 Jan 2024 11:43 next collapse

Why people no buy our GPU anymore?

Because I can get a whole fucking console for the price of a lower midrange GPU. My only hope is Intel’s Battlemage at this point.

Kbobabob@lemmy.world on 21 Jan 2024 12:28 next collapse

Because you are completely against AMD?

UnfortunateShort@lemmy.world on 21 Jan 2024 13:54 collapse

I have an all-AMD system, but they have become too expensive as well. Just Nvidia with a 20% discount, safe for the 7900 XTX which is completely out of question for me to begin with.

Kbobabob@lemmy.world on 21 Jan 2024 15:31 collapse

Cheaper Nvidia ain’t bad. This is coming from someone that uses a 3080Ti and refuses to use AMD GPUs because of shit way in the past. I use their processors though, those are amazing i just wish they had support for thunderbolt.

jaxxed@lemmy.world on 21 Jan 2024 16:37 next collapse

Does Intel allow AMD to license thunderbolt? USB might be better in the long term to support.

flatlined@lemmy.dbzer0.com on 21 Jan 2024 18:31 collapse

You can get amd with thunderbolt. The motherboards with thunderbolt headers are bloody expensive, and you’ll need a 200 bucks add in card (which needs to match the motherboard manufacturer I think), so it’s not exactly cheap, but it is possible.

Kbobabob@lemmy.world on 21 Jan 2024 18:33 collapse

I understand you can shoehorn just about anything you want into a system but that’s not the same as supporting it IMO.

flatlined@lemmy.dbzer0.com on 22 Jan 2024 01:18 collapse

Agreed, and in my experience (Asus board) it’s functional but a bit buggy, so not an easy recommendation. Still, if you want or need team red it’s an option. Price premium sucked, but wasn’t actually noticeably more than if I’d gone team blue. Not sure I’d do it again in hindsight though. Fully functional but only 90% reliable (which is worse than it seems, in the same way a delay of “only” a second every time you do something adds up to a big annoyance) is perhaps not worth it for my use case.

sake@lemmy.world on 21 Jan 2024 17:07 next collapse

Yeah I’m still running GTX970 since the GPU prices went zongers right after buying this. Last generation with decent performance price balance.

Fuck the market. I’ll just stick with this one until it dies on me.

GhostlyPixel@lemmy.world on 21 Jan 2024 19:25 next collapse

I will hold onto my $700 3080 until it spits fire, cannot believe I was lucky enough to get it at launch.

darkkite@lemmy.ml on 21 Jan 2024 20:54 collapse

yeah but then you have to play a console without mods or cheap games

try buying a used GPU and game on 1080p monitor and you’ll be able to have great graphics without a lot of money

AlexisFR@jlai.lu on 21 Jan 2024 11:44 next collapse

Wait, they didn’t put the 4070 super at 16 GB?

AProfessional@lemmy.world on 21 Jan 2024 11:57 next collapse

They clearly believe customers will always buy nvidia over amd so why bother competing just make an annoyingly segmented lineup.

Kbobabob@lemmy.world on 21 Jan 2024 12:27 next collapse

Nope. Even my 3080 Ti has 12Gb. I was waiting for the 4000 series refresh but i think I’ll just wait and see what the 5000 series looks like.

[deleted] on 21 Jan 2024 17:58 collapse

.

Crashumbc@lemmy.world on 22 Jan 2024 18:48 collapse

If I understand it’s a limitation of the bus width which they kept the same.

Binthinkin@kbin.social on 21 Jan 2024 15:16 next collapse

You all should check prices comparing dual fan 3070’s to 4070’s they are a $40 difference on Amazon. Crazy to see. They completely borked their pricing scheme trying to get whales and crypto miners to suck their 40 series dry and wound up getting blue balled hard.

Aren’t they taking the 4080 completely off the market too?

TheGrandNagus@lemmy.world on 21 Jan 2024 15:59 collapse

Aren’t they taking the 4080 completely off the market too?

Apparently they stopped production of it months ago. Whatever still exists on shelves is only there because nobody has been buying them.

Honestly this has been the worst 80-class Nvidia card ever. The GTX 480 was a complete joke but even that managed to sell ok.

altima_neo@lemmy.zip on 21 Jan 2024 15:32 next collapse

The RAM is so lame. It really needed more.

Performance exceeding the 3090, but limited by 12 gigs of RAM .

Kazumara@feddit.de on 21 Jan 2024 18:56 next collapse

600 $ for a card without 16 GB of VRAM is a big ask. I think getting a RX 7800 XT for 500 $ will serve you well for a longer time.

NIB@lemmy.world on 22 Jan 2024 16:00 next collapse

12gB vram is not a bottleneck in any current games on reasonable settings. There is no playable game/settings combination where a 7800xt’s 16gB offers any advantage. Or do you think having 15fps average is more playable than 5fps average(because the 4070s is ram bottlenecked)? Is this indicative of future potential bottlenecks? Maybe but i wouldnt be so sure.

The 4070 super offers significantly superior ray tracing performance, much lower power consumption, superior scaling(and frame generation) technology, better streaming/encoding stuff and even slightly superior rasterization performance to the 7800xt. Are these things worth sacrificing for 100€ less and 4gB vram? For most people they arent.

Amd’s offerings are competitive, not better. And the internet should stop sucking their dick, especially when most of the internet, including tech savvy people, dont even use AMD gpus. Hell, LTT even made a series of videos about how they had to “suffer” using AMD gpus, yet they usually join the nvidia shitting circlejerk.

I have an amd 580 card and have bought and recommended AMD gpus to people since the 9500/9700pro series. But my next gpu will almost certainly be an nvidia one. The only reason people are complaining is because nvidia can make a better gpu(as shown by the 4090) but they choose not to. While AMD literally cant make better gpus but they choose to only “competitively” price their gpus, instead of offering something better. Both companies suck.

Caitlyynn@lemmy.blahaj.zone on 22 Jan 2024 16:36 next collapse

Found the Nvidia fanboy

YeetPics@mander.xyz on 22 Jan 2024 19:17 next collapse

$100 less IS the advantage.

NIB@lemmy.world on 22 Jan 2024 20:13 collapse

It’s not enough though and the sales are showing it. 7800xt is a decent card but it isnt an amazing offer, it is just a good one. For some people, It is a slightly better value for money option. But those nvidia things have value too. So the value proposition isnt as clearcut, even though it should be considering that AMD is behind.

The steam stats should tell you what consumers think. And while consumers are not infallible, they are a pretty good indicator. The most popular amd card is the 580, which is arguably one of the best cards of all time. Except it came out 6 years ago. Did AMD have a better marketing back then? No. Did they have the performance crown? Nope. But that didnt stop the 580 from being an amazing card.

The 7800xt could have been the new 580, mid/high end card, with decent vram. Except you could get the 580 for 200€, while the 7800xt costs literally three times as much. When your “good” card is so expensive, customers have higher expectations. It isnt just about running games well(cheaper cards can do that too), it is about luxury features, like ray tracing and upscaling tech.

Imagine if the 7800xt was 400€. We wouldnt even have this conversation. But it isnt. In fact, in Europe it launched at basically the same price as a 4070. Even today, it is 50€-80€ cheaper. If nvidia is scamming us with inferior offers, why arent AMD offers infinitely better in value? Because AMD is also scamming us, just very slightly less so.

YeetPics@mander.xyz on 22 Jan 2024 21:02 next collapse

I disagree.

daq@lemmy.sdf.org on 23 Jan 2024 02:57 collapse

$100 sure feels much more solid than rtx that a ton of games don’t even support. There are a bunch of people that just want to play in 4k and couldn’t care less about features you call luxury.

That requires more VRAM and 7800xt and xtx deliver that perfectly.

potustheplant@feddit.nl on 23 Jan 2024 03:11 collapse

A ton? Try “most”.

daq@lemmy.sdf.org on 23 Jan 2024 02:52 next collapse

D4 on Linux. Literally the only bottleneck is it eats 11GB of my 1080Ti’s VRAM for breakfast and then still wants lunch and dinner. Plays 4k on high with prefect fps otherwise. Starts glitching like crazy once VRAM is exhausted after 10-15 minutes.

Zero issues on a 20GB card. I understand that shitty code on single game is not exactly universal example, but it is a valid reason to want more VRAM.

Kazumara@feddit.de on 24 Jan 2024 00:00 collapse

Is this indicative of future potential bottlenecks? Maybe but i wouldnt be so sure.

This is exactly what I expect. I have seen what happened to my friends with their GTX 970 when 3.5 GB of VRAM wasn’t enough anymore. Even though the cards were still rasterizing quickly enough they weren’t useful for certains games anymore. Therefore I recently make sure I go for enough VRAM to extend the useful service life of my cards.

And I’m not just talking about buying AMD, I actually do buy them. I first had the HD 5850 with 1GB, then got my friends HD 5870 also with 1GB (don’t remember if I used it in crossfire or just replaced), then two of my friends each sold me their HD 7850 with 2GB for cheap and I ran crossfire, then I bought a new R9 380 with 4GB when a game that was important to me at the time couldn’t deal with crossfire well, then I bought a used RX 580 with 8GB and finally the RX 6800 with 16 GB two years ago.

At some point I also bought a used GTX 960 because we were doing some CUDA stuff at University, but that was pretty late, when they weren’t current anymore, and it was only used in my Linux server.

Anti_Face_Weapon@lemmy.world on 22 Jan 2024 17:31 collapse

But my rtx :(

trackcharlie@lemmynsfw.com on 21 Jan 2024 23:19 next collapse

less than 20gb of vram in 2024?

The entire 40 series line of cards should be used as evidence against nvidia in a lawsuit surrounding intentional creation of e waste

BorgDrone@lemmy.one on 22 Jan 2024 01:52 collapse

The real tragedy is that PCs still have to make do with discrete graphics cards that have separate VRAM.

mlg@lemmy.world on 22 Jan 2024 01:45 next collapse

insert linus torvalds nvidia clip here

Dra@lemmy.zip on 22 Jan 2024 12:57 next collapse

I haven’t paid attention to GPUs since I got my 3080 on release day back in Covid.

Why has acceptable level of VRAM suddenly doubled vs 4 years ago? I don’t struggle to run a single game on max settings at high frames @ 1440p, what’s the benefit that justifies the cost of 20gb VRAM outside of AI workloads?

rndll@lemm.ee on 22 Jan 2024 13:45 next collapse

GPU rendering and AI.

Asafum@feddit.nl on 22 Jan 2024 15:11 next collapse

Lmao

We have your comment: what am I doing with 20gb vram?

And one comment down: it’s actually criminal there is only 20gb vram

Dra@lemmy.zip on 22 Jan 2024 15:50 collapse

Lol

AlijahTheMediocre@lemmy.world on 22 Jan 2024 16:54 next collapse

If only game developers optimized their games…

The newest hardware is getting powerful enough that devs are banking on people just buying better cards to play their games.

Eccitaze@yiffit.net on 22 Jan 2024 17:17 next collapse

An actual technical answer: Apparently, it’s because while the PS5 and Xbox Series X are technically regular x86-64 architecture, they have a design that allows the GPU and CPU to share a single pool of memory with no loss in performance. This makes it easy to allocate a shit load of RAM for the GPU to store textures very quickly, but it also means that as the games industry shifts from developing for the PS4/Xbox One X first (both of which have separate pools of memory for CPU & GPU) to the PS5/XSX first, VRAM requirements are spiking up because it’s a lot easier to port to PC if you just keep the assumption that the GPU can handle storing 10-15 GB of texture data at once instead of needing to refactor your code to reduce VRAM usage.

Dra@lemmy.zip on 23 Jan 2024 10:47 collapse

Perfect answer thank you!

Blackmist@feddit.uk on 22 Jan 2024 17:36 next collapse

Current gen consoles becoming the baseline is probably it.

As games running on last gen hardware drop away, and expectations for games rise above 1080p, those Recommended specs quickly become an Absolute Minimum. Plus I think RAM prices have tumbled as well, meaning it’s almost Scrooge-like not to offer 16GB on a £579 GPU.

That said, I think the pricing is still much more of an issue than the RAM. People just don’t want to pay these ludicrous prices for a GPU.

Obi@sopuli.xyz on 22 Jan 2024 18:05 next collapse

Personally I need it for video editing & 3D work but I get that’s a niche case compared to the gaming market.

Space_Racer@lemm.ee on 22 Jan 2024 20:22 next collapse

I’m maxed on VRAM in VR for the most part with a 3080. It’s my main bottleneck.

Hadriscus@lemm.ee on 23 Jan 2024 03:03 collapse

Perhaps not the biggest market but consumer cards (especially nvidia’s) have been the preferred hardware in the offline rendering space -ie animation and vfx- for a good few years now. They’re the most logical investment for freelancers and small to mid studios thanks to hardware raytracing. CUDA and later Optix may be anecdotal on the gaming front, but they completely changed the game over here

hark@lemmy.world on 22 Jan 2024 20:36 next collapse

So many options, with small differences between them, all overpriced to the high heavens. I’m sticking with my GTX 1070 since it serves my needs and I’ll likely keep using it a few years beyond that out of spite. It cost $340 at the time I bought it (2016) and I thought that was somewhat overpriced. According to an inflation calculator, that’s $430 in today’s dollars.

dangblingus@lemmy.dbzer0.com on 23 Jan 2024 01:49 collapse

1060 6gb here. Loving life atm.

Blackmist@feddit.uk on 23 Jan 2024 01:53 next collapse

It’ll do for the few pc games I play. FFXIV don’t need much to run. Even handles HL Alyx.

systemglitch@lemmy.world on 23 Jan 2024 02:11 collapse

Ditto. It’s a great card and I don’t feel I’m missing out over what newer cards offer.

randon31415@lemmy.world on 22 Jan 2024 20:53 next collapse

Is this the one that they nerfed so that they could sell them in China around the US AI laws?

potustheplant@feddit.nl on 23 Jan 2024 03:08 collapse

Nope, that’s the 4090.

DingoBilly@lemmy.world on 23 Jan 2024 02:25 collapse

What’s going on? It’s overpriced and completely unnecessary for most people. There’s also a cost of living crisis.

I play every game I want to on high graphics with my old 1070. Unless you’re working on very graphically intensive apps or you’re a pc master race moron then there’s no need for new cards.

chiliedogg@lemmy.world on 23 Jan 2024 02:30 next collapse

I still game 1080p and it looks fine. I’m not dropping 2500 bucks to get a 4k monitor and video card to run it when I won’t even register the difference during actual gameplay.

n3m37h@sh.itjust.works on 23 Jan 2024 03:27 collapse

It was a night and day difference going from a 1060 6gb to a 6700xt. The prices are still kida shit but that goes for everything