Chips aren’t improving like they used to, and it’s killing game console price cuts (arstechnica.com)
from j4p@lemm.ee to technology@lemmy.world on 04 May 15:55
https://lemm.ee/post/63049008

“These price increases have multiple intertwining causes, some direct and some less so: inflation, pandemic-era supply crunches, the unpredictable trade policies of the Trump administration, and a gradual shift among console makers away from selling hardware at a loss or breaking even in the hopes that game sales will subsidize the hardware. And you never want to rule out good old shareholder-prioritizing corporate greed.

But one major factor, both in the price increases and in the reduction in drastic “slim”-style redesigns, is technical: the death of Moore’s Law and a noticeable slowdown in the rate at which processors and graphics chips can improve.”

#technology

threaded - newest

Diplomjodler3@lemmy.world on 04 May 16:06 next collapse

Consoles are just increasingly bad value for consumers compared to PCs.

gravitas_deficiency@sh.itjust.works on 04 May 16:10 next collapse

Tbh the only consoles I’ve been really interested in lately are the switch and steam deck, simply because they’re also mobile devices.

Diplomjodler3@lemmy.world on 04 May 18:07 next collapse

The Steam Deck is basically a PC. You can get mini PCs with APUs of a similar performance for very low prices these days. That won’t perform like a current gen console but it’s a cheap gaming machine with a huge selection of low cost games and you won’t have to pay for multiplayer.

can@sh.itjust.works on 04 May 19:56 collapse

And those mini PC’s are mobile with built-in screens and controls?

Diplomjodler3@lemmy.world on 04 May 20:10 collapse

That would be handhelds. Mini PCs are desktop devices. They often use the same processors as handhelds and laptops, though.

can@sh.itjust.works on 04 May 22:35 collapse

Oh man you’re so close

JayGray91@piefed.social on 05 May 07:37 collapse

I mean after Valve released the Steam Deck, Asus, Lenovo, MSI--to name just a few--followed suit with at least an iteration. Asus has the Ally and Ally X, Lenovo now have 3 models, and MSI only has 1.

I can't recall if there are any other big brands handheld PCs, but there's definitely Chinese ones.

cmnybo@discuss.tchncs.de on 04 May 20:31 collapse

The Steam Deck is the only decent console because it’s not locked down.

sugar_in_your_tea@sh.itjust.works on 05 May 03:07 collapse

That’s because it’s not a console.

zerofatorial@lemm.ee on 04 May 16:22 next collapse

Are they tho? Have you seen graphics card prices?

Toneswirly@lemmy.world on 04 May 16:36 next collapse

2060 super for 300, and then another 200 for a decent processor puts you ahead of a ps5 and for a comparable price. Games are cheaper on PC too, as well as a broader selection. pcpartpicker.com/list/zYGmJn here is a mid tier build for 850, you could cut the procesor down, install linux for free, and im sure youve got a computer monitor laying around somwhere… the only thing stopping you is inertia.

tomalley8342@lemmy.world on 04 May 18:18 next collapse

2060 super for 300, and then another 200 for a decent processor puts you ahead of a ps5 and for a comparable price.

you’re going to have to really scrunge up for deals in order to get psu, storage, memory, motherboard, and a case for your remaining budget of $0.

pcpartpicker.com/list/zYGmJn here is a mid tier build for 850

This is $150 more expensive and the gpu is half as performant as the reported PS5 pro equivalent.

sp3ctr4l@lemmy.dbzer0.com on 04 May 22:50 next collapse

Ok so, for starters, your ‘reported equivalent’ source is wrong.

eurogamer.net/digitalfoundry-2024-playstation-5-p…

The custom AMD Zen2 APU (combined CPU + GPU, as is done in laptops) of a PS5Pro is 16.7 TFLOPs, not 33.

So your PS5 Pro is actually roughly equivalent to that posted build… by your ‘methodology’, which is utterly unclear to me, what your actual methodolgy for doing a performance comparison is.

The PS5 Pro uses 2 GB of DDR5 RAM, and 16 GB of GDDR6 RAM.

This is… wildly outside of the realm of being directly comparable to a normal desktop PC, which … bare minimum these days, has 16 GB DDR4/5 RAM, and the GDDR6 RAM would be part of the detachable GPU board itself, and would be … between 8GB … and all the way up to 32 if you get an Nvidia 5090, but consensus seems to be that 16 GB GDDR6/7 is probably what you want as a minimum, unless you want to be very reliant on AI upscaling/framegen, and the input lag and whatnot that comes with using that on an underpowered GPU.

Short version: The PS5Pro would be a wildly lopsided, nonsensical architecture to try to one to one replicate in a desktop PC… 2 GB system RAM will run lightweight linux os’s, but not a chance in hell you could run Windows 10 or 11 on that.

Fuck, even getting 7 to work with 2GB RAM would be quite a challenge… if not impossible, I think 7 required 4GB RAM minimum?

The closest AMD chip to the PS5 Pro that I see, in terms of TFLOP output… is the Radeon 7600 Mobile.

((… This is probably why Cyberpunk 2077 did not (and will never) get a ‘performance patch’ for the PS5Pro: CP77 can only pull both high (by console standards) framerates at high resolutions… and raytracing/path tracing… on Nvidia mobile class hardware, which the PS5Pro doesn’t use.))

But, lets use the PS5Pro’s ability to run CP77 at 2K60fps on … what PC players recognize as a mix of medium and high settings… as our benchmark for a comparable standard PC build. Lets be nice and just say its the high preset.

(a bunch of web searching and performance comparisons later…)

Well… actually, the problem is that basically, nobody makes or sells desktop GPUs that are so underpowered anymore, you’d have to go to the used market or find some old unpurchased stock someone has had lying around for years.

The RX 6600 in the partpicker list is fairly close in terms of GPU performance.

Maybe pair it with an AMD 5600X processor if you… can find one? Or a 4800S, which supposedly actually were just rejects/run off from the PS5 and Xbox X and S chips, rofl?

Yeah, legitimately, the problem with trying to make a PC … in 2025, to the performance specs of a PS5 Pro… is that basically the bare minimum models for current and last gen, standard PC architecture… yeah they just don’t even make hardware that weak anymore.

EDIT:

oh final addendum: if your tv has an hdmi port, kablamo, thats your monitor, you dont strictly need a new one.

And there are also many ways to get a wireless or wired console style controller to work in a couch pc setup.

tomalley8342@lemmy.world on 04 May 23:59 collapse

Short version: The PS5Pro would be a wildly lopsided, nonsensical architecture to try to one to one replicate in a desktop PC… 2 GB system RAM will run lightweight linux os’s, but not a chance in hell you could run Windows 10 or 11 on that.

Fuck, even getting 7 to work with 2GB RAM would be quite a challenge… if not impossible, I think 7 required 4GB RAM minimum?

It’s shared memory, so you would need to guarantee access to 16gb on both ends.

The RX 6600 in the partpicker list is fairly close in terms of GPU performance.

I don’t know how you could arrive at such a conclusion, considering that the base PS5 has been measured to be comparable to the 6700.

sp3ctr4l@lemmy.dbzer0.com on 05 May 01:20 collapse

It’s shared memory, so you would need to guarantee access to 16gb on both ends.

So… standard Desktop CPUs can only talk to DDR.

‘CPUs’ can only utilize GDDR when they are actually a part of an APU.

Standard desktop GPUs can only talk to GDDR, which is part of their whole seperate board.

GPU and CPU can talk to each other, via the mainboard.

Standard desktop PC architecture does not have a way for the CPU to directly utilize the GDDR RAM on the standalone GPU.

In many laptops and phones, a different architecture is used, which uses LPDDR RAM, and all the LPDDR RAM is used by the APU, the APU being a CPU+GPU combo in a single chip.

Some laptops use DDR RAM, but… in those laptops, the DDR RAM is only used by the CPU, and those laptops have a seperate GPU chip, which has its own built in GDDR RAM… the CPU and GPU cannot and do not share these distinct kinds of RAM.

(Laptop DDR RAM is also usually a different pin count and form factor than desktop PC DDR RAM, you usually can’t swap RAM sticks between them.)

The PS5Pro appears to have yet another unique architecture:

Functionally, the 2GB of DDR RAM can only be accessed by the CPU parts of the APU, which act as a kind of reserve, a minimum baseline of CPU-only RAM set aside for certain CPU specific tasks.

The PS5Pro’s 16 GB of GDDR RAM is sharable and usable by both the CPU and GPU components of the APU.

So… saying that you want to have a standard desktop PC build… that shares all of its GDDR and DDR RAM… this is impossible, and nonsensical.

Standard desktop PC motherboards, compatible GPUs and CPUs… they do not allow for shareable RAM, instead going with a design paradigm of the GPU has its own onboard GDDR RAM that only it can use, and DDR RAM that only the CPU can use.

You would basically have to tear a high end/more modern laptop board with an APU soldered into it… and then install that into a ‘desktop pc’ case… to have a ‘desktop pc’ that shares memory between its CPU and GPU components… which both would be encapsulated in a single APU chip.

Roughly this concept being done is generally called a MiniPC, and is a fairly niche thing, and is not the kind of thing an average prosumer can assemble themselves like a normal desktop PC.

All you can really do is swap out the RAM (if it isnt soldered) and the SSD… maybe I guess transplant it and the power supply into another case?

I don’t know how you could arrive at such a conclusion, considering that the base PS5 has been measured to be comparable to the 6700.

I can arrive at that conclusion because I can compare actual bench mark scores from a nearest TFLOP equivalent, more publically documented, architecturally similar AMD APU… the 7600M. I specifically mentioned this in my post.

This guy in the article here … well he notes that the 6700 is a bit more powerful than the PS5Pro’s GPU component.

The 6600 is one step down in terms of mainline desktop PC hardware, and arguably the PS5Pro’s performance is… a bit better than a 6600, a bit worse than a 6700, but at that level, all of the other differences in the PS5Pro’s architecture give basically a margin of error when trying to precisely dial in whether a 6700 or 6600 is a closer match.

You can’t do apples to apples spec sheet comparisons… because, as I have now exhaustively explained:

Standard desktop PCs do not share RAM between the GPU and CPU. They also do not share memory imterface busses and bandwidth lanes… in standard PCs, these are distinct and seperate, because they use different architectures.

I got my results by starting with the (correct*) TFLOPs output from a PS5Pro, finding a nearest equivalent APU with PassMark benchmark scores, reported by hundreds or thousands or tens of thousands of users, then compared those PassMark APU scores to PassMark conventional GPU scores, and ended up with ‘fairly close’ to an RX 6600.

  • The early, erroneous reporting of the TFLOPs score as roughly 33, when it was actually closer to 16 or 17… that stemmed from reporting a 16 digit FLOP score/test, when the more standard convention is to list the 32 digit FLOP score/test.

You, on the other hand, just linked to a Tom’s Hardware review of currently in production desktop PC GPUs… which did not make any mention of the PS5Pro… and them you also acted as if a 6600 was half as powerful as a PS5Pro’s GPU component… which is wildly off.

A 6700 is nowhere near 2x as powerful as a 6600.

2x as poweful as an AMD RX 6600… would be roughly an AMD RX 7900 XTX, the literal top end card of AMDs previous GPU generation… that is currently selling for something like $1250 +/- $200, depending on which retailer you look

Aceticon@lemmy.dbzer0.com on 05 May 07:49 next collapse

Just to add to this, the reason you only see shared memory setups on PCs with integrated graphics is because it lowers performance compared to dedicated memory, which is less of a problem if your GPU is only being used in 2D mode such as when doing office work (mainly because that uses little memory), but more of a problem when used in 3D mode (such as in most modern games) which is as the PS5 is meant to be used most of the time.

So the PS5 having shared memory is not a good thing and actually makes it inferior compared to a PC made with a GPU and CPU of similar processing power using the dominant gaming PC architecture (separate memory).

sp3ctr4l@lemmy.dbzer0.com on 05 May 11:33 next collapse

Basically this is true, yes, without going into an exhaustive level of detail as to very, very specific subtypes and specs of different RAM and mobo layouts.

Shared memory setups generally are less powerful, but, they also usually end up being overall cheaper, as well as having a lower power draw… and being cooler, temperature wise.

Which are all legitimate reasons those kinds of setups are used in smaller form factor ‘computing devices’, because heat managment, airflow requirements… basically rule out using a traditional architecture.

Though, recently, MiniPCs are starting to take off… and I am actually considering doing a build based on the Minisforum BD795i SE… which could be quite a powerful workstation/gaming rig.

Aside about interesting non standard 'desktop' potential build

This is a Mobo with a high end integrated AMD mobile CPU (7945hx)… that all together, costs about $430. And the CPU in this thing… has a PassMark score… of about the same as an AMD 9900X… which itself, the CPU alone, MSRPs for about $400. So that is kind of bonkers, get a high end Mobo and CPU… for the price of a high end CPU. Oh, I forgot to mention: This BD795iSE board? Yeah it just has a standard PCI 16 slot. So… you can plug in any 2 slot width standard desktop GPU into it… and all of this either literally is, or basically is the ITX form factor. So, you could make a whole build out of this that would be ITX form factor, and also absurdly powerful, or a budget version with a dinky GPU. I was talking in another thread a few days ago, snd somekne said PC architecture may be headed toward… basically you have the entire PC, and the GPU, and thats the new paradigm, instead of the old school view of: you have a mobo, and you pick it based on its capability to support future cpus in the same socket type, future ram upgrades, etc… And this intrigued me, I looked into it, and yeah, this concept does have cost per performance merit at this point. So this uses a split between the GPU having its GDDR RAM and the… CPU using DDDR SODIMM (laptop form factor) RAM. But its also designed such that you can actually fit huge standard PC style cooling fans… into quite a compact form factor. From what I can vaguely tell as a non Chinese speaker… it seems like there are many more people over in China who have been making high end, custom, desktop gaming rigs out of this laptop/mobile style architecture for a decent while now, and only recently has this concept even really entered into the English speaking world/market, that you can actually build your own rig this way.

lka1988@lemmy.dbzer0.com on 05 May 15:25 collapse

Fascinating discourse here. Love it.

What about a Framework laptop motherboard in a mini PC case? Do they ship with AMD APUs equivalent to that?

sp3ctr4l@lemmy.dbzer0.com on 05 May 17:24 collapse

Hrm uh… Framework laptops… seem to be configurable as having a mobile grade CPU with integrated graphics… and also an optional, additional mobile grade, dedicated GPU.

So, not really an APU… unless you really want to haggle over definitions and say ‘technically, a CPU with pathetic integrated graphics still counts as a GPU and is thus an APU’.

Framework laptop boards don’t have the PCI-E 16x slot for a traditional desktop GPU. As far as I am aware, Minisforum are the only people that do that, along with a high powered mobile CPU.

Note that the Minisforum Mobo model I am talking about, the AMD chip is not really an APU, its also a CPU with integrated graphics. Its a Radeon 610M, basically the bare minimum to be able to render and output very basic 2d graphics.

True APUs are … things like what more modern consoles use, what a steam deck uses. They are still usually custom specs, proprietary to their vendor.

The Switch 2 will have a custom Nvidia APU, which is the first Nvidia APU of note to my knowledge, and it will be very interesting to learn more about it from teardowns and benchmarks.

Currently, the most powerful, non custom, generally publically available, compatible with standard PC mobos… arguably an APU, arguably not… is the AMD 8700G.

Its about $315 bucks, is a pretty decent CPU, but as a GPU… its less powerful than a standard desktop RX 6500 from AMD… which is the absolute lowest tier AMD GPU from now two generations back from current.

You… might be able to run … basically games older than 5ish years, at 1080p, medium graphics, at 60fps. I guess it would maybe be a decent option if you… wanted to build a console emulator machine, roughly for consoles … N64/PS1/Dreamcast, and older, as well as being able to play older PC games, or PC games at lower settings/no more than 1080p.

I am totally just spitballing with that though, trying to figure out all that exactly would be quite complicated.

But now, back to Framework.

Framework is soon to be releasing the Framework Desktop.

This is a small form factor PC… which uses an actual proper APU, either the AMD AI Max 385 or 395.

Its listed as MSRP of $1100, they say it can run Cyberpunk at 1440p on high settings at about 75 fps… thats with no ray tracing, no framegen… and I think also no frame upscaling being used.

So, presumably, if you turned on upscaling and framegen, you’d be able to get similar fps at ultra and psycho settings, and/or some amount of raytracing.

There are also other companies that offer this kind of true APU, MiniPC style architecture, such as EvoTek, though it seems like most of them are considerably more expensive.

wccftech.com/amd-ryzen-ai-max-395-strix-halo-mini…

… And finally, looks like Minisforum is sticking with the laptop CPU + desktop GPU design, and is soon going to be offering even more powerful CPU+Mobo models.

wccftech.com/minisforum-ryzen-9-9955hx-x870m-motd…

So yeah, this is actually quite an interesting time of diversification away from … what have basically been standard desktop mobo architectures… for … 2, 3? decades…

…shame it all also coincides with Trump throwing a literally historically unprecedented senilic temper tantrum, and fucking up prices and logistics for… basically the whole world, though of course much, much more seriously for the US.

addie@feddit.uk on 05 May 15:39 collapse

You’ve got that a bit backwards. Integrated memory on a desktop computer is more “partitioned” than shared - there’s a chunk for the CPU and a chunk for the GPU, and it’s usually quite slow memory by the standards of graphics cards. The integrated memory on a console is completely shared, and very fast. The GPU works at its full speed, and the CPU is able to do a couple of things that are impossible to do with good performance on a desktop computer:

  • load and manipulate models which are then directly accessible by the GPU. When loading models, there’s no need to read them from disk into the CPU memory and then copy them onto the GPU - they’re just loaded and accessible.
  • manipulate the frame buffer using the CPU. Often used for tone mapping and things like that, and a nightmare for emulator writers. Something like RPCS3 emulating Dark Souls has to turn this off; a real PS3 can just read and adjust the output using the CPU with no frame hit, but a desktop would need to copy the frame from the GPU to main memory, adjust it, and copy it back, which would kill performance.
Aceticon@lemmy.dbzer0.com on 05 May 18:20 next collapse

When two processing devices try and access the same memory there are contention problems as the memory cannot be accessed by two devices at the same time (well, sorta: parallel reads are fine, it’s when one side is writing that there can be problems), so one of the devices has to wait, so it’s slower than dedicated memory but the slowness is not constant since it depends on the memory access patterns of both devices.

There are ways to improve this: for example, if you have multiple channels on the same memory module then contention issues are reduced to the same memory block, which depends on the block-size, though this also means that parallel processing on the same device - i.e. multiple cores - cannot use the channels being used by a different device so it’s slower.

There are also additional problems with things like memory caches in the CPU and GPU - if an area of memory cached in one device is altered by a different device that has to be detected and the cache entry removed or marked as dirty. Again, this reduces performance versus situations where there aren’t multiple processing devices sharing memory.

In practice the performance impact is highly dependent on if an how the memory is partitioned between the devices, as well as by the amount of parallelism in both processing devices (this latter because of my point from above that memory modules have a limited number of memory channels so multiple parallel accesses to the same memory module from both devices can lead to stalls in cores of one or both devices since not enough channels are available for both).

As for the examples you gave, they’re not exactly great:

  • First, when loading models into the GPU memory, even with SSDs the disk read is by far the slowest part and hence the bottleneck, so as long as things are being done in parallel (i.e. whilst the data is loaded from disk to CPU memory, already loaded data is also being copied from CPU memory to GPU memory) you won’t see that much difference between loading to CPU memory and then from there to GPU memory and direct loading to GPU memory. Further, the manipulation of models in shared memory by the CPU introduces the very performance problems I was explaining above, namely contention problems from both devices accessing the same memory blocks and GPU cache entries getting invalidated because the CPU altered that data in the main memory.
  • Second, if I’m not mistaken tone mapping is highly parallelizable (as pixels are independent - I think, but not sure since I haven’t actually implemented this kind of post processing), which means that the best by far device at parallel processing - the GPU - should be handling it in a shader, not the CPU. (Mind you, I might be wrong in this specific case if the algorithm is not highly parallelizable. My own experience with doing things via CPU or via shaders running in the GPU - be it image shaders or compute shaders - is that in highly parallelizable stuff, a shader in the GPU is way, way faster than an algorithm running in the CPU).

I don’t think that direct access by the CPU to manipulate GPU data is at all a good thing (by the reasons given on top) and to get proper performance out of a shared memory setup at the very least the programming must done in a special way that tries to reduce collisions in memory access, or the whole thing must be setup by the OS like it’s done on PCs with integrated graphics, were a part of the main memory is reserved for the GPU by the OS itself when it starts and the CPU won’t touch that memory after that.

sp3ctr4l@lemmy.dbzer0.com on 06 May 04:49 collapse

Can you explain to me what the person you are replying to meant by ‘integrated memory on a desktop pc’?

I tried to explain why this phrase makes no sense, but apparently they didn’t like it.

…Standard GPUs and CPUs do not share a common kind of RAM that gets balanced between space reserved for CPU-ish tasks and GPU-ish tasks… that only happens with an APU that uses LPDDR RAM… which isn’t at all a standard desktop PC.

It is as you say, a hierarchy of assets being called into the DDR RAM by the CPU, then streamed or shared into the GPU and its GDDR RAM…

But the GPU and CPU are not literally, directly using the actual same physical RAM hardware as a common shared pool.

Yes, certain data is… shared… in the sense that it is or can be, to some extent, mirrored, parellelized, between two distinct kinds of RAM… but… not in the way they seem to think it works, with one RAM pool just being directly accessed by both the CPU and GPU at the same time.

… Did they mean ‘integrated graphics’ when they … said ‘integrated memory?’

L1 or L2 or L3 caches?

???

I still do not understand how any standard desktop PC has ‘integrated memory’.

What kind of ‘memory’ on a PC… is integrated into the MoBo, unremovable?

???

Aceticon@lemmy.dbzer0.com on 06 May 09:03 collapse

Hah, now you made me look that stuff up since I was talking anchored on my knowledge of systems with multiple CPUs and shared memory, since that was my expectation about the style of system architecture of the PS5, since in the past that’s how they did things.

So, for starters I never mentioned “integrated memory”, I wrote “integrated graphics”, i.e. the CPU chip comes together with a GPU, either as two dies in the same chip package or even both on the same die.

I think that when people talk about “integrated memory” what they mean is main memory which is soldered on the motherboard rather than coming as discrete memory modules. From the point of view of systems architecture it makes no difference, however from the point of view of electronics, soldered memory can be made to run faster because soldered connections are much closer to perfect than the mechanical contact connections you have for memory modules inserted in slots.

(Quick explanation: at very high clock frequencies the electronics side starts to behave in funny ways as the frequency of the signal travelling on the circuit board gets so high and hence the wavelength size gets so small that it’s down to centimeters or even milimeters - around the scale of the length of circuit board lines - and you start getting effects like signal reflections and interference between circuit lines - because they’re working as mini antennas so can induce effects on nearby lines - hence it’s all a lot more messy than if the thing was just running at a few MHz. Wave reflections can happen in connections which aren’t perfect, such as the mechanical contact of memory modules inserted into slots, so at higher clock speeds the signal integrity of the data travelling to and from the memory is worse than it is with soldered memory whose connections are much closer to perfect).

As far as I know nowadays L1, L2 and L3 caches are always part of the CPU/GPU die, though I vaguelly remember that in the old days (80s, 90s) memory cache might be in the form of dedicated SRAM modules on the motherboard.

As for integrated graphics, here’s some reference for an Intel SoC (system on a chip, in this case with the CPU and GPU together in the same die). If you look at page 5 you can see a nice architecture diagram. Notice how memory access goes via the memory controller (lower right, inside the System Agent block) and then the SoC Ring Interconnect which is an internal bus connecting everything to everything (so quite a lot of data channels). The GPU implementation is the whole left side, the CPU is top and there is a cache slice (at first sight an L4 cache) at the bottom shared by both.

As you see there, in integrated graphics the memory access doesn’t go via the CPU, rather there is a memory controller (and in this example a memory cache) for both and memory access for both the CPU and the GPU cores goes through that single controller and shares that cache (but lower level caches are not shared: notice how the GPU implementation contains its own L3 cache - bottom left, labelled “L3$”)

With regards to the cache dirty problems I mentioned in the previous post, at least that higher level (L4) cache is shared so instead of cache entries being made invalid because of the main memory being changed outside of it, what you get is a different performance problem were there is competiton for cache usage between the areas of memory used by the CPU and areas of memory used by the GPU (as the cache is much smaller than the actual main memory, it can only contain copies of part of the main memory, and if two devices are using different areas of the main memory they’re both causing those areas to get cached but the cache can’t fit both so depending on the usage pattern it might constantly be ejecting entries for one area of memory to make room for entries for the other area of memory and back, which in practice makes it as slow as not having any cache there - there are lots of tricks to make this less of a problem but it’s still slower than if there was just one processing device using that cache such as you get with each processing device having its own cache and its own memory).

As for contention problems, there are generally way more data channels in an internal interconnect as the one you see there than in the data bus to the main memory modules, plus that internal interconnect will be way faster, so the contention in memory access will be lower for cached memory but with cache misses (memory locations not in cache and hence that have to be loaded from main memory) that architecture will still suffer from two devices sharing the main memory hence that memory’s data channels having to be shared.<

sp3ctr4l@lemmy.dbzer0.com on 06 May 10:44 collapse

addie said:

Integrated memory on a desktop computer is more “partitioned” than shared

Then I wrote my own reply to them, as you did.

And then I also wrote this, under your reply to them:

Can you explain to me what the person you are replying to meant by ‘integrated memory on a desktop pc’?

And now you are saying:

So, for starters I never mentioned “integrated memory”, I wrote “integrated graphics”, i.e. the CPU chip comes together with a GPU, either as two dies in the same chip package or even both on the same die.

I mean, I do genuinely appreciate your detailed, technical explanations of these systems and hardware and their inner functions…

But also, I didn’t say you said integrated memory.

I said the person you are replying to, addie, said integrated memory.

I was asking you to perhaps be able to explain what they meant… because they don’t seem to know what they’re trying to say.

But now you have misunderstood what I said, what I asked, lol.

You replied to addie … I think, as if they had written ‘integrated graphics’. But they didn’t say that. They said ‘integrated memory’.

And… unless I am … really, really missing something… standard desktop PCs… do not have any kind of integrated memory, beyond like… very, very small areas where the mobo bios is stored, but that is almost 100% irrelevant to discussion about video game rendering capabilities.

As you say, you have to go back 20+ years to find desktop PCs with Mobos that have their own SRAM… everything else is part of the GPU or CPU die, and thus … isn’t integrated. As GPUs and CPUs are removable, swappable, on standard desktop PCs.

Eitherway, again, I do appreciate your indepth technical info!

Aceticon@lemmy.dbzer0.com on 06 May 11:07 collapse

Well, I wasn’t sure if you meant that I did say that or if you just wanted an explanation, so I both clarified what I said and I gave an explanation to cover both possibilities :)

I think the person I was replying to just got confused when they wrote “integrated memory” since as I explained when main memory is “integrated” in systems like these, that just means it’s soldered on the motherboard, something which really makes no difference in terms of architecture.

There are processing units with integrated memory (pretty much all microcontrollers), which in means they come with their own RAM (generally both Flash Ram and SRAM) in the same integrated circuit package or even the same die, but that’s at the very opposite end of processing power of a PC or PS5 and the memory amounts involved tend to be very small (a few MB or less).

As for the “integrated graphics” bit, that’s actually the part that matters when it comes to performance of systems with dedicate CPU and GPU memory vs systems with shared memory (integrated in the motherboard or otherwise, since being soldered on the motherboard or coming as modules doesn’t really change the limitations of each architecture) which is what I was talking about back in the original post.

sp3ctr4l@lemmy.dbzer0.com on 06 May 11:43 collapse

Sorry, I … well, I was recently diagnosed with PTSD.

And a significant part of that… is I am so, so, very used to people just misinterpreting what I actually said, then in their heads, they heard /something else/, and then they respond to /something else/, they continue to believe I said /something else/, even after I explain to them that isn’,t what I said, and then they tell everyone else that I said /something else/.

(there are many other things that go into the PTSD, but they are waaaay outside of the scope of this discussion)

I, again, realize and aporeciate that you responded to both interpretations…

But I am just a bit triggered.

I am so, so, very used to being gaslit by… most of my family, and many, many other people in my life, who just seemingly willfully misinterpret me consistently, or are literally incapable of hearing/reading without just inventing and inserting their own interpretation.

… Whole lot of my family has very serious mental health disorders, and I’ve also happened to have a very bad run of many bosses and former friends and ex partners who just do the same thing, all the time.

Took me a long time to just… get away from all these toxic situations, and finally be able to pursue mental health evaluation/treatment on my own accord.

I’m not saying you ‘intentionally triggered me’ or anything like that, that would be a ridiculous judgement from me, and you have been very polite, and informative… I’m just trying to explain myself, lol.

As to the actual technical info: yes, everything you are saying lines up with my understanding, its nice to know I know what these words and terms mean in this context, and my understanding is … in line with reality.

Aceticon@lemmy.dbzer0.com on 06 May 12:50 collapse

Well, this being the Internet it’s natural to expect less than impeccable truth from strangers here, both because a lot of people just want to feel like they “won” the argument no matter what so they’ll bullshit their way into a “win”, because most people aren’t really trained in the “trying to be as completed and clear as possible” mental processes as Engineers and Scientists (so there’s a lot of “I think this might be such” being passed as “it is such”) and because it simply feels bad to be wrong so most people don’t want to accept it when somebody else proves them wrong and react badly to it.

I’m actually a trained Electronics Engineer but since I don’t actually work in that domain and studied it decades ago, some of what I wrote are informed extrapolations based on what learned and stuff I read over the years rather than me being absolutely certain that’s how things are done nowadays (which is why looking up and reading that Intel spec was very interesting, even if it turned out things are mainly is as I expected).

Also I’m sorry for triggering you, you don’t need to say sorry for your reaction and I didn’t really took it badly: as I said, this is the Internet and a lot of people are argumentative for the sake of “winning” (probably the same motivation as most gaslighters) so I expect everybody to be suspicious of my motivations, same as they would be for all other people since from their point of view I’m just another random stranger ;)

Anyways, cheers for taking the trouble of explaining it and making sure I was okay with out interaction - that’s far nicer and more considerate than most random internet strangers.

sp3ctr4l@lemmy.dbzer0.com on 05 May 18:26 collapse

I… uh… what?

Integrated memory, on a desktop PC?

Genuinely: What are you talking about?

Typical PCs (and still many laptops)… have a CPU that uses the DDR RAM that is… plugged into the Mobo, and can be removed. Even many laptops allow the DDR RAM to be removed and replaced, though working on a laptop can often be much, much more finnicky.

GPUs have their own GDDR RAM, either built into the whole AIB in a desktop, or inside of or otherwise a part of a laptop GPU chip itself.

These are totally different kinds of RAM, they are accessed via distinct busses, they are not shared, they are not partitioned, not on desktop PCs and most laptops.

They are physically and design distinct, set aside, and specialized to perform with their respective processor.

The kind of RAM you are talking about, that is shared, partitioned, is LPDDR RAM… and is incompatible with 99% of desktop PCs

Also… anything, on a desktop PC, that gets loaded and processed by the GPU… does at some point, have to go through the CPU and its DDR RAM first.

The CPU governs the actual instructions to, and output from, the GPU.

A GPU on its own cannot like, ask an SSD or HDD for a texture or 3d model or shader.

Normally, compressed game assets are loaded from the SSD to RAM via the Win32 API. Once in RAM, the CPU then decompresses those assets. The decompressed game assets are then moved from RAM to the graphics card’s VRAM (ie, GDDR RAM), priming the assets for use in games proper.

(addition to the quote is mine)

Like… there is GPU Direct Storage… but basically nothing actually uses this.

pcworld.com/…/what-happened-to-directstorage-why-…

Maybe it’ll take off someday, maybe not.

Nobody does dual GPU SLI anymore, but I also remember back when people thought multithreading and multicore CPUs would never take off, because coding for multiple threads is too haaaaarrrrd, lol.

Anyway, the reason that emulators have problems doing the things you describe consoles a good at… is because consoles have finetuned drivers that work with only a specific set of hardware, and emulators have to reverse engineer ways of doing the same, which will work on all possible pc hardware configurations.

People who make emulators generally do not have direct access to the actual proprietary driver code used by console hardware.

If they did, they would much, much more easily be able to… emulate… similar calls and instruction sets on other PC hardware.

But they usually just have to make this shit up on the fly, with no actual knowledge of how the actual console drivers do it.

Reverse engineering is astonishingly more difficult when you don’t have the source code, the proverbial instruction manual.

Its not that desktop PC architecture … just literally cannot do it.

If that were the case, all the same issues you bring up that are specific to emulators… would also be present with console games that have proper ports to PC.

While occasionally yes, this is sometimes the case, for some specific games with poor quality ports… generally no, not this is not true.

Try running say, an emulated Xbox version of Deus Ex: Invisible war, a game notoriously handicapped by its console centric design… try comparing the PC version of that, on a PC… to that same game, but emulating the Xbox version, on the same exact PC.

You will almost certainly, for almost every console game with a PC port… find that the proper PC version runs better, often much, much better.

The problem isn’t the PC’s hardware capabilities.

The problem is that emulation is inefficient guesswork.

Like, no shade at emulator developers whatsoever, its a miracle any of that shit works at all, reverse engineering is astonishingly difficult, but yeah, reverse engineering driver or lower level code, without any documentation or source code, is gonna be a bunch of bullshit hacks that happen to not make your PC instantly explode, lol.

tomalley8342@lemmy.world on 06 May 19:54 collapse

People are not trying to replicate the Sony’s console hardware layout, they are trying to have the same gaming experience. I’m not sure how I can make it any clearer than pointing out an article which showed that the 6700 has similar performance across 7 separate games to the base Playstation 5. Not the Playstation 5 Pro, the base Playstation 5. If you believe that the tflop numbers are more credible than just running games to see how they perform, then I believe your priorities are misplaced with respect to the end goal of playing video games on this hypothetical PC build.

sp3ctr4l@lemmy.dbzer0.com on 06 May 21:46 collapse

TFLOPs generally correlate to actual, general game performance quite well.

A very demanding game will do poorly on a gpu with a given TFLOPs score, but a less demanding one may do fairly well.

So, if you can roughly dial things into a TFLOPs value, you can get a result that is then generally applicable to all games.

The most graphically demanding games require the most TFLOPs out of a GPU to render well and fast, but less demanding games can be fine with a lower TFLOPs GPU.

Like… a GPU with a medium high TFLOPs score may only be able to push about 50 fps on CP77 ultra w/o RT, but it could likely easily hit something like 200ish FPS in say, CounterStike2, both games at 4k.

resetera.com/…/all-games-with-ps5-pro-enhancement…

A PS5Pro can render CP77 at ‘quality mode’ …which is roughly the ‘high’ graphical quality preset of CP77 on PCs, with… I believe all raytracing other than raytraced shadows off…

… at 2k30fps, or it can render at a dynamicly scaling resolution between 2k and 1080p at a locked 60fps.

gpu-monkey.com/…/benchmark-amd_radeon_rx_6700-cyb…

An RX 6700 can render CP77 at 2k, 43 fps avg.

This site isn’t 100% clear as to whether these GPU scores are done at the ‘high’ preset, or the ‘ultra’, all settings maxed out preset… but with all raytracing off.

gpu-monkey.com/…/benchmark-amd_radeon_rx_6600-cyb…

A 6600 can do the same, high or ultra preset, with no raytracing, at 2k, with 31 fps.

1080p at 51 fps.

So uh yeah, even with the top of the line PS5 variant, the PS5Pro’s graphical capabilities are a bit worse than a 6700, and a bit better than a 6600.

I have used many AMD cards, and yeah basically, if you only use raytraced local shadows?

At 1080p, or 2k?

Very roughly, you’ll get about the same fps with a ‘high’ preset, and only raytraced local shadows, as you would with no rt local shadows, but the game’s graphical preset bumped up to ‘ultra’.

There are now like 6 different raytracing options in the PC version of CP77.

They only work well with Nvidia cards… and the latest iteration of AMD cards, that now have basically their equivalent of RT cores, though they’re not as powerful, nor as expensive, as Nvidia cards.

Needless to say, PS5… normals, or whatever, are going to be even weaker than the ‘somewhere between an rx 6600 and rx 6700’ rough equivalence.

Playstation users only even see about 10% of all the graphical settings options that PC players see, really just the 4 post processing effects and motiin blur. PC players get many, many more settings they can tweak around.

Also, PS5 users generally seem to think CP77 is being rendered at 4k, in quality mode.

Uh, no, it isn’t, its rendered at 2k max, and that is likely using FSR 2.1,with a bit of dynamic frame upscaling within CP77, meaning the actual real render resolution is roughly 5% to 15 % lower than 2k, then FSR upscales to 2k,… then the PS5 does its own upscaling outside of the game, likely via an emulator style simple integer upscale algo, to display at 4k on a 4k TV.

… I am saying all of this with confidence because I have spent a considerable amount of time fucking with customizing dynamjc resolution scaling, and other graphical settings, in CP77, on different AMD cards, on PCs.

The PS5Pro is using the same dynamic resolution scaling that exists in CP77 on PCs. PCs generally do best with an ‘Auto’ mode for various frame upscalers, but the dynamic setting on PC basically lets you throw out the auto settings and fuck about with your own custom tolerances for max and min resolutions and frame rates.

tomalley8342@lemmy.world on 06 May 22:40 collapse

Needless to say, PS5… normals, or whatever, are going to be even weaker than the ‘somewhere between an rx 6600 and rx 6700’ rough equivalence.

There is no need to speculate or dial anything in, the article above literally compared that exact game side by side with the base model Playstation 5.

Cyberpunk 2077 in the 30fps RT mode on PS5, but a part of the market in the Phantom Liberty expansion definitely has issues - but not as many issues as the RX 6700. The PS5 outperforms is by 45 percent - a remarkable state of affairs that is not matched in performance mode, where the 6700 seems much the same as PlayStation 5.

[deleted] on 06 May 03:55 collapse

.

FreedomAdvocate@lemmy.net.au on 05 May 01:38 next collapse

$850 is way more expensive than a PS5 though lol. Linux also means you can’t play the games that top the most played charts on the PS5 every single month of every single year.

sp3ctr4l@lemmy.dbzer0.com on 05 May 02:52 next collapse

metacritic.com/…/best-playstation-games-of-2024/

Works on Linux:

Prince of Persia, the Lost Crown

Silent Hill 2 (Remake)

Marvel vs Capcom: Arcade Classics

Shin Megamei Tensei (V)engeance

Persona 3 Reload

HiFi Rush

Animal Well

Castlevania Dominus Collection

Like A Dragon: Infinite Wealth

Tekken 8

The Last of Us Part II (Remaster)

Balatro

Dave the Diver

Slay the Princess: Pristine Cut

Metaphor Re Fantazio

Elden Ring: Shadow of the Erdtree (and base game)

Does not work on Linux:

Unicorn Overlord (Console Exclusive, No PC Port Allowed by Publisher Vanillaware)

Destiny 2 (Kernel Level Anti Cheat)

FF VII Rebirth (PS Exclusive)

Astro Bot (PS Exclusive)

Damn, yeah, still consoles gotta hold on via exclusives, I guess?

And then there’s the mismanaged shitshow that is Destiny 2…

…who can’t figure out how to do AntiCheat without installing a rootkit on your PC, despite functional, working AntiCheats having worked on linux games for at least half a decade at this point, if not longer…

…nor can they figure out how to write a storyline that rises above ‘everyone is always lore dumping instead of talking, and also they talk to you like a you’re a 10 year while doing so.’

Last I heard, a whole bunch of hardcore D2 youtubers and streamers were basically all quitting out of frustration and feeling let down or betrayed by Bungie.

Maybe we should advocate for some freedom of platform porting/publishing for all games, eh FreedomAdvocate?

FreedomAdvocate@lemmy.net.au on 05 May 17:49 collapse

Highest rated != most played or most popular.

COD MP.

Warzone.

Fortnite.

GTA Online.

Not on Linux.

sp3ctr4l@lemmy.dbzer0.com on 05 May 19:04 collapse

Most Call of Duty games work on linux, you’re gonna have to be more specific as to which particular one of like 25 you mean by ‘COD’.

The ones that don’t, they don’t work because the devs are too lazy or incompetent (or specifically told not to by their bosses) to make an AntiCheat that isn’t a rootkit with full access to your entire PC.

I used to play GTA V Online (and RDR2, and FiveM, and RedM…) on linux all the time, literally for years… untill they just decided to ban all linux players.

IMO they owe me money for that, but oh well I guess.

Again, there are many AntiCheats that work on linux, and have worked on linux for years and years now.

Easy Anti Cheat and Battleeye even offer linux support to game devs. There are some games with these ACs that actually do support linux.

But many game devs/studios/publishers just don’t use this support… because then there wouldn’t be any reason to actually use Windows, and MSFT pays these studios a lot of money… or they just literally own them (Activision/Blizzard = MSFT).

Kernel Anti Cheat that only works on Windows?

Yep, that’s just a complicated way to enforce Windows exclusivity in PC games.

Go look up how many hacks and trainers you can find for one of these games you mention.

You may notice that they are all designed for, and only work on… Windows.

The idea that all linux gamers are malicious hackers is a laughable, obviously false idea… but game company execs understand the power of rabid irrational fandoms.

You are right that you can’t run games with rootkit anticheats on linux though, so if those heavily monetized and manipulative games with toxic playerbases are your addiction of choice, yep, sorry, linux ain’t your hookup for those.

Again, this is another game platform freedom advocacy issue, and also a personal information security advocacy issue, not a ‘something is wrong with linux’ issue.

Game companies have gotten many working anticheat systems to work with linux. The most popular third party anticheat systems also support linux.

But the industry is clever at keeping people locked into their for profit, insecure OSs that spy on their entire system.

FreedomAdvocate@lemmy.net.au on 06 May 02:04 collapse

Most Call of Duty games work on linux, you’re gonna have to be more specific as to which particular one of like 25 you mean by ‘COD’.

I was more specific - I said COD MP as in multiplayer, as in the current COD Multiplayer that people play, which all have anti-cheat that doesn’t work on Linux. Warzone, again, doesn’t work on Linux.

The ones that don’t, they don’t work because the devs are too lazy or incompetent (or specifically told not to by their bosses) to make an AntiCheat that isn’t a rootkit with full access to your entire PC.

Because without full access to your PC, anti-cheat is essentially useless and easily bypassed by cheaters.

I used to play GTA V Online (and RDR2, and FiveM, and RedM…) on linux all the time, literally for years… untill they just decided to ban all linux players.

Because of cheaters.

Kernel Anti Cheat that only works on Windows?

Yep, that’s just a complicated way to enforce Windows exclusivity in PC games.

It’s also one of the only ways to try to stop cheaters.

The idea that all linux gamers are malicious hackers is a laughable, obviously false idea

That’s not an idea that anyone is saying though, other than you right now. The idea is that without that kernel level protection you can’t even hope to stop a high percentage of cheats.

You are right that you can’t run games with rootkit anticheats on linux though, so if those heavily monetized and manipulative games with toxic playerbases are your addiction of choice, yep, sorry, linux ain’t your hookup for those.

So like I said, the most popular, most played games on every platform (apart from linux) every year.

Again, this is another game platform freedom advocacy issue, and also a personal information security advocacy issue, not a ‘something is wrong with linux’ issue.

It is a “something is wrong with linux” issue if Linux doesn’t allow/provide for something that game developers - and game players - want, which is anti-cheat that does the absolute best it can to stop cheaters.

sp3ctr4l@lemmy.dbzer0.com on 06 May 04:10 collapse

Because without full access to your PC, anti-cheat is essentially useless and easily bypassed by cheaters.

This is false.

Many functional AntiCheats work well without Kernel Level access… and many Kernel Level AntiCheats… are routinely bypassed by common, easily purchaseable hacks… which, again, only work on Windows.

I used to play GTA V Online (and RDR2, and FiveM, and RedM…) on linux all the time, literally for years… untill they just decided to ban all linux players.

Because of cheaters.

That’s not an idea that anyone is saying though, other than you right now.

Uh… you are also basically saying this, with that combination of statements.

So… please refrain from obviously contradictory, gas lighting arguements, thanks!

Anyway: GTAV uses BattleEye.

BattleEye works on Linux.

Rockstar just … chose not to use that Linux support.

It’s also one of the only ways to try to stop cheaters.

There are many other ways to stop cheaters that are quite effective, namely, actually designing your game more competently and more cleverly, with less client side authority and more server side authority, less intrusive system client side AC that is more reliant on randomized realtime logging and verifications of game files, server side hereustics that pick up ‘impossible’ player input patterns, etc.

You know, all the other methods that have been used for decades, and still work.

No AntiCheat method will ever be 100% effective.

As I already mentioned, Kernel Level AntiCheats are defeated all the time, and you can easily find and purchase such cheats/hacks… which only work on Windows… after maybe 30 minutes of web searching or jumping around discord communities.

Beyond that, its not that hard or expensive to setup your own, or just purchase a tiny microcomputer that plugs into your PC, then you plug your mouse/keyboard into that, and then the microPC middleman performs aim hacks and otherwise impossible movement macros like stuttersteps and such.

Kernel ACs are routinely defeated by competent executions of this concept.

You can never stop all hackers.

It is always a trade off of exactly how much you inconvenience and degrade the system integrity/stability/security of the user, versus how many hackers you believe you are likely to stop.

Kernel Level AntiCheat is basically going to 99.99% effective from previous methods being 99.9% effective… and the cost is literally you are now installing a rootkit on your own system that could very well be reading all your web history and saved logins and passwords.

The code is black box, and tech companies lie all the time about how much data they gather from you… and then sell to every data broker they can.

The only actual numbers and statistics anyone has to work with, when justifying or arguing against effectiveness levels of different kinds of AC… are the claims put out by AC companies.

And even then, most people, such as yourself, aren’t even aware of or refuse to acknowledge that AntiCheats have worked on linux for years.

It is a “something is wrong with linux” issue if Linux doesn’t allow/provide for something that game developers - and game players - want, which is anti-cheat that does the absolute best it can to stop cheaters.

I see how you just completely did not address how I stated that EAC and BattleEye both support linux, other ACs have and still do as well… certain game publishers just don’t use these features that have existed for years.

Valve Anti Cheat, for example?

You can find more info if you look, but I’m guessing you won’t.

You just have an idea, of ‘the idea’.

Have you ever written a hack?

Written game netcode, and other client/server game code?

… I have! … back when I still used Windows, ironically.

Best way to test your own design is to try to defeat it.

Installing a rootkit onto your personal computer… to protect you from hackers in a game… is like trying to fight a stomach flu you got from Taco Bell by intentionally infecting yourself with Covid.

Oh and uh, after the whole… CrowdStrike fiasco, where Windows just allowed CrowdStike to write and update kernel level code without actually doing their own testing or validation… and then they pushed a bad update… and that took out basically 1/4 of the world’s enterprise Windows machines for a week or two?

Yeah… Windows is now removing direct kernel level access from third party software.

They’re making everything move up a level or two, kind o

Psythik@lemm.ee on 06 May 01:16 collapse

If you’re willing to get a base model, sure. The PS5 Pro is a $700 console, and that’s not including the subscription fee for multiplayer (which doesn’t exist on PC unless you’re into MMOs).

Edit: Also every Playstation (and Xbox) game eventually comes to PC, so unless you’re so impatient that you have to play the latest games right fucking now, there’s no reason to own a console. Even Switch games are fully playable on PC, at higher resolutions and framerates as well. I sold my Switch because the games look and run so much better on my gaming rig.

FreedomAdvocate@lemmy.net.au on 06 May 01:58 collapse

Also every Playstation (and Xbox) game eventually comes to PC, so unless you’re so impatient that you have to play the latest games right fucking now, there’s no reason to own a console.

This isn’t true yet (the games thing). Playstation haven’t brought or even suggested that they’ll bring every game to PC. Microsoft are, sure, which is amazing - but Sony very much still want to protect their walled garden of consoles. Playstation, more specifically the 30% cut they get of all game sales, and their online subscription fees, are the only thing keeping Sony afloat. If sony were to go full PC like MS, Playstation and Sony would go down like the Titanic.

As for the “there’s no reason to own a console”, eh I see the point and I agree to an extent but I also disagree. My Series X’s UI is just so much better than any PC UI, not to mention the features. Quick resume for one is an absolute game changer, and it’s not on PC. Everything that you could ever want to do while playing online is just so much easier to do on a console than on a PC. I say this as someone who plays more and more on PC these days, and wishes I could boot my PC into a Xbox OS type UI.

Psythik@lemm.ee on 06 May 01:14 collapse

Regarding that last point: consoles don’t come with TVs either, so you don’t even have to factor that in the cost of a gaming PC.

Furthermore, many modern TVs are now being designed with gaming in mind, and thus have input lag comparable to a good gaming monitor (like LG OLEDs and most Samsungs), so the whole concept of needing a dedicated monitor just for your PC is somewhat outdated now. If your TV is good enough for console gaming, then chances are it’s good enough for PC gaming too, so long as you did your research before buying and didn’t just buy whatever had a good picture on the showroom floor.

Also there’s the fact that multiplayer tends to be free on PC, so no subscription fees to worry about. The accessories tend to be cheaper as well.

Diplomjodler3@lemmy.world on 04 May 18:10 next collapse

You don’t need a graphics card. You can get mini PCs with decent gaming performance for cheap these days.

YouAreLiterallyAnNPC@lemmy.world on 04 May 18:16 next collapse

That sounds kind of like a console, no?

Edit: I mean, if the intent is gaming and only gaming, it feels like there’s a lot of overlap. Only the PC would have less support for more freedom.

paraphrand@lemmy.world on 04 May 18:22 next collapse

Interesting point. Then you understand why Apple is making moves to try to be a real player in gaming.

All three of us see how gaming performance is plateauing across various hardware types to the point that a modern game can run on a wide range of hardware. With settings changes to scale across the hardware, of course.

Or are you going to be a bummer and claim it’s only mini pcs that get this benefit. Not consoles, not VR headsets, not macs, not Linux laptops.

There really is a situation going on where there is a large body of hardware in a similar place on the performance curve in a way that wasn’t always true in the past. Historically, major performance gains were made every few years. And various platforms were on very different and less interoperable hardware architectures, etc.

The Steam Deck’s success proves my point, and your point alone.

The thing is, people don’t wanna hear it. They wanna focus on the very high end. Or super high refresh rates. Or they wanna complain about library sizes.

Ulrich@feddit.org on 04 May 18:23 next collapse

The ones with capable GPUs cost as much as a PS5 Pro.

MonkderVierte@lemmy.ml on 04 May 18:49 collapse

There are CPUs with quite capable iGPU, fitting in a mini-PC. All in all maybe $500.

And yeah, sure, the article mentioned that consoles are subsidized by game prices.

Ulrich@feddit.org on 04 May 19:42 collapse

Go on then. Which ones.

treyf711@lemm.ee on 05 May 06:43 next collapse

Knowing the usefulness that we’ve gotten at our house out of having them, I would probably say if I didn’t have the PS5 I would get a steam deck at this point. A refurbished one from valve when they’re on sale would be my pick. Plus, it works on my 20 year catalog of games.

MonkderVierte@lemmy.ml on 05 May 08:11 collapse

I have a Ryzen 7 5700G in my DeskMini X300, but that one is a genrration (?) ago. Still, can play almost all games in 3440x1440 at medium settings.

In case you have seen my “string and tape” case mod to fit the cooler, that was done to support Turbo for video recoding. Noctua NH-L9a-AM4 fits nicely.

FreedomAdvocate@lemmy.net.au on 05 May 01:37 next collapse

By decent you meant significantly worse than console gaming performance though.

Consoles are still the king for values in gaming, even with their increasing prices.

chunes@lemmy.world on 05 May 08:50 collapse

Can confirm. I wouldn’t recommend it unless you mostly play indie games, though.

sugar_in_your_tea@sh.itjust.works on 05 May 02:47 next collapse

You don’t need a top end card to match console specs, something like a 6650XT or 6700XT is probably enough. Your initial PC build will be more than a console by about 2X if you’re matching specs (maybe 3X if you need a monitor, keyboard, etc), but you’ll make it up with access to cheaper games and being able to upgrade the PC without replacing it, not to mention the added utiliy a PC provides.

So yeah, think of PC vs console as an investment into a platform.

If you only want to play 1-2 games, console may be a better option. But if you’re interested in older or indie games, a PC is essential.

SoftestSapphic@lemmy.world on 05 May 15:58 next collapse

My 4070 cost $300 and runs everything.

The whole PC cost around $1000, and i have had it since the Xbox One released.

You can get similar performance from a $400 steam deck which is a computer.

Blackmist@feddit.uk on 06 May 07:10 collapse

On what planet does a Steam Deck give 4070 performance?

And on which does a 4070 cost $300 for that matter? They cost more than a whole PS5.

SoftestSapphic@lemmy.world on 06 May 08:23 collapse

I took a gamble and bought used from Ebay cuz I saw the deal on user benchmark and it’s been working great so far.

If you have a card you want search it on there and sometimes you can get some great finds.

www.userbenchmark.com

ColeSloth@discuss.tchncs.de on 06 May 00:14 collapse

I can get ps5 graphics with a $280 video card, games are often way cheaper, I can hook the pc up to my TV, and still play with a ps5 or Xbox controller, or mouse and keyboard.

I suspect next gen there will be a ps6 and Xbox will make a cheap cloud gaming box and just go subscription only.

Nikelui@lemmy.world on 06 May 06:46 next collapse

Didn’t Google Stadia do the cloud thing and failed miserably?

ColeSloth@discuss.tchncs.de on 07 May 01:43 collapse

Microsofts cloud gaming is already profitable. Also, they got their ass kicked so badly against the ps5 that there’s no profitable avenue in developing and trying to sell a future console. They’re better off concentrating on pc games and cloud gaming. Sony can’t really compete against them in that market, just like microsoft is unlikely to make it worth while to compete against Sony in a console.

Robust_Mirror@aussie.zone on 06 May 07:12 collapse

The internet isn’t good enough globally to do that, and still won’t be by 2030 after the ps6/nextbox is out. Maybe the gen after next. But even then, there’s a lot of countries I could see still being patchy. Right now in Australia, Sony won’t even let you access the PS3 streaming games because they know it won’t work well enough.

ColeSloth@discuss.tchncs.de on 07 May 01:48 collapse

You overestimate how much Microsoft would care about people with bad internet. They’ll opt for smaller numbers paying a subscription per month/year on cheap hardware. No point in losing money against Sony directly.

Skyline969@lemmy.ca on 04 May 16:27 next collapse

I mean, for the price of a mid range graphics card I can still buy a whole console. GPU prices are ridiculous. Never mind everything else on top of that.

turbowafflz@lemmy.world on 04 May 16:33 next collapse

Yeah but remember to factor in that you probably already need a normal computer for non-game purposes so if you also use that for games you only have to buy one device not two

Fondots@lemmy.world on 04 May 17:13 collapse

I just built a PC after not having a computer for about 5+ years.

Built it for games, did not feel like I was missing out on anything in particular except games by not having a computer. There’s a lot of things I’d rather use a computer for but these days most of what I used to do on a computer can be done just fine from a phone or tablet.

During those 5 or so years, I maybe needed to use a computer about a dozen times, and if my wife didn’t have a computer I could have just swung by a library for a bit to take care of it.

taladar@sh.itjust.works on 04 May 17:15 collapse

To me tablets feel like the most useless devices ever invented. Too large to carry around with you but just as stupidly limited as a phone compared to a real computer where you can actually automate some of your tasks and type on a decent keyboard and have a decent sized screen that doesn’t ruin your wrists with the weight of holding it up.

Diplomjodler3@lemmy.world on 04 May 18:11 next collapse

You can build a pretty capable PC for about $600. And you won’t have to pay for multiplayer.

Grangle1@lemm.ee on 04 May 18:15 next collapse

“Pretty capable” will get you dunked on in the PC gaming world. For what I’ve seen PC gamers actually recommend I could buy 2-3 modern consoles.

Diplomjodler3@lemmy.world on 04 May 18:58 collapse

That’s just nonsense. Maybe some 13 year olds with rich parents think like that.

Skyline969@lemmy.ca on 05 May 01:06 next collapse

Along with paying for multiplayer I get access to a large catalog of games as well as additional games every month. Yes they’re inaccessible if I stop paying, but that’s not really a big deal. Even all that aside, I pretty much play single player games anyway.

Also, when a game comes out I know it’ll work. No driver bugs, no messing with settings, no checking minimum and recommended specs, it just works. And it works the same for everyone on the platform. I don’t have any desire to spend a bunch of time tweaking settings to get things just right, only to have the game crash for some esoteric reason or another.

ano_ba_to@sopuli.xyz on 05 May 23:50 collapse

If I’m building a PC for gaming, I wouldn’t limit myself to $600. Would you? I’ve never not had PCs or laptops since I first had one in the 90s. I’m building again now to go Linux. 7800xt and 2 Tb SSD cost as much as a PS5 Pro in my part of the world. I only started getting into consoles because I can afford it now, and for physical games. I don’t really get why today it’s PC vs. consoles. I was into PCs but never judged consoles as inferior, just different.

sp3ctr4l@lemmy.dbzer0.com on 05 May 00:14 next collapse

GPU prices are ridiculous, but those GPUs are also ridiculously more powerful than anything in any console.

The rough equivalent to a PS5Pro’s GPU component is a … not current gen, not last gen, but the gen before that… find AMD’s weakest GPU model in the 6 series, the RX 6600, and that is roughly the same performance as the GPU performance of a PS5Pro.

The Switch 2 may have an interesting, custom mobile grade Nvidia APU, but at this point, its not out yet, no benchmarks, etc.

Oh right also: If GPU prices for PCs remain elevated… well, any future consoles will also have elevated prices. Perhaps not to the same degree, but again, that will be because a console will be basically fairly low tier if you compared it to the range of PC hardware… and console mfgs can subsidize console costs with game sales… and they get discounts on ordering the components that go into their consoles by ordering in huge bulk volumes.

sugar_in_your_tea@sh.itjust.works on 05 May 03:06 collapse

Yeah, GPU prices are kinda ridiculous, but a 7600 is probably good enough to match console quality (essentially the same as the 6650XT, so get whatever is cheaper), and I see those going for $330. It should be more like $250, so maybe you can find it closer to that amount when there’s a sale. Add $500-600 for mobo, CPU, PSU, RAM storage, and a crappy case, and you have a decent gaming rig. Maybe I’m short by $100 or so, but that should be somewhere in the ballpark.

So $900-1000 for a PC. That’s about double a console, extra if you need keyboard, monitor, etc. Let’s say that’s $500. So now we’re 3x a console.

Entry cost is certainly higher, so what do you get in return?

  • deeper catalogue
  • large discounts on older games (anything older than a year or so)
  • emulation and other PC tasks
  • can upgrade piecemeal - next console gen, just need a new CPU + GPU, and if you go AMD, you can probably skip a gen on your mobo + RAM
  • can repurpose old PC once you rebuild it (my old PC is my NAS)
  • generally no need to pay a sub for multiplayer

Depending on how many and what types of games you play, it may or may not be cheaper. I play a ton of indies and rarely play AAA new releases, so a console would be a lot more expensive for me. I also have hundreds of games, and probably play 40 or so in a given year (last year was 50 IIRC). If I save just $10 per game, it would be the same price as a console after 2 years, but I save far more since I wait for sales. Also, I’ll have a PC anyway, so technically I should only count the extra stuff I buy for playing games, as in my GPU.

Skyline969@lemmy.ca on 05 May 06:04 collapse

You do make some decent points, but the console has one major aspect that PC simply does not have: convenience. I install a game and I’m playing it. No settings to tweak, no need to make sure my drivers are up to date, no need to make sure other programs I’m running are interfering with the game, none of that. If I get a game for my console I know it absolutely will work, with the exception of a simply shitty game which happens on PC too.

The other thing I wanted to touch on was the cheap games. That’s just as relevant on console nowadays. For example, I’ve been slowly buying the Yakuza games for $10-$15 each. That’s the exact same discounts I’ve seen on Steam.

For backwards compatibility, it depends on your console. Xbox is quite impressive - if you have an Xbox Series X you can play any game ever released for any Xbox all the way back to the original. Just stick in the disc. With PlayStation, it’s just PS4 games that the PS5 is backwards compatible with. Sony needs to do better. And with Nintendo… lol.

Yeah, with a PC you can do other things than gaming. For most of that you can get a cheap laptop. There are definitely edge cases where a powerful PC is needed such as development, CAD, AI, etc. But on average a gaming-spec PC is not necessary. I’m saying that as a developer and systems administrator for the past 14 years.

sugar_in_your_tea@sh.itjust.works on 05 May 13:15 collapse

No settings to tweak, no need to make sure my drivers are up to date, no need to make sure other programs I’m running are interfering with the game, none of that.

I also do almost none of that on my PC. I do install updates, but that’s pretty much in the background. Then again, I use Linux, so maybe it’s different on Windows these days? I doubt it.

Most people tweak settings and whatnot because they want to, not because they need to in order to get a decent experience. I use my PC and Steam Deck largely as a console: install games then then play. That’s it.

I’ve been slowly buying the Yakuza games for $10-$15 each

Steam isn’t the only store for buying games on PC, so the chance that you can buy a given game on sale on a given day is quite a bit higher vs console, where there’s only one store. I’ve picked games up on Steam, Fanatical, or Humble Bundle, and there are several others if you’re interested in looking.

For example, here’s Yakuza 0 price history on PC, it has been $10 somewhere for almost a year. On PlayStation, it looks like it’s been $20 most of the year. I actually got it for a little under $5 about 5 years ago, and I only paid >$10 for one Yakuza game (most were $7-8).

Tons of games show up in bundles as well. I have picked up tons of games for $2-5 each (sometimes less) as part of a bundle, and that’s just not really a thing on consoles.

if you have an Xbox Series X you can play any game ever released for any Xbox all the way back to the original

Interesting, that’s pretty cool!

gaming-spec PC

Honestly, the difference between a “gaming spec” PC and one targeting only typical tasks is pretty minimal outside the GPU, assuming you’re targeting console quality. You really don’t need a high end CPU, RAM, or mobo to play games, you can match CPU perf w/ something mid-range, so $150-ish for the CPU. Likewise for the GPU, you can get comparable quality for something in the $300-400 range, probably less now since the PS5 and XBox Series consoles are kind of old.

But that’s assuming you need console quality. You can get by with something a bit older if you turn the settings down a bit.

If you want to save cash, you have a lot more options on PC vs consoles. If you want to go all out, you have a lot more options on PC vs consoles for maxing out performance. PC gaming is as expensive as you make it. I used the same PC for playing games for something like 10 years before getting an upgrade (upgraded the GPU once), because it played all the games I wanted it to. If I have a console, chances are the newer games will stop supporting my older console a year or so after the new one launches, so I don’t have any options once the console goes out of support outside of buying a new one.

That said, there are a ton of caveats:

  • don’t buy laptops for gaming, they are way too expensive and can’t really be upgraded (Framework exists, sure, and I think eGPUs still do, but that’s going to be expensive)
  • don’t buy a pre-built PC if you want to save money - if you DIY your PC, you can save a bit of cash, but more importantly, you’re more likely to upgrade it vs replace it later on
  • you can spend a ton on PC gaming, if you follow whatever the influencer trends are (everyone needs a top-end GPU for $2k or whatever, plus a monitor > 200 hz)
  • consoles have a much better couch co-op experience

I have a Switch for the couch co-op experience, as well as ease of use for my kids to just put in a game and play, and a PC for most of my personal gaming time (I also have a Steam Deck so I can play in bed or on vacation). I have something like 20 Switch games and hundreds of PC games.

pewgar_seemsimandroid@lemmy.blahaj.zone on 04 May 21:00 collapse

they can be portable computers built for gaming

kalipixel@reddthat.com on 04 May 16:56 next collapse

The consoles unless you root or jailbreak them are too restrictive anyway. For older games you can just use an emulator on your PC or mobile.

Auntievenim@lemmy.world on 04 May 17:31 next collapse

Is it Moores law failing or have we finally reached the point where capitalists are not even pretending to advance technology in order to charge higher prices? Like are we actually not able to make things faster and cheaper anymore or is the market controlled by a monopoly that sees no benefit in significantly improving their products? My opinion has been leaning more and more towards the latter since the pandemic.

SaltySalamander@fedia.io on 04 May 18:10 next collapse

This has little to do with "capitalists" and everything to do with the fact that we've basically reached the limit of silicon.

sunzu2@thebrainbin.org on 04 May 19:04 next collapse

Fine we reached the limit... But why the price gouging lol

MrVilliam@lemm.ee on 04 May 21:46 collapse

Because people continue to accept that price by agreeing to pay it. The price of a product is dictated by what people are willing to pay for it. If the price is so low that the seller isn’t happy with it, they don’t sell it and stop making it.

In other words, if you think Nintendo prices are bullshit price gouging, then vote with your wallet. With enough votes, the prices come down or the company goes under. You don’t have that luxury of choice when it comes to groceries or shelter, but you absolutely do when it comes to luxury entertainment expenses. Make them earn your money.

TwinTitans@lemmy.world on 05 May 01:14 collapse

I wish people would apply this to many other industries as well. A company will rip people off the first chance that they get.

FreedomAdvocate@lemmy.net.au on 05 May 01:35 collapse

What do you classify as “ripping people off” when it comes to pricing?

MrVilliam@lemm.ee on 05 May 04:02 collapse

Not OP, but probably price gouging? Especially regarding things where you aren’t afforded the reasonable opportunity to make an informed decision (healthcare, baby formula plus necessary clean water). Also maybe regional monopolies (internet service) or pretty much anything involving an event or venue (ticket pricing or cost of a slice of pizza or a can of beer at a festival).

In all of these examples, you likely don’t have a heads-up or the chance to choose something else. Admittedly, most of the examples off the top of my head were unnecessary luxury spending, but how in the blue fuck is it okay that any of them are literally a situation of “pay me whatever price I decide or else a person will die”?

Pretty fucked up if you ask me.

FreedomAdvocate@lemmy.net.au on 05 May 17:45 collapse

I agree with your examples, and my issue is when people call pricing a game console at $450, or a game at $80 “price gouging”.

It’s not, in any way.

sunzu2@thebrainbin.org on 06 May 17:22 collapse

Up there I was talking about silicon specifically. But on this topic...

Sure, this is 100% discretionary spend and I am deff not buying but I am also a Linux user and will use emulators for my kids because fuck Nintendo and thei r parasitic business practices

But you have to see how a less sophisticated consumer is being price gouged? We are talking about games for kids at adult man with a job prices.

Or is u "efficient markets" typa a guy?

I agree with your fundamental premise but behaviour is scammy IMHO

FreedomAdvocate@lemmy.net.au on 06 May 18:34 collapse

No one is being price gouged by Nintendo. It’s a luxury technology device. Gaming is more for adults than kids these days, and had been for a long time. The average age of gamers has been increasing for decades and is around 30 years old.

sunzu2@thebrainbin.org on 06 May 18:39 collapse

So we should charge this idiots adult working man prices 🤡

Good job bootlicking, champ

FreedomAdvocate@lemmy.net.au on 06 May 18:46 collapse

Once more but in English please?

sunzu2@thebrainbin.org on 06 May 21:28 collapse

Sure but why is u mad tho

FreedomAdvocate@lemmy.net.au on 06 May 22:22 collapse

You’re the one that seems mad. I understand that not being able to afford something doesn’t mean that there is “price gouging” going on. I understand that I’m not entitled to everything I want being affordable to me with pocket change.

sunzu2@thebrainbin.org on 06 May 22:35 collapse

Well good thing there is piracy to check these bootlicker attitudes at least

FreedomAdvocate@lemmy.net.au on 06 May 22:53 collapse

Bootlicker, another term so misused by the left that it now means nothing.

I don’t think the switch 2 emulation scene is going to go anywhere after what Nintendo did to the yuzu devs. Not to mention that now any emulation would have to try and emulate the cuda cores, which has never been a thing in consoles before.

Just save your pocket money for a few years, maybe ask your parents what other little jobs you can do to earn a bit extra, and you’ll get a switch 2 eventually.

sunzu2@thebrainbin.org on 06 May 23:02 collapse

"The left" 🤡

FreedomAdvocate@lemmy.net.au on 07 May 01:52 collapse

Generally the people that throw around the term “bootlicker” in situations like this, who feel entitled to others work for free or process they choose, are leftists.

sunzu2@thebrainbin.org on 07 May 04:49 collapse

I think you are suffering from a bias issue here tbh

I am not sure how your political opinions tie to this but sure have fun with it ;)

FreedomAdvocate@lemmy.net.au on 07 May 08:15 collapse

OK you’re not a lefty, you just use all their favourite terms, make all the same arguments for why the things you want to buy should be cheaper, and jump straight to the sane old insults as quickly as leftists do.

If it quacks like a duck and all that…

Diplomjodler3@lemmy.world on 04 May 19:09 next collapse

While blaming anything and everything on “capitalism” is disingenuous, it really does have to do with a lack of competition in the space. None of the incumbents have any incentive to really put much effort into improving the performance of gaming GPUs. PC CPUs face a similar issue. They’re good enough for the vast majority of users. There is no sizable market that would justify spending huge amounts of money on developing new products. High end gaming PCs and media production workstations are niche products. The real money is made in data centre products.

Lesrid@lemm.ee on 05 May 16:43 collapse

I mean, when the definition of economy can be “how the species produces what it needs” then the answer to a problem is probably capitalism even if that answer explains very little

sunzu2@thebrainbin.org on 06 May 17:24 collapse

Capitalism is oligarchy owning means of production

I think what you are aim for specifically here is free market, ie demand driven economy

People conflate the two regularly... But we don't need capitalism, we need a free market.

But what we got is oligarchy without a free market

nuko147@lemm.ee on 05 May 02:09 collapse

I don’t agree. It is capitalism, but not in a bad way. Simply put it is economy logic. Chip market has shifted from consumer market to the enterprise market.

So because the supply is limited, the demand has gone way up, and enteprise market has a lot, a mean a lot of money to spare buying, because it is an investment for them and not entertainment.

Also some bad capitalist tacticts in other areas, hard drives for example, that the big players reduced production to keep prices from falling. They cotribute to the problem, but they are not the major factor.

Coyote_sly@lemmy.world on 04 May 21:10 next collapse

Por que no los dos?

ICastFist@programming.dev on 04 May 21:42 collapse

Moore’s law started failing in 2000, when single core speeds peaked, leading to multi core processors since. Memory and storage still had ways to go. Now, the current 5nm process is very close to the limits imposed by the laws of physics, both in how small a laser beam can be and how small a controlled chemical reaction can be done. Unless someone can figure a way to make the whole chip fabrication process in less steps, or with higher yield, or with cheaper machines or materials, even if at 50nm or larger, don’t expect prices to drop.

Granted, if TSMC stopped working in Taiwan, we’d be looking at roughly 70% of all production going poof, so that can be considered a monopoly (it is also their main defense against China, the “Silicon Shield”, so there’s more than just capitalistic greed at play for them)

www.youtube.com/watch?v=po-nlRUQkbI - How are Microchips Made? 🖥️🛠️ CPU Manufacturing Process Steps | Branch Education

Auntievenim@lemmy.world on 05 May 05:07 collapse

Very interesting! I was aware of the 5nm advancements and the limitations of chip sizes approaching the physical limitations of the material but I had been assuming since we worked around the single core issue a similar innovation would appear for this bottleneck. It seems like the focus instead was turned towards integrating AI into the gpu architecture and cranking up the power consumption for marginal gains in performance instead of working towards a paradigm shift. Thanks for the in depth explanation though, I always appreciate an opportunity to learn more about this type of stuff!

ABetterTomorrow@lemm.ee on 04 May 22:09 next collapse

Wtf, that headline is fucking backwards thinking and capitalistic. If you’re not greedy and don’t have unnecessary high standards that doesn’t make a game, you’re the problem. Sorry not sorry but gamers demand and the companies are at fault here.

heyWhatsay@slrpnk.net on 04 May 22:23 next collapse

This article doesn’t factor in the new demand that is gobbling up all the CPU and GPU production: Ai server farms. For example, Nvidia, that was once only making graphic cards for gamers, has been trying to keep up with global demand for Ai. The whole market is different, then toss tarrifs and the rest of top.

I wouldn’t blame moores law death, technology is still advancing, but per usual, based on demand.

FreedomAdvocate@lemmy.net.au on 05 May 01:29 next collapse

AI has nothing to do with it. Die shrinks were the reason for “slim” consoles and big price drops in the past. Die shrinks are basically a thing of the past now.

sugar_in_your_tea@sh.itjust.works on 05 May 02:42 collapse

Not exactly, but smaller nodes are getting really expensive. So they could make a “slim” version with a lower power unit, but it would likely cost more than the original.

nlgranger@lemmy.world on 05 May 08:01 collapse

technology is still advancing

Actually not really: performance per watt of the high end stuff has been stagnating since Ampere generation. NVidia hides it by changing models in its benchmarks or advertising raw performance without power figures.

heyWhatsay@slrpnk.net on 05 May 20:28 collapse

Idk, seems like Germany is making progress.

FreedomAdvocate@lemmy.net.au on 05 May 01:28 next collapse

It’s not that they’re not improving like they used to, it’s that the die can’t shrink any more.

Price cuts and “slim” models used to be possible due to die shrinks. A console might have released on 100nm, and then a process improvement comes out that means it can be made on 50nm, meaning 2x as many chips on a wafer and half the power usage and heat generation. This allowed smaller and cheaper revisions.

Now that the current ones are already on like 4nm, there’s just nowhere to shrink to.

toastmeister@lemmy.ca on 05 May 04:04 next collapse

Which itself is a gimmick, they’ve just made the gates taller, electron leakage would happen otherwise.

dai@lemmy.world on 05 May 09:00 collapse

NM has been a marketing gimmick since Intel launched their long-standing 14nm node. Actual transistor density depending on which fab you compare to is shambles.

It’s now a title / name of a process and not representative of how small the transistors are.

I’ve not paid for a CPU upgrade since 2020, and before that I was using a 22nm CPU from 2014. The market isn’t exciting (to me anymore), I don’t even want to talk about the GPUs.

Back in the late 90s or early 2000s upgrades felt substantial and exciting, now it’s all same-same with some minor power efficiency gains.

lka1988@lemmy.dbzer0.com on 05 May 14:52 next collapse

This is why I’m more than happy with my 5800X3D/7900XTX; I know they’ll perform like a dream for years to come. The games I play run beautifully on this hardware under Linux (BeamNG.Drive runs faster than on Windows 10), and I have no interest in upgrading the hardware any time soon.

Hell, the 4790k/750Ti system I built back in 2015 was still a beast in 2021, and if my ex hadn’t gotten it in the divorce (I built it specifically for her, so I didn’t lose any sleep over it), a 1080Ti upgrade would have made it a solid machine for 2025. But here we are - my PC now was a post-divorce gift for myself. Worth every penny. PC and divorce.

FreedomAdvocate@lemmy.net.au on 05 May 17:43 collapse

There’s no world in which a 750Ti or even 1080Ti is a “solid machine” for gaming in 2025 lol.

lka1988@lemmy.dbzer0.com on 05 May 17:48 next collapse

For what I do? It would be perfectly fine. Maybe not for AAA games, but for regular shit at ~40fps and 1080p, it would be perfectly fine.

Gotta remember that some of us are reaching 40 years old, with kids, and don’t really give a shit about maxing out the 1% lows.

FreedomAdvocate@lemmy.net.au on 06 May 02:28 collapse

but for regular shit at ~40fps and 1080p

it would be perfectly fine.

That’s not “perfectly fine” to most people, especially PC players.

Gotta remember that some of us are reaching 40 years old, with kids, and don’t really give a shit about maxing out the 1% lows.

Already there myself. I don’t care about maxing out the 1% lows, but I care about reaching a minimum of 60fps average at the bare minimum, preferably closer to 100 - and definitely higher than 1080p. Us oldies need more p’s than that with our bad eyesight haha

ZC3rr0r@lemmy.ca on 05 May 17:49 collapse

Depends on your expectations. If you okay mainly eSports titles at 1080p it would’ve probably been quite sufficient still.

But I agree it’s a stretch as an all-rounder system in 2025. My 3090 is already showing signs of it’s age, a card that’s two generations older would certainly be struggling today.

FreedomAdvocate@lemmy.net.au on 05 May 17:42 collapse

Now, maybe, but like I said - in the past this WAS what let consoles get big price cuts and size revisions. We’re not talking about since 2020, we’re talking about things like the PS -> PSOne, PS2 - PS2 Slim.

Buddahriffic@lemmy.world on 05 May 16:58 next collapse

Not to mention that even when some components do shrink, it’s not uniform for all components on the chip, so they can’t just do 1:1 layout shrinks like in the past, but pretty much need to start the physical design portion all over with a new layout and timings (which then cascade out into many other required changes).

Porting to a new process node (even at the same foundry company) isn’t quite as much work as a new project, but it’s close.

Same thing applies to changing to a new foundry company, for all of those wondering why chip designers don’t just switch some production from TSMC to Samsung or Intel since TSMC’s production is sold out. It’s almost as much work as just making a new chip, plus performance and efficiency would be very different depending in where the chip was made.

SirEDCaLot@lemmy.today on 06 May 03:36 next collapse

This is absolutely right. We are getting to the point where the circuit pathway is hundreds or even dozens of electrons wide. The fact that we can even make circuits that small in quantity is fucking amazing. But we are rapidly approaching laws-of-physics type limits in how much smaller we can go.

Plus let’s not forget an awful lot of the super high-end production is being gobbled up by AI training farms and GPU clusters. Companies that will buy 10,000 chips at a time are absolutely the preferred customers.

MDCCCLV@lemmy.ca on 06 May 06:05 collapse

Did you read the article? That’s exactly what it said.

theotherbelow@lemmynsfw.com on 05 May 02:09 next collapse

No, it turns out that lying to the consumer about old tech is profitable.

doodledup@lemmy.world on 05 May 11:57 collapse

Hatebait. Adds nothing informative to the thread.

Jakeroxs@sh.itjust.works on 05 May 17:48 next collapse

Ironic the image is of a switch, like Nintendo has been on the cutting edge at all in the last 20+ years

Shanmugha@lemmy.world on 05 May 18:06 next collapse

So now we can finally go back to good old code optimization, right? Right? (Padme.jpg)

lagoon8622@sh.itjust.works on 05 May 19:34 collapse

We’ll ask AI to make it performant, and when it breaks, we’ll just go back to the old version. No way in hell we are paying someone

Shanmugha@lemmy.world on 05 May 19:41 collapse

Damn. I hate how it hurts to know that’s what will happen

VerticaGG@lemmy.blahaj.zone on 05 May 23:12 next collapse

game graphics and design peaked in 2008. N64 was more optimized than anything that came after. Im so over current gen, and last gen and the gen before that too. Let it all burn. :)

Edit: Furthermore,

…blahaj.zone/…/222c26df-19d9-4fce-9ce3-2f3dcffefc…

Talonflame@lemmy.cafe on 05 May 23:20 collapse

Was about to say this too. Can’t tell a difference between most games made in 2013 vs 2023.

Amir@lemmy.ml on 06 May 00:02 collapse

Battlefield 1 still beats 99% of games releasing now

Guidy@lemmy.world on 06 May 04:38 next collapse

That’s why I play using a PC and not a console. Though PC components have also been overpriced for years.

umbrella@lemmy.ml on 06 May 06:21 next collapse

is it just me or this title is weird?

orcrist@lemm.ee on 06 May 08:30 collapse

It’s not just you. The title gets causation totally wrong. If people made bad assumptions about how technology would change in the future, it’s their assumptions that are the problem, not reality.

NigelFrobisher@aussie.zone on 06 May 08:17 next collapse

Also they’re not going to play Silksong any better than a ten year old console.

_core@sh.itjust.works on 06 May 11:43 collapse

Man they are going to ride the pandemic as a cause for high prices until it’s a skeleton just skidding on the ground. It’s been four years since pandemic supply issues, pretty sure those are over now. Unless they mean the price gouging that happened then that hasn’t gone down.