You’d think since 2005 they’d be experienced enough to know how to work with UE but noooo…
ICastFist@programming.dev
on 16 Sep 00:19
collapse
To be fair, it seems only Epic themselves know how to fuck around with UE5 nowadays. That nanite lighting, temporal antialiasing and dithering everything makes everything look like ass. Games from 2016 look better
the_riviera_kid@lemmy.world
on 15 Sep 17:38
nextcollapse
Randy Bitchford cries about a problem he created, More at 11.
sp3ctr4l@lemmy.dbzer0.com
on 15 Sep 22:47
collapse
I can’t believe I’ve somehow never seen ‘Randy Bitchford’, but that’s his new name now, christ what a fucking diva.
the_riviera_kid@lemmy.world
on 15 Sep 23:16
collapse
I’ve been calling him that since he got all whiney about the Duke Nukem forever flop, Glad that there are now 2 of us.
sigmaklimgrindset@sopuli.xyz
on 16 Sep 03:59
collapse
Make that three, I also call him that whenever he comes up.
scrubbles@poptalk.scrubbles.tech
on 15 Sep 18:11
nextcollapse
I’ve read reports that people can’t get more than 30fps on low settings on 4000 series cards. I’m definitely not one to expect sweet 120fps on ultra on launch day, but a 4000 card not even getting low settings? They failed. Hard.
sp3ctr4l@lemmy.dbzer0.com
on 15 Sep 22:49
nextcollapse
Hey guys, you know what we really need for a IP property that is famous for a rather simplistic cel shaded art style?
The most complicated realtime lighting and rendering engine in the world.
… I could smack him with a fucking book, holy shit this is so stupid.
scrubbles@poptalk.scrubbles.tech
on 15 Sep 23:16
collapse
Don’t you know that gamers demand hyper realistic graphics? Even when there is a defined art style it doesn’t matter. Realism.
It’s famously why comic books failed to captivate anyone, and cartoons are never successful.
ICastFist@programming.dev
on 16 Sep 00:20
nextcollapse
2D pixel games without realtime raytracing and perfect light simulations never ever sold
I can’t believe no one buy silksong and cuphead because it is 2d and it isn’t hyperrealistic. Sorry team cherry and MDHR, you guys failed.
NuXCOM_90Percent@lemmy.zip
on 16 Sep 13:59
collapse
I mean, Cuphead is one of the more visually impressive games… ever. It has an art style and it works it to the fullest extent. And Silksong is also quite gorgeous.
This comes up every other year or so it seems. People think 2D means “low effort” and get angry that so many fighting games moved on to 3D. But the reality is that actually making sprites is a VERY labor intensive process that often requires a deep understanding of the entire rendering pipeline (including the hardware it is displayed on). At this point, “most” online people are aware that many of the NES/SNES sprites were specifically made with CRT “blurring” in mind but it goes way beyond that. So that is why franchises like Street Fighter just have “generic” 3d models.
And ArcSys more or less made their entire model (wait for it) simple-ish 3d models with cell shading and very specific lighting systems to appear 2D even though they aren’t. Which is why stuff like the super attacks always look so impressive as you do the zoom and spin around.
I can’t speak for blands 4 since 3 (and the pre-sequel…) were so aggressively obnoxious that I just replay 2 every 3 or 4 years. But just look at this thread where you have weirdos saying that UE5 games look like they are from 2016. People are deeply stupid and games need to pop and sizzle to stand out. And while I don’t know (or care) if blands 4 succeeded… just look at ArcSys for how you can use modern engines to make cellshaded 3d models look AMAZING.
It is very specifically for actually 2D games, but check out some of Cobra Code’s videos on youtube. They have put a LOT of work into how to use UE5 to make sprite based 2D games look GOOD and it is actually fascinating. And that is still sidestepping the initial sprite work for the most part.
And as another example of why sprites are actually a ridiculous amount of work to get right. Mina The Hollower (?) is the latest game from the Shovel Knight devs. And… most of us who tried the demo on a high resolution display felt REALLY weird because of the way the sprites and animations looked upscaled (I saw a breakdown of why. I did not understand it). The devs are putting in the work to fix that ahead of launch but it really speaks to the kinds of problems that come up when you go from testing on a Steam Deck or in a debug window to stretched across a 1440p display at 120-ish Hz.
shadowedcross@sh.itjust.works
on 16 Sep 11:17
nextcollapse
On 3440x1440 with a 4080 I get around 50-60 FPS on very high, with DLSS Quality.
scrubbles@poptalk.scrubbles.tech
on 16 Sep 16:52
collapse
They’ve released some updates, I’m not surprised if it’s gotten better
shadowedcross@sh.itjust.works
on 17 Sep 11:09
collapse
I’ve been playing since the 12th, it hasn’t gotten better since then and my friend who got it day one hasn’t said anything about updates improving the performance.
I’m definitely not one to expect sweet 120fps on ultra on launch day
I fucking am. I didn’t pay $1700 for a graphics card to have potato graphics. I’m glad there’s been plenty of bad press on this, not that I would have bought it this close to launch anyway.
scrubbles@poptalk.scrubbles.tech
on 16 Sep 18:50
collapse
No, I disagree with that. It’s always been perfectly normal to have ultra graphics a bit out of reach so that the game will look great on future graphics cards, and 120fps is a ridiculously high number that should never be expected even with top of the line graphics cards for a brand new release. (Assuming 2 or 4k)
However, a 1 generation out of date graphics card should be able to easily play most things on High settings at a decent framerate (aiming for 60) on 4k settings, which Borderlands failed at horribly. Medium to low settings on a 4000 series card sounds like a gutpunch to me.
Randy isn’t even an engineer, he just owns a train.
NuXCOM_90Percent@lemmy.zip
on 15 Sep 18:36
nextcollapse
Obligatory dril tweet about how you never “Gotta hand it to them”
But yeah. It is infuriating how often people just spew nonsense about “it is Unreal so it runs like shit” just like it was when “It is Unity so it runs like shit” and so forth. I very much do not think people need to “code (their) own engine” to understand things but… it would be a good idea to to do some basic research on the topic.
I can’t speak for blands 4 specifiaclly. But with a lot of these games? People don’t realize that the game is very much being optimized with upscaling and even framegen in mind. You can say you hate that if you want to. But these studios aren’t targeting 120 FPS at 4k. They are targeting 120 FPS at 2k or even 60 FPS at 2k. In large part because… there still isn’t a lot of value in actually targeting 4k on any console or even most PCs. So it gives them more or less a “single” target. And some techniques don’t scale anywhere near as well (also 2k to 4k is not linear).
It is just that people grew up in the golden age where everything was targeting a PS4 and kind of has been for almost a decade at this point. So the idea of futzing with your graphics settings and checking the different AA (now upscaling) models is kind of anathema to a lot of users. And consoles STILL try to hide all of that.
SalamenceFury@lemmy.world
on 15 Sep 19:19
collapse
If I buy a 1000 dollar graphics card I’m expecting it to run any game at max settings with NO upscaling. Because that was the norm for over 35 years.
Upscaling is supposed to be used to EXTEND the life of lower end graphics cards, not be the new fucking normal.
NuXCOM_90Percent@lemmy.zip
on 15 Sep 19:53
nextcollapse
You can expect whatever you want. That isn’t what is being sold. That’s why so much of the nVidia bullshit insists that reviewers and partners report upscaling and framegen as the “normal” results (and why outlets like Gamers Nexus are on their shitlist).
And the “norm for over 20 years” was that games targeted a specific range of hardware “power”. That is why so much work was put into engines to support higher res models in first person view that would only be one per scene and lower res models that other characters have. And scaling detail based on distance and so forth. But it also meant that we didn’t have the ridiculously high res hero models that might only exist for cutscenes because no computer of the era would be able to run that performantly in real time.
Personally? I would LOVE if more effort were put in for scalability (which Unreal is ironically really good at because it targets film). But that gets back into what the target audience is and if it is worth putting the effort in for “Mega Ultra Super Epic” settings when nobody is going to use them. And… considering that one of the biggest issues is people refuse to put more research in than “I set it to High”… yeah.
But also… that isn’t NOT what we have now. We have the games that are designed to look good at 4k asset wise. And, in a couple years (maybe longer, maybe never) we will have the oomph to actually run those natively at 4k/120 and it will be BEAUTIFUL (and have less weird fur textures appearing out of nowhere or weird cape physics when the model barfs). Which isn’t TOO different than building a new PC, playing Dwarf Fortress because it is funny, and then booting up a Total War and losing your mind for a bit.
SalamenceFury@lemmy.world
on 16 Sep 03:39
collapse
That isn’t what is being sold
That’s what was being sold for over three decades. I give 1000 dollars worth of graphics card, I can run anything at max settings with no compromises. Nvidia and videogame devs enshittified this shit so much that you require stuff that should be reserved for LOW END graphics cards to run the game properly. Fuck that.
NuXCOM_90Percent@lemmy.zip
on 16 Sep 16:12
collapse
Hey buddy. I got a 2070 that I’ll give you a real bargain on. Only 900 USD plus s&h. It’ll play Crysis at max settings with absolutely no compromises. Hit me up.
But, in all seriousness: This is why (minimally biased) reviews are a thing. Outlets like Gamers Nexus or Hardware Unboxed do a spectacular job of actually benchmarking against common games of the current era (and a few oldies) and comparing them against a large database of other cards. Don’t just assume that you can give Jensen a grand and get what you want. Make informed decisions.
Also I am not actually sure if 900 is ACTUALLY a steal for a 2070 at this point and didn’t care enough to check for my joke. SImilarly, I want to say a grand is closer to a 4070 than a 4090 at this point? Shit is REAL fucked which is even more why you should actually look at what you are buying rather than “30 years ago if I gave someone a thousand bucks they would suck me off AND let me play every game I could ever want at a full 640*40 and 30 FPS!”
Also I want to say Crysis is actually more CPU bound these days since it was the height of the Pentiums and single core performance is nowhere near the priority it used to be. But too lazy to check.
SalamenceFury@lemmy.world
on 16 Sep 21:31
collapse
I can’t decipher whatever word salad you’re trying to say here and I don’t care. There is no excuse for a top of the line GPU from LAST YEAR to not get at least 80 FPS on High. Most gamers are poor and have stuff like 3060s, 4060s, and RX 6600s. I’m not a rich fucker who can burn 2000 dollars every year on a 600 watt GPU to get a pitiful 50 FPS.
They have AA off because they recommend upscaling on literally everything. Upscaling sorta acts like AA and they are not used together.
jjjalljs@ttrpg.network
on 15 Sep 20:03
nextcollapse
I feel like they’d make more money if they targeted lower end machines. My friend has a potato laptop and would enjoy borderlands, so they’re out of luck. They’re not going to spend any money on a new gaming toy because it’s not that big a hobby for them. I imagine there are many such people.
Bentdreadnot@lemmy.world
on 15 Sep 20:03
nextcollapse
Cheaping out on optimization is one thing, telling customers to basically eat shit if they have a problem with it is… something that‘s not gonna have any major consequences nowadays, I guess lol
TwinTitans@lemmy.world
on 15 Sep 20:25
nextcollapse
Seems fine in performance mode on PS5. Windows users having trouble?
It’s funny given they use unreal.
CaptainBlinky@lemmy.myserv.one
on 15 Sep 20:48
collapse
I run a 7800X3D on and X870 mobo, 64gb ram and an RTX3090, and couldn’t run it at my native 7680x2160 resolution and have a playable framerate. It was fine at 3840x1080, but randomly crashed to desktop (at 1/4 my normal resolution, that I can even play Star Citizen at.)
I refunded it after 3 hours. This was before Randy’s ridiculous response to the criticism. I was planning on buying again later when it’s fixed, but now I’m not so sure. The man is his own worst enemy.
He’s a bell end. He’s always been a bell end. I had the dubious pleasure of meeting him when I was working as a games press outfit’s tech monkey over a decade ago at Gamescom in Cologne.
Every other word out of his mouth was a business buzzword, he’s smarmy and smug, my skin crawled the whole time he was present.
It looks like he’s as high on his own farts now as he was back then.
wafflesies@infosec.exchange
on 15 Sep 20:36
nextcollapse
@inclementimmigrant borderlands 4 running like dogshit on a 5090 is going to make me crash out in the most legendary way
Flamekebab@piefed.social
on 15 Sep 20:39
nextcollapse
Randy, you molest your employees with that mouth?!
Anomnomnomaly@lemmy.org
on 15 Sep 22:29
nextcollapse
Takes me back to 2008 and Rockstar claiming that GTA 4… was designed for hardware that didn’t exist yet after that shit show of a shitty port to PC… It would be a further 6yrs before I could play that game at a reliable 19080p, 60fps… then GTA5 came out a year later.
Drbreen@sh.itjust.works
on 15 Sep 22:56
nextcollapse
No wonder you couldn’t play it. 19080p, mother of God!
19080p in 16:9 is 33,920X19,080 =647,193,600 pixels, 647 megapixels. 8K is 33.2 megapixels. So this dude is playing GTA 4 on a screen that has almost 20 times more pixels than an 8K screen
Drbreen@sh.itjust.works
on 16 Sep 00:14
nextcollapse
I’m not sure if they can anymore. Civ 7 broke me on how it shoe-horned in systems to make money that ultimately broke what was a tried and tested formula.
aesthelete@lemmy.world
on 15 Sep 23:20
nextcollapse
No
e8d79@discuss.tchncs.de
on 15 Sep 23:42
nextcollapse
I don’t need to be a three star Michelin chef to realize that the plate of shit I have just been served is in fact a plate of shit. Similarly, I also don’t need to be a game developer to see that the buggy mess of a game gearbox just released is not worth my time.
Take your own advice? What games do they have not on some other developer’s engine? Other than their initial expansions for Half-Life, which are basically mods for Half-Life, all the games I know they have done were built on various iterations of Unreal Engine.
MoreZombies@piefed.au
on 16 Sep 05:31
nextcollapse
I'm ready for the article announcing Randy takes back his statements. As is the cycle.
Greasy bastard.
RizzRustbolt@lemmy.world
on 16 Sep 05:43
nextcollapse
Coffee Stain managed to get UE5 to keep window settings between sessions, but what do they know? Certainly not how to not fix an inventory screen bug that’s been in 4 games at this point.
Shade aside, I do think more developers should make their own engine. Yes it takes time and resources, but those are spent on exactly what you need instead of on getting what you want out of an engine that was made to do everything but focused on nothing.
I mean I sort of agree, but I’ve both used custom engines and seen people trying use custom engines and you have this problem where the engine was designed for a game, rather than for any game. So if the original game didn’t have a particular feature the engine has no capacity to do that thing, so every time you want to make a new game in that engine, you basically have to rewrite the engine.
It works if you build an engine to be an engine, but as you say that’s extremely expensive and time consuming and you probably am not going to get any benefit out of it. You could try selling the engine, but you’re unlikely to make much progress unless there is a significant improvement over the other options already available.
This kind of thinking is what caused all the issues for cyberpunk 2077 release. You need people who can actually use the engine. If you have to train every dev that on boards on your custom engine and all its quirks and customizations youll mever be able to release on time. Its why theyve switched to working with unreal. You can actually get people who are familiar wirh all the systems without spending weeks or months training them.
It also sucks for the devs trying to leave since youve now spent months or years working on systems and code that dont work anywhere else. So whoever hires you will again have to retrain you with their systems.
TheGreenWizard@lemmy.zip
on 16 Sep 10:17
nextcollapse
God this guy is so impossible, hes a fucking cartoon character.
In short, there was a legal dispute between Pitchford and a former counsel for Gearbox. As part of a pattern of suit-countersuit, the former employee alleged that Pitchford had left a USB stick at a local restaurant which contained proprietary company info as well as underage pornography. Pitchford confirmed that all of the above, with the notable exception of the “underage” part. Given nothing came of it, and he was remarkably candid about what type of porn was actually on the USB, I’m inclined to believe him.
IpsumLauren@lemmy.world
on 16 Sep 11:17
nextcollapse
this is unreal
But_my_mom_says_im_cool@lemmy.world
on 16 Sep 11:52
nextcollapse
I remember when 3 came out, i still had a ps4. The game was literally unplayable, like literally crashing every 2 minutes. Couldn’t get past the first 10 minutes of the game. Im not surprised that even on ps5 the new one is crap.
It’s like they gave up after part 2
I don’t understand how Sony would allow a game on their platform that doesn’t actually run. Like surely they will require to provide some kind of advanced copy for them to review?
I’m pretty sure they allow anything that hits 30fps.
the_artic_one@programming.dev
on 17 Sep 14:43
collapse
There’s a process called certification you need to pass in order to release on a console where they test your game against a list of criteria outlined in the developer agreement to validate stability, minimum performance,and conformance with platform standards. Nintendo pioneered this process (the Nintendo seal of quality) in response to unauthorized developers releasing cartridges which ran poorly and could freeze or even cause damage to consoles.
For a while, console manufacturers pretty strict about certification requirements but as time goes on they’ve been granting more and more exceptions to large publishers willing to pay fees and pinky-promise to fix the issues post-launch.
ArchmageAzor@lemmy.world
on 16 Sep 13:51
nextcollapse
Now watch as some determined lunatic codes a new engine and somehow ports all of BL4 to it.
Have to partly disagree. The loading screens, whilst usually brief, were very annoying.
Elite Dangerous, No Man’s Sky or Space Engineers accomplish similar tasks without needing those, or having to limit the accessible planet area.
I understand that those were limitations of the engine, but it could just be speculation.
nutsack@lemmy.dbzer0.com
on 16 Sep 15:46
nextcollapse
Does borderlands really “need” some bleeding edge engine?
Its not a terribly complex game? And nobody gives a shit if its a bit janky.
Borderlands, like its movie, is the kid who gets an ant farm for his birthday, pulls out the tubes, shoves one end up his ass and starts sucking the farts out with the other.
Yes, its entertaing, but nobody expects it to be high-class.
Fyrnyx@kbin.melroy.org
on 17 Sep 13:50
nextcollapse
He's just mad that he's not as good at making game engines like Id, Valve and even Epic did.
MangoPenguin@lemmy.blahaj.zone
on 17 Sep 21:36
collapse
Maybe I’m just wildly crazy here, but I think a $70 game from a big studio should run well.
We’re not talking some $15 early access title here that lacks optimization because it’s like 2 guys making it in their free time.
threaded - newest
Randy pitchfork is a piece of shit that needs to stfu
Oh, I was unaware randy coded unreal engine all by himeself. I take back every negative thing i ever said about the fucking Twat
Yeah, fucking what?
What the fuck does that even mean?
Randy is… what… when was the last time this asshat even wrote any code, much an less engine?
20 years?
No, seriously, when actually was the last time he wrote any code?
He founded Gearbox in 99… and by 2005, they liscensed UE 3… prior to that, they’d been working with UE2… and Source…
I think you have to back prior to Gearbox, to Duke Nukem 3D and Lo Wang, to find somewhere Randy may actually have contributed to game engine code.
In conclusion: Fuck you Randy.
You’d think since 2005 they’d be experienced enough to know how to work with UE but noooo…
To be fair, it seems only Epic themselves know how to fuck around with UE5 nowadays. That nanite lighting, temporal antialiasing and dithering everything makes everything look like ass. Games from 2016 look better
Randy Bitchford cries about a problem he created, More at 11.
I can’t believe I’ve somehow never seen ‘Randy Bitchford’, but that’s his new name now, christ what a fucking diva.
I’ve been calling him that since he got all whiney about the Duke Nukem forever flop, Glad that there are now 2 of us.
Make that three, I also call him that whenever he comes up.
I’ve read reports that people can’t get more than 30fps on low settings on 4000 series cards. I’m definitely not one to expect sweet 120fps on ultra on launch day, but a 4000 card not even getting low settings? They failed. Hard.
Hey guys, you know what we really need for a IP property that is famous for a rather simplistic cel shaded art style?
The most complicated realtime lighting and rendering engine in the world.
… I could smack him with a fucking book, holy shit this is so stupid.
Don’t you know that gamers demand hyper realistic graphics? Even when there is a defined art style it doesn’t matter. Realism.
It’s famously why comic books failed to captivate anyone, and cartoons are never successful.
2D pixel games without realtime raytracing and perfect light simulations never ever sold
I can’t believe no one buy silksong and cuphead because it is 2d and it isn’t hyperrealistic. Sorry team cherry and MDHR, you guys failed.
I mean, Cuphead is one of the more visually impressive games… ever. It has an art style and it works it to the fullest extent. And Silksong is also quite gorgeous.
This comes up every other year or so it seems. People think 2D means “low effort” and get angry that so many fighting games moved on to 3D. But the reality is that actually making sprites is a VERY labor intensive process that often requires a deep understanding of the entire rendering pipeline (including the hardware it is displayed on). At this point, “most” online people are aware that many of the NES/SNES sprites were specifically made with CRT “blurring” in mind but it goes way beyond that. So that is why franchises like Street Fighter just have “generic” 3d models.
And ArcSys more or less made their entire model (wait for it) simple-ish 3d models with cell shading and very specific lighting systems to appear 2D even though they aren’t. Which is why stuff like the super attacks always look so impressive as you do the zoom and spin around.
I can’t speak for blands 4 since 3 (and the pre-sequel…) were so aggressively obnoxious that I just replay 2 every 3 or 4 years. But just look at this thread where you have weirdos saying that UE5 games look like they are from 2016. People are deeply stupid and games need to pop and sizzle to stand out. And while I don’t know (or care) if blands 4 succeeded… just look at ArcSys for how you can use modern engines to make cellshaded 3d models look AMAZING.
It is very specifically for actually 2D games, but check out some of Cobra Code’s videos on youtube. They have put a LOT of work into how to use UE5 to make sprite based 2D games look GOOD and it is actually fascinating. And that is still sidestepping the initial sprite work for the most part.
And as another example of why sprites are actually a ridiculous amount of work to get right. Mina The Hollower (?) is the latest game from the Shovel Knight devs. And… most of us who tried the demo on a high resolution display felt REALLY weird because of the way the sprites and animations looked upscaled (I saw a breakdown of why. I did not understand it). The devs are putting in the work to fix that ahead of launch but it really speaks to the kinds of problems that come up when you go from testing on a Steam Deck or in a debug window to stretched across a 1440p display at 120-ish Hz.
On 3440x1440 with a 4080 I get around 50-60 FPS on very high, with DLSS Quality.
They’ve released some updates, I’m not surprised if it’s gotten better
I’ve been playing since the 12th, it hasn’t gotten better since then and my friend who got it day one hasn’t said anything about updates improving the performance.
I fucking am. I didn’t pay $1700 for a graphics card to have potato graphics. I’m glad there’s been plenty of bad press on this, not that I would have bought it this close to launch anyway.
No, I disagree with that. It’s always been perfectly normal to have ultra graphics a bit out of reach so that the game will look great on future graphics cards, and 120fps is a ridiculously high number that should never be expected even with top of the line graphics cards for a brand new release. (Assuming 2 or 4k)
However, a 1 generation out of date graphics card should be able to easily play most things on High settings at a decent framerate (aiming for 60) on 4k settings, which Borderlands failed at horribly. Medium to low settings on a 4000 series card sounds like a gutpunch to me.
It’s a bit misleading to say “4000 series” like that doesn’t cover 15 different products over a drastic range of intended performance levels
I don’t need to personally be an engineer to call out a train wreck when I see one…
I’m not a pilot, but if I see a helicopter upside down in a tree I’m pretty sure someone fucked up.
Randy isn’t even an engineer, he just owns a train.
Obligatory dril tweet about how you never “Gotta hand it to them”
But yeah. It is infuriating how often people just spew nonsense about “it is Unreal so it runs like shit” just like it was when “It is Unity so it runs like shit” and so forth. I very much do not think people need to “code (their) own engine” to understand things but… it would be a good idea to to do some basic research on the topic.
I can’t speak for blands 4 specifiaclly. But with a lot of these games? People don’t realize that the game is very much being optimized with upscaling and even framegen in mind. You can say you hate that if you want to. But these studios aren’t targeting 120 FPS at 4k. They are targeting 120 FPS at 2k or even 60 FPS at 2k. In large part because… there still isn’t a lot of value in actually targeting 4k on any console or even most PCs. So it gives them more or less a “single” target. And some techniques don’t scale anywhere near as well (also 2k to 4k is not linear).
It is just that people grew up in the golden age where everything was targeting a PS4 and kind of has been for almost a decade at this point. So the idea of futzing with your graphics settings and checking the different AA (now upscaling) models is kind of anathema to a lot of users. And consoles STILL try to hide all of that.
If I buy a 1000 dollar graphics card I’m expecting it to run any game at max settings with NO upscaling. Because that was the norm for over 35 years.
Upscaling is supposed to be used to EXTEND the life of lower end graphics cards, not be the new fucking normal.
You can expect whatever you want. That isn’t what is being sold. That’s why so much of the nVidia bullshit insists that reviewers and partners report upscaling and framegen as the “normal” results (and why outlets like Gamers Nexus are on their shitlist).
And the “norm for over 20 years” was that games targeted a specific range of hardware “power”. That is why so much work was put into engines to support higher res models in first person view that would only be one per scene and lower res models that other characters have. And scaling detail based on distance and so forth. But it also meant that we didn’t have the ridiculously high res hero models that might only exist for cutscenes because no computer of the era would be able to run that performantly in real time.
Personally? I would LOVE if more effort were put in for scalability (which Unreal is ironically really good at because it targets film). But that gets back into what the target audience is and if it is worth putting the effort in for “Mega Ultra Super Epic” settings when nobody is going to use them. And… considering that one of the biggest issues is people refuse to put more research in than “I set it to High”… yeah.
But also… that isn’t NOT what we have now. We have the games that are designed to look good at 4k asset wise. And, in a couple years (maybe longer, maybe never) we will have the oomph to actually run those natively at 4k/120 and it will be BEAUTIFUL (and have less weird fur textures appearing out of nowhere or weird cape physics when the model barfs). Which isn’t TOO different than building a new PC, playing Dwarf Fortress because it is funny, and then booting up a Total War and losing your mind for a bit.
That’s what was being sold for over three decades. I give 1000 dollars worth of graphics card, I can run anything at max settings with no compromises. Nvidia and videogame devs enshittified this shit so much that you require stuff that should be reserved for LOW END graphics cards to run the game properly. Fuck that.
Hey buddy. I got a 2070 that I’ll give you a real bargain on. Only 900 USD plus s&h. It’ll play Crysis at max settings with absolutely no compromises. Hit me up.
But, in all seriousness: This is why (minimally biased) reviews are a thing. Outlets like Gamers Nexus or Hardware Unboxed do a spectacular job of actually benchmarking against common games of the current era (and a few oldies) and comparing them against a large database of other cards. Don’t just assume that you can give Jensen a grand and get what you want. Make informed decisions.
Also I am not actually sure if 900 is ACTUALLY a steal for a 2070 at this point and didn’t care enough to check for my joke. SImilarly, I want to say a grand is closer to a 4070 than a 4090 at this point? Shit is REAL fucked which is even more why you should actually look at what you are buying rather than “30 years ago if I gave someone a thousand bucks they would suck me off AND let me play every game I could ever want at a full 640*40 and 30 FPS!”
Also I want to say Crysis is actually more CPU bound these days since it was the height of the Pentiums and single core performance is nowhere near the priority it used to be. But too lazy to check.
I can’t decipher whatever word salad you’re trying to say here and I don’t care. There is no excuse for a top of the line GPU from LAST YEAR to not get at least 80 FPS on High. Most gamers are poor and have stuff like 3060s, 4060s, and RX 6600s. I’m not a rich fucker who can burn 2000 dollars every year on a 600 watt GPU to get a pitiful 50 FPS.
It’s a PC. It could run one thing great and another like total dog shit. A games performance hinges on so much more than just a graphics card.
Yeah, like the optimisation of the game for starter.
Yo, Randy, eat your own shit! Don't forget to swallow!
Yeah I’m pretty sure for $70 a functional engine should be the bare minimum.
What a fucking chode.
I wasn't going to buy this game at all, but now I really won't buy it.
What an ass.
I was gonna buy it when it was 50% off with the dlcs included. Now, gotta see that performance first.
My friend with a RX 7900 XT can barely run it.
At 1080p.
On Medium.
Shove it up your ass, Gearbox.
Is he running with antialiasing on?
Looking at their system settings page:
borderlands.2k.com/…/amd-optimization/
Their settings for every single GPU listed there seem to have antialiasing off.
It may be that current hardware can’t do that at a reasonable clip
They have AA off because they recommend upscaling on literally everything. Upscaling sorta acts like AA and they are not used together.
I feel like they’d make more money if they targeted lower end machines. My friend has a potato laptop and would enjoy borderlands, so they’re out of luck. They’re not going to spend any money on a new gaming toy because it’s not that big a hobby for them. I imagine there are many such people.
I mean… You first?
I think that’s exactly his point.
.
Cheaping out on optimization is one thing, telling customers to basically eat shit if they have a problem with it is… something that‘s not gonna have any major consequences nowadays, I guess lol
What a time we live in
The ‘true fans’ will find a way, apparently.
Seems fine in performance mode on PS5. Windows users having trouble?
It’s funny given they use unreal.
I run a 7800X3D on and X870 mobo, 64gb ram and an RTX3090, and couldn’t run it at my native 7680x2160 resolution and have a playable framerate. It was fine at 3840x1080, but randomly crashed to desktop (at 1/4 my normal resolution, that I can even play Star Citizen at.)
I refunded it after 3 hours. This was before Randy’s ridiculous response to the criticism. I was planning on buying again later when it’s fixed, but now I’m not so sure. The man is his own worst enemy.
He’s a bell end. He’s always been a bell end. I had the dubious pleasure of meeting him when I was working as a games press outfit’s tech monkey over a decade ago at Gamescom in Cologne.
Every other word out of his mouth was a business buzzword, he’s smarmy and smug, my skin crawled the whole time he was present.
It looks like he’s as high on his own farts now as he was back then.
@inclementimmigrant borderlands 4 running like dogshit on a 5090 is going to make me crash out in the most legendary way
Randy, you molest your employees with that mouth?!
Takes me back to 2008 and Rockstar claiming that GTA 4… was designed for hardware that didn’t exist yet after that shit show of a shitty port to PC… It would be a further 6yrs before I could play that game at a reliable 19080p, 60fps… then GTA5 came out a year later.
No wonder you couldn’t play it. 19080p, mother of God!
19080p in 16:9 is 33,920X19,080 =647,193,600 pixels, 647 megapixels. 8K is 33.2 megapixels. So this dude is playing GTA 4 on a screen that has almost 20 times more pixels than an 8K screen
My bro’s gaming on a freakin Jumbotron
A game designed for hardware that doesn’t exist yet on a monitor that doesn’t exist yet.
.
.
Gonna keep repeating it until it sticks:
Don’t touch Gearbox shit. They showed their true colours when they embezzled Colonial Marines.
Pitchford’s brain dead takes killed any chance that I was going to play Borderlands 4.
I think it should be taken a bit further and just don’t buy anything from 2K if you can.
2K can actually release a functional game though. The problem is they get greedy after the fact.
I’m not sure if they can anymore. Civ 7 broke me on how it shoe-horned in systems to make money that ultimately broke what was a tried and tested formula.
No
I don’t need to be a three star Michelin chef to realize that the plate of shit I have just been served is in fact a plate of shit. Similarly, I also don’t need to be a game developer to see that the buggy mess of a game gearbox just released is not worth my time.
<img alt="" src="https://lemmy.zip/pictrs/image/3991dda7-7238-4051-89da-7abfdb964d1e.webp">
Don’t you guys have
phones90-series graphics cards?I have. Oddly enough, I don’t remember any Gearbox game with a Gearbox engine?
Take your own advice? What games do they have not on some other developer’s engine? Other than their initial expansions for Half-Life, which are basically mods for Half-Life, all the games I know they have done were built on various iterations of Unreal Engine.
They did port Halo CE to PC with its own engine. They fucked up a lot of it in the process though.
Is he wearing a barong Tagalog?
I think so too
Bai!
I'm ready for the article announcing Randy takes back his statements. As is the cycle.
Greasy bastard.
Coffee Stain managed to get UE5 to keep window settings between sessions, but what do they know? Certainly not how to not fix an inventory screen bug that’s been in 4 games at this point.
Shade aside, I do think more developers should make their own engine. Yes it takes time and resources, but those are spent on exactly what you need instead of on getting what you want out of an engine that was made to do everything but focused on nothing.
I mean I sort of agree, but I’ve both used custom engines and seen people trying use custom engines and you have this problem where the engine was designed for a game, rather than for any game. So if the original game didn’t have a particular feature the engine has no capacity to do that thing, so every time you want to make a new game in that engine, you basically have to rewrite the engine.
It works if you build an engine to be an engine, but as you say that’s extremely expensive and time consuming and you probably am not going to get any benefit out of it. You could try selling the engine, but you’re unlikely to make much progress unless there is a significant improvement over the other options already available.
This kind of thinking is what caused all the issues for cyberpunk 2077 release. You need people who can actually use the engine. If you have to train every dev that on boards on your custom engine and all its quirks and customizations youll mever be able to release on time. Its why theyve switched to working with unreal. You can actually get people who are familiar wirh all the systems without spending weeks or months training them.
It also sucks for the devs trying to leave since youve now spent months or years working on systems and code that dont work anywhere else. So whoever hires you will again have to retrain you with their systems.
God this guy is so impossible, hes a fucking cartoon character.
Anything ever come of this creep and his “favorite magic tricks” usb drive?
I think I’m out of the loop on something here…
more info than you could ever want about Pitchford’s porn habits
In short, there was a legal dispute between Pitchford and a former counsel for Gearbox. As part of a pattern of suit-countersuit, the former employee alleged that Pitchford had left a USB stick at a local restaurant which contained proprietary company info as well as underage pornography. Pitchford confirmed that all of the above, with the notable exception of the “underage” part. Given nothing came of it, and he was remarkably candid about what type of porn was actually on the USB, I’m inclined to believe him.
this is unreal
I remember when 3 came out, i still had a ps4. The game was literally unplayable, like literally crashing every 2 minutes. Couldn’t get past the first 10 minutes of the game. Im not surprised that even on ps5 the new one is crap. It’s like they gave up after part 2
I don’t understand how Sony would allow a game on their platform that doesn’t actually run. Like surely they will require to provide some kind of advanced copy for them to review?
I’m pretty sure they allow anything that hits 30fps.
There’s a process called certification you need to pass in order to release on a console where they test your game against a list of criteria outlined in the developer agreement to validate stability, minimum performance,and conformance with platform standards. Nintendo pioneered this process (the Nintendo seal of quality) in response to unauthorized developers releasing cartridges which ran poorly and could freeze or even cause damage to consoles.
For a while, console manufacturers pretty strict about certification requirements but as time goes on they’ve been granting more and more exceptions to large publishers willing to pay fees and pinky-promise to fix the issues post-launch.
Now watch as some determined lunatic codes a new engine and somehow ports all of BL4 to it.
And gets hit with a C&D the day before release
If your engine is crap then you don’t get brownie points because it’s custom.
After all, Starfield is on a custom engine, and it had exactly the same problems as Borderlands 4.
Starfield ran pretty good. The game was just not fun. I tried having for 120 hours. I really did try. The shipbuilding was nice tho.
Have to partly disagree. The loading screens, whilst usually brief, were very annoying.
Elite Dangerous, No Man’s Sky or Space Engineers accomplish similar tasks without needing those, or having to limit the accessible planet area.
I understand that those were limitations of the engine, but it could just be speculation.
I’m not doing anything that chat GPT can’t do
I did
^It^ ^sucked,^ ^but^ ^I^ ^won’t^ ^buy^ ^your^ ^game^ ^regardless^
unreal engine game developer here: it’s your fuckin’ responsibility to debug and optimize your game, not unreal engine’s.
You gave them a hammer and their nail has not gone in straight?
Clearly your fault. And the hammer’s.
Yeah, i could. Or you could optimize your assets and figure out how to add performance profiles instead.
Randy is such a bitch boy chode meister.
for real though why don’t more companies license idTech
Lol. I bet he has no idea how many people actually can do that
hmmm, that kind of statement makes me long for my ol’ eyepatch
Does borderlands really “need” some bleeding edge engine?
Its not a terribly complex game? And nobody gives a shit if its a bit janky.
Borderlands, like its movie, is the kid who gets an ant farm for his birthday, pulls out the tubes, shoves one end up his ass and starts sucking the farts out with the other.
Yes, its entertaing, but nobody expects it to be high-class.
He's just mad that he's not as good at making game engines like Id, Valve and even Epic did.
Maybe I’m just wildly crazy here, but I think a $70 game from a big studio should run well.
We’re not talking some $15 early access title here that lacks optimization because it’s like 2 guys making it in their free time.