anamethatisnt@lemmy.world
on 03 Dec 15:38
nextcollapse
I can’t wait for the release in 2044! I hope 1440p is still all the rage when it launches!
Serious note: I hope Intel stays in the dgpu market, we could use another player in the space.
inclementimmigrant@lemmy.world
on 03 Dec 15:58
nextcollapse
Yeah, very glad that Intel has stayed in the market.
It’s very refreshing to see a company release a reasonable, though that term has been skewed so much over the years, budget cpu that doesn’t completely suck and actually tries to run current green games.
for now, anyway. nobody knows if the discrete gpu division will survive the leadership shakeup and new ceo (when they find one)
fuckwit_mcbumcrumble@lemmy.dbzer0.com
on 03 Dec 17:41
collapse
It would be incredibly stupid for Intel to abandon the dGPU market after spending all this money on it. As long as Battlemage turns out alright (basically it’s only goal) I doubt it will go away.
They cut the die size nearly in half so they’re no longer blowing a fuck ton of money on a $200 GPU. As long as utilization of the silicon goes up it should be fine.
DarkThoughts@fedia.io
on 03 Dec 17:56
nextcollapse
Yeah, I think giving up after just two generations would be a weird move. It's not an easy market to enter and Intel knew that beforehand.
mnemonicmonkeys@sh.itjust.works
on 04 Dec 01:17
collapse
Apparently Intel is replacing Gelsinger because his plan to turn the company’s fortunes around are taking too long. My guess is the new CEO will likely sell off major parts of the company and I doubt the dGPU division will be kept
iAmTheTot@sh.itjust.works
on 03 Dec 16:17
nextcollapse
This is a strange comment when the article is about the launch on December 12th. Maybe the joke went over my head?
dogslayeggs@lemmy.world
on 03 Dec 16:44
nextcollapse
Read the text from the post, not the article. OP said it is releasing in 2044 instead of 2024.
I haven’t been a huge fan of Intel for their cpus for some time now, but I agree, there needs to be more gpu competition out there. I’ve been wanting to try out an Arc for a while, I’m just hoping the dgpu drivers are better than what they run for their integrated chips.
dinckelman@lemmy.world
on 03 Dec 15:54
nextcollapse
Hopefully it will bring some decent generational improvements. The only thing i’m not a huge fan of is the 45% price increase over lasts gen, which isn’t even putting used or discounted cards into consideration
The A580 launched at 175$+tax. We are not talking about the same card
fuckwit_mcbumcrumble@lemmy.dbzer0.com
on 04 Dec 02:12
collapse
…I completely forgot that was even a thing. It came out and nobody really cared. Only the 770/750 got love, and the a380 saw some appreciation for being the quicksync addon card.
Well regardless it’s the same MSRP as the A770 post price drop, and still outperforms it.
Slightly better performance than a 4060 and $50 cheaper. But the 4060 is about to be replaced with a newer model at this point, so is it actually a good deal? Questionable.
And come out in 2026 after Nvidia finally gives up trying to convince the market that their $600 5070 is the low end of the GPU market
iAmTheTot@sh.itjust.works
on 03 Dec 16:17
nextcollapse
Might be a year before Nvidia’s next 60 class card is out. They usually release from highest spec to lowest.
Blue_Morpho@lemmy.world
on 03 Dec 16:23
nextcollapse
Nvidia says the $2000 5090 comes out in January. They haven’t even hinted at an announcement for the 5060 so it will be a very long time before it comes out.
They'll sell hundreds of thousands of them anyway. Neither price nor power consumption matter anymore
mnemonicmonkeys@sh.itjust.works
on 04 Dec 01:19
collapse
Fools are easily parted from their money
inclementimmigrant@lemmy.world
on 03 Dec 16:33
nextcollapse
Given that Nvidia said to be upping the prices of their 50XX series and the current 4060 and 7600 cards only offer 8 GB of vram, which honestly is insufficient for modern games now and overpriced, yeah, I do think this will offer decent value to budget gamers.
The 20xx series was expensive, skipped the 3x/4x and went back to amd. Even though I got my 7900xtx on sale, it still was insanely expensive for a gpu…where are the $500 top gpus gone.
themoonisacheese@sh.itjust.works
on 03 Dec 17:01
nextcollapse
Number for number, sure, if it’s actually available at that price.
The problem is that Intel’s drivers sucked in the past, so they definitely have to prove themselves with this launch. I definitely wouldn’t be buying it release day if I needed a GPU.
With its 8GB, the 4060 performs quite poorly when scaling up the resolution. There’s a great video by hardware unboxed showing how limiting 8GB are, in 1440p.
I just can't imagine the extra vram making such a difference in performance that it is enough to play in 1440p, let alone on ultra. I have a 6650 XT, which is slightly slower than the targeted 4060 / 7600 and that thing struggles even in 1080p.
Check the video. It clearly shows how performance drops significantly the moment you run out of vram. It doesn’t meant the performance will be perfect in 1440p, it means Intel is using that as a competition ground, something the 8GB cards fail at and maybe Intel’s GPU isn’t great but the 12GB will probably make a difference (and Intel is maybe being quiet at 1080p because they are likely to perform worse).
I mostly play BG3 now but I was hard into Destiny 2. As long as I capped my FPS to match my monitor (so 120), I could crank it up to pretty much max. BG3 and Last Epoch I max out (still fps capped). Cyberpunk 2077 I didn’t bother with and play it on GeForce Now. Most other games I play are AA or indie and the 1080ti at 1440p handles them easily.
Space Marine II is another that’s going on GeForce Now just because I want it on Ultra everything. So literally 95%+ of my library runs maxed at 1440p/120 on a 1080ti.
The comments I’ve read from current-generation Arc owners have given the impression that their Linux drivers are catching up to AMD. Here’s the latest info:
kippinitreal@lemmy.world
on 03 Dec 18:22
nextcollapse
As an aside knowing most companies working in embedded technologies usually work in, or have strong aspects in Linux. Why then are Linux drivers so difficult to come by? Lack of customers seems unlikely since they mostly have everything ready, right? Or is it cost cutting to avoid lengthy QA on another platform? That would be easy to sidestep by giving a no-warranty driver version?
blackstrat@lemmy.fwgx.uk
on 03 Dec 19:37
nextcollapse
They keep getting removed from the kernel.
fuckwit_mcbumcrumble@lemmy.dbzer0.com
on 03 Dec 23:50
collapse
Most of the demand is for Windows. So if your choice is to spend resources (money) where demand is, or hope that you can possibly create demand where there isn’t any currently.
Been a while but I played around with the a770 in Arch for a few months. It didn’t play nice with proton and even native games were hit and miss. Better support from Intel than nvidia gives, but it’s a new platform and Linux development was definitely taking a back seat to the windows drivers which were also a buggy mess.
And basically nobody had the cards so if something didn’t work your options were to give up or become a computer graphics programming wizard and fix it all yourself from scratch.
To answer the question: not really, no. The drivers themselves may have been fine, but who knows how any given software will handle a brand new GPU architecture.
KingThrillgore@lemmy.ml
on 03 Dec 18:22
nextcollapse
I can’t wait for these to be EOLed due to the exec shakeup.
Isn’t this the same architecture that is also in their iGPUs? That should help keep them motivated to improve drivers even if they lose interest in dGPUs.
If this turns out to be a solid performer, the price could make it the best midrange value since AMD’s Polaris (RX 480). Let’s hope Intel’s build quality has improved since the A770.
doggle@lemmy.dbzer0.com
on 04 Dec 00:54
nextcollapse
Sick. I got an a770le when they launched. Buggy AF, but not bad performance when it decided to work. It currently lives as a dedicated av1 encoder in a Plex server
secret300@lemmy.sdf.org
on 04 Dec 01:43
nextcollapse
I just bought an a750 honestly I’m loving it. So far I’ve only had issues in Skyrim with the shadows, and vermintide 2.
rickyrigatoni@lemm.ee
on 04 Dec 03:15
nextcollapse
How these go Linux? Vroom or doom?
thedeadwalking4242@lemmy.world
on 04 Dec 03:18
collapse
I believe vroom, intel usually has good Linux support even having their own optimized distribution
threaded - newest
I can’t wait for the release in 2044! I hope 1440p is still all the rage when it launches!
Serious note: I hope Intel stays in the dgpu market, we could use another player in the space.
Yeah, very glad that Intel has stayed in the market.
It’s very refreshing to see a company release a reasonable, though that term has been skewed so much over the years, budget cpu that doesn’t completely suck and actually tries to run current green games.
for now, anyway. nobody knows if the discrete gpu division will survive the leadership shakeup and new ceo (when they find one)
It would be incredibly stupid for Intel to abandon the dGPU market after spending all this money on it. As long as Battlemage turns out alright (basically it’s only goal) I doubt it will go away.
They cut the die size nearly in half so they’re no longer blowing a fuck ton of money on a $200 GPU. As long as utilization of the silicon goes up it should be fine.
Yeah, I think giving up after just two generations would be a weird move. It's not an easy market to enter and Intel knew that beforehand.
Apparently Intel is replacing Gelsinger because his plan to turn the company’s fortunes around are taking too long. My guess is the new CEO will likely sell off major parts of the company and I doubt the dGPU division will be kept
This is a strange comment when the article is about the launch on December 12th. Maybe the joke went over my head?
Read the text from the post, not the article. OP said it is releasing in 2044 instead of 2024.
The article says 2044 between the headline and the first picture.
I don’t see any text for this post, it’s just a link on my end.
<img alt="" src="https://lemmy.world/pictrs/image/a0f377f7-be28-40d8-8757-58caf02660f4.png">
<img alt="" src="https://lemmy.world/pictrs/image/447d0c5b-981e-4179-ba2b-ae6ca51a38d5.png">
Aaah it’s in the article itself! Didn’t see that, thanks.
I haven’t been a huge fan of Intel for their cpus for some time now, but I agree, there needs to be more gpu competition out there. I’ve been wanting to try out an Arc for a while, I’m just hoping the dgpu drivers are better than what they run for their integrated chips.
Hopefully it will bring some decent generational improvements. The only thing i’m not a huge fan of is the 45% price increase over lasts gen, which isn’t even putting used or discounted cards into consideration
The last equivalent card was
$112$172?Edit: I’ve been corrected. That still seems incredibly low.
$249 is 45% more than $172.
Haha, you’re right. Brain fart.
Original MSRP of the A770 was $330 so that is a big improvement. I assume intel is sticking with a reasonable launch MSRP to set expectations right.
tomshardware.com/…/intel-arc-a750-a770-full-prici…
The A580 launched at 175$+tax. We are not talking about the same card
…I completely forgot that was even a thing. It came out and nobody really cared. Only the 770/750 got love, and the a380 saw some appreciation for being the quicksync addon card.
Well regardless it’s the same MSRP as the A770 post price drop, and still outperforms it.
Slightly better performance than a 4060 and $50 cheaper. But the 4060 is about to be replaced with a newer model at this point, so is it actually a good deal? Questionable.
If it can ship before tariffs become an issue, maybe.
We both know, 5060 will be like $300 or even more
And come out in 2026 after Nvidia finally gives up trying to convince the market that their $600 5070 is the low end of the GPU market
Might be a year before Nvidia’s next 60 class card is out. They usually release from highest spec to lowest.
Nvidia says the $2000 5090 comes out in January. They haven’t even hinted at an announcement for the 5060 so it will be a very long time before it comes out.
$2k, jfc.
They'll sell hundreds of thousands of them anyway. Neither price nor power consumption matter anymore
Fools are easily parted from their money
Given that Nvidia said to be upping the prices of their 50XX series and the current 4060 and 7600 cards only offer 8 GB of vram, which honestly is insufficient for modern games now and overpriced, yeah, I do think this will offer decent value to budget gamers.
The 20xx series was expensive, skipped the 3x/4x and went back to amd. Even though I got my 7900xtx on sale, it still was insanely expensive for a gpu…where are the $500 top gpus gone.
Number for number, sure, if it’s actually available at that price.
The problem is that Intel’s drivers sucked in the past, so they definitely have to prove themselves with this launch. I definitely wouldn’t be buying it release day if I needed a GPU.
With its 8GB, the 4060 performs quite poorly when scaling up the resolution. There’s a great video by hardware unboxed showing how limiting 8GB are, in 1440p.
www.youtube.com/watch?v=ecvuRvR8Uls
I just can't imagine the extra vram making such a difference in performance that it is enough to play in 1440p, let alone on ultra. I have a 6650 XT, which is slightly slower than the targeted 4060 / 7600 and that thing struggles even in 1080p.
Check the video. It clearly shows how performance drops significantly the moment you run out of vram. It doesn’t meant the performance will be perfect in 1440p, it means Intel is using that as a competition ground, something the 8GB cards fail at and maybe Intel’s GPU isn’t great but the 12GB will probably make a difference (and Intel is maybe being quiet at 1080p because they are likely to perform worse).
I game at 1440p on a 1080ti. So what this tells me is that I don’t need to upgrade. Cool.
What do you play and on what settings? I know the 1080ti was a beast but it must surely be showing its age.
I mostly play BG3 now but I was hard into Destiny 2. As long as I capped my FPS to match my monitor (so 120), I could crank it up to pretty much max. BG3 and Last Epoch I max out (still fps capped). Cyberpunk 2077 I didn’t bother with and play it on GeForce Now. Most other games I play are AA or indie and the 1080ti at 1440p handles them easily.
Space Marine II is another that’s going on GeForce Now just because I want it on Ultra everything. So literally 95%+ of my library runs maxed at 1440p/120 on a 1080ti.
So excited. Can’t wait.
Do these cards have good open-source Linux drivers?
The comments I’ve read from current-generation Arc owners have given the impression that their Linux drivers are catching up to AMD. Here’s the latest info:
www.phoronix.com/…/intel-arc-b580-battlemage
As an aside knowing most companies working in embedded technologies usually work in, or have strong aspects in Linux. Why then are Linux drivers so difficult to come by? Lack of customers seems unlikely since they mostly have everything ready, right? Or is it cost cutting to avoid lengthy QA on another platform? That would be easy to sidestep by giving a no-warranty driver version?
They keep getting removed from the kernel.
Most of the demand is for Windows. So if your choice is to spend resources (money) where demand is, or hope that you can possibly create demand where there isn’t any currently.
Been a while but I played around with the a770 in Arch for a few months. It didn’t play nice with proton and even native games were hit and miss. Better support from Intel than nvidia gives, but it’s a new platform and Linux development was definitely taking a back seat to the windows drivers which were also a buggy mess.
And basically nobody had the cards so if something didn’t work your options were to give up or become a computer graphics programming wizard and fix it all yourself from scratch.
To answer the question: not really, no. The drivers themselves may have been fine, but who knows how any given software will handle a brand new GPU architecture.
I can’t wait for these to be EOLed due to the exec shakeup.
Isn’t this the same architecture that is also in their iGPUs? That should help keep them motivated to improve drivers even if they lose interest in dGPUs.
If this turns out to be a solid performer, the price could make it the best midrange value since AMD’s Polaris (RX 480). Let’s hope Intel’s build quality has improved since the A770.
www.youtube.com/watch?v=N371iMe_nfA
Sick. I got an a770le when they launched. Buggy AF, but not bad performance when it decided to work. It currently lives as a dedicated av1 encoder in a Plex server
I just bought an a750 honestly I’m loving it. So far I’ve only had issues in Skyrim with the shadows, and vermintide 2.
How these go Linux? Vroom or doom?
I believe vroom, intel usually has good Linux support even having their own optimized distribution
Edit: it may have trouble with older titles
I love playing older titles…
This looks insanely good.