Add a discrete GPU or not?
from Shimitar@feddit.it to selfhosted@lemmy.world on 08 Nov 19:05
https://feddit.it/post/12259469

In my server I currently have an Intel i7 9th gen CPU with integrated Intel video.

I don’t use or need A.I. or LLM stuff, but we use jellyfin extensively in the family.

So far jellyfin worked always perfectly fine, but I could add (for free) an NVIDIA 2060 or a 1060. Would it be worth it?

And as power consumption, will the increase be noticeable? Should I do it or pass?

#selfhosted

threaded - newest

exu@feditown.com on 08 Nov 19:15 next collapse

QuickSync is usually plenty to transcode. You will get more performance with a dedicated GPU, but the power consumption will increase massively.

Nvidia also has a limit how many streams can be transcoded at the same time. There are driver hacks to circumvent that.

Player2@lemm.ee on 08 Nov 19:28 next collapse

If it is working for you as is, no need to make a change

precarious_primes@lemmy.ml on 08 Nov 20:48 next collapse

I ran a 1650 super for a while. At idle it added about 10W and would draw 30-40W while transcoding. I ended up taking it out because the increased power wasn’t worth the slight performance increase for me.

Shimitar@feddit.it on 08 Nov 20:57 collapse

Yeah look like a lot… Probably not worth it.

kevincox@lemmy.ml on 08 Nov 20:50 next collapse

Most Intel GPUs are great at transcoding. Reliable, widely supported and quite a bit of transcoding power for very little electrical power.

I think the main thing I would check is what formats are supported. If the other GPU can support newer formats like AV1 it may be worth it (if you want to store your videos in these more efficient formats or you have clients who can consume these formats and will appreciate the reduced bandwidth).

But overall I would say if you aren’t having any problems no need to bother. The onboard graphics are simple and efficient.

variants@possumpat.io on 08 Nov 21:42 next collapse

Host steam-headless and use the GPU for that so you can have remote gaming on your phone anywhere you have 5G

[deleted] on 08 Nov 21:51 next collapse

.

Shimitar@feddit.it on 09 Nov 06:56 collapse

Cool idea!

sugar_in_your_tea@sh.itjust.works on 08 Nov 22:22 next collapse

I only have a GPU because my CPU doesn’t have any graphics. I don’t use the graphics anyway, but I need it to boot. So I put our crappiest spare GPU in (GTX 750 Ti) and call it good.

I wouldn’t bother. If you end up needing it, it’ll take like 15 min to get it installed and drivers set up and everything. No need to bother until you actually need it.

OneCardboardBox@lemmy.sdf.org on 09 Nov 01:54 next collapse

Look up the GPU on these charts to find out what codecs it will support: …nvidia.com/video-encode-and-decode-gpu-support-m…

NVENC support will tell you what codecs your GPU can generate for client devices, and NVDEC support determines the codecs your GPU can read.

Then compare it with the list of codecs that your Intel can handle natively.

Shimitar@feddit.it on 09 Nov 06:59 collapse

Thanks!

Both the 2060 and the 1060 don’t support AV1 either way, so I guess its pointless to me.

eskuero@lemmy.fromshado.ws on 09 Nov 02:50 next collapse

For an old nvidia it might be too much energy drain.

I was also using the integrated intel for video re-encodes and I got an Arc310 for 80 bucks which is the cheapest you will get a new card with AV1 support.

InverseParallax@lemmy.world on 09 Nov 20:06 collapse

Yeah, using the a750 the same.

Can’t wait for next Gen arc with vvc (x266) support.

eskuero@lemmy.fromshado.ws on 09 Nov 21:22 collapse

Is x266 actually taking off? With all the members of AOmedia that control graphics hardware (AMD, Intel, Nvidia) together it feels like mpeg will need to gain a big partner to stay relevant.

InverseParallax@lemmy.world on 09 Nov 22:09 collapse

Google is pushing av1 because of patents, but 266 is just plain better tech, even if it’s harder to encode.

This same shit happened with 265 and vp9, and before that, and before that with vorbis/opus/aac.

They’ll come back because it’s a standard, and has higher quality.

Maybe this is the one time somehow av1 wins out on patents, but I’m encoding av1 and I’m really not impressed, it’s literally just dressed up hevc, maybe a 10% improvement max.

I’ve seen vvc and it’s really flexible, it shifts gears on a dime between high motion and deep detail, which is basically what your brain sees most, while av1 is actually kind of worse than hevc at that to me, it’s sluggish at the shifts, even if it is better overall.

monkeyman512@lemmy.world on 09 Nov 17:20 next collapse

If the iGPU is getting the job done, I would leave that alone. You could add a GPU and pass it through to a gaming VM. But that is an entirely different project.

Shimitar@feddit.it on 09 Nov 18:13 collapse

Could be an interesting project tough, will definitely think about that. Not top priority, but why not since the hardware its free?

lowdude@discuss.tchncs.de on 10 Nov 09:29 next collapse

I would avoid it, if you care at all about availability and downtime. The result will probably not be great, you need to ensure the server side gets enough resources under load, and setting it up may require constant restarts if things aren’t immediately working as expected.

Nonetheless, here is a link where someone did essentially exactly that on NixOS: astrid.tech/2022/09/22/0/nixos-gpu-vfio/

Deway@lemmy.world on 10 Nov 12:02 collapse

Power consumption, if you care about that.

Shimitar@feddit.it on 10 Nov 14:06 collapse

Yes, but if I can stream games to my mobile device that could be an acceptable treadeoff, if the card doesn’t drain too much when idle

interdimensionalmeme@lemmy.ml on 09 Nov 18:47 next collapse

THE FOLLOWING IS WRONG!!!

EMBY BAD, JELLYFIN IS OPEN SPURCE, NOT EMBY


Switch to emby

Shimitar@feddit.it on 09 Nov 21:08 collapse

Why?

interdimensionalmeme@lemmy.ml on 09 Nov 21:27 collapse

THE FOLLOWING IS WRONG!!!

EMBY BAD, JELLYFIN IS OPEN SOURCE, NOT EMBY


It’s open source equivalent and won’t enshittify on you, like plex did, like jellyfin will

Shimitar@feddit.it on 09 Nov 21:34 collapse

Is jellyfin getting enshittified? Why you say that? Doesn’t seems like its following plex

interdimensionalmeme@lemmy.ml on 10 Nov 03:14 collapse

THE FOLLOWING IS WRONG!!!

EMBY BAD, JELLYFIN IS OPEN SOURCE, NOT EMBY


As long as it’s not released under a copyleft license, it remains a possibility. That’s not to say they’ve actually done anything untoward, but this is a warning to the people who have been bit by this dynamic one too many times

potentiallynotfelix@lemmy.fish on 10 Nov 06:28 next collapse

Jellyfin is under the GPLv2 Github

interdimensionalmeme@lemmy.ml on 10 Nov 11:34 collapse

You are right, I have fucked up!! All posted edited

Shimitar@feddit.it on 10 Nov 14:07 collapse

DNW, it happens, but you made me doubt my memort eheheh

Shimitar@feddit.it on 10 Nov 06:53 collapse

If and when, I will have plenty of time to migrate. I hate plexification and will indeed switch to emby IF and WHEN.

So far, seems jellyfin is under a copyleft license, so unless they break it, I think I am fine.

And sincerely, jellyfin totally rocks.

interdimensionalmeme@lemmy.ml on 10 Nov 11:31 collapse

I have fucked up!!! It is emby that is closed source fuckkkkkk!!!

“Emby has been relicensed and is now closed-source, while open source components will be moved to plugins. Due to this, a free open source fork of Emby was created called Jellyfin.”

possiblylinux127@lemmy.zip on 09 Nov 19:50 next collapse

The Intel GPU is probably better

InverseParallax@lemmy.world on 09 Nov 20:05 collapse

Intel has excellent transcode, even in their igpus.

I use an arc750 specifically for transcode, av1 runs at ludicrous speeds, but don’t do an Nvidia, they kind of suck because they dont support vaapi, only nvenc/nvdec and vdpau.