Debian 13 burning 10W playing 4K YouTube video on a Framework with max brightness 🫨
from lightrush@lemmy.ca to linux@lemmy.ml on 12 Aug 03:40
https://lemmy.ca/post/49628982

A screenshot showing 4K YouTube playing with power consumption of 10W

#linux

threaded - newest

lightrush@lemmy.ca on 12 Aug 03:41 next collapse

It fluctuated between 8.8W and 10.3W.

solrize@lemmy.ml on 12 Aug 03:49 next collapse

Is that good or bad? What cpu? How big is the screen? What encoding?

lightrush@lemmy.ca on 12 Aug 04:23 next collapse

It’s a Framework with 11th gen Intel i5. I’ve never seen it below 11W while doing this. I don’t recall the exact number I got in Debian 12 but I think it was in the 11-13W range. The numbers were similar with Ubuntu LTS which I used till about a year ago. Now I see 9-10W. The screen is 3:2 13". Not sure about the enconding but I have GPU decoding working in Firefox.

Kazumara@discuss.tchncs.de on 12 Aug 12:09 collapse

Not sure about the enconding

Right click on video -> Stats for Nerds

fuckwit_mcbumcrumble@lemmy.dbzer0.com on 12 Aug 04:55 next collapse

It’s a youtube video so whatever youtube is these days. I tested with this M1 Macbook Pro and it was using about 7 watts so 3 watts more is pretty good for pretty much anything. I think my 12th Gen. laptop typically draws about 13-15 doing the same thing, but with a much dimmer screen.

MonkderVierte@lemmy.zip on 12 Aug 09:02 collapse

The screen was not measured.

ryannathans@aussie.zone on 12 Aug 03:58 next collapse

That’s very good, audio could do with some work

humanrogue@lemmy.ml on 12 Aug 14:19 collapse

I agree the stock tuning could use some work. As a workaround, have you seen these EasyEffects profiles? The sound quality is significantly improved, but there’s still resonance above >60% volume (I think due to the keyboard).

reddit.com/…/improving_perceived_sound_quality_on…

ryannathans@aussie.zone on 12 Aug 21:40 collapse

Easy effects is fantastic, I use it to remove background noise from my mic

humanrogue@lemmy.ml on 12 Aug 22:25 collapse

Can you share instructions on how to do that? The only thing that I’ve been able to do to improve my situation is lower my mic gain

ryannathans@aussie.zone on 13 Aug 23:28 collapse

I use a gate filter to block sound that isn’t up close and loud, it essentially functions like voice activation with configurable hysteresis. Then next I also use a RNNoise filter I think it’s called, which is the machine learning noise filter that blocks unwanted non-voice sounds.

If you need a hand I can take screenshots or something when I am next at my PC?

humanrogue@lemmy.ml on 14 Aug 02:46 collapse

Yes please! Or if you followed a tutorial, a link would suffice too

ryannathans@aussie.zone on 15 Aug 07:33 collapse

<img alt="" src="https://aussie.zone/pictrs/image/13d4fa85-dc9e-4ad8-8bd2-018ec4936b46.png"> <img alt="" src="https://aussie.zone/pictrs/image/90bade73-1dbe-4afe-88c1-575bb4c28d8a.png"> <img alt="" src="https://aussie.zone/pictrs/image/5e4b7378-5bd6-4099-a3df-403a97b99d76.png"> <img alt="" src="https://aussie.zone/pictrs/image/75c2db54-fc69-47dc-9c44-9211444f0007.png"> <img alt="" src="https://aussie.zone/pictrs/image/d2eff516-4cd0-48d6-89e2-557c1d91225b.png">

These are my app, gate and noise reduction filter settings, and the order they are enabled in for my microphone input. I am using the easyeffects flatpak which seems to include the standard noise reduction ML model.

You might need to study the gate settings for a minute to tweak them to your needs, I suspect you’ll only need to change the curve threshold (-22db in my pic) to pick up your voice when you speak into it, but not background noises. You can use the db values shown in-app while speaking to help adjust it.

humanrogue@lemmy.ml on 16 Aug 02:32 collapse

Thank you! This is really thorough!

mitch@piefed.mitch.science on 12 Aug 04:52 next collapse

Honestly it's a little staggering how much better web video got after the W3C got fed up with Flash and RealPlayer and finally implemented some more efficient video and native video player standards.

<video> was a revolution.

lightrush@lemmy.ca on 12 Aug 05:02 next collapse

Oh man, I was like a kid in a candy shop when I got my hands on Flash 4… built quite a few sites with it.

mitch@piefed.mitch.science on 12 Aug 14:40 collapse

My unpopular opinion is that Flash was perhaps one of the greatest media standards of all time. Think about it — in 2002, people were packaging entire 15 minute animations with full audio and imagery, all encapsulated in a single file that could play in any browser, for under 10mb each. Not to mention, it was one of the earliest formats to support streaming. It used vectors for art, which meant that a SWF file would look just as good today on a 4k screen as it did in 2002.

It only became awful once we started forcing it to be stuff it didn't need to be, like a Web design platform, or a common platform for applets. This introduced more and more advanced versions of scripting that continually introduced new vulnerabilities.

It was a beautiful way to spread culture back when the fastest Internet anyone could get was 1 MB/sec.

RheumatoidArthritis@mander.xyz on 12 Aug 16:42 collapse

It worked great only on Windows PCs in the times when PC and Windows still weren’t the definite winners of the technological race and people have been using all kinds of computers.

savvywolf@pawb.social on 12 Aug 08:23 next collapse

Wasn’t that when Whatwg took over the spec?

mitch@piefed.mitch.science on 12 Aug 14:41 collapse

Ah I am not sure. I just assumed it was W3C.

FauxLiving@lemmy.world on 12 Aug 13:46 collapse

I remember, that was a dramatic change.

Also, most people now dont remember this, but YouTube was initially popular because their flash video player was efficient, worked acrossed many different system configurations and browsers and dynamically changed resolution to match your connection.

At that point you had some people with broadband connections and a lot more with dial-up. So often dial-up users would not be able to watch videos because they were only available in one resolution.

YT had 144p (or less!) videos ready for dial-up users and higher resolution videos for broadband users and it automatically picked the appropriate video for the client. This made it so most people (dial-up users) would look to YT first, because you knew that YT would have a video that you could actually watch.

Then Google bought them.

mitch@piefed.mitch.science on 12 Aug 14:44 next collapse

YouTube blew up the year I went to college and got access to a T3 line. 🤤 My school had pretty robust security, but it was policy-based. Turns out, if you are on Linux and can't run the middleware, it would just go "oh you must be a printer, c'mon in!"

I crashed the entire network twice, so I fished a computer out of the trash in my parents' neighborhood, put Arch and rtorrrent on it, and would just pipe my traffic via SSH to that machine. :p

Ah, and the short era of iTunes music sharing... Good memories.

FauxLiving@lemmy.world on 12 Aug 15:24 collapse

Yeah, my high school had a computer lab donated by Cisco to teach their CCNA course. There were like 2 students taking the class and 25 PCs, so we setup one to run WinMX, Kazaa and eDonkey.

They all had CD-RW drives. We were minting music and movie CDs (divx encoded SD movies were under 650MB so they would fit on a CD), and selling them on campus for $3-5. You could get a 100 blank cd-rs for around $40, so it was very profitable.

Korhaka@sopuli.xyz on 13 Aug 09:23 collapse

I forget that people still had dial up in mid 2000s, I always associate it with the 90s

Zykino@programming.dev on 12 Aug 06:12 next collapse

What command do you use to see the Watt used?

ominousdiffusion@lemmy.world on 12 Aug 06:37 collapse

Powertop

lefixxx@lemmy.world on 12 Aug 07:42 next collapse

( ͔° ĶœŹ– ͔°)

grinceur@programming.dev on 12 Aug 22:11 collapse

and don’t forget the calibration before use

llii@discuss.tchncs.de on 12 Aug 07:54 next collapse

Me with an older notebook that doesn’t support av1 decoding: 😭

gnuhaut@lemmy.ml on 12 Aug 14:16 next collapse

There’s a browser extension called ā€œYour Codecs.ā€ which can prevent YouTube from serving you AV1-encoded videos.

[deleted] on 12 Aug 23:44 next collapse

.

HiddenLayer555@lemmy.ml on 12 Aug 23:44 collapse

I wish there were more M.2 cards beyond just SSDs and wireless NICs. The idea of a small form factor PICe interface is underutilized and things like hardware codec accelerators can keep laptops with older processors usable with new standards for longer. It’s sad how PCMCIA had an entire ecosystem of expansion cards yet we somehow decided that the much higher bandwidth M.2 is only for storage and networking. Hell, do what sound cards in the 90s/00s did and have M.2 SSDs specifically designed for upgrading older laptops that also have built in accelerators for the latest media standards. Hardware acceleration is energy efficient and can probably just be bundled into the flash controller like they’re bundled into the processor, and unless you have a top of the line SSD you’re probably not saturating the M.2 interface anyway.

umbrella@lemmy.ml on 13 Aug 03:51 collapse

capitalism underutilizes tech and its sad. we could be in 2085 already if we didn’t just waste time and materials on shit made to be thrown away in a few years.

Mwa@thelemmy.club on 12 Aug 10:07 next collapse

What cpu architecture is this?

lightrush@lemmy.ca on 12 Aug 14:32 collapse

x86_64

Mwa@thelemmy.club on 12 Aug 14:39 collapse

ngl I expected to be ARM cause of the low power usage.

ozymandias117@lemmy.world on 12 Aug 22:50 collapse

AMD has been proving that x86_64 can be at least as power efficient as ARM over the last few years (given a floor of performance for like a phone/laptop… I doubt it can get as low power as a little ARM microcontroller)

It seems like x86 was getting so power hungry because of Intel’s laser focus on single core performance

umbrella@lemmy.ml on 13 Aug 04:33 next collapse

intel held the industry back big time for at least a decade.

Mwa@thelemmy.club on 13 Aug 08:14 next collapse

Thats pretty cool actually, Balance between performance and power efficiency.

Korhaka@sopuli.xyz on 13 Aug 09:25 collapse

Are there any good mini PCs with AMD CPUs for low spec/power? Only really aware of intel N150s

serenissi@lemmy.world on 12 Aug 11:06 next collapse

I’ve seen 10-12W easily on 4K for soc without av1. your soc (intel 11 gen) should support av1. try to play the video on mpv (with yt-dlp integration) with various hw acceleration options to see if it changes. probably your browser is software decoding.

for hardware decoding supported soc too I noticed 2-3W of extra power usage when playing youtube from website compared to mpv or freetube. the website seems doing inefficient js stuffs but I haven’t profiled it.

Mubelotix@jlai.lu on 13 Aug 14:06 collapse

Av1 will probably increase power usage. It’s made to reduce data consumption

serenissi@lemmy.world on 13 Aug 19:25 collapse

on mobile platforms nowadays power is more important than data. OTOH for servers bandwidth is more important.

Mubelotix@jlai.lu on 14 Aug 10:11 collapse

Yeah. But who builds the apps? The guys running the servers or the final end users?

serenissi@lemmy.world on 14 Aug 11:23 collapse

ofc

fmstrat@lemmy.nowsci.com on 12 Aug 11:22 next collapse

Obligatory: ā€œUse Debian instead of Ubuntu. It’s basically Ubuntu without Snap.ā€

pupbiru@aussie.zone on 12 Aug 11:45 next collapse

it was always wild to me back in the day when so many container images were based on ubuntu… was like PLEASE debian is functionally identical here at like 1/10th the base container size!

Grass@sh.itjust.works on 12 Aug 11:57 next collapse

I prefer ā€œubuntu without the bullshitā€

lightrush@lemmy.ca on 12 Aug 13:53 next collapse

Mostly yes but there are functional differences in convenience. For example the standard upgrade process is completely manual. You have to disable third party repos. You have to change the repos. You have to check if you have space. You have to remove obsolete oackages. And more. On Ubuntu, the software update tool does all that, eliminating a lot of possibility for error. To an exoerienced user, the Debian process is fine. A novice would have plenty of opportunity for frustration and pain.

fmstrat@lemmy.nowsci.com on 12 Aug 22:17 collapse

What? Software Center is GNOME, not Ubuntu. Discover is KDE, not Ubuntu. Debian updates can be done the same way? I don’t do any of the things you mention. Using SC or just apt upgrade works just fine.

ozymandias117@lemmy.world on 12 Aug 22:47 collapse

They’re talking about a Debian 12 -> Debian 13 upgrade

On Debian, you get release notes on what commands to run.

Ubuntu has their own software update utility, separate from Software Center or Discover, that runs the commands for you

fmstrat@lemmy.nowsci.com on 13 Aug 14:01 collapse

Ahhh OK. I’ve always gone fresh for a full upgrade. But does apt dist-upgrade not work? That’s what the docs say to do.

ozymandias117@lemmy.world on 13 Aug 14:47 collapse

You have to at least modify your sources.list.d manually first. For most people, updating sources.list.d and running full-upgrade will probably work fine…

The full instructions are

  1. run dist-upgrade
  2. remove back ports
  3. remove obsolete packages
  4. remove non-debian packages
  5. clean up old configuration files
  6. add non-free-firmware (this is a 12 -> 13 specific)
  7. remove proposed updates
  8. disable pinning
  9. update sources.list.d to point to the next release
  10. apt upgrade --without-new-packages
  11. apt full-upgrade

It takes like an hour? but it’s still not ā€œjust press okay.ā€

Ubuntu’s has broken on some upgrades for friends and they had to do the whole Debian process manually, but it does try to automate the removals, disablements, and updating sources

Edit: instructions taken from Trixie release. I skipped some that aren’t really unique, like make a backup

www.debian.org/releases/…/upgrading.en.html

fmstrat@lemmy.nowsci.com on 14 Aug 00:26 collapse

Ahh yea, thats not too great

deadcream@sopuli.xyz on 12 Aug 16:39 collapse

It has much slower release cycle and ancient kernel. For people with new hardware it’s not suitable.

Eggymatrix@sh.itjust.works on 12 Aug 19:37 next collapse

Unless you prototype in a cpu fab it does not matter, debian 13 came out last week and its kernel is not that old

rocky1138@sh.itjust.works on 12 Aug 19:51 next collapse

Pop_os

fmstrat@lemmy.nowsci.com on 12 Aug 21:33 next collapse

This is why Backports exists. You can get any newer packages or kernels you need by enabling it.

And Ubuntu LTS doesn’t go much farther ahead than base Debian.

seralth@lemmy.world on 13 Aug 05:34 next collapse

If you need to rely on back ports to have day to day function of HARDWARE. Then your OS is not suitable to your use case. Backport reliance should not be the norm for your avg user.

fmstrat@lemmy.nowsci.com on 13 Aug 13:59 collapse

I disagree, since this is why Backports was made. That being said, everyone is entitled to their opinion.

dropped_packet@lemmy.zip on 13 Aug 06:24 next collapse

At that point why not just run a rolling release? Debians whole selling point is stability which backports kinda ruins.

vandsjov@feddit.dk on 13 Aug 13:41 collapse

I would argue that backporting one package does not ruin everything. If you backport a lot of stuff, then I would agree that it changing distrio to something more up-to-date should be considered because of the increase of potential problems.

Mubelotix@jlai.lu on 13 Aug 14:02 collapse

A great way to brick your system and enter the package versionning conflict hell

Magnum@lemmy.dbzer0.com on 13 Aug 07:30 collapse

Bullshit

ColeSloth@discuss.tchncs.de on 12 Aug 15:00 next collapse

Your entire backlight is only 3w? I feel like my phone is over 3w.

ChaoticNeutralCzech@feddit.org on 12 Aug 20:01 collapse

You can use the Wattz app to monitor current/power flowing into/out of the battery on some Android phones. Yes, 3 W is about the average in normal use. Unfortunately you cannot gauge the power consumption while charging unless you have a USB wattmeter too: the system only measures battery current because it’s required for battery capacity/percentages.

LaLuzDelSol@lemmy.world on 13 Aug 14:43 next collapse

That is very impressive! Although to be honest I question the accuracy of all those estimated power draws. I would be interested to see an endurance test of your battery- assuming your battery capacity is accurate, your runtime on a full charge should line up with your power draw.

marcie@lemmy.ml on 14 Aug 01:26 collapse

My phone uses like 30 on idle šŸ«