There have been multiple accounts created with the sole purpose of posting advertisement posts or replies containing unsolicited advertising.

Accounts which solely post advertisements, or persistently post them may be terminated.

Dreadful6644

@[email protected]

This profile is from a federated server and may be incomplete. Browse more on the original instance.

NVIDIA dynamic boost wattage

Hi, I have an Asus vivobook pro 16x oled (M7600QC) optimus laptop with a dedicated Nvidia RTX 3050 Max-Q rated 50w. When I try to use the dedicated Nvidia GPU on Linux, the wattage is only 35w. If I enable the “nvidia-powerd” daemon, it goes up to ~40w. Is there a way to raise the wattage to the rated 50w like on Windows?...

Dreadful6644 ,

nvidia-smi -pl POWER

This will probably not work with recent driver versions, but might work with 525.

Dreadful6644 ,

Also, if cpu draws too much power, you will not get the dynamic power boost. Try to limit your cpu wattage. The dynamic boost is up to 15W, depending on how much power the rest of your system uses.

Dreadful6644 ,

Especially on laptops, where VRAM is halved in comparison to desktop models for whatever reason.

Dreadful6644 ,

I thought compression would not help much with disk space as well. I believe it depends on the use case. After switching to btrfs and enabling zstd compression, my Arch install reduced from 100GB to 60 GB in terms of used disk space. Most of the savings are from documentations of development packages.

Dreadful6644 ,

If your dGPU supports rtd3 power management, it should (almost) completely power off when not in use. For me the battery life changes a lot: is something like 2 hr vs 10hr battery life with the GPU off, which is very noticeable.

Dreadful6644 ,

Well, most (if not all) freesync monitors flicker when the frame rate changes. You are probably jumping from 48 to 120 or similar. You can use freesync only for fullscreen apps on kde plasma if that works for you.

thank you Linux for giving a damn about Bluetooth headphones (feddit.de)

For context, LDAC is one of the few wireless audio codecs stamped Hi-Res by the Japan Audio Society and its encoder is open source since Android 8, so you can see just how long Windows is sleeping on this. I’m excited about the incoming next gen called LC3plus, my next pair is definitely gonna have that.

Dreadful6644 ,

Any way to see which bitrate is currently being used? I know you can set it to use only 909kbps, 606kbps or 303kbps in the wireplumber config, but I am curious which bitrate the adaptive mode (usually) uses.

Dreadful6644 ,

On my headphones, you can either use LDAC with one device or SBC/AAC with two devices. I can only change it via the app. Is there a similar setting for you?

What modern (gaming) laptops should be avoided for proprietary firmware or whitelists/gate keeping? Also posted Linux GPU telemetry data from Stable Diffusion

I’m looking for a machine to run OpenGPT, Stable Diffusion, and Blender. I’m on the precipice of buying an Alienware w/ Ryzen 9 with a Radeon RX6850m. I’ve never needed anything near this level on Linux and I’m scared TBH. I’d much rather get a System76, but the equivalent hw has Nvidia and costs more than twice as...

Dreadful6644 ,

I have a dell G15 5525 with rtx3060. No issues besides the general Nvidia driver issues. You can check the Arch Wiki.

  • All
  • Subscribed
  • Moderated
  • Favorites
  • random
  • lifeLocal
  • goranko
  • All magazines