Reencoding video will always put a lot of strain on your computer, regardless if it’s Handbrake or some other tool doing the conversion. With that said, it also depends a lot on the settings you use to transcode.
Can you share what video settings you tried in Handbrake?
Ok, so two things stand out.
First, you’re using placebo as your encoder preset. This is slow, stupidly slow.
Searching for some comparisons between the different presets that show how slow it is was more difficult than expected, most people don’t even test this setting. Found one comparison here anyways luckily. The first graph (red line) shows how many fps/second were achieved on average. The fastest preset they tested, veryfast, is 68 times faster than placebo.
For reasonable values I’d use medium or slow, which are still 20 times or 10 times faster with minuscule quality differences. Also see the FFMPEG FAQ on why placebo is stupid.
Now for the second thing. I get why you put 0 as CRF, but that’s not a good idea. You’ll most likely have a bigger file in the end than what you started with.
The bluray itself does not contain enough details to actually need such a low CRF. 17 or 18 is visually lossless, as in you won’t be able to tell the difference with your eyes. For my encodes I use 20 most of the time, as it still more than good enough. Reasonable values, if you want a smaller file size are up to ~28.
TLDR: use slow with CRF 20 as a starting point.
I hope some of this made sense to you and sorry if it comes over as too agressive.
Still, hope this helps you get what you want.
Edit: One more thing I thought about. You can use the hardware encoder in your GPU if you want. However, that will come with worse quality and bigger files than encoding on the CPU. Still something you might want to look at just to compare.
I…uh…feel like a dumbass now. I didn’t know a lot of this. When I first started using Handbrake, a lot of articles I read suggested to use Placebo / CRF 20 if I wanted no loss in quality. I also do this when I rip DVDs and the file sizes and everything have always been perfectly reasonable. Though I suppose maybe that was because DVDs are 480p max typically and those naturally have much smaller file sizes than Blu-rays hold.
I just don’t want a loss in quality, that’s all. :(
Also,
TLDR: use slow with CRF 20 as a starting point.
Thanks. You didn’t come across as agressive to me. :) I appreciate the information. I’m no noob as I’ve been ripping DVDs for years using Handbrake but I am very much a beginner when it comes to ripping Blu-rays, which seem to be a slightly different beast than the former, so I’m glad that everyone is so willing to share tips. :)
To get it out of the way first: There are no financial issues. There are more than enough funds to continue operations as they are for a sufficiently long time.
What is actually happening is that a long time sponsor has indicated that they (understandably) no longer want to foot the huge bill of hosting the entire archive of binary caches ($9000/mo). Finding a more sustainable setup is what the community is currently concerned with.
There is no risk of operations shutting down any time soon, the NixOS foundation has funds set aside to continue even this unsustainable setup for at least a year. We just want to be more efficient with our and others resources going forwards.
That’s what all this you might have heard of is about.
Btw, even if the binary cache were to go poof, we don’t technically need it. NixOS is a source-based distro like Gentoo and source hosting is not a concern. The binary cache is immensely helpful though which is why we’d obviously prefer to keep it.
You aren’t a reputable public hoster with AWS-class uptime. That has a price too. AWS is likely overpriced though, hence the nix community still looking for better alternatives.
First Linux servers I installed were RedHat 4.2. I stick with RH until 8.0. Then they stabbed us all in the back, starting to charge for it.
Have you RH users been fooled twice?
I switched to the then (and still?) distro that was most strict in commitment to FOSS - heck, they forked FireFox just because of the logo copyrights - Debian.
(RH to kubunto at home, because Debian then was (is?) too “enterprise” for home, and I wanted to stick to the same packaging)
The only other distro I’ve been using is SUSE (SLES), because that’s what SAP suports for HANA database servers.
SUSE should gradually morph the RH fork into becoming SLES, and always provide an easy automated way to migrate, a one way only route to leave RH.
Some apps automatically pick up your theme some don’t. For these I give the specific app access to my theme folder with a :ro at the end of the path.
IDEs should work ootb. If some extension doesn’t work, maybe it’s because of poor support for Flatpak. 9/10 times you’ll find the issue is that app is calling the traditional /usr/bin path etc. when Flatpak installations use different paths.
Only one thing: never give up. You’ll get things fixed by copy and paste until one day youll have a broken system and think wait I actually know how to fix this because I’ve been through it five times before.
Who needs time shift when you can just slowly break your system while trying to fix a bug and then just either reinstall the os or switch to a different distro bc might as well
Appimages are nice. I just would like there to be some hub for them which also enables the appimages to be updated through it. I know about zap for example but it’s not up to par with flathub or the snap store.
linux
Active
This magazine is from a federated server and may be incomplete. Browse more on the original instance.