I’m sorry, but if you see a 25% difference in a benchmark, that means your methodology is somehow flawed. A few percentage in either direction would be believable, but this difference would be so comical if true, that extra wariness is needed.
There’s a few thing that look a bit off to me, but most importantly it seems like your OBS settings are wildly different between systems. It’s a bit hard to make out, but it seems like you’re doing CPU-based encoding on Linux and GPU-based encoding on Windows.