Liberalism is a right wing ideology, which because America is so right shifted is seen as left in their country or at least a counterpoint to conservatism when they’re basically the same thing - deregulate, don’t tax, “free market” capitalist nonsense that’s destroying the planet.
So from a left wing perspective it’s a derogatory term. Most leftists take on a more worldly understanding, so even American leftists will use it this way.
As for American right wingers, they’re just using it for hate. I doubt they could even explain who they mean.