Just curious, as the word seems to have an entirely different meaning in the US to Europe- what does it mean to be, or to call someone, a liberal in the US?
It often seems to be thrown around as someone who is pro-social justice, and government, which traditionally would be a position far from that advocated by a person traditionally liberal. An American libertarian, and probably many who consider themselves as right-wing conservatives could fall under the liberal umbrella in it's historic definition.
When did being a liberal become such a bad thing?
It often seems to be thrown around as someone who is pro-social justice, and government, which traditionally would be a position far from that advocated by a person traditionally liberal. An American libertarian, and probably many who consider themselves as right-wing conservatives could fall under the liberal umbrella in it's historic definition.
When did being a liberal become such a bad thing?