Left-wing can be relative to the nation. The most meaningful faction of American "left" is the Democratic Party. The global definition is based on socialist vs capitalist ideological splits, in which all forms of ideological liberalism are right wing or, in the case of social democracy, arguably what centrist actually means.
The Democratic Party is therefore left-wing internally and right-wing globally, thus people saying "America does not have a left wing."
Leftists are socialists. It is not relative. Democrats are not leftists. Bernie Sanders is, as a democratic socialist. You are not "An American leftist" because you like billionaires existing but don't want to genocide brown and gay people. That's just liberalism as it is supposed to be.
This is also why a leftist would deny that "liberalism" is left wing. Liberalism is a broad ideological judgement and can be assumed to be using the global standards as a result, America does not have sole claim to defining it. So American liberals are the American left, but liberalism itself is right-wing.
This really isn't that complicated if you know the basic meanings of the words in question, which is why liberals find it so confusing. Liberalism is the status quo position of the American electorate and moving beyond it requires education while going along with the binary party politics does not.
Edit: this was supposed to be a response to the first comment instead of me telling OP things they already know
Why did their comments on liberals offend you so much? lol. American liberals do tend to be quite uninformed on basic political science _ just like most Americans are
Also...
Projection (n.): a psychological defense mechanism where a person unconsciously or consciously attributes their own thoughts, feelings, or behaviors onto others