Was originally going to post this in general, but after reading the response from an angry Canadian claiming Canadians aren't American..I thought it'd be better suited here. Alittle just in case so to speak.
I don't think there is much to debate about, but it's nice getting other opinions.
When someone hears the phrase "American", they automatically assume the citizens of the USA. American this American that. Those bloody Americans!
But why is that?
Technically the USA doesn't have a proper name for a country. A group of states..that are united! Haha - Collectively, it's part of the Americas. Which of course is North and South America. In N.America, we have Mexico, USA and Canada. Technically, Canadians and Mexicans are American as well, but are rarely addressed as such. The USA (United States Of America) is a mass of land sandwiched between the two that kinda "stole" the term "America".
In a way, it remindes me of the UK. When someone hears the term "UK/United Kingdom/British Isles" they tend to think of England by default. Even though the whole of the UK includes Wales, England, Scotland and Northern Ireland. Difference is, someone from England can be called either British or an Englishman. Those who are from the USA are "American" with not much else to choose from outside the informal "Yank".
How does it make you feel that the USA "owns" the term "America"? If you could give the USA a proper country name, what would it be?
I can't really think of another country thats name is closely linked to its continent aside from Australia.
This thread is about the American label!