Learn English – use the term ‘America’ to signify just the United States

geographymeaningmeaning-in-context

I write legal marketing materials. Does the term 'America' signify Canada + USA + Mexico, etc. to readers abroad or will they know that I'm talking specifically about the USA?

Best Answer

It depends where you are writing for. In Europe, Asia and Oceania generally speaking yes, America denotes the USA unless otherwise qualified. However, in North and South America that is not the case, and in fact it would be rude to consider it so. Under those circumstances a more qualified term would be appropriate.

In Britain and Australia, the Wall Street Journal is an American newspaper, but in Canada it is a US newspaper.

EDIT: I should say it is always correct to say US. So the Wall Street Journal is also a US newspaper in Britain and Australia. So it is a safe default.

Related Topic