I feel that what hua he said is correct, but an overly broad generalisation. Traditionally, western countries are the European biggies: France, Italy, the UK, Spain and Portugal and the countries they colonized that are located in the so-called western hemisphere (don't ask me where it starts now, I don't know). However, that definition has since changed to include all the countries in the western hemisphere (excluding most if not all African countries), I believe. Yet, the term "Western" is too often used here and elsewhere to designate countries of mainly Latin, Angle, or Saxon heritage, which brings us back to the biggies above, the States, Canada, Mexico, and the South American countries conquered and colonized by the Spanish and Portuguese.

The political situations with most of the aforementioned countries since world war I has won most of them a designation of their own. Russia is a country of the former USSR, Turkey is a middle-eastern country, Singapore is and always was an Asian state, and South Africa is perceived (even if its not true) as a third world country. Another interesting concept, the third world.

I would place Bolivia in the Western lot myself, although I've never really given it any thought, nor have I ever seen Jamaica as anything else than what it is: Jamaica.

I would ask you the same question you did me. How about Australia? Do you see it as a western country?