- published: 06 Feb 2015
- views: 12012
The Western world, also known as the West and the Occident (from Latin: occidens "sunset, West"; as contrasted with the Orient), is a term referring to different nations depending on the context. There is no agreed upon definition about what all these nations have in common.
Though the term originally had a literal geographic meaning and contrasted Europe with the cultures of the Orient or Asia, today the term West does not imply geographic location, as most of Europe and Oceania, major components of the West, lie in the Eastern Hemisphere.[citation needed]
The concept of the Western part of the earth has its roots in Greco-Roman civilization in Europe, with the advent of Christianity. In the modern era, Western culture has been heavily influenced by the traditions of The Renaissance, The Protestant Reformation, The Enlightenment, and shaped by the expansive colonialism of the 16th-20th centuries. Its political usage was temporarily informed by a mutual antagonism with the Soviet bloc during the Cold War in the mid-to-late 20th Century (1945–1991).