Conversations Ongoing: What is the West?

Definitions for the West:

"West is most commonly a noun, adjective, or adverb indicating direction or geography."

"the countries of (originally) Europe and (now including) North America and South America"

"the region of the United States lying to the west of the Mississippi River"

"a location in the western part of a country, region, or city"

"the West originated in the northern and eastern Mediterranean with ancient Greece and ancient Rome. Over time, their associated empires grew first to the east and south, conquering and absorbing many older great civilizations; later, they grew to the north and west to include Western Europe."

"The exact scope of the West is somewhat subjective in nature, depending on whether cultural, economic, spiritual or political criteria are employed."

In regards to the American West there really is no consensus on a place, region, state of mind, or exact end of the West. Some put the end of Western America once the Populist party was defeated in 1908. But what really fuled the ideals of the West? What drove the people of the West to do what they did and expand across the continent of America the way they did? Are we still doing so today, spreading across the world? What has changed about us? Has anything changed?

If the ideas of modernity, scientific progress, political, social, and economic liberalism fueled the Western expansion of early America, who is to say that it is over? I would venture to say that the beliefs in the three main metanarratives of our culture (scientific progress, modernity, and the national story) all drove and continue to drive our Western outlook. The absolute belief in these things drive our actions and beliefs about ourselves just as it has before and will continue to drive us indefinitely. My definition of the West would include this notion.

Comments

Popular posts from this blog

Python 3.4 - Caesar Cipher

UML - Use Case Diagram