Search

The Online Encyclopedia and Dictionary

 
     
 

Encyclopedia

Dictionary

Quotes

   
 

West

(Redirected from The west)

West is most commonly a noun, adjective, or adverb indicating direction or geography.

West is the direction towards which the sun sets at the equinox. It is one of the four cardinal points of the compass, upon which it is considered the opposite of East, and at right angles to North and South.

Moving continuously west is following a circle of latitude, which, except in the case of the equator, is not a great circle.

In astronomy, there are several comets named after a discoverer whose name was West; the most famous is Comet West.


"The West" also often refers to Western countries. When used in this sense, it could mean anything from NATO, Europe and North America with or without Japan to all of Judeo-Christian civilisation. See also: Occident

"West" can also refer to the American West, a portion of the United States, especially during the end of the 19th century. It is also called the "Old West" or "Wild West" (see American Old West). See also: Western movie

In pre-modern China, the West indicated the land of Buddha and the place from which the sacred Buddist scriptures came (cf. Journey to the West).

In Mumbai city, West refers to the part of a suburb to the west of the railway line. (See Mumbai suburban railway)


West is the surname of various persons, notably


See also:

Last updated: 05-17-2005 23:36:46