Search

The Online Encyclopedia and Dictionary

 
     
 

Encyclopedia

Dictionary

Quotes

   
 

West Coast of the United States


In general, the term "West Coast" is a nickname for the coastal states of the Western United States, comprising California, Oregon and Washington, and sometimes Alaska and Hawaii (see Pacific States). The West Coast is a portion of the West.

It has also come to be called "The Coast", especially by New Yorkers, or the "Left Coast," a pun based on its lefthand position on a map of the US as well as its reputation for being more socially liberal — or left wing — than the East Coast or Midwest.

The term has been taken by rap music performers when used to refer to a particular school of artists, such as Tupac and Dr. Dre. The East Coast/West Coast dichotomy has led to violence and much rhetoric. This violence and rhetoric largely subsided by the beginning of the 21st century.

See also: Geography of the Western United States, List of regions of the United States

Last updated: 05-18-2005 19:36:43