follow Dictionary.com

Stories We Like: Novels For Language Lovers

West Coast

noun
1.
the western coast of the U.S., bordering the Pacific Ocean and comprising the coastal areas of California, Oregon, and Washington.
Related forms
West-Coast, adjective
Dictionary.com Unabridged
Based on the Random House Dictionary, © Random House, Inc. 2014.
Cite This Source

Word of the Day

Word Value for west

7
7
Scrabble Words With Friends

Quotes with west-coast