west-coast

West Coast

noun
the western coast of the U.S., bordering the Pacific Ocean and comprising the coastal areas of California, Oregon, and Washington.

West-Coast, adjective
Dictionary.com Unabridged
Based on the Random House Dictionary, © Random House, Inc. 2014.
Cite This Source Link To west-coast
Copyright © 2014 Dictionary.com, LLC. All rights reserved.
  • Please Login or Sign Up to use the Recent Searches feature