west

/west/

Definitions

1. noun

the region of the world on the opposite side of the earth from the sun at any given time, especially the region of the United States west of the Mississippi River.

“The family moved to the west coast for a more relaxed lifestyle.”

2. noun

the western part of something.

“The west wing of the building was closed due to renovations.”

3. adverb

in or towards the west.

“The sun sets in the west every evening.”

4. verb

to move or turn to the west.

“She wested through the mountains to get to the other side.”

Synonyms

  • occident
  • sunset
  • western

Antonyms

  • east
  • eastern