Definition of Western States in English:

Western States

noun

  • The more westerly of the United States of America.

Origin

Late 18th century.