Definition of Wild West in US English:

Wild West

proper noun

  • The western US in a time of lawlessness in its early history. The Wild West was the last of a succession of frontiers formed as settlers moved gradually further west. The frontier was officially declared closed in 1890.


Wild West

/ˈˌwaɪl(d) ˈwɛst//ˈˌwīl(d) ˈwest/