Definition of Wild West in English:

Wild West

proper noun

  • The western US in a time of lawlessness in its early history. The Wild West was the last of a succession of frontiers formed as settlers moved gradually further west. The frontier was officially declared closed in 1890.

Pronunciation:

Wild West

/ˈˌwīl(d) ˈwest/