Definition of Wild West in English:

Wild West

proper noun

  • The western regions of the US in the 19th century, when they were lawless frontier districts. The Wild West was the last of a succession of frontiers formed as settlers moved gradually further west.

Pronunciation:

Wild West

//