Definition of Wild West in English:
The western US in a time of lawlessness in its early history. The Wild West was the last of a succession of frontiers formed as settlers moved gradually further west. The frontier was officially declared closed in 1890.
Wild West/ˈˌwīl(d) ˈwest/