Definify.com
Definition 2024
Wild_West
Wild West
English
Proper noun
- The western United States during the 19th-century era of settlement, commonly believed to be lawless and unruly.
- (by extension) A place or situation in which disorderly behavior prevails, especially due to a lack of regulatory oversight or an inadequate legal system.
- The CEO commented that the Russian business environment of the 1990s was the Wild West.
See also
Translations
western United States during the 19th-century era of settlement
|
|
place or situation in which disorderly behaviour prevails
|
|