Definify.com

Definition 2024


Wild_West

Wild West

English

Proper noun

Wild West

  1. The western United States during the 19th-century era of settlement, commonly believed to be lawless and unruly.
  2. (by extension) A place or situation in which disorderly behavior prevails, especially due to a lack of regulatory oversight or an inadequate legal system.
    The CEO commented that the Russian business environment of the 1990s was the Wild West.

See also

Translations