ˌWild ˈWest, the the western US, where many European
settlers moved during the 19th century to establish new farms and new cities. In films it is often shown as a place where
cowboys and
Indians (=Native Americans) fight each other, and where cowboys use guns rather than the law to settle arguments. A situation where there are no laws or controls is sometimes described as being ‘like the Wild West’.
[TahlilGaran] Dictionary of Contemporary English ▲