| 释义 |
the Wild WestBrE /ðə ˌwaɪld ˈwest/ NAmE /ðə ˌwaɪld ˈwest/ noun [sing. ]the western states of the US during the years when the first Europeans were settling there , used especially when you are referring to the fact that there was not much respect for the law there 荒野西部,西大荒(开拓时期,尤指尚无法制的美国西部) |