I've been playing a lot of games lately and watching shows that feature the "Wild West" that took over the Western United States after the Civil War. I know from history classes that it was partly romanticized, but to what extent? Did this "Wild West" actually have some basis in truth (eg, cowboys and bank-robbing gangs, etc) or was it almost all just a romanticization after the fact? If it was mostly just romanticized, where did this popular idea of the "Wild West" come from?
I have written in the past about the origins and popularization of the myth of the Frontier through Buffalo Bill's Wild West show. You will find that most of it wasn't romanticized after the fact, but actually romanticized while it was still happening! I am happy to answer any specific follow-up questions you might have.