"Dances With Wolves" and "Open Range" are great movies. What other films out there best depict the real wild west?
Unforgiven comes to mind and There Will Be Blood which not really depicts the wild west per se, but its downfall.