Books that detail the rise of neoliberalism in the US?

by blackluck64

Not sure how to ask this question without sounding overtly political, but I'm looking for a book that details the rise of neoliberalism in the US and the ascendancy of corporate power in US politics. Goliath: The 100-Year War Between Monopoly Power and Democracy comes close, but from reading book reviews it sounds like the author slips more into advocacy than detailing history, and I'm looking more for the latter.

(I don't think this necessarily violates the rules of this sub since it's not just about current politics, but apologies in advance if it does. Thanks.)

gent2012

If you want to focus specifically on neoliberalism in the United States, there's a few good options:

  • Judith Stein, Pivotal Decade: How the United States Traded Factories for Finance in the Seventies

  • Kim Phillips-Fein, Fear City: New York City's Fiscal Crisis and the Rise of Austerity Politics

  • Angus Bergin, The Great Persuasion: Reinventing Free Markets since the Great Depression

  • There's also a new book that has so far received positive reviews: Binyamin Applebaum, The Economists' Hour: False Prophets, Free Markets, and the Fracture of Society

I'd also suggest looking at neoliberalism within a global context. For that, there's the following books:

  • Quinn Slobodian, Globalists: The End of Empire and the Birth of Neoliberalism

  • Daniel Stedman Jones, Masters of the Universe: Hayek, Friedman, and the Birth of Neoliberal Politics

  • Janek Wasserman, The Marginal Revolutionaries: How Austrian Economists Fought the War of Ideas

Hopefully one (or all) of these books pique your interest!