While the United States' expansion across North America and later the Pacific was certainly Imperialistic, I was wondering why the US did not establish itself as a global empire in a fashion resembling France or Great Britain instead of controlling its neighbors through semi-puppet governments.
This is not at all my field of expertise, but since no one else has answered I'll attempt to do my best with what limited knowledge I remember from American history classes.
America has an incredibly isolationist history. After a bloody revolution and watching decades of continent-embroiling wars in Europe, we very much created an identity of staying out of foreign affairs. What did we need to get involved in the rest of the world for? America was huge. We had all the land in the world. We were an agrarian nation with very little footprint on the world's trade network. (This will obviously change towards the middle and late 19th century.)
With ample room to expand, abundant natural resources and a steady flow of immigrants from Europe bringing tales of the hardships of foreign wars, America developed a strictly isolationist policy of wanting next to nothing to do with the rest of the world. Even getting involved in WW1 a few decades later was a hard sell.
This starts to change as the nation has more of a vested interest in trade, and we pretty much forced the start of the Spanish-American war in an eager attempt to flex our very early military muscles, but the country in general was still highly isolationist. Acquiring the Philippines as a colony was incredibly unpopular at home. Why should we, a nation born of a revolution from an imperial government, become an empire? When the Philippines rose up in an armed revolution for their sovereignty, we crushed it with an armed response in kind. You can imagine why such behavior did not resonate with the American people.
I recall some senator exclaiming after the event "You wanted an empire, we got an empire, and now we're acting like an empire."
In the 20th century though all bets are off, and controlling smaller countries has become pretty much a national pastime. The history of American isolationism though is very interesting, as it's a core part of our national identity for over a century. Hopefully someone more knowledgeable can add more, but I didn't want this to go totally unanswered.