Why, in the early 1900's, was much of Europe colonizing and expanding their empires, while the United States weren't?

by chuckyjc05

Obviously the British and the French had extensive colonies throughout the world, but even Germany and Italy, who were less than 100 years old had fairly large colonies extending into Africa and Asia.

Why does it seem like the United States didn't have an interest in expanding this way? or did they?

bug-hunter

The US took Puerto Rico and the Philippines from Spain, was still settling the West, overthrew governments throughout Central America (Banana Wars), and essentially set Cuba'a government up to be pro-Anerican business.

So they were in on it as well.

Oliebonk

When the traditional colonial powers divided or fought over areas of interest, the US did not exist yet, were still expanding within America, or were at war with native Americans, each other, the Brits, the French or Mexico over territory.

In the early 1900's there wasn't much to divide left. Only the colonies of the crumbling Spanish Empire were within reach. After winning the war against Spain, the US took parts of the colonial possessions.

Germany was the organizer of the Berlin Conference in 1884-1885, where Africa was divided amongst European powers. For Chancellor Bismarck that was a victory for the young German state because they took the position of honest broker in between arguing European powers. The Italians participated in the conference. So they were both just in time to take their share of colonies. The Portuguese, Spanish, Dutch, French and British already established their spheres of influence in the centuries before.

Edit: The Americans had an interest in opposing the colonial status quo, because there was nothing left for the US. I do not think they were against having colonies. After WW2 the US actively discouraged European nations trying to regain control of their lost colonies.