When did the USA start being seen by Europeans as an exceptionally rich country?

by EverVigilant

I'm talking of those WWI German propaganda posters that depict the USA as being full of money. At what point in American history did America become "rich?"

intangible-tangerine

I would date it to the so-called Gilded age from the 1870s to the 1900s (approx) when the US economy switched from being primarily based on agriculture to being based on Industry and manufacturing.

The acquisition of huge tracts land suitable for farming in the decades leading up to the civil war combined with innovations in farming technology meant that the US could produce enough cheap food to support the large population needed to produce major Urban centres to supply employees to fixed locations such as factories and mines.

There was also a drive toward reconstruction and development after the civil war, such as building new cross country rail-road networks, which created an internal demand for high productivity and so provided a source of internal stimulation for the US economy even at times when foreign investment wasn't great.