What was the American reception to the formation of the German Empire?

by NonFanatic

I have check to see if there have been any similar post, or even any noticeable articles from a google search, but couldn't really find anything. I know that at the time the U.S. was in the middle of Reconstruction in the south, and that Brazil, while a large land holder, didn't really have as large of a population. My question is what would have been there thoughts on another huge European power, considering the implications of the reach its neighbors had at the time.

nilhaus

German American relations is a complicated subject that changed rapidly from the 1860's through the 1890's. Prior to the unification of Germany, the German states and the United States enjoyed generally friendly relations. Bismarck, the Prime Minister of Prussia and later face of Germany to most Americans, viewed the United States favorably, and it was generally returned.

Bismarck personally sympathized with the Confederacy during the American Civil War, but believed the Union would win and be justified in it's victory. In 1878, after the formation of Imperial Germany, Bismarck remarked in a conversation with President Grant that he felt the German wars of unification were analogous to the American Civil War. Grant did not totally agree, putting more emphasis on the destruction of slavery in the American Civil War. Bismarck's personal sympathy with the Confederacy, and rumors of monarchism in the south during the war lead to a slight souring of American-German relations. For the record, Bismarck thought a monarchy in America would have been counter-productive, but this did not always make it to the press.

During the Franco Prussian war itself, American and German attitudes were still friendly. The United States sent three generals, including Sherman, to help with negotiations of cease-fire on behalf of the German states, though the missions were largely unsuccessful.

After the war relations between the two countries slowly soured due to competition. America and Germany were both late to the overseas Imperialism game and butted heads over Pacific holdings, in particular Samoa and the Spanish-American war. A series of tariff wars in the 1890's through 1910's lead to rather negative relations between America and Germany.

Edit: I just noticed you mentioned Brazil, and may have been more broad in your definition of 'American' than I interpreted. My reply only has to do with the United States.

Sources: Blackbourn, David; Eley, Geoff. The Peculiarities of German History, Oxford University Press, 1984.

Young, John Russell. Around the World with General Grant, New York: The American News Company, 1879.

Wehler, Hans-Ulrich. The German Empire 1871-1918, Berg Publishers, 1985.