At what time did the public learn about death camps in WWII?

by CameronCraig88

I can't seem to find a concise answer on this. I was taught (Florida education system isn't the best) the soldiers and public didn't know about death camps until they were liberated. I watched the movie Judgement at Nuremberg and one of the main plot points is a widow of a Nazi general trying to convince one of the judges that the German people did not know. This reiterates what I was taught in schools.

But last night I watched The Great Dictator from 1940 and in that movie, there's newspapers that flash on the screen that give an update on what's happening in the war, and it seems very clear they knew what was happening to Jewish communities. If most of the death camps weren't liberated until 1945, did the world know about the death camps for 5 years before they were liberated?

Somewhat related question:

I've seen the question of whether American had got involved or not would have resulted in the same outcome for WWII but I want to know if the tides of the battle were already turning once America got involved? I was taught that the Axis' biggest mistake was the attack on Pearl Harbor and getting America involved in the war, but from my reading of history as I got older, I get the impression America wasn't the push that got the war over the hill.

I was taught the assault on Normandy was the 'biggest' factor in securing victory in the war, but if I'm not mistaken, the Red Army launched an even greater assault to the north which was much more impactful.

Can someone clear all this up for me? I keep finding conflicting answers.

mikedash

There is always more to say, but one section of our FAQ offers a number of links to earlier threads that address your question. You might like to review some of these while you wait for fresh responses to this query:

Were people aware of the Holocaust during the war?