Did the soldiers who fought in WW1 really believe that the war would end wars?

by Notthezodiackiller69

I've heard this phrase a lot - it's quoted in the song 'The Green Fields of France' for example - and as far as I know it was a contemporary phrase.

Did people of the time believe this to be literally true? And if so, why did they believe that?

Holy_Shit_HeckHounds

Here are two brief answers to similar questions: The war to end all wars written by u/scrap_iron_flotilla which focuses largely on officers and Why did people think that World War 1 would end all wars? written by u/DuxBelisarius links to related ideas and talks briefly about H.G. Wells contributions to the phrase