When did Universities become seen as being liberal places?

by J_Mnemonic

At what point did Universities become seen as being a place where 'left-wing' or liberal attitudes were welcomed and embraced?

TyroneFreeman

I'd say it's been since universities were first established that they became places from which to challenge the orthodoxy. Being exposed to new and alien thoughts, in some cases superior to the environment in which one lived in, would make academics seek to propagate said ideals through their surroundings.

One of the best examples of this was the invention of modern Civil Law. Originally, after Rome fell, each areas laws devolved into an arcane mixture of Germanic and Roman law. Universities, especially that of Bologna, came into contact with the Codex Iuris Civilis and reintroduced actual Roman Law into Western Europe.

Another more recent example would be the revolutionary upheavals of the Nineteenth Century, with particular emphasis on Germany, as that's what I know best. The Kingdom of Hanover initially granted a constitution which was later revoked; seven professors, including the Brothers Grimm, protested that and were fired. Later on, during the Revolutions of 1848, Pan-German Nationalism and Democracy saw their greatest support in Universities.

Arluza

A question in a similar vein. Is the tradition of the younger crowd (college-aged, so call it 18-30 or so) to be more progressive and liberal in most cultures?

sarcasticorange

Can you clarify your use of the terms "left-wing" and "liberal"? These terms have different meanings (or even no meaning) depending on the time-frame and the political culture of the country in question.