Not sure if this is the right sub for this, I apologize if I’m in the wrong place. I’m trying to find what a typical medical student’s class load looked like, as in what each year might have studied. For example: when did they begin to do labs and watch in a operating theater - right away in Freshman year or later?
I’ll take basically any other information I can get my hands on as well. Classroom settings, dormitories, student life, teaching styles, etc...I’m specifically researching Canadian schools but honestly American is fine as well. Europe was so far ahead that it won’t really be too helpful though!
So far, google hasn’t been too helpful. UNMC has a very short page about curriculum in 1900, not particularly full of information. I’ve been able to locate a few photos of what a campus looked like In Toronto and read the entire history of the University of Toronto.
Thanks in advance if you’re able to help!
EDIT: I am happy for information presented in any media type - online articles, books, documentaries, fictional movies, anything! The Knick TV show is the closest I’ve found so far haha
Interesting questions!
It’s quite likely that the state of Canadian medical education in 1900 looked a lot like that in America— in which there were some elite medical schools, associated with universities, that sought a more rigorous curriculum and credentialing standards, and then, kind of, well, everyone else, a category with enormous variation in terms of entry requirements and curriculum.
In America, for example, there were very fancy, very white, very well-known medical schools at major universities— Yale, Johns Hopkins, Michigan, Harvard, UPenn— that drew on innovations in French and German medicine and were deeply connected to medical professional organizations like the American Medical Association. Their goal was to professionalize medicine and establish it as a scholarly discipline, with requirements for matriculation and graduation, its own scholarly journals, and their own academic department in the university. At these schools, the medical curriculum was four years (nine-month terms), and students had to have a year or two of college before entry (limiting the race and class backgrounds of incoming medical students), all instruction was designed around work in laboratories and clinics. The goal was also to have faculty who were full-time academic researchers and teachers, not physicians who taught on the side [1].
Outside this circle were a host of smaller schools, many for-profit and some not, that trained far fewer students, had far fewer laboratory and clinical facilities (some for-profit schools had none at all), and by the turn of the century, were struggling to keep up with the university medical schools and the professionalizing reforms they instituted. These smaller, struggling schools tended to be in rural areas and/or black medical schools, and often trained working-class men as doctors— their demise further cemented medical education in the hands of elite, white institutions [1, 2].
But based on your question it sounds like you’re more interested in university medical schools. So, what was life like there, for a typical student at a place like Yale, or Hopkins? (My examples are American because American sources are what I can get my hands on right now, but I strongly suspect the University of Toronto and McGill medical schools would have been set on similar lines, and provincial medical colleges would be trying to catch up where they could.)
Anyways: at a 1901 address to graduates of Yale Medical School, William Welch—the first dean of Johns Hopkins Medical School, which was established in 1893—gives us an overview of the history of Yale Medical School and a couple choice hints: “The inadequacy of the system of didactic lectures for the training of medical students was nowhere in this country earlier recognized than here. In 1855 the course was supplemented by daily recitations… until they in combination with laboratory practice became, at least as early as 1867, a distinctive, and certainly a valuable, feature of the school. In 1879 the Yale medical department placed itself in the front rank, as regards to its standards, with only a few companions at that time, by introducing a stated matriculation examination and a three years’ graded course, lengthened in 1896 to four years. Clinical instruction and the recitation and laboratory plan of teaching, continued to be the basis of the course” [3].
He goes on to note that in 1893, Yale built a new laboratory building, and in 1901, was in the process of building a new “clinical building”— perhaps a clinic that students could work in.
So a Yale student in 1901 would have to take an exam to enter. Before this, he would have taken some college courses, though not enough to constitute an undergraduate degree. (Elsewhere in this address Welch also mentions the importance of “chemistry, physics, and general biology,” and approvingly cites the “Sheffield Scientific School, which in 1870 offered well-planned courses in these branches of science, announced as intended especially for the preliminary training of prospective medical students” [3]. Pre-med requirements were becoming more common by 1900, and in 1912, the AMA issued new guidance that medical schools that did not require pre-med courses would receive lesser rankings from the AMA.
Once he got to Yale, he would be there for four years. (He because Yale Medical School didn’t accept women until 1916.) Lectures were supplemented by “recitations” (I’m actually not 100% sure on what this means— my guess is small-group sessions with faculty), and laboratory work, in a spanking-new laboratory where he could immediately learn science by, you know, doing it. Experiential, scientific learning was the cutting-edge of science in 1900, and the elite faculty of US medical schools contended that medical training should be an exercise in learning how to apply the scientific method rather than regurgitating facts from lectures [4]. His instructors were prominent doctors and medical researchers— the ideal of full-time medical faculty, who did nothing but teach and research, was still pretty far off for American medicine, and his professors might well have had private practices of their own [5].
Welch doesn’t say, but it’s likely that the lectures and laboratory work would be undertaken together: you would directly apply what you learned in that day’s/week’s lecture to your laboratory work. JAMA, the journal of the American Medical Association, called for instruction in “anatomy, physiology, physiologic chemistry, bacteriology, pathology, and pharmacology,” all supplemented with ‘well-equipped laboratories” [5]. But less well-resourced schools, without expensive laboratories and university backing, would likely have just based the lectures around whatever physicians were on hand to lecture and what they were able to teach. The depth, rigor, and standardization of medical education very much depended on the resources of the school in question.
And when did he get to go into an operating theater? Good question. Throughout his training, the medical student might get to hang out in an outpatient clinic to learn the basics, and tour the wards a few mornings a week to practice diagnosing patients (and perhaps attend and observe during surgeries)—but until the 1920s, hospitals were leery to let students do anything more than that. If your university owned its own hospital, like Johns Hopkins or the University of Michigan, lucky you [1]. In 1908, the AMA’s own Council on Medical Education recommended that students begin doing clinical rotations in hospitals at the end of their second year (if the curriculum was four years, which many were), but again, it really was not until the 1920s that the “teaching hospital” emerged in full-force across the US & Canada.
(Actually, I did a little more research on this and came across an interesting article from 1958, which references “extramural preceptorships” as a common part of pre-WW1 medical education— where “fourth-year students are sent to observe the work of a physician for a period of weeks or less commonly of months. The student usually becomes a member of the practitioner’s family during this period and is virtually his shadow during his working and leisure hours.” The point of the preceptorship was to help prepare students to enter general practice, though of course a poor match between doctor and student could entirely derail it [6].)
Sources:
[1] Ludmerer, Kenneth M. Time to Heal: American Medical Education From the Turn of the Century to the Era of Managed Care. Oxford, 1999.
[2] Lynn Miller & Richard Weiss, “Revisiting Black Medical School Extinctions in the Flexner Era,” Journal of the History of Medicine and Allied Sciences, Vol. 67, No. 2 (April 2012), pp. 217-243
[3] “The Relation of Yale to Medicine,” Science 29 Nov 1901: Vol. 14, Issue 361, pp. 825-840
[4] Andrew H. Beck, “The Flexner Report and the Standardization of American Medicine,” JAMA, 5 May 2004, 291(17):2139-40
[5] Gerald E. Markowitz & David Rosner, “Doctors in Crisis: A Study of the Use of Medical Education Reform to Establish Modern Professional Elitism in Medicine,” American Quarterly vol. 25, no. 1 (1973), pp. 83-107
[6] Mason Trowbridge, Jr. “Extramural Preceptorships — A Return to the Pre-Flexner Era of Medical Education?” New England Journal of Medicine, 1958, vol. 258, p691-695.
edited for clarity & style, clarified one point