What did we use in smoke detectors before americium?

by the-antiredditor

Most smoke detectors nowadays contain americium (some recent ones, however, use the photoelectric effect), a synthetic element that does not exist on earth naturally, not even in trace amounts (unlike plutonium, which does exist in trace amounts). The first use of nuclear power was somewhere in 1945, meaning americium could have only been discovered and synthesized after 1945. If this is the case, what did we use in smoke detectors before 1945? Did we just not have smoke detectors before then? Are smoke detectors a recent invention?
(Note: I posted this in this subreddit because it's more of a history question than a science question)

restricteddata

An americium smoke detector is what is known as an ionization smoke detector, and works by ionizing the air and seeing if it behaves the way non-smoky air does or not. Non-smoky air has a predictable rate of ionization, and the ionization can be detected as an electric current. Smoke causes the ionization of the air to decrease and has a resultant drop in current.

Ionizing radiation is, as the name implies, something that can ionize things. The first ionizing radiation smoke detector patent that I could find was filed in 1942, though was not granted until 1946. I was curious if it had been made secret during the war — [quite a few private patents were kept secret](https://alexwellerstein.com/publications/wellerstein_insidepatentoffice(bas).pdf) if they touched on issues that potentially were of relevance to the Manhattan Project — but looking over my files on which patents were secret, neither that patent nor its author come up, so the delay may just be because the Patent Office was not running at exactly full-capacity during the war. This particular patent is cited as the prior art of many later smoke detector patents.

Radiation is not the only way to ionize a gas. This earlier patent from the author of the previous one talks about using flame ionization for fire detection. You can also ionize gas with a cathode ray tube. My sense is that all of these other methods are more difficult and potentially cumbersome than radioactivity (using a flame to detect a fire seems kind of ironic to me). But using radioactivity require you to have a reliable source of radiation of sufficient intensity, which prior to the invention of nuclear reactors was non-trivially difficult. Radium is extraordinarily expensive — over the early 20th century it ranged from $100,000/g to about $40,000/gram — as just one example of a natural radioactive source of sufficient intensity to imagine using in such a way.

Industrial-sized nuclear reactors, however, can be used to produce large quantities of radioactive isotopes pretty cheaply, or at least, cheaply because their original purpose (producing plutonium and polonium for nuclear weapons) was already heavily subsidized. And so this is where americium comes into the picture: it fits a nice isotopic niche for what you'd want in a smoke detector (a half-life long enough to keep it active but not decay too much over the course of human lifetimes, intense enough to easily ionize air, easy-enough to produce if you already have reactors and plutonium, lots of alphas and not too many gammas, etc.), and by the 1960s the US government had enough plutonium on hand that putting some more of it in a reactor and bombarding it with neutrons to produce americium was not a huge operation. So in the mid-1960s the US government, via the Atomic Energy Commission, licensed companies to use americium in smoke detectors, and produced Am-241 for sale in 1962 at $1,500/gram. Each gram of Am-241 is enough for 5,000 detectors; so it's about $3 of the cost for a detector. Very reasonable.

Americium manufacture itself was declassified in 1962, and the then-Chairman of the Atomic Energy Commission, Glenn Seaborg, was awarded a patent on it (one he had filed in 1946, but had been kept secret until then). The patent was assigned to the US government, however, so Seaborg didn't personally profit off of it, other than a one-time award of $100,000 he got from the US government in 1955 in recognition of the value of his plutonium patents (also given to the government), and from the residual income he got being on the boards of companies and being a consultant and so on. But it probably didn't hurt that he took a lot of pride in americium, and that probably helped "grease the wheels" for making it available for commercial use, which started soon after.

But even then, home smoke detectors were not really a common thing in the US until the 1970s. Over the course of the mid-1970s, home smoke detectors got small, cheap and common (the ones from the 1960s were large and expensive), with sales moving from the tens of thousands to the many millions. Around 1976, many cities started mandating them through building and safety codes. Interestingly, at the time — in a period when skepticism towards the US government was very high, and fear of radioactivity was also high — they were pretty controversial, with people like Ralph Nader and his Health Research Group arguing that only photoelectric detectors should be mandated (they argued that the Am-241 could contaminate people and the environment in the event of a fire or other dramatic event).

Anyway, the above the pieced together from some other research I've done on Americium-241 and its discoverer over the years, as well as some searches for the term "smoke detector" in ProQuest's Historical Newspapers Archive (which turned up the sales trends and controversies). To my knowledge here is no scholarly history of the smoke detector, but it looks like something that could make for a tidy little paper should someone want to do it (I will happily abstain).