How were complex calculations made before computers?

by Santi182

I'm curious to know how accountants/mathematicians/scientists organized data and made calculations before computers were commonly used. Were there any special charts, books, or techniques to manage data?

Basically, what was the predecessor to excel?

nlcund

For complex calculations, it was common to have a reference book with tables of pre-calculated logarithms, trigonometric functions, square roots, and commonly-used integrals, to save the labor of manually calculating them.

As far as data management, that's a broad area, but many techniques that we think of as computer-based were actually developed to improve the efficiency of manual processes, when the cost of a poor algorithm was particularly acute. Knuth is fairly good about tracing the origins of the algorithms he documents; I remember one even originated to sort railroad cars.

jeffbell

Generally it was to use pen and paper.

For applications like accounting, systems such as double entry bookkeeping developed in the 1300s in Italy. Early ledgers were drawn up with a straight edge to make the labeled columns. Printed forms for this are still sold today: pre-ruled ledger forms.

When these are bound so that you can see two pages at once they are called "spreadsheets". Later electronic spreadsheets copied the layouts and organization.

Even Big Data operations such as the census was completed by hand, using [printed forms] (http://freepages.genealogy.rootsweb.ancestry.com/~mcging/gif/1850census.gif)

Other applications, such as numerical methods such as are used in semiconductor design the Newton-Raphson Method is still used today.

In other words, the algorithms and spreadsheets have been around for centuries. They're just faster now.

kyalo40

This comes from The Difference Engine by Doron Swade.

As noted previously, the term 'computer' originally referred to a person who calculated tables of results, laboriously repeating the same calculation over and over with different inputs. The problem was errors. To quote: (p. 13) 'In a random selection of 40 volumes [Dionysius Lardner] found no fewer than 3,000 errors acknowledged in the errata sheets. Some of the correction sheets themselves contained errors. Lardner ridiculed the need for errata of errata, and in the case of the Nautical Almanac - a standard volume of astronomical tables - he trumpeted the absurdity of errata of errata of errata.' It was standard practice in this era to give a calculation to two 'computers' and if they agreed, hope they had not both made the same mistake. Babbage dedicated his life to finding a mechanical method for performing these calculations, inventing the first computer (The Difference Engine of the title).

SkyNTP

For engineering, slide rulers are a classic, but these only served as general purpose calculators. Nomograms (example) function similarly, but are specific to a particular equation and set of units.

ctesibius

There were many threads which converged to modern methods of calculation. Let's start with Excel, since that's what you asked about. Excel is a descendent of Visicalc, the first computer spreadsheet. However the name "spreadsheet" originally applied to paper, or more commonly blackboards. Picture a blackboard ten yards wide divided in to columns with the same calculation applied to each member of a column by hand. As with modern spreadsheets, usually the individual calculations were not complex and the aim was simply to visualise large amounts of data, usually commercial.

Another immediate predecessor to the computer was the programmable calculator, which was at the top of the tree for about 15 years. In fact the current PC CPUs are linear descendants of the original Intel 4004 chip, which was intended for a Busicom calculator. Calculators have been technically stagnant for at least 15 years, and obsolescent for long before then, but in their day were impressive beasties. The larger ones were desktop models rather than the more familiar pocket calculators. This was partly due to the size of the electronics and the power consumption, but even when this had been solved in some cases a desktop version of a calculator coexisted with a pocket version as it was more convenient to use for many hours of work. These calculators were fully programmable, but used low-level languages more similar to assembler than to FORTRAN or BASIC. There was usually some means of storing software and data, e.g. magnetic cards, and printers were available.

There's a distinction between these two threads. Spreadsheets were about "data processing" - handling data in large volume, but with simple arithmetic. The calculators were usually used for smaller volumes of data but with more complex calculation, perhaps for scientific or engineering purposes. While it was not a rigid distinction, the two tended to be treated separately until the micro computer age.

On the DP side, the computer age started with the LEO (Lyons Electronic Office), the first commercial computer. This was made to handle logistics for a chain of British tea shops. We founded an empire to ensure tea supply, so founding a computer industry was only a logical continuation.

However the input and output of a LEO would have been familiar to earlier generations: punched cards and paper tape. These came from two different traditions. Paper tape came from telecoms, specifically telex machines - automated telegraphy, and in some ways a predecessor of the Internet. The tapes would be used to prepare a message and destination codes off line before it was sent at full line speed (usually 50 bits per second).

Punched cards were more interesting, in that they were used for pre-computer DP, using machines supplied by companies such as International Business Machines (IBM, formerly the Tabulating Machine Company). In this era data such as a census record would be encoded on a punched card. One hole might represent male/female. Seven holes would be sufficient to encode a person's age, and so on. Separate machines would be set to perform operations such as counting all the employed females of voting age. These were not computers: operations were extremely simple such as accept/reject a card, count cards, sum numbers. Different machines might be used for each step of an algorithm. There was no program: the process the operators used to move cards between machines was the closest equivalent. Here we see the characteristic data processing aim - little calculation, huge volumes of data.

Moving back to the calculation side: punched card systems were occasionally perverted into use for this. Feynman had a story about the use of such equipment for "Monte Carlo" calculations for early nuclear bomb work, where the same calculation was repeated thousands of times for varying input data. However this was unusual.

More commonly, calculation divided into two types: real time and non real time. A real time calculation would be something like aiming the guns of a battleship to engage a moving target fifteen miles away, bearing in mind complications such as coriolis forces. These would usually be perfomed with what we would now call an analogue computer. An analogue computer contains elements which behave in a way which is physically similar (analogous) to the system you are trying to model. A good example is the MONIAC computer, which simulated the British economy by flowing water through pipes and reservoirs. Analogue computer are generally either built to perform only one task (e.g. MONIAC), or perform only very simple operations (e.g. a slide rule). They varied very widely in their form. One favourite of mine is the planimeter, a device for calculating the area of something on a map by tracing around it.

Non real time calculations were simpler. An example would be calculating the mass of fuel that an aircraft needed to reach a destination. These could be done with pre-computed tables, or with a nomogram - a sort of paper-based analogue computer where most of the calculation had been done before printing the diagram. In both cases, the calculations required would be extensive, but as they would be the same for every user, they could be done once and then summarised in the table or nomogram. The calculations would be done by hand or with mechanical calculators, using large numbers of human staff.

Trust_No_Won

Not sure if I'm really qualified to answer this, but I always understood electronic computers to be a replacement for people who performed the job of computing. For example, artillerists relied on range tables to provide them with information about how to hit a target at a certain distance.

Books were compiled of logarithm tables, trigonometric functions, and even things like life expectancies and death rates. So, yes, there were specialized books available for people to refer to for this information, and there were people required to write them out and perform all these manual calculations to arrive at the specific solutions.

madhadron

There are two separate issues here: the laying out of data for working with it and archiving it, and doing calculations on that data.

Laying out data has developed to make errors easier to catch and less likely to commit. Someone has already mentioned double entry bookkeeping (from the 14th century), but even laying lists out in a table is a development. I don't know when this first appeared. Inventories in Linear B, or in the Xin Jiang desert along the silk road, weren't laid out so. It's an enormous waste of space on your medium. Napier published the first logarithm tables in 1614, which set the format more or lass for numerical tables to the present day. The height of this kind of technique was reached in the 1960's in John Tukey's 'Exploratory Data Analysis', which includes some truly remarkable techniques for compactly representing large amounts of data for easy archiving and examination. Note that Tukey is also one of the pioneers of data analysis on computers, and also invented many of the concepts used today such as the OLAP cube.

For calculation, many things you take for granted today were invented to make this faster. Arabic numerals, for example, are an incredible step forward from Roman numerals, which were themselves a step forward from the Greek representation using sequential letters of the alphabet. Newton and Leibniz invented the differential and integral calculus to automate certain forms of proof that they were carrying out again and again when working on mechanics. I've already mentioned Napier's logarithm tables. There are still professors in physics and math departments who are old enough that they memorized the log tables and interpolation methods to be able to do multiplication and division at high speeds in their heads. In some ways, this was a better way of working. I spent a few months when I was a student doing all my work in my head to three decimal places. It quickly becomes fairly reflexive, and you cannot get distracted by failures in whatever device you're using. It's also tiring, though, which is why we all use calculators. Up until the second world war, there had been centuries of employing people to sit in a room and calculate the entries for mathematical tables. And if you go into a physics department, you'll find those tables are still being used. Dover puts out a reprint of Abramowitz and Stegun, which is a necessary reference for a theoretical physicist, since it is the most comprehensive set of tables for special functions (Bessel functions, Hankel functions, Gegenbauer polynomials, etc.) that you can get your hands on.

Someone else has mentioned slide rules. Slide rules came in a huge variety of forms and for lots of specialized purposes. There are still specialized slide rules for navigation purposes in use today, and midwives and obstetricians use another specialized form for calculating conception and due dates. There was a fairly standard slide rule most engineers used as their day to day tool, which had the logarithmic scales to quickly do multiplication and division, and usually a set of extra slides for sines, cosines, and other trigonometric functions.

I'm afraid these are just a few details to wet your appetite. You've asked about a really enormous expanse of technology, much of which still exists and is still in use.