Prior to the advent of personal computers and office software, how were complex finance, accounting, marketing, and data visualization tasks handled in corporations?

by MoreCheezThanDoritos

Did people create P&L statements by hand on graph paper? How were charts and other data visualizations created and presented? How were financial models done? Was "big data" even a thing?

RenaissanceSnowblizz

With difficulty, yes and no, and it depends.

Big Data is a trendy buzzword for a field that existed a long time. My professor of Information Systems would grumble continuously about people suddenly "inventing" what she had done for decades. Then again you shouldn't look down on when your field suddenly gets sexy again. A sentiment I expressed in my research group several times at the times.

Basically databases and maintenance and gaining insight from them, aka data mining are old concepts within the field. And has existed since the "mainframe days". What the modern advent of "Big Data" really is the cumulation of decades of innovation and development in several fields. It really comes about as a convergence of the ability to store and access much more data than previously (storage become cheaper and faster), the ability to collect (people willingly give up loads of personal information amongst other things) and enter all that data (automated entry, often by the users themselves) into databases. Raw computer power and memory also matters in that you can process more data concurrently. This also leads to tools and system able to process said data better, what we used to call data mining. Basically Big Data becomes a thing as soon as technology, amount do data available and the tools to analyse it all reach a tipping point. Before "big data" you had medium data, and small data, though no one thought of calling it so.

Now in a sense then "big data" existed from the beginning, mainframe computers stored for what was in their day massive amounts of information (compared to a human). I would recommend watching the movie Hidden Figures (because it's bloody excellent), but mostly because it shows this transition very well. A room full of dozens of (human) computors can be replaced by a machine, that works much much faster. So that is how you handled complex financial calculations. By hand. In fact we were still taught the fundamentals and forced to take an entrance exam using hand calculation of financial equations to get into business school (at the turn of the millennium). I am eternally grateful to have never had to perform derivation calculations on financial equations by hand (or at all, screw you national economics!) ever since that exam.

Yes, you would indeed have had to do graphs by hand in a time before computers. Even partly in a time with computers because mainframe computing has the quirk that you have to share your computing time and you didn't necessarily have access to the software required anyway. So maybe you get a printout where you have gotten the data calculated for you from the database system but then it's your job to draw it up for some boss to look at. So there was a time when MS PowerPoint was Ms. Power pointing at a piece of paper (ok in fairness most like a Mr. but I think you see what am going for there). I'm probably not too far off base if I say that calculations were by necessity kept simpler. But we should also remember societies worked at a much slower pace too. There's a hilarious scene in an old 1980's Swedish comedy where a big shot financier is listening to a tape on a Walkman of the stockquotes his secretary has read in for him. Imagine a world where this made sense? When today you need a dedicated terminal plugged into the stockexchange main-system with privileged access to be a real player. You had less data and you needed less data because, well, you didn't have it. Companies made plans and budgets for a year (and multiple year plans) not a couple of months at a time, and the idea of working only towards the new quarterly report woulda made little sense.

When does this change? Well, many have argued it was Visicalc released in 1979, the first computer spreadsheet program for a personal computer. This program in of itself was a reason to buy a microcomputer (i.e. a personal computer on the desktop, as contrasted to a "terminal" that simply accessed a mainframe computer). And it prompted IBM to create a personal computer, the IBM PC, to avoid losing their mainframe market. This put tremendous power into the hands of the actual end-user of data, where before data would pass many hands before reaching a decision-maker. Also in many cases it let them bypass the main computer centre of the company as they could buy the equipment out of their own equipment allocation. Needless to say the centralized computer centres of corporations were not pleased with muggles having computer access, especially since they ended up having to support it. Ushering in a decades long cycle of centralisation/decentralisation of computing resources in corporations and computer support. With personal computers now invading most white-collar offices you get an explosion of software to support and automate all those financial calculations and analytics. We get better computers, software does more. We can collect even more data to analyse. The data has to be pushed onto larger centralised datamanaging systems (oh hey mainframe is back!). We develop better tools to store, manage and analyse data and hey presto someone figures out this is new and needs a name. Data existed, so this must be Big Data.

I've attached some of the resources I used back in the day, though note while my research group where basically big data, I was not, so my resources are more geared towards the general gist of the history of end-users using computers. It won't be a detailed citation list for the history of big data as such. I'd have to see what I have left of my colleagues works if any to find that.

Regan, E. A. and O'Connor, B. N. (1994). End-user information systems : Perspectives for managers and information systems professionals, New York, Macmillan.

Power, D. J. (2004). A brief history of spreadsheets. Retrieved 17.1, 2006, from http://dssresources.com/history/sshistory.html

Barker, S. (2007). End user computing and end user development: Exploring definitions for the 21st century. In M. Khosrow-Pour (Ed.), Managing worldwide operations and communications with information technology (pp. 249-252), Idea Group.

Benson, D. H. (1983). A field study of end user computing: Findings and issues. MIS Quarterly, 7(4), 35-45.

Powell, A. and Moore, J. E. (2002). The focus of research in end user computing: Where have we come since the 1980s? Journal of Organizational and End User Computing (JOEUC), 14(1), 3-22.

Halloran, J. P. (1993). Achieving world-class end-user computing: Making IT work and using IT effectively. Information Systems Management, 10(4), 7-12.

Henderson, J. C. and Treacy, M. E. (1986). Managing end-user computing for competitive advantage. Sloan Management Review, 27(2), 3-14.

g_a28

We have a great answer, but it seems to me that the question itself might be based on some incorrect assumptions...

Software didn't (and still doesn't) just come out of the blue. In order for it to appear, somebody must know how to do it by hand. And things would be done by hand before the computers became widespread. There were devices to help calculations (e.g. logarithmic ruler, abacus, etc.). People spent their entire lives making those statistical, trigonometric, and other tables, which you can still find in any textbook. All by hand.

The emergence of computers and software didn't create any new methods. In order to be programmed, it has to already exist! What it really did was make some of the theoretically known things actually feasible. The idea of statistical learning (now known as 'machine learning') first came up way before the PC. Another thing the 'fast' computers enabled was 'bigger' data. They simply provided the physical capability to process more. Later on, with the Internet, there suddenly was a lot of newly created data. The methods to handle it have mostly been there for quite some time, just they weren't feasible.

TL;DR: the Math always comes before the Software.

Yours truly,

Guest Engineer