When Computers Were Human Read online




  When Computers Were Human

  When Computers Were Human

  David Alan Grier

  PRINCETON UNIVERSITY PRESS

  PRINCETON AND OXFORD

  Copyright © 2005 by Princeton University Press

  Published by Princeton University Press, 41 William Street, Princeton, New Jersey 08540

  In the United Kingdom: Princeton University Press, 3 Market Place, Woodstock, Oxfordshire OX20 1SY

  All Rights Reserved

  Third printing, and first paperback printing, 2007

  Paperback ISBN: 978-0-691-13382-9

  THE LIBRARY OF CONGRESS HAS CATALOGED THE CLOTH EDITION OF THIS BOOK AS FOLLOWS

  Grier, David Alan, 1955 Feb. 14–

  When computers were human / David Alan Grier.

  p. cm.

  Includes bibliographical references.

  ISBN 0-691-09157-9 (acid-free paper)

  1. Calculus—History. 2. Science—Mathematics—History. I. Title.

  QA303.2.G75 2005

  510′.92′2—dc22 2004022631

  British Library Cataloging-in-Publication Data is available

  This book has been composed in Sabon

  Printed on acid-free paper. ∞

  press.princeton.edu

  Printed in the United States of America

  10 9 8 7 6 5 4 3

  FOR JEAN

  Who took my people to be her people and my stories to be her own without realizing that she would have to accept a comet, the WPA, and the oft-told tale of a forgotten grandmother

  Contents

  INTRODUCTION

  A Grandmother’s Secret Life

  1

  PART I: Astronomy and the Division of Labor 1682–1880

  9

  CHAPTER ONE

  The First Anticipated Return: Halley’s Comet 1758

  11

  CHAPTER TWO

  The Children of Adam Smith

  26

  CHAPTER THREE

  The Celestial Factory: Halley’s Comet 1835

  46

  CHAPTER FOUR

  The American Prime Meridian

  55

  CHAPTER FIVE

  A Carpet for the Computing Room

  72

  PART II: Mass Production and New Fields of Science 1880–1930

  89

  CHAPTER SIX

  Looking Forward, Looking Backward: Machinery 1893

  91

  CHAPTER SEVEN

  Darwin’s Cousins

  102

  CHAPTER EIGHT

  Breaking from the Ellipse: Halley’s Comet 1910

  119

  CHAPTER NINE

  Captains of Academe

  126

  CHAPTER TEN

  War Production

  145

  CHAPTER ELEVEN

  Fruits of the Conflict: Machinery 1922

  159

  PART III: Professional Computers and an Independent Discipline 1930–1964

  175

  CHAPTER TWELVE

  The Best of Bad Times

  177

  CHAPTER THIRTEEN

  Scientific Relief

  198

  CHAPTER FOURTEEN

  Tools of the Trade: Machinery 1937

  220

  CHAPTER FIFTEEN

  Professional Ambition

  233

  CHAPTER SIXTEEN

  The Midtown New York Glide Bomb Club

  256

  CHAPTER SEVENTEEN

  The Victor’s Share

  276

  CHAPTER EIGHTEEN

  I Alone Am Left to Tell Thee

  298

  EPILOGUE

  Final Passage: Halley’s Comet 1986

  318

  Acknowledgments

  323

  Appendix: Recurring Characters, Institutions, and Concepts

  325

  Notes

  333

  Research Notes and Bibliography

  373

  Index

  401

  Illustration Credits

  412

  When Computers Were Human

  INTRODUCTION

  A Grandmother’s Secret Life

  After a while nothing matters … any more than these little things, that used to be necessary and important to forgotten people, and now have to be guessed at under a magnifying glass and labeled: “Use unknown.”

  Edith Wharton, The Age of Innocence (1920)

  IT BEGAN with a passing remark, a little comment, a few words not understood, a confession of a secret life. On a cold winter evening, now many years ago, I was sharing a dinner with my grandmother. I was home from graduate school, full of myself and confident of the future. We sat at a small table in her kitchen, eating foods that had been childhood favorites and talking about cousins and sisters and aunts and uncles. There was much to report: marriages and great-grandchildren, new homes and jobs. As we cleared the dishes, she became quiet for a moment, as if she were lost in thought, and then turned to me and said, “You know, I took calculus in college.”

  I’m certain that I responded to her, but I could not have said anything beyond “Oh really” or “How interesting” or some other empty phrase that allowed the conversation to drift toward another subject and lose the opportunity of the moment. In hindsight, her statement was every bit as strange and provocative as if she had said that she’d fought with the Loyalists in the Spanish Civil War or had spent her youth dealing baccarat at Monte Carlo. Yet, at that instant, I could not recognize that she had told me something unusual. I studied with many women who had taken calculus and believed they would have careers in the mathematical sciences like my intended career. I did not stop to consider that only a few women of my grandmother’s generation had even attended college and that fewer still had ever heard of calculus.

  My grandmother’s comment was temporarily ignored, but it was not lost. It came rushing back into my thoughts, some six or seven years later, as I was sitting in a mathematics seminar. Such events are often conducive to reflection, and this occasion promised plenty of opportunity to think about other subjects. The speaker, a wild-haired, ill-clad academic, was discussing a new mathematical theory with allegedly important applications that were far more abstract than the theory itself. As I was helping myself to tea and cookies, a staple of mathematical talks, I caught a remark from a senior professor. I had always admired this individual, for he had the ability to sleep during the boring parts of seminars and still catch enough of the material to ask deep and penetrating questions during the discussion period. This professor, who had recently retired, was describing his early days at the university during the Great Depression of the 1930s. Having just arrived in the United States from his native Poland and knowing only rudimentary English, he was assigned to teach the engineering calculus course. “This,” he stated with a flourish, “was the first time that calculus was required of engineering students at the university.”

  As I listened to his story, I heard my grandmother’s phrase from that night long before. “You know, I took calculus in college.” I did not know when she had attended college, but having heard my mother’s stories of the Depression, I was certain it would have been before 1930. As I settled into my chair, I started to ponder what my grandmother had said. For the next hour, I was lost in my own thoughts and oblivious to the talk, which proved to be the best seminar of the term. During the discussion period, I was asking myself the questions I should have raised at that dinner years before: Where had my grandmother attended college? What courses had she taken? What had she hoped to learn from calculus?

  By then, it was too late to ask these questions. My grandmother was gone, and no one knew much
about her early life. My mother believed that my grandmother had studied to be an accountant or an actuary. My uncle thought that my grandmother had taken some bookkeeping classes. Our family genealogist, a distant cousin who seemed to know everything about our relations, expressed her opinion that my grandmother’s family had been too poor to send her to college. Still bothered by that one phrase, I decided to see what I could learn. My grandmother had been raised in Ann Arbor, the home of the University of Michigan. So one day, I called the college registrar and asked if she had a transcript for my grandmother. I tried to use a tone of voice to suggest that it was the most natural thing in the world for a grandson to review his grandmother’s college grades, rather than the other way around. With surprisingly little hesitation, the registrar agreed to my request and left the phone. In a few minutes, she returned and said, “I have her records here.”

  Catching my breath, I asked, “When did she graduate?”

  “Nineteen twenty-one,” the registrar responded.

  “What was her major?” was my next question.

  After a moment of shuffling paper, she replied, “Mathematics.” Three weeks later, I was sitting at a long library table with a little gray box that contained the university’s record of my grandmother’s life. As I worked through her transcript and the course record books, I was surprised but pleased to see that she had taken a rigorous program of study. In all, she had taken about two-thirds of the mathematics courses that I had taken as an undergraduate, and she had studied with several well-known mathematicians of the 1920s. The professors’ record books were particularly intriguing, for they contained little notes that hinted at the activity and turmoil outside the classroom. One mentioned the male students who had left for the First World War; another recorded that he had devoted part of the term to analyzing ballistics problems; a third mentioned that two students had died in the influenza epidemic.1

  1. Calculus class, University of Michigan, 1921. Author’s grandmother is rightmost woman

  Perhaps the most surprising revelation was the fact that my grandmother was not the only female mathematics student. Of the twelve students who had taken a mathematics degree in 1921, six of them, including my grandmother, were women. The University of Michigan was more progressive than the Ivy League schools, but its liberalism had limits. About a quarter of the university student body was female, but the school provided no dormitory for women and barred them from the student union building, as it was attached to a men’s residence hall. University officials also discouraged women from studying medicine, business, engineering, physics, biology, and chemistry. For women with scientific interests, the mathematics department was about the only division of the school that welcomed them. Much of this welcome was provided by a single professor, James W. Glover (1868–1941), who served as the advisor to my grandmother and most of her female peers.2

  Glover was an applied mathematician, an expert in the mathematics of finance, insurance, and governance. He had been employed as an actuary for Michigan’s Teacher Retirement fund, had held the presidency of Andrew Carnegie’s Teachers Insurance and Annuity Association, and, in the early years of the century, had served as a member of the Progressive Party’s “brain trust,” the circle of academic advisors to the party leader, Robert La Follette.3 Within the University of Michigan, Glover was an advocate for women’s education, though he was at least partly motivated by a desire to increase enrollments in mathematics courses. He welcomed women to his classes, encouraged them to study in the department lounge, prepared them for graduate study, and helped them search for jobs. He pushed the women to look beyond the traditional role of school-teacher and consider careers in business and government. At a time when clerical jobs were still dominated by men, Glover helped his female students find positions as assistant actuaries and human computers, the workers who undertook difficult calculations in the days before electronic computers. At the end of his career, he recorded that he had advised nearly fifty women and that only “one-third have married and have retired from active business life.”4

  Of the six women who graduated in 1921, only one, my grandmother, never worked outside the home. The remaining five had mathematical careers that lasted into the 1950s. One was a human computer for the United States Army and prepared ballistics trajectories. A second did calculations for the Chemical Rubber Company, a publisher that sold handbooks to engineers and scientists. Another compiled health statistics for the state of Michigan. The fourth worked for the United States Bureau of Labor Statistics and eventually became the assistant director of a field office in Baton Rouge. The last female mathematics major of 1921 became an actuary, moved to New York City, and operated her own business.5

  Though my grandmother’s hidden mathematical career held a special emotional appeal to me, it was the story of the other five women that captured my interest. What kind of world did they inhabit? What were their aspirations? What did they do each day? At the ends of their careers, what had they accomplished? Rather than restrict my scope to the five women who had known my grandmother or even the women mathematics graduates of the University of Michigan, I decided to look at the history of scientific computers, the workers who had done calculations for scientific research.

  Scientific computation is not mathematics, though it is closely related to mathematical practice. One eighteenth-century computer remarked that calculation required nothing more than “persevering industry and attention, which are not precisely the qualifications a mathematician is most anxious to be thought to possess.”6 It might be best described as “blue-collar science,” the hard work of processing data or deriving predictions from scientific theories. “Mental labor” was the term used by the English mathematician Charles Babbage (1791–1871).7 The origins of scientific calculation can be found in some of the earliest records of human history, the clay tables of Sumeria, the astronomical records of ancient shepherds who watched over their flocks by night, the land surveys of early China, the knotted cords of the Inca.8 Its traditions were developed by astronomers and engineers and statisticians. It is kept alive, in a sophisticated form, by those graduate students and laboratory assistants who use electronic calculators and computer spreadsheets to prepare numbers for senior researchers.

  Though many human computers toiled alone, the most influential worked in organized groups, which were sometimes called computing offices or computing laboratories. These groups form some of the earliest examples of a phenomenon known informally as “big science,” the combination of labor, capital, and machinery that undertakes the large problems of scientific research.9 Many commentators identify the start of large-scale scientific research with the coordinated military research of the Second World War or the government-sponsored laboratories of the Cold War, but the roots of these projects can be traced to the computing offices of the eighteenth and nineteenth centuries.10

  It is possible to begin the story of organized computing long before the eighteenth century by starting with the great heavenly Almagest, the charts of the planets created by Claudius Ptolemy (85–165) in classical Egypt. As the connection between the ancient world and its modern counterpart is sometimes tenuous, we will begin our story just a few years before the opening of the eighteenth century with two events: the invention of calculus and the start of the Industrial Revolution. Both events are difficult to date exactly, but that is of little concern to this narrative. Identifying the inventors of specific ideas is less important than understanding how these ideas developed within the scientific community. Calculus gave scientists new ways of analyzing motion. Most historians of mathematics have concluded that it was invented independently by Isaac Newton (1642–1727) and Gottfried Wilhelm Leibniz (1646–1716) in the 1680s. It was initially used in astronomy, but it also opened new fields for scientific research. The Industrial Revolution, the economic and social change that was driven by the factory system and the invention of large machinery, created new techniques of management, developed public journalism as a means of diss
eminating ideas, and produced the modern factory.11 Most scholars place the start of the Industrial Revolution at the end of the eighteenth century, but this change was deeply influenced by the events of Newton’s time. “It is enough to record that by 1700 the foundations of modern technology have been laid,”12 concluded historian Donald Caldwell.

  By starting with the invention of calculus, we will overlook several important computational projects, including the Arithmetica Logarithmica by Henry Briggs (1561–1630), the ballistic trajectories of Galileo Galilei (1564–1642), and the planetary computations in the Rudolphine Tables by Johannes Kepler (1571–1630). Each of these projects contributed to the development of science and mathematics. Briggs gave science one of its most important computing tools, the logarithm table. Galileo and Kepler laid the foundation for calculus. However, none of these projects is an example of organized computation, as we define it. None of these scientists employed a staff of computers. Instead, they did the computations themselves with the occasional assistance of a student or friend.

  The story of organized scientific computation shares three themes with the history of labor and the history of factories: the division of labor, the idea of mass production, and the development of professional managers. All of these themes emerge in the first organized computing groups of the eighteenth century and reappear in new forms as the story develops. All three were identified by Charles Babbage in the 1820s, when he was considering problems of computation. These themes are tightly intertwined, as mass production clearly depends upon the division of labor, and the appearance of skilled managers can be seen as a specific example of divided and specified labor. However, this book separates these ideas and treats them individually in an attempt to clarify and illuminate the different forces that shaped computation.