Human Computers

How Computers Used to Be Humans

If you ask an average person when they think the word computer start being part of our language, they would probably guess the 20th century. But long before “computer” meant a glowing screen in your desk, it was meant for a person who does calculations. The Oxford English Dictionary records the earliest use in 1613, in Richard Braithwait’s book The Yong Mans Gleanings, where he described a man as a “good Computer.”

Back in the 17th century, the rise of astronomy, navigation, and ballistics demanded ever more accurate tables of numbers. Ships needed precise star charts, gunners needed firing tables, and engineers needed logarithmic tables. Teams of human computers were hired to calculate, check, and re-check the figures by hand, and therefore the job “computer” was born.

By the 19th century, whole institutions were organized around this work. The British Nautical Almanac Office employed dozens of clerks to churn out astronomical tables. At Harvard College Observatory, women were hired in large numbers to examine photographic plates of the night sky, painstakingly classifying stars by brightness and spectrum. These “computers” built the raw data sets that underpinned scientific revolutions in astronomy and physics.

The reliance on human computers reached its peak in the world wars, when armies of mostly female clerks calculated ballistics trajectories, weather predictions, and cryptographic tables. Only in the mid-20th century did the word begin to slip away from them, transferred first to electromechanical engines and then to electronic machines, forever tying the word computer to a machine.

Craving more? Check out the source behind this Brain Snack!

Keep the adventure going! Dive into these related Brain Snacks: