What is a Computer?

In the age of artificial intelligence, we often see humans and computers set in opposition to one another, as in headlines asking whether computers will soon replace humans in some job category, or even develop super-powers that enable them to destroy humanity.

In fact, however, the first computers were humans. The word “computer” is first recorded in English in the 1640’s, meaning “one who calculates, a reckoner, one whose occupation is to make arithmetical calculations.” In the twentieth century, human computers, writes Claire L. Evans, “prepared ballistics trajectories for the United States Army, cracked Nazi codes at Bletchley Park, crunched astronomical data at Harvard, and assisted numerical studies at nuclear fission on the Manhattan Project. Despite the diversity of their work, they had one thing in common. They were women” (Broadband: The Untold Story of the Women Who Made the Internet, Penguin, 2018).

The book (and later the film) Hidden Figures tells the story of the very human computers who worked in the U.S. space program in its early years, focusing on the lives of four Black women who performed essential mathematical calculations by hand: Dorothy Vaughan, Mary Jackson, Katherine Johnson, and Christine Darden.

Nowadays, the word “computer” is more likely to call to mind a desktop, laptop, mobile or other device that can be programmed to perform its own calculations. The first computer of this type, the ENIAC (Electronic Numerical Integrator and Computer) was completed in 1946. Its first programmers were Betty Holberton, Jean Bartik, Ruth Teitelbaum, Kathleen Antonelli, and Frances Spence. In the years since, with the development of the silicon-based integrated circuit and rapid advances in miniaturization (advances often associated with a concept known as Moore’s Law), computers of the non-human type have been packing more and more computational power into smaller and smaller devices.

But it would be a mistake to associate even non-human computers with any one particular architecture, or even to assume that programmable computers must involve silicon and electricity. The analytical engine of nineteenth-century inventor Charles Babbage was entirely mechanical (though never built, in fact), and the instructions written for it by Babbage’s collaborator Ada Lovelace, daughter of the British Romantic poet Lord Byron, are sometimes regarded as the earliest examples of computer programs, or algorithms. Perhaps the most famous computer in the history of computing, the Turing machine, isn’t a physical machine at all but a thought experiment.

What a computer does is, in the end, more important than the components it’s made of. Computers are information processors. They take in information (input), store it in some form of memory, process that information according to some set of instructions (a program), and produce results (output).

It’s worth noting, in this connection, that the field typically known in the U.S. as “computer science” is elsewhere commonly called “informatics,” a label that usefully points to what computers do rather than summoning a mental image of any particular type of machine for doing it.