Money  /  Debunk

How ‘Automation’ Made America Work Harder

Computers were supposed to reduce office labor. They accomplished the opposite.

Many ordinary people have encountered this paradox: they are underemployed or unemployed but also, counterintuitively, working harder than ever before. Historically in the U.S., this phenomenon has always typically accompanied public discussions of “automation,” the ostensible replacement of human labor with machine action. Mid-20th century automobile and computer industry managers often talked about “automation” in an effort to play off the technological enthusiasm of the era. But rather than a concrete technological development and improvement, “automation” was an ideological invention, one that has never benefitted workers. I use scare quotes around “automation” as a reminder that the substance of it was always ideological, not technical. Indeed, from its earliest days, “automation” has meant the mechanized squeezing of workers, not their replacement.

Which is why, ever since the end of World War II, what employers have called “automation” has continued to make life harder, and more thankless, for Americans. Employers have used new tools billed as “automation” to degrade, intensify, and speed up human labor. They have used new machines to obscure the continuing presence of valuable human labor (consider “automated” cash registers where consumers scan and bag their own groceries—a job stores used to pay employees to do—or “automated” answering services where callers themselves do the job of switchboard operators). “All Automation has meant to us is unemployment and overwork,” reported one autoworker in the 1950s; another noted that “automation has not reduced the drudgery of labor…to the production worker it means a return to sweatshop conditions, increased speedup and gearing the man to the machine, instead of the machine to the man.”

There is no better example of the threat and promise of “automation” than the introduction of the beloved electronic digital computer. The first programmable electronic digital computers were invented during the Second World War to break Nazi codes and perform the enormous calculations necessary for the construction of an atomic bomb. Well into the early 1950s, computers remained for the most part associated with high-level research and cutting-edge engineering. So, at first, it was by no means obvious how a company might use an electronic digital computer to make money. There seemed little computers could offer businessmen, who were more interested in padding profits than decrypting enemy ciphers.

It was left to management theorists, hoping to build up and profit from the budding computer industry, to create a market where none yet existed. First among them was John Diebold, whose 1952 book Automation not only made “automation” a household term, but also introduced the notion that the electronic digital computer could “handle” information—a task that until then had been the province of human clerical workers. “Our clerical procedures,” wrote Diebold, “have been designed largely in terms of human limitations.” The computer, he told a new generation of office employers, would allow the office to escape those human limits by processing paperwork faster and more reliably.