Inflation. We all know it when we see it. Prices rise, gas stations have lines. Graphs point upward. Eyebrows lift. Eyeballs roll. From Mary Tyler Moore gently tossing a package of meat into her shopping cart in the opening credits of her 1970s sitcom to a recent cartoon asking if there’s a vaccine for “sticker shock,” inflation would seem to be the most visible of macroeconomic phenomena.
Economist Milton Friedman repeatedly counseled: “Inflation is always and everywhere a monetary phenomenon.” Friedman, sometimes called the most influential economist of the 20th century, argued that prices automatically rose whenever too much money chased too few goods. His best-selling books, television series and role in Ronald Reagan’s White House made his “monetarism” economic common sense for much of the past four decades. Today, when economists like Lawrence H. Summers warn that the American economy is in danger of “overheating,” they repeat those same assumptions.
Yet, historians know money and inflation don’t quite work that way. The history of inflation isn’t skyrocketing prices inevitably caused by the same mistakes. It is, rather, a history of changing words, changing numbers — and most important, the people who change them. Time and time again, it’s a story of pundits making ahistoric claims to promote their own policy agendas and of changing priorities in what gets measured and how.
The word “inflation” only began to refer to money and economics in the mid-1800s. For centuries before that, it solely meant the action or condition of being filled with air. (Balloons were inflated. Charles Darwin, on his youthful expedition to South America, described the “inflation” of a puffer fish.)
Moreover, when “inflation” entered economic usage in the 1860s-1870s, it meant increasing the money supply — what today might be called “economic stimulus.” By issuing greenbacks, for example, the Lincoln administration had, according to its critics, “inflated” American currency. It had also, of course, helped to finance and win the Civil War. Yes, prices rose, but the term inflation wasn’t intended to convey that, and they rose because of military necessity driving up demand.
In the early 20th century, however, economist Irving Fisher’s “equation of exchange” (MV=PT) established an apparent necessary relation between money supply and price levels. The charismatic Fisher’s fame, burnished by his public-health campaigning and ties to the eugenics movement, fueled buy-in for the idea. He was the first celebrity economist, a man whose pronouncements were quoted far and wide. If he said inflating the currency would automatically lead to rising prices, few would challenge him. Yet, Fisher was far from infallible: He failed to predict the 1929 stock market crash.
Widespread acceptance of Fisher’s apparent discovery, however, did not mean inflation was a bad thing — far from it. In the global depression of the 1930s, some politicians and pundits saw inflation as potentially beneficial. “Inflation enacted by House, 307-86; President expected to sign Farm Relief Bill before Sunday” read one Washington Post headline for May 4, 1933.
Indeed, support for inflation came from senators of both parties — Bronson Cutting (R-N.M.) and Elmer Thomas (D-Okla.) — as well as industrialists, agricultural economists and the wildly popular isolationist and antisemitic radio priest, Charles Coughlin. Crucially, this unlikely coalition shared a vision of the United States as a land of producers; from the point of view of farmers, ranchers, manufacturers or the mining industry, deflation and depression were far graver concerns than rising prices.
Only when Americans started thinking of themselves solely as consumers — a shift begun in the aftermath of World War II — did “inflation” become everybody’s enemy. This outlook hardened amid the soaring prices of the 1970s, which came at the same time as a stagnant economy. Rising joblessness made higher prices an increasing hardship for Americans in a way they had not been in a period of nearly full employment. Since the 1980s, as manufacturing has moved overseas and ours has become overwhelmingly a service economy, the idea of Americans as consumers has only become more axiomatic. Inflation is therefore now seen as foreboding and detrimental to the national interest.
Yet, the longer history reminds us that inflation, deflation and price stability actually all produce both winners and losers — something Americans have forgotten. Inflation, for instance, tends to benefit debtors over creditors, producers over consumers. Today, if you own your home outright, have a fixed-rate mortgage or are in the business of flipping real estate, you are probably counting on rising house prices, even if you dread “inflation.”
There is another problem with our obsessive fear that inflation equals the economic cataclysm of the 1970s: figuring out how we calculate economic statistics is a science full of choices — choices about which economic experts disagree.
Since 1992, annual inflation as measured by the Bureau of Labor Statistics’ consumer price index (CPI) has been low and steady: never over four percent and often under two percent. “Core” inflation — which omits prices for groceries and gasoline, because they are especially sensitive to external shocks (droughts, a pipeline held hostage by hackers, etc.) — has been even lower.
Yet how this figure gets calculated is hotly contested. Consumer advocates fear it understates inflation by excluding housing prices (though not rent) and hiding enormous increases in the cost of health care, schools, prescriptions and higher education — all expenditures crucial for a population that isn’t just surviving, but also thriving.
Conversely, however, deficit hawks and proponents of small government argue the CPI does the reverse. Federal spending on many programs, including Social Security and military pensions, is indexed to inflation (that is, it rises and falls based on the reported rate), as is eligibility for Pell Grants and SNAP benefits. If measured inflation rises, so must the expenditure on those programs. Helping people in need might thereby result in a larger national deficit and growing national debt.
In 1995, immediately after Republicans swept the midterm elections, Federal Reserve Chairman Alan Greenspan testified in Congress that he suspected CPI was calculated in such a way that inflation was exaggerated by 1 to 1.5 percent per year. A “Blue Ribbon Commission,” chaired by the Hoover Institute’s Michael Boskin, concurred with this assessment. Concluding that the CPI had already led to significant over payment by Social Security, the commission warned that, left unchecked, the “upward bias” in the CPI would add $1.07 trillion to the national debt over the next dozen years.
But as fellow economist Wynne Godley observed at the time, Boskin’s report went significantly beyond its formal charge. The CPI was devised to inform labor disputes during the World Wars (just how much more should shipyard workers be paid to keep pace with the cost of living?), but it was now being blamed for increasing national debt, the level of which depended far more on spending choices made by policymakers.
All of which is a reminder that while it seems simple to spot inflation and axiomatic that it’s a giant flashing warning sign for the economy, this isn’t necessarily true. Inflation can be good for some Americans.
Moreover, even when economists warn of risks of rising inflation, their ideological predilections inform how they interpret statistics. A measurement like the CPI does not simply exist. It is produced. Who takes the measurements and what they are told to measure therefore matter a great deal. Having some way of calculating shared costs or measuring general inflation is still important — otherwise, we’re left with individual impressions and unrelated anecdotes. But getting policy right depends on understanding where numbers come from and on asking who wields them and to what ends.