The Equifax hack revealed how vulnerable most of us are to the agencies that compile our credit records. Having “good credit” has been important to the way people judge each other since the dawn of commerce. But, as scholar Josh Lauer explains, the advent of credit reporting agencies in the early twentieth century reflected—and facilitated—a sea change in who makes those judgements and how they do it.
Up until the middle of the nineteenth century, Lauer writes, credit was personal. People got loans and ran up tabs based on interpersonal relationships and reputation within a community. As one wag noted in 1833, a debtor “is a man of note—of promissory note; he fills the speculation of many minds; men conjecture about him, wonder and conjecture whether he will pay.”
But growing companies and markets made credit a less personal matter, creating demand for a more objective source of information on a potential borrower’s creditworthiness.
In the 1840s, the first credit reporting firms created a beta version of the modern credit history report—though only for companies, not individuals. As retailers like butchers and department stores got a look at this credit monitoring model, they saw its potential value for the tabs they let customers keep.