Memory  /  Argument

Do We Know What History Students Learn?

It's not enough to say that they pick up critical thinking skills. It's time to offer evidence.
Line graph of history BAs granted, peaking in the 1960s and declining in the 2010s.
Benjamin M. Schmidt/IPEDS & HEGIS datasets

What are you going to do with that -- teach?” Uttered with disdain, it’s a question history majors have been asked many times. Clio’s defenders have a response. The head of the American Historical Association says that the study of history creates critical thinkers who can “sift through substantial amounts of information, organize it, and make sense of it.” A university president asserts that the liberal arts endow students with the “features of the enlightened citizen” who possesses “informed convictions … and the capacity for courageous debate on the real issues.” Historians pride themselves on the evidence for their claims.So, what’s the evidence?

Not much, actually. Historians aren’t great at tracking what students learn. Sometimes they even resent being asked. Recently, however, the winner of the Bancroft Prize, one of history’s most distinguished awards, washed the profession’s dirty laundry in public. The article’s title: “Five Reasons History Professors Suck at Assessment.”

Anne Hyde described what happened when accreditors asked her colleagues to document what students learned. They paid little heed to the requests -- that is, until Colorado College’s history department flunked its review. Committed teachers all, her colleagues “had never conducted assessment in any conscious way beyond reporting departmental enrollment numbers and student grade point averages.”

Among many college history departments, this is routine. To address the issue of assessment, the American Historical Association in 2011 set out on a multiyear initiative to define what students should “be able to do at the end of the major.” Eight years, dozens of meetings and hundreds of disposable cups later, the Tuning Project produced a set of ambitious targets for student learning. But when it came to assessing these goals, they left a big question mark.

Which is one of the reasons why we were convinced of the need to create new assessments. With support from the Library of Congress, we came up with short tasks in which history students interpreted sources from the library’s collection and wrote a few sentences justifying their response. For example, one assessment, “The First Thanksgiving,” presented students with a painting from the beginning of the 20th century and asked if the image of lace-aproned Pilgrim women serving turkey to bare-chested Indians would help historians reconstruct what may have transpired in 1621 at the supposed feast between the Wampanoag and English settlers.

In the March issue of the Journal of American History, we describe what happened when we gave our assessments to students at two large state universities. On one campus, we quizzed mostly first-year students satisfying a distribution requirement. All but two of 57 ignored the 300-year time gap between the Thanksgiving painting and the event it depicts. Instead, they judged the painting on whether it matched their preconceptions, or simply took its contents at face value -- an answer we dubbed the “picture’s worth a thousand words” response.