You’ll have the boys a-lining up single file
If you just wear your smile.
Cass Elliot, “When I Just Wear My Smile” (1969)
As mask mandates eased across the United States, many women bemoaned the inevitable return of one of the more insidious banalities of misogyny: men telling them to smile. COVID-19 masking had offered a kind of consciousness-raising for many women, the absence of the requirement to smile in public making stark their habitual, constant emotional labor. One woman told a reporter for the Daily Beast, “Best thing about the masks is that men can’t tell me to smile when I’m out in public.” Another said she planned to continue wearing masks despite changes to the rules in her community, because “it’s just so nice and freeing to be able to decide whether to smile or not, just based on how I feel personally.”
These women’s comments were reminiscent of remarks made by Women’s Liberation activist Shulamith Firestone, who explained in her foundational 1970 book The Dialectic of Sex: “My ‘dream’ action for the women’s liberation movement: a smile boycott, at which declaration all women would instantly abandon their ‘pleasing’ smiles, henceforth smiling only when something pleased them.” Firestone’s use of the term “pleasing” remains machete-sharp, slicing through both sides of the compulsory smile interaction. A woman is “pleasing” to look at because she is smiling, and she is “pleasing” the man because he expects her to. At base, Firestone argues, the woman’s smile “indicates acquiescence of the victim to her own oppression.” And, if a man doesn’t get it—on the subway, at work, in the cereal aisle at the grocery, in class, at a club, walking down the street—he demands it. “You should smile more.” “Come on, lady, smile!” “Lighten up!” “You have Resting Bitch Face.” “Why are you so angry?” “Your clients/coworkers/boss would find you more approachable if you smiled more.” “Smile, bitch!”
Fortunately, our popular culture is finally starting to rally behind the position that men must stop telling women to smile. At the same time, however, a prominent subfield of psychology known as Positive Psychology, which purports to be the science of the good life, continues to insist that people—and especially women—should smile.