Dear Dudes: Please Stop Telling Women To Smile
Photo: Comedy Central
Imagine this: You're walking down the street, minding your own business, maybe listening to music, or pondering your next meal, or thinking about the messed up thing you heard about on the news this morning. You're making whatever kind of expression your face has naturally settled into when all of a sudden some rando comes out of nowhere (seriously, where did homeboy pop up from?) and tells you to “smile!” Maybe he takes it one step farther and informs you that you'd be “prettier if you smiled” or suggests that “it's such a beautiful day, you should smile!” Or, perhaps he just keeps it simple and barks the word directly into your face. Whatever the specifics of his statement, his words drop into your path like a bomb. By the time you have a chance to acknowledge what's happened and muster a response, the moment and the person responsible for creating it have likely passed, and you're left with a pit in your stomach and a current of rage coursing through your bones.
If you're a woman, this scene probably isn't something you have to try too hard to picture. In fact, some version of it has probably happened to you within the past month or so. A version of it happened to me just today! Like, literally as I was writing this. Because even with #MeToo, and the Weinstein Effect, and increased awareness about street harassment, too many men still don't seem to understand just how inappropriate it is to tell a woman to smile.
So, just in case you're a guy who still doesn't get it, let me spell it out for you. Not only does telling a woman to smile discount women's emotions and imply that we are merely decorative objects meant to look happy and pretty at all times (sorry, we're not), but it's also a prime example of a man telling a woman what to do while having exactly zero authority to do so. Sorry, random dude on the street, but neither you nor anyone else, for that matter, hold any jurisdiction over my facial expression. There's a lot of things I have to deal with in this crazy world of ours—people elbowing me on my insane morning commute, my shady landlord, the difficulty of getting white wine stains out of silk—so if I want to scowl or snicker or walk around with my damn tongue hanging out, that's my call. Mine! Not yours.
Oh, and while we're on the subject, most women also don't love being told how “beautiful” they are by random men on street corners, because it makes us feel unsafe, and also because, like, can I just live my life without somebody feeling the need to comment on my appearance? Or would it be too much to allow me to go an afternoon without remembering that, because I'm a woman, how I look is something I will always be judged on, for better or worse.
Dudes who tell women to smile or call them beautiful often try to justify it by saying they're just “being friendly” and trying to “brighten someone's day.” But nobody really thinks that, right? No one actually thinks that instructing a stranger on what to do with their face is just a nice thing to do, do they? And if for some wild reason they do, how come no one ever tells men to smile? Seriously, can you picture a man coming up to another man and suggesting he should “try smiling”? I can't, mostly because I think any guy who did that would probably get punched. But, societal power dynamics being what they are, most women are unlikely to have such a physical reaction, because, like I said before, women fear being hurt, assaulted, or worse by strange men. And frankly, I don't think dudes are trying to brighten my day. I think they're trying to exert their toxic masculinity all over it.
But even beyond the sexist implications, telling someone to smile is just an uncool thing to do, in the deepest sense of the world. It's kind of like that scene in Office Space when the main character's obnoxiously cheerful co-worker says “sounds like someone has a case of the Mondays!” It's just all-around awful and under no circumstances is it going to improve anyone's day, ever. Got that? Never. Never!
So, dudes, even if you don't care about being a sexist pig (but may I suggest that it's high time you start?), don't tell people to smile because it makes you sound not only like a dick, but also like a naive, annoyingly chipper yokel who needs to get a clue about the world. Because if you were paying attention to many of the things that are going on it, you'd understand why we don't always want to smile. And, hey, if seeing us happy is so important to you, maybe consider supporting some of the causes that are important to us, like equal pay, reproductive rights, affordable childcare, maternity leave, and ending rape and assault culture. Now that might make us smile.