Establishment_viaFabioVenniI’ve been having a problem lately with the word “establishment.” It’s a two-part problem, and one part of that problem is that I cannot seem to read anything about our current election cycle without getting run over by “the establishment.” The other part of the problem is the difference between what it means and how we actually use it.

To take the first part of the problem, I keep hearing about how Trump supporters are against the establishment, and how Bernie supporters are against the establishment, and about how no, actually Hillary is also against the establishment, and Cruz is most definitely against the establishment, and to be safe, lets just say all political candidates are anti-establishment.

We’ll gloss right over the problem of who the establishment actually is for now and accept that it’s fashionable to be against it.

Continue reading

A Few Bad Apples

Reliably, whenever issues of sexism, racism, and prejudice appear, so too does the phrase “a few bad apples.” University professors are harassing their students, but universities and media hasten to remind us that they are just a few bad apples. Police officers are abusing the people they are supposed to protect and serve, but mostly when those people are black—still, it’s a few bad apples.

“A few bad apples” is in-group language. It’s what you say when you identify with the group in question, and you just can’t believe anything bad about that group because it would also mean something bad about yourself. It is, in essence, group-level denial: that person did something I can’t be associated with, so that must mean they don’t really represent my group.

Continue reading

Confusing Categories: GMOs

FDA_Nutrition_Facts_Label_2006My home state of Vermont voted to label GMOs not so long ago. Unsurprisingly, Bernie Sanders also supports labeling GMOs. Perhaps more surprisingly, some large food manufacturers are starting to move in that direction as well, notably and recently Campbell Soup. For many progressives and liberals, GMO labeling represents a win in food advocacy.

But I try not to have sacred cows, and I try to question everything, and thus I find myself forced into a different, and locally controversial, position. It’s not that I’m against labeling GMOs exactly, because I really am not sure what I think about that aspect of the debate. But I get stuck on one sort of important observation: “GMO” doesn’t mean any one thing. Or at least, I don’t think it means something in the way we keep talking about it.

Let me explain: so far as I can tell, there are three major categories of thing on food labels.

Continue reading

The Democracy of Language

PronounsAs someone who works with words for a living, I always feel a little bit traitorous when I talk about the fluidity of language. My undergraduate studies in linguistics left me with a dialectic view of how language works. The conventional view of language, though, has the same deep currents of judgment and correctness and power that lie simmering in the rest of any culture.

Should I accept those? Should I treat language as brittle and defend narrow meanings and usages from shattering change? Or should I treat language as malleable clay that can and must be sculpted to best convey any idea? My inclinations lie obviously with the latter, but I retain a fondness for the certain bedrock of the former, and an empathy for those drawn to that view. Thus I am always aware that when I advocate use of the singular “they,” for example, I am casting a vote that goes against the grain for some.

The democracy of language is my bridge. By treating language as a democratic exercise, I acknowledge both the importance of the consensus view and the option of disregarding it. If a majority of English speakers believe prepositions should never be used at the end of a sentence, I can acknowledge and respect their view as a convention. At the same time, I can look at the native grammar of English and realize that prepositions are perfectly fine things to end sentences with.

Likewise I can treat pedantry with some respect, and embrace it in the spirit of inquiry rather than the spirit of restriction it so often carries. To be corrected, to hear that consensus view, is no shame—it is just a broadening of knowledge. But by receiving it as one option among many, I can just as easily and comfortably set it aside.

Still, there are some moments when pedantry slips out of the realm of mere language and becomes a channel for cultural currents. In the hands of the righteous, pedantry can acquire the sort of disdainful viciousness only well-chosen words can really achieve.

I do advocate the use of the singular “they.” I think it is a natural, elegant, and gracious choice to give respect both to those who identify outside a gender binary, and, equally importantly, to acknowledge that gender can be incidental. I should not need to know your gender to hear your ideas, to hire you for a job, to convey a delightful anecdote about you to a friend.

And so, when I read from a colleague in higher education sweeping dismissal of the use or relevance of new pronouns, the singular “they” along with, I am annoyed. I would like to receive this pedantry as a suggestion, but I find that difficult in the face of arguments that “they are entitled to their own identity, but not to their own grammar,” (and why not, exactly?) and that faculty don’t know “how to deal with this violation of basic subject-verb agreement” (no doubt the top of the educational priority list). This is followed by some hand-waving arguments about not kowtowing to “safety” and “comfort” as if those ideas have any relationship to the inclusion of a singular and a plural in the same word.

Happily, this author didn’t provide their gender, nor do I need it to discuss their work. Gender does not matter to my opinion, which is that they are engaging in the worst sort of pedantry—that sort of pedantry that is stolid and unyielding and defensive, the sort of pedantry that is the usual province of grumpy old white men and stereotypical English teachers. So I will address such pedants directly, and this author in particular, and I will do it using what, if they had any historical view of language whatsoever, would be the bane of their existence: the pronoun “you.”

For you, Melvin the pedant, who does not know this: a piece of linguistic history. The pronoun “you,” like “they,” was once solely plural. The singular second-person pronouns were “thee” and “thou.” As levels of formality began to drop out of English, so, too, did the distinction between the singular and plural in the second person. You probably don’t realize that you are using a plural pronoun in the singular every time you address one of your students. And thus does the rest of your argument, which rests on the implicit idea that language is fixed, collapse under its own decaying weight.

Please, if you must be a pedant, be an educated pedant. Make arguments out of elegance, out of convention, out of inquiry and desire for precision. Do not make arguments out of the specious idea that your own set of language is the only set, or that it should be.

Because, for the rest of us, I maintain that language is a democratic exercise. And you are being out-voted.

The Apotheosis of Form

I like to think about words. I believe that thinking about the words we choose is a wonderful way of pushing the bounds of our thinking. I believe that choosing our words carefully and drilling down in the nuances of their meaning helps us understand both what we personally believe and how others’ thinking is subtly different. I believe that strongly enough that I’ve written a number of posts now about the importance of choosing your words carefully.

Anna_Chromy_Cloak_Of_ConscienceIn the discussions I’ve had on this topic, though, another theme has emerged: that of treating our words as if they are the only things that matter. I was discussing this with a close friend recently and she brought up the idea of “liberal shibboleths,” which I think is a brilliantly simple way to explain this problem. A shibboleth, after all, is “the watchword of a party,” and often “some peculiarity in things of little importance.” And before I single out liberals for illiberal use of shibboleths, there are plenty of conservative shibboleths, libertarian shibboleths, progressive shibboleths, and so on.

I and my friend both have seen moments when a well-meaning person is rebuked by members of the in-group for use of the wrong words. Sometimes that rebuke is called for—there are, indeed, people who are offensive with intent, and those people should be called on their behavior. But what of the rest? If someone reaches out honestly to understand a thing they are not, it’s natural that they not know how to speak about it. Why do we treat them as if they should? These are people who have taken a step outside their comfort zone—they do not need us to critique their form, they need us to show them new ideas.

There is value in treating people with respect. There is respect in describing people with the words they choose and not the words we choose. There is respect in recognizing what is offensive, and why, and avoiding it. But there is also value, and respect, in presuming the best of intentions. Certainly when a prominent white man publicly speaks of women as girls, the inherent sexism of his statement is worth critique. But if that man had gone to some of his colleagues with an honest desire to learn and asked how he should handle situations with “girls” in his lab?

Someone who wants to learn is a rare and precious commodity. What would you teach in such a moment? Would you teach this man that he is making unwarranted assumptions about half the human race? Would you teach him that basic human decency should not be dependent on gender? Would you teach him about women’s experiences when men view them as erratic, emotional, unintelligible aliens, instead of as human beings?

Or would you take this moment, this rare open moment, to teach him only that he is using the wrong word?

The thing I did not mention before is that a shibboleth is not merely a password or a badge of membership—it is a tool of exclusion. We know, by the words they use, who agrees with us and who does not. If we are complacent and unwilling to engage our own ideas, if we prefer superficial discussion with no dissent, the shibboleths tell us who to echo and who to exile.

In my opinion, the way we engage with outsiders is the true test—of whether our groups are bent on real, deep discussion and self-improvement, or whether they are rigid places where ritual is king and doubt is forbidden. We, who profess to be open to multiple ideas; we, who profess to believe in human rights and human decency; we, who claim to value discourse and discussion: it is incumbent on us to pay more than lip service to these ideals.

We can choose our words carefully, and we should. But we can make those choices out of understanding rather than prescription, and when we speak to those who disagree we should not conflate the two. The form is what we see, but it cannot be what we teach—because form, without the ideals to inspire it, is dead.

A Poor Choice of Words

via Neil Moralee

Have you ever made a complete ass of yourself and then had to apologize later? Ever found yourself rapidly backpedaling from something you said that, while ill judged at the time, seems head-smackingly foolish in retrospect? Have you ever found yourself stammering out an apology for “my poor choice of words?”

Personally, I can’t recall doing this—but I would bet that I have. I would bet that most people have (excluding incredibly inoffensive people, and assholes who never apologize). It’s not surprising that this phrase might come into your head at a moment of tension when you are fumbling for a way to take back something you said; after all, we hear it all the time. But if you ever find yourself about to say this, you really, really shouldn’t.

Last Friday I wrote about apologizing by claiming “it wasn’t my intent;” which is valid in minor incidents where good intentions can be presumed, but is often used to justify wildly prejudiced things. “A poor choice of words” is a close cousin: an apologetic phrase that makes perfect sense when you have a slip of the tongue, but not if you just said a meaner version of what you meant to say all along.

One place this phrase crops up often is in apologies from organizations, politicians, media personalities, and other individuals in the public eye. Rush Limbaugh called Sandra Fluke a “slut” and a “prostitute,” then apologized for his “choice of words.” Arkansas representative Don Young called migrant workers “wetbacks” and then apologized for his “poor choice of words.” Congressman Geoff Davis called President Obama a “boy” and then apologized for his “poor choice of words.” Dr. Ben Carson drew analogies between LGBTQ individuals and “bestiality” and then apologized, you guessed it, for his “choice of words.” Senator Harry Reid gave Obama the back-handed complement that he “had no Negro dialect” and then apologized, as usual, for “such a poor choice of words.”

You might have noticed a theme in all these examples: specifically, that these are all people expressing absolutely horrible, prejudiced things and yet they seem to think it was how they said them that mattered. In this insane upside-down world, you can hold opinions that are sexist, racist, or many other kinds of horrendous, but all that matters is the words you use to express them. The sentiment, apparently, doesn’t matter.

I assume that you, the reader, are already ahead of me at this point and have realized, if you didn’t know it already, that apologizing for “a poor choice of words” is not, in fact, apologizing. Instead it is downgrading one’s offense from believing something terrible to making some kind of slip of the tongue. “Oops! I totally meant to say something else instead of ‘subhuman mongrel.’ My bad!” “So sorry, I didn’t mean to say you were ‘a slut,’ I just accidentally said it out loud because I thought being sexist was funny. JK you guys!”

You might have noticed another insidious theme here, and I want to make it explicit because I think it is very important. Apologizing for “a poor choice of words” is the same as saying your original sentiment was fine. You are basically saying the horrible thing you said is a valid, acceptable thing to say.

So, if you happen to be a school with a dress code, say, and it happens to advise girls that “we don’t want to be looking at ‘sausage rolls’” and tells those same girls that “you can’t put 10 pounds of mud in a five-pound sack,” you should know that it is no way sufficient to apologize for “unfortunate word choices.”

Now I know horrible non-apologies are put out there all the time, but that doesn’t mean we have to condone them or perpetuate them. If you see a leader apologizing for their poor choice of words, call them on it. Twitter, Facebook, whatever—let their terrible apology writers know that we do not accept their apologizing for word choice instead of sentiment. If your friends apologize to you this way, you may want to be nicer, but gently make it clear what is and isn’t a real apology.

Because the phrase “a poor choice of words” is a indeed very poor choice of words.

It Wasn’t My Intent

Intention is both more and less important than we allow. It matters what I meant to say and do, because those reflect my experience of the events in question. But what I meant to say and do may have little relationship to your experience of the same events. And the events themselves are yet another truth.

I am not suggesting it is easy to navigate these murky waters. It’s tough to anticipate how someone will respond to what you say or do, and it’s tough to know ahead of time how it will be perceived. Maybe you tap a friend on the should to say hello and they jump out of their skin—you mean to say hello, they experience it as being startled, and the objective act (tapping them on the shoulder) holds neither connotation. In this sort of circumstance, intent does matter, and the phrase “it wasn’t my intent” may actually be reassuring. The hurt is minor and results from an innocent misunderstanding.

But there is a different usage of this phrase, and one that takes it well outside allowable bounds. Where I more often see “it wasn’t my intent” cropping up is in apologies where it really has no business being. I am talking about circumstances where the hurt is large, or there is no misunderstanding, or the consequences are so significant that intent no longer matters. In these cases the phrase “it wasn’t my intent” and its cousins are the phrases we trot out to abdicate responsibility.

Tim Hunt - World Economic Forum

Tim Hunt – World Economic Forum

For example, Tim Hunt used the phrase “I certainly didn’t mean that” this past week when apologizing for sexist comments about women (he called them girls) being a problem in labs. He was worried that women would “fall in love with him” and “cry” and be “distracting,” so Tim thinks they should be in gender-segregated labs. And in his apology he says he “did mean the part about having trouble with girls,” so he seems to be burning the candle at both ends on this apology. By saying he “didn’t mean” to offend anyone, he seems to be saying that the inherent sexism of his views doesn’t matter, because he didn’t intend it to be offensive. Happily, lots of woman in science jumped in to tell Tim just how wrong he is.

Nevertheless, this is how I usually see phrases like “it wasn’t my intent” employed. Not to clear up some real misunderstanding of meaning, but rather as a verbal scalpel to separate someone’s offensive views from the consequences of expressing those views. When someone says something steeped in prejudice and then claims “it wasn’t my intent” to upset anyone, they are effectively saying that there is nothing wrong with their views, and the fault lies in your response.

At this point some people may be thinking “hey, wait a minute, maybe Tim Hunt didn’t mean to be sexist.” They are probably right. And they may be thinking of some time that they said something prejudiced themselves and didn’t realize until after the fact—I know I’ve done this. And that is true, and a good point.

And it doesn’t matter. There is no plausible deniability for those espousing sexism, or racism, or homophobia, or any other prejudicial viewpoint. The offensiveness of prejudiced views and the hurt they cause cannot be separated. This is why the phrase “it wasn’t my intent” is such an insidious bit of misdirection—it’s basic role is to suggest that when someone is prejudiced and offensive, whether they intended to be matters more than whether they were. It refuses to acknowledge the prejudice as the problem, and thus it reinforces, rather than diminishes, the original harm.

“It wasn’t my intent,” we say, “to give offence. But of course, we are decent people, so if you were bothered by our prejudices, we will happily apologize for the bother, even though the problem really lies with you. Sorry.”

“It wasn’t my intent” is the “I’m sorry your face keeps hitting my fist” of rhetorical apology.

Live in the Real World

Maybe you were just saying how we needed our government to be something less than corrupt, or how women need to be safe in our society, or how evidence and logic should trump nonsense and prevarication, or how arguing about scientific realities is preventing us from dealing with them. And the person you were speaking with replied, with a touch of condescension, a hint of derision, and a little eye roll: “You have to live in the real world.”

Now, it really doesn’t bother me where people choose to focus their time and efforts. If they believe in a just cause, and they can maintain the effort for that cause, more power to them. If they don’t have time for what I think is most important, that’s fine, too—people can and must choose where to put their time, and it cannot be everywhere. But this is a special kind of righteous dismissal, and it isn’t what it sounds like.

When someone says, “You have to live in the real world,” they think they are saying that your suggestions are implausible, or foolish, or unachievable. They think they are gently steering you away from wasting your poor misguided energy on something that you, poor naïve soul that you are, do not realize is worthless.

Via ViudadesnudaBut that isn’t what they really mean. What they mean is: “My version of the real world, the version I have created in my head, accepts these things as givens. So shut up about them already.” And maybe they mean: “…because changing them is too hard.”

Mostly they mean: “Stop making me question my assumptions.”

Now, there are reasonable ways to disagree with people. If one person is talking about systemic inequalities that disadvantage the poor and how we need more support structures, and another person is suggesting that the poor are leeching off the government safety net like the parasites they are, there is clearly grounds for disagreement. There is also room for both sides to support their arguments, and to reconsider their own assumptions. And even though I find one of those views offensive, that’s the place I would rather be in a discussion. Debating issues with people when I disagree with them helps me learn how their views differ, how to support my own, and where I am wrong.

But if either person says to the other “You have to live in the real world,” that room for discussion is lost. Instead of being a point of discussion, the issue has become a point of judgment. That is the signal to me that the other party in the debate doesn’t care about finding the truth of an idea, only about preserving their own worldview.

What’s worse, this phrase doesn’t just come up with people I radically disagree with—in its most painful, useless, divisive form, it is coming from people who I would like to have as allies. It can appear in even apparently friendly discussions, but its true meaning remains. “You have to live in the real world here. Wind power isn’t the answer; solar is the only workable choice.” And maybe: “The government is never going to be effective, so less of it will always be the better option. You have to live in the real world.” Or perhaps: “Of course men shouldn’t rape women, but you have to live in the real world—how you dress is going to matter.”

I can’t make people stop saying this. Neither can you. I can’t even avoid being frustrated every time I hear it. But we can keep the true meaning in mind. If we hear others saying it, we can translate and regroup—okay, what assumption are they guarding here? And, if we find ourselves thinking this about someone else, we can use it as a reminder to question our own assumptions.

Because, in the real world, people are going to say this, and think it, and not always know what they really mean. But in the real world, some of us believe in trying to build a better one.

The Mad Misnomer

The trope of the Mad Scientist pervades popular culture and popular awareness. One of the major archetypes, Victor Frankenstein, embodies the trope as an obsessed man reanimating the dead through perverse experimentation. In some cases, such as that of Doctor Jekyll, the mad scientist engages in well-intentioned but equally doomed self-experimentation. In still other cases—Lex Luthor, for example—the mad scientist creates fantastical devices that enable his madness. Superhero stories are rife with Mad Scientists as villains and heroes both, though even the heroes seem untethered and at risk from their own brilliance. Tony Stark invents Iron Man suits in his basement out of devices he invented, yet his devices are constantly being turned against him. Even the delightful Dr. Horrible fails his endeavors mainly through the failure of his own inventions. For the Mad Scientist, their brilliance is also their Achilles Heel.

But are any of these really scientists? I rather agree with Sanjay Kulkacek who said we ought to call them Mad Engineers:


Or perhaps just Mad Inventors. But key elements of science—awareness of bias, quantifying uncertainty, testing ideas slowly and methodically, cooperatively generating knowledge—seem totally absent from the archetype. Mad Scientists are people who work alone, fueled by the own brilliance, creating fantastical things by reorganizing bodies, technology, or both. Scientists pursue knowledge with uncertainty, but the Mad Scientist is recklessly sure of their own conclusions. Scientists work in groups, but the Mad Scientist works alone, cut off from society and their peers. Scientists employ method to reach understanding, but Mad Scientists achieve their goals through leaps of unreachable brilliance. Scientists are slow and careful, and Mad Scientists are capricious and haphazard.

The Mad Scientist can be an entertaining character, but I fear the associations with science drag the public perception of science in the wrong direction. And the one feature of the Mad Scientist that I most dislike is their isolation, because when you take away the “mad” part, you are left, not with science, but with another mistaken trope: the Lone Genius. Of course, real science is collaborative, but that’s not how we, the public, tend to think about it. We’d rather imagine a Nicola Tesla holed up in a mansion inventing a ray gun than a collaborative group of hundreds of people carefully planning, funding, building, and launching a mission to Pluto.

At the risk of being repetitive, collaboration in science is fundamental. Understanding that is the difference between two opposing views of science: the first view is of a whimsical group of unintelligible geniuses who argue about whether eggs are good for you, but the other is a collective endeavor of humanity to achieve the best possible knowledge of the world.

In the first view, the public view, the view that underlies the majority of science news reporting, there is no way for the public to assess to truth of science, and thus it is a view that fundamentally mistrusts science. One person says one thing, another person says differently; how can we laypeople tell the difference? In this paradigm, we can’t even grasp the concept of scientific consensus, because every study on evolution or climate change is divorced from every other. Every scientist is just one step away from being mad.

In the second view, however, science is a body of knowledge. We can and do assess truth by consensus—when 98% of climate scientists agree that the earth is warming and humans are causing it, that means far more than any one study or argument by one dissenter. What’s more, we can look at the body of data to ask and answer questions about where the truth lies and what our best knowledge is at the present. That makes science—real science—accessible, but only if we know to look.

The Mad Scientist undercuts true science, and also represents our failure to understand it. In the trope, the scientists are the ones who lose touch with reality, but scientists in the real world are diligently and deeply engaging it. When it comes to understanding science, the ones who lose touch with reality are the rest of us.