"Shall," "Will," and Not Knowing It All

I like to think of myself as well-informed when it comes to usage customs and grammar rules (and "rules").  But my knowledge of usage is not exhaustive, and I learn more every day, usually by accident, necessity, or both.  I have no problem admitting this because, as the Laughing Saint, Philip Neri, taught, humility is healthy for us all.  It's a delicate balance to strike---being humble and being an expert.  Admitting that my knowledge isn't encyclopedic is one way to attempt such a balance, perhaps.

Not long ago, I overheard someone say that shall is only used with the first person.  I scratched my head, assumed this person didn't know what he was talking about, and went about my merry way.  That was foolish of me in many ways.  First, I assumed that I knew more or better without evidence.  That is, I didn't know this person's full background.  Turns out, he teaches linguistics, so there was at least some chance that he was right or that he at least had a reason to say something that, to me, sounded utterly wrong.  Second, I assumed that I had an exhaustive understanding of usage.  I do not.  No one does.  Third, in assuming as much, I was giving up any chance at learning (a) what the truth is and either (b) why the custom/rule is as he said it is or (c) why he'd say it is if it's not.  I was being lazy at best and self-righteous at worst.

So here I am, several weeks later, looking at someone else's use of shall and wondering "Was that guy right?"  The question, once ignored, now nagged at me.  I picked up my grammar books.  No information there.  That told me that this must be a usage issue, a matter of custom and/or culture.  So I picked up my soon-to-be-erstwhile CMS 16th edition and turned to the usage section.  Nothing under shall.  Then I unshelved my Oxford American usage handbook, and lo and behold, clarity.

Here's the skinny on shall v. well, according to Bryan A. Gardner: "Grammarians formerly relied on [a] paradigm, which now has little utility."  That paradigm is that when shall is used with the first person (i.e., I or we), it indicates futurity (i.e., that something will, indeed, happen in the future).

Example: I shall go to the store later today.  But he will not.

However(!!), when shall is used with the second (you) or third person (i.e., he, she, it, or they), it connotes a command or promise, an obligation.  It suddenly has what might be described in speech-act theory as illocutionary force---it does something in addition to meaning something.

Example: I will not agree to that contract, and if you wish to remain in business with me, you shall not, either.

Will indicates futurity for second and third person but not first.  When used with first person, it has the illocutionary force of indicating a promise or command.

The distinction is very fine, highly contextual, and therefore easily disregarded.  I'm not saying it's not a useful distinction; I'm saying I'm not surprised that people have stopped honoring it (did they ever?  I do wonder).

Gardner includes this pertinent quip from "Professor Gustave Arit of the University of California":

The artificial distinction between shall and will to designate futurity is a superstition that has neither a basis in historical grammar nor the sound sanction of universal usage.  It is a nineteenth-century affectation [that] certain grammarians have tried hard to establish and perpetuate. ... [T]hey have not succeeded.

Ouch.  So, does the distinction exist?  Sort of.  Am I surprised that I hadn't happened upon it?  No.  I wouldn't fault any of my linguistics or English professors for not teaching it.  Was the person whom I heard articulate the rule as easily dismissed as I thought?  Well, no.  He may only have had one side of a story that is increasingly not being told, but "wrong" is too strong.  In the end, having the distinction in mind is useful, even if I continue to use shall to connote promises or commands and will to indicate futurity in a sort of blanket way.  I won't go around correcting anyone who hangs on to this person-based paradigm.

Why It's Never Okay to Self-Plagiarize, Especially If You're a Scholar in the Humanities

When I was a writing program administrator, I dealt with plagiarism occasionally.  It's a fact of life, but it's rarely something to get wrapped around the axle about.  That said, I have something on my mind, and I'm not going to pull any punches with this post.  Even if you're not in the field of academia, keep reading.  I'm going to explain a different way of thinking about plagiarism than you're probably used to, and I'm going to give you some insight into how we should value the services that academics in the humanities provide you.

My Students Almost Never Plagiarized.  Here's Why.

There's probably as much moral outrage connected to plagiarism as there is confusion about what actually constitutes plagiarism.  That's not a stable combination.  But as a teacher, I rarely had students plagiarize in my classes.  This was for three reasons:

  1. I made them write drafts, sometimes in class, so no one could show up to class with a complete paper out of the thin, blue sky.  They'd fail a substantial portion of the paper grade if they did.  We also talked explicitly about what plagiarism is, and they knew that part of the reason they were doing drafts was to help them avoid plagiarism.  There was no mystery to the process, because I wasn't trying to catch them or trick them.
  2. I had unique paper prompts that required writers to synthesize and/or address unusual topics and/or incorporate their own experiences.  This is the number-one way that teachers can avoid cases of plagiarism.  Not having unique assignments that ask students to do something truly unique (like incorporate their personal experiences into their analysis or to analyze things that few other folks would think to analyze) is a good way to avoid getting paper-mill papers.
  3. I warn students in the first few days of class that I am a rhetorician with enough training in linguistics that I can analyze their rhetorical/linguistic/discursive fingerprints based on samples of their in-class writing and compare that analysis with a similar analysis of any paper they turn in that I think might be plagiarized.  Armed with forensic linguistics, I would tell them, I could bring charges of plagiarism that would be pretty hard to deny even in the absence of a matching source if the analysis indicates that plagiarism had, indeed, occurred.

But when I did catch students plagiarizing, it was usually because they were:

  1. Ignorant about the topic they were to write about.  These were the students who'd bailed on class or hadn't done the readings.  They were stealing other people's ideas (yeah, I said it: stealing) because they didn't have any of their own to put into words.
  2. Ignorant about what constitutes plagiarism.  Yes, sometimes you have students (especially from foreign cultures) who just don't understand how our culture defines plagiarism, no matter how much we discuss the basics of plagiarism in class.  They might not realize that it's not okay to take a sentence from a paper they'd written in high school and plop it into a new paper, or that they have to provide actual citations for all materials--directly quoted or otherwise--that didn't come from their own brains or aren't common knowledge.  My response to this was, typically: "Except for common-knowledge issues, if you can cite a source for any idea or words you're writing, then you must."
  3. Out of time.  Maybe they knew the material inside and out.  Maybe they knew what plagiarism is.  But maybe they put off writing the paper and just don't have the time to write 6000 words in the next 3 hours or whatever before class, so they decide to lift someone else's ideas or words without proper attribution.  Yikes.  That's why plagiarism penalties exist, indeed.

All of that is understandable, if not always excusable.  We hold students to a high standard, and it's our responsibility as teachers to teach students what those standards are so that students can live up to them.  We're also here to help them do that "living up to" part, too.

Here's the thing: We cannot do that if we, their teachers, are plagiarists.  We have to hold ourselves to the highest standard if we want to be taken seriously.

Who's Afraid of the Humanities?

Nary a month goes by but that there's an op-ed piece in a major newspaper or academic trade publication about how important the humanities are.  Ever wondered why that is?  It's not as if there are op-eds about how unnecessary the humanities are, right?  Well, it's true that after the boom times of the 1990s, university budget cuts struck humanities programs first.  These programs weren't flashy (no robots getting built by philosophy professors, even though their work makes AI possible), they didn't get big grant funds (no pharmaceuticals being created by cultural-studies experts, even though their work informs how we categorize disorders and diseases), and they appeared to be more expensive than they were worth (even though some courses, like first-year writing courses, are huge money-makers for universities, largely because they're so cheap to teach and because students are conscripted into them).  Right before I went on the job market for a tenure-track job, the economy crashed, and English and other humanities departments around the country dried up.  Suddenly, our scholarship wasn't as valuable as it had been; it wasn't worth the same level of investment in the form of professorships and departmental funding.  So it goes.  The humanities really are vulnerable to the money-focused forces that steer contemporary universities. 

The problem isn't that we're not actually valuable.  It's also not that we're not inherently valuable, by which I mean that the humanities aren't valuable for their own sake.  Humanities scholarship is valuable, and humanities scholars have to be able to articulate the nature of that value in order to persuade others of it.  Torrential rainstorms of ink have been spilled in the effort to articulate that value, so I'm going to keep it brief here, but the best reason I can think of to indicate the value of the humanities is this: Imagine that everything we know about human culture didn't get passed down to the next generations.  Imagine that in two generations we don't know anything about what we were doing at any point beyond 200 years ago.  Imagine that we didn't understand anything about ourselves and how we got to where we are.  Sounds dangerous, right?  Not to mention wasteful.  That's what abandoning the humanities means.  We're worth investing time, effort, and, yes, money in.

Self-Plagiarism (Especially in the Humanities) Is Damn Ugly

The following scenario is hypothetical, okay?  But let's say that in the course of being the loving, diligent copyeditor of a book written by a group of smart, capable, insightful scholars in the humanities, I see a bit of code that indicates that a few words have come from an online source.  I used to see this code in my students' papers all the time when they'd copy and paste a quotation from whatever online source they were reading.  With proper attribution, this is not a problem.  In fact, with proper attribution, signal phrases, and fully integrating whatever was copied into their ideas and sentences, the inclusion of those outside words--whether they were copied and pasted or not--would constitute successful academic writing.  The code in and of itself wasn't the problem for my students.

So let's say I decide, "Well, I better double check that these words don't need to be cited, since they seem to be copied and pasted."  Because the words aren't cited.  Why would they be copied and pasted, then, I might wonder?  Let's say that I then search the interweb for the words, and find that, lo and behold, that exact phrase has already been published in an article on the same topic in a peer-reviewed, academic journal that specializes in publishing information about this topic.  Gasp!  And not cited??  This is not okay!

What I've just described to you in this hypothetical situation is plagiarism.  The author of the chapter hasn't given attribution to the exact wording of a pretty distinctive phrase that comes from another source that, in all likelihood, the author came in contact with in the course of doing research for this article.  Standard plagiarism that an editor can query: "Does this sentence require attribution?  It comes from an outside source.  Please provide complete citation information."

But let's say I look at the byline for the article from which this phrase has been flat-out plagiarized, and I find that the article was written by the same person who's written the chapter that I'm currently copyediting.

Um, no.  No, no, no.  Say it ain't so.  This humanities scholar has self-plagiarized.  This person has just repeated themselves verbatim in a totally new work of "scholarship."  And, let's go to the worst-case scenario: let's say that this person is a rhetoric-and-composition scholar with a tenure-track position and has even written a textbook about academic writing.

Let's say that happened.  Just, like, hypothetically.

This is truly ugly.  It's hypocritical.  It's professional malpractice.  It's self-sabotage.  Any humanities scholar, especially someone who specializes in rhetoric and/or writing, has no excuse.  They have no appeal to any of the three reasons why students might plagiarize.  Let me count the ways:

  1. Self-plagiarists in the humanities, especially writing-studies specialists, cannot claim ignorance of the subject matter they're writing about.  Clearly, as someone who's published on this topic before, they should be able to think of new things to say about this topic.  If they can't, they should take several seats and let someone else who has something new and fresh to say have a chance.  But this is one of the many problems of the academy today: publish or perish leads to a glut of echo-chamber publications.  It leads to cliques of scholars publishing each other's scholarship once one of them gets into an editorial position.  Perhaps self-plagiarizing humanities scholars think that no one is actually reading their work, at least not closely, and they'll never get caught.  That's woefully abject in its cynicism.
  2. They cannot claim to be ignorant about what constitutes plagiarism.  If you're a professional academic, you've encountered dozens of definitions of plagiarism.  It's your job to enforce plagiarism policies in your classes.  You can't say that you didn't realize that just repeating your own words and not providing a citation to that information is dishonest.  You can't say that you think there's no harm in trying to get ahead in the publish or perish game by cutting corners, by trying to seem as if you've got new, fresh ideas when in fact you're just repeating yourself.  This is why outsiders don't take the humanities seriously.  Things like this.  When we don't actually bring new, worthwhile knowledge to the table.  This is why.  This.
  3. They cannot claim to have run out of time.  Behind on that deadline?  Either ask for an extension or sit down and let someone else have a go.  I'm in the middle of an epic battle with myself about whether I'm ever going to get a chapter submitted for a certain edited collection.  But I'm not going to steal someone else's words or try to pass off words that I've already published somewhere else in order to have another publication line on my CV.  Neither would I steal just one sentence.  It's not going to save me that much time.  In the time that I saved by not trying to think of a new way to phrase that same idea, I'm not going to be able to fit in another student advising session or another email or another meeting or time enough to prep a whole class, etc.  It's not saving that much time to self-plagiarize just one line.  So why bother?  It's just lazy and ugly, and it suggests that what we do is cheap and not worthy of building upon.  It suggests that even we "really" know that what we do is just the same thing over and over.  As long as we get the publication glory, right?

So, I'm not saying anything.  I'm just saying.  If you're a scholar in the humanities and you're thinking of self-plagiarizing, don't.  Wait to write when you actually have something new to contribute.  Give someone else a chance, if the best you can do is repeat yourself.  If we're in this cosmic cocktail party together, then just remember that no one likes to chat up the person who just keeps saying the same thing over and over.  What's the point of listening to that?

How to Correct Someone's Usage, or: (Not) Making Usage Great Again

I used to date a philosopher.  He was (is still, I'm sure) brilliant.  I remember having a long, adversarial conversation with him about the use of "beg the question."  In case you don't know--and many people don't--"beg the question" is a technical term.  It's used to refer to a flaw in logic/argumentation in which the assertion you're making essentially assumes that the basis of the assertion is true.  If I say that G-d exists because G-d said "I AM," then I'm begging the question: it's already assumed in my assertion--that G-d said "I AM"--that G-d exists.  (This possible logical fallacy is not a problem for me as a Christian, because I haven't confused logic with faith, and I don't require that my faith be logically sound.  But that's a conversation for another day.)  The philosopher was making what philosophers would call a "strong" claim that anyone who misuses "beg the question" should be corrected lest the phrase lose its meaning because of (but not due to) misuse.  That is, if everyone uses it to mean "presents the question" or "requires you to wonder," then no one will know its (true) technical meaning!!  And how will we sleep at night??

The point I made in response to him was descriptive (though he took it as prescriptive): the phrase is already being misused, so don't get too hung up on correcting everyone, because that's a Sisyphean task. 

The philosopher was not amused.

What is Usage?

Usage has to do with how we use language--from punctuation to turns of phrase--to communicate.  It's governed by convention, not divine law and not dictionaries.  It changes over time.  What you learned about the "right" (read: customary) way to say or write this or that can differ greatly from the way that someone else who lives a few blocks, states, or continents away from you.  I'll never forget telling a flatmate of mine in London to stop talking about her "pants" because the Londoners in the room were getting uncomfortable thinking that she was talking about her underwear (to them, she meant "trousers"). 

Do you have a pet peeve about a phrase that gets commonly misused (or so you think)?  They're everywhere.  Some of my favorite examples:

  • "for all intensive purposes" should be "for all intents and purposes"
  • "flushed out" should be "fleshed out" ("I fleshed out the details")
  • "moment being" should, to my ears, be "time being" ("I'm home, at least for the time being")

Some pet peeves might turn out just to be regional variations that you didn't know about.  My hillbilly kinfolk say "you'ns" to indicate the plural second-person.  Think that's annoying?  Sorry, but it's not wrong, at least not according to certain dialects of regional English.  It's just not customary to use outside of that regional variation.  For all I know, "moment being" might be the same way.

Here's what you can't attribute what you think is a misuse of English to:

  • stupidity
  • neglect
  • moral failure
  • an untrained mind
  • poor parenting
  • economic background
  • poor education.

The philosopher was convinced that he had it "right," and he did, in a technical sense.  But people who say "beg the question" in a non-technical sense probably don't have his extensive and excellent training in philosophy.  It's not because they're dumb or lazy or had parents who didn't discipline and/or love them sufficiently.  It's because they just don't know.  People rarely like getting usage wrong; we hang so much judgment on using "correct grammar" (which people usually use incorrectly to refer to both "correct" and "grammar," so add that phrase to my pet peeve list), so it's unlikely that the misuser is doing so on purpose.  Hard to judge someone for not doing something right that they didn't know was wrong.

Correcting Misusers

Oh, wait.  The subheading here and the title of this post kinda beg the question, don't they?  We're assuming that we should correct people who misuse language conventions!  I don't accept that assumption, actually, so let's approach this issue somewhat algorithmically.

How to determine whether you should correct someone's usage:

  • The most important question must be: do you know FOR A FACT that the phrase (or whatever) in question has been used in a way that does not adhere to current convention?  Could you point to a passage in a handbook, for example, that unequivocally proves that whatever you're about to lay down a correction for is, in fact, in need of correction?  If not, abandon your intention.  In this case, you do not possess the requisite knowledge, expertise, or validation to issue a correction. 
    • What can you do instead?  At best, you could ask a question: "Oh, that's interesting.  You said 'beg the question.'  I thought it was only used to refer to logical fallacies.  Have I gotten that wrong?"  Always, always assume the position of humility.  Do not ask, "Where did you learn to say it that way?" or "Were you aware that it's actually...?"  Your objective is to make, not alienate, friends, right?
  • Is the person you want to correct a loved one to whom you are not a parent?  If yes, then...
    • What can you do?  Don't correct them.  Why would you want to?  Just let them be.  That said, parents get the right to correct their children's everything: behavior, attitudes, use of salad forks, and language.  Parents, you still need to answer question 1 in the affirmative before you correct your kid's language use without an appeal to a handbook or authoritative resource.  But if you think that your kid has misused a phrase or word, you can say, "I don't think that 'beg the question' means what you think it means.  Go get your English handbook [or tablet or dictionary, etc.] and look it up and come tell me what it says."  That way, you'll both learn things!  And you'll be modelling for kiddo that it's okay not to know things and to risk being wrong.  Takes a lot of strength, that.
  • Is the person you want to correct someone over whom you have some kind of managerial authority?  That is, you're his boss or you're her mentor or you're their teacher.  In that case...
    • What can you do?  Never, ever, ever correct that person in front of other people.  Again, the less-enlightened among us still judge others for their "correct" usage of conventions.  Be aware of that before you go shaming someone for misusing "flushed out."  That said, in large office settings, if you're the boss, you might be able to get away with a general email that says something like "I want to make sure we're using 'flush out' correctly.  Unless we're talking about plumbing, we ought to avoid it.  Let's make sure we're using 'flesh out' from here on to refer to adding details or looking at additional information.  That'll help us stay consistent across the whole office."  But in individual contexts, I would recommend adding a comment about a misuse as an afterthought to something else: "It really was a great first draft.  I'm glad we've spent the last 30 minutes discussing it.  By the way, before we talk about when we're having our next conversation, I noticed that you use 'flushed out' when I would have used 'fleshed out.'  I looked it up in my usage dictionary, and where you have 'flushed out,' it should be 'fleshed out.'  I wanted to make sure I mention it to you so that you adjust this draft.  It's important to impress your readers, so I wanted to make sure that you've got the tightest prose possible."  Wordy?  Yes.  Tactful?  Mostly.  Better than red ink with no explanation for the correction or why it was important to make?  Totally.
  • Is the person a stranger to you?  Then stop.  You'll exhaust yourself trying to be everyone's real-time, flesh-and-blood copyeditor.  It's not your job to make usage great again.  Change comes to all things, and if "flesh out" becomes "flush out," what's the difference?  If "begging the question" has both a technical and a colloquial sense, the Earth will continue to spin around the sun without your correcting this hapless (mis)user.

There are surely other scenarios I'm not thinking of, but the upshot is this: how should you correct someone's usage?  Generally, you shouldn't.  You should only intervene--and then, tactfully and empathetically--when the quality of your/your employee's/your student's/your child's work and/or reputation are at stake.  And that's if and only if you know for sure that the correction you're making is actually a correction and not, say, just your imposition of your own personal standards.

Life's short.  There are some battles worth fighting.  "Begging the question," in most (but not all) circumstances, isn't one of them.

Much Ado about Singular "They"

I promised myself that I'd only spend an hour on this post, because rapid rivers of ink have gushed forth from those smarter and more qualified than I to opine on the matter of the use of singular "they."  But a friend and colleague asked the other day whether it's right or wrong to use the singular "they," so let's have that conversation.

Neither right nor wrong

As with most usage issues in English, it's not as if there's a definitively right or wrong way to use singular "they."  I say that with my descriptivist hat on: I'm trying just to describe how English gets used, not lay down proscriptions about whether it should be used in this way or that.  The "should" approach is called the prescriptivist approach.  We'll get to that.  But the upshot is that you'll never hear me or anyone from the Laughing Saint Editorial's crew say that singular "they" is right or wrong per se.  We're going to talk about whether it's appropriate once we get into the prescriptive side of things later on.

So, what are you going to learn in this post?  A little bit about what other experts say, a little about gender theory, and a little about yourself.

Please ignore Grammar Girl

I've got a post I'm saving up about why Grammar Girl isn't your friend (do you use WebMD instead of a doctor?  No.  So you shouldn't use Grammar Girl and assume you've gotten accurate grammar/usage advice.  I digress).  That said, we do need to start this conversation by looking at what experts (i.e., not Grammar Girl) have said about the use of singular "they," both descriptively and prescriptively.

I'm a rhetorician with a background in linguistics (my dissertation director was a nationally-recognized linguist who helped create the Dictionary of American Regional English, and I have something like a master's worth of coursework in English grammar, including English-language history and functional, cognitive, and generative grammars), which means I think of language/linguistics as the foundation of rhetoric.  Rhetoric is, more or less, how we use communication--verbal and otherwise--to do things.  Note the importance there of the word "use."  "English language usage" refers to customs of language usage, not the rules (flexible and dynamic though they may be) of grammar, which is really about how words get put together to make sense (but whether they achieve some purpose, well, that's a question for rhetoric and usage and style rather than grammar).  Grammar is how the Lego blocks fit together; rhetoric is whether you've used your blocks to make a castle or a bridge and why you'd want to build one or the other.

So, before we can understand whether and how to use (!) singular "they," we have to understand its basis in language and the history of the language.  The fact is that singular "they" has been used in the English language since before "correct spelling" was a thing.  There's something like four centuries of time lag between the two, actually: singular "they" is at least as old as the 14th century, and spelling and other matters of language use were being codified in the 18th and 19th centuries.  That's just the descriptive facts.  So the historical argument suggests that there's precedent both for using singular "they" and for not using it, as language standards started to be implemented.

Since history won't save you, how about the brain?  Here's a great analysis from the Bible of approachable linguistics scholarly news, Language Log, about how using the singular "they" has been shown to require increased processing time (meaning: it does seem to take a handful of milliseconds longer to understand what "they" or "their" refers to when it refers to a singular noun).  But we're talking about milliseconds, not a complete breakdown in semantics (how sentences make meaning).  So cognitive arguments aren't going to save you, either, because it would be patently risible to claim that understanding singular "they" creates such a significant cognitive burden that singular "they" should never be used.

What about semantics, though?  Can you argue that the use of singular "they" creates vagueness in sentences that can't be overcome?  Actually, I think this is the best argument against using it.  Note that the sentence "The student took their book to class" makes perfect sense to some folks.  To me, it causes at least momentary confusion: Wait, did this person intend to use "their"?  Are we suddenly talking about some other group of people?  Did I miss the change in subject or meaning?  Grammatically, we can change the rules (or change them back) such that "they" is officially alright to use in singular contexts, but just like I have to read some sentences a couple of times before I understand whether "read" is past or present tense, I might have to read a sentence that uses singular "they" a couple of times before I'm assured of whom we're talking about, and there might still be some ambiguity.  Furthermore, at the moment, we don't allow for "themself," which would be the singular reflexive form of the plural pronoun, which suggests to me that we're still not there, descriptively, when it comes to the singular use of the plural pronoun.  I could be asking for too much (we use "themselves" as the singular reflexive instead), but I think that when we get to "themself," we'll be fully in singular "they" semantics.  Right now, we're not.

On the ground

That said, when it comes to language use on the ground, some folks use singular "they" to promote a gender-neutral perspective.  In many ways, I support this, but that's not my only reason for using singular "they," though I don't use it all that often (see the semantic argument against it, above).  That said, I do use singular "they" not only in speech but occasionally in my Oratoria posts and elsewhere.  There are good reasons to use singular "they," just a few of which include:

  1. Respecting other people's wishes when it comes to the pronouns they prefer.  Don't be a self-righteous jerk: if someone asks that you use "she" or "they" or "he," just do your best to do that.  There's no reason to make a big deal about it.  If that person changes their (!) mind about it later (an argument I've heard against having to keep up with personal pronoun choices), what's it to you?  You can be forgiven slip-ups, in that case, but remember that most of the time, making that kind of change isn't something an individual takes lightly, so don't plan on being asked to adjust more than once per person.
  2. Accepting linguistic (and social) change.  It's not so much that words change, but our use of and rules for them change.  Happens all the time.  Don't imagine that you've got the form of English that G-d loves best.  Unless you've got some stone tablets lying about that you wanna tell us about, and those tablets are about grammar and usage, there's no reason to be upset about language change.  Unless you're trying to use language as a tool of oppression or control, that is.
  3. It fits the context you're writing or speaking for.  If I'm editing a blog post written for a website geared to people in their teens, I'm giving a pass to singular "they."  If I'm editing a book written by one of my clients in the business world, there shall be no singular "they" if I have anything to say about it.  The Chicago Manual of Style is unequivocal about their rejection of singular "they" (I have to believe it's because it occasionally creates ambiguities that can't be resolved semantically/grammatically), for example.  So is The New YorkerAs a rule, for me, I decide to allow singular "they" if and only if:
    1. It doesn't create ambiguities that can't be resolved easily by the reader
    2. The applicable style guide allows for it
    3. The audience is likely not to have a fit about it
    4. The writer wrote it (i.e., I won't go adding it where it isn't already).

So, it's not as if this is a right-or-wrong issue.  It's really a matter of rhetoric: how do you plan to use singular "they," and why?  Will it work in your specific context?  Those are the salient questions.

How I Used to Teach the Which/That Comma Rule

For some reason, I find myself adding a lot of commas before the word "which" lately.  It's just a fluke; I don't think it's due to a moral failing of our educational system or a lack of personal fortitude on behalf of the writers I'm working with.

But it does make me sad that I'm not still in the classroom teaching students my awesome method for remembering when and why, more or less, to use a comma before "which" and when to use "which" rather than "that" in the first place.  So I've decided to share my method here instead!  I hope that all you undergraduate writers and writing teachers will find it useful.  Remember: sharing is caring!

Don't Trust Your Gut

The conversation usually began like this: I'd ask my students how they know they should use "that" instead of "which."  More often than not, they would have no concrete idea of why; they just used their intuition, if they were native speakers, and while trusting your gut may have been good enough for Stephen Colbert, it's not sufficient for command of the rules/common standards of US English. 

Here's one of my favorite examples to use in class:

This spacesuit, which I wore yesterday, was made in 1965.  It is kept in the museum that I told you about last night.

For not-entirely-arbitrary reasons, in US English "which" is a non-restrictive relative pronoun in contexts like these, and "that" is a restrictive relative pronoun (as opposed to being a demonstrative, as in "Look at that spacesuit," but I digress).  Accepting that seemingly-but-trust-me-not-totally-arbitrary rule is step one.*

What Makes "Restrictive" Restrictive?

What, after all, is being restricted?  In short: the meaning of the word that the pronoun stands for.  In the first sentence, when we start the second clause, after the comma, we need to re-establish the grammatical subject, and it sounds clunky to say "This is the spacesuit, the spacesuit I wore yesterday," so we use a pronoun to cut down on the wordiness (and yeah, you could remove "which" and "that" altogether from these sentences, but you're just eliding the re-establishment of the grammatical subject if you do so, and I'm trying to explain the grammar to you, so play along with me here).  In the case of the first sentence, the use of "which" should indicate that the additional information about the object being described--the spacesuit--is information that is not necessary for identifying the object.  That is, the fact that I wore the spacesuit yesterday does not restrict the meaning of "spacesuit" in this sentence to the object being described; it is merely a further detail, not a detail that distinguishes this spacesuit from any other.  The restricting/defining information in that sentence is probably (depending on context) the fact that it was made in 1965.

Now, what about that second sentence?  Well, the fact that I've used "that" should indicate that the additional description of the object being described--the museum--distinguishes or restricts the meaning.  Without that additional information--namely, the fact that I told you about the object (in this case, the museum) last night--you could be confused about which museum I'm referring to.  The fact that I had to add extra information to restrict or specify the meaning of "museum" means that I need to use the restrictive relative pronoun "that."  If I'd used "which" in the second sentence, it would suggest that we both already knew which museum we were talking about, and the fact that I told you about it last night would have been already-assumed or non-restrictive information.

These are just some basic examples.  Distinguishing restrictive from non-restrictive can get pretty tricky.  For example, restrictiveness can also be a property of other types of appositive phrases that aren't headed up by a relative pronoun, but that's a bridge to cross on another day and in another post...

If You Have to Trust Your Gut, Follow This Rule

What I used to tell my students was that if they could understand when to use "which" and when to use "that," they'd have better control of the language.  But, since the restrictive/non-restrictive principle gets tricky, I also gave them what I called the back-door rule.  If they couldn't figure it out but had a pretty good feeling in their guts that they should use "that" or "which" in any given case, they could remember to use commas before "which" using this simple rule, which (hey, hey!) I drew on the board:

This worked particularly well for the science-minded in the class.  It's a basic chiasmus, or crossing of opposites to achieve balance.  If you think of non-restrictive and no comma as being "negative" and their opposites--the use of a comma and the use of restriction--as being "positive," then you can easily remember that the positive always goes with the negative.  Restrictive "that" should have a negative--no comma.  And the negative non-restrictive "which" should always include or add a comma.  You might be relying on your gut to tell you whether to use "which" or "that," but at least your punctuation will be right, and most instructors grading papers (whether they're teaching history, physics, or English) who get bent out of shape about such things only care about whether the punctuation is correct because, frankly, they couldn't distinguish restrictive from non-restrictive if their lives depended on it.

So there you have it!  Here's hoping my neat little diagram is helpful!

*One important note: the rules about using "that" or "which" exclusively as restrictive or non-restrictive relative pronouns, respectively, don't apply so uniformly in UK English.  They're a bit more liberal with "which" as a restrictive relative pronoun out there!