Monday, March 14, 2011

Response to Robert Lane Greene

Why LOLCats ruined my English

Mar 09 2011 Published by melodye under From the Melodye Files


The talented writer and polyglot Robert Lane Greene has a short guest blog in yesterday’s NY Times today suggesting that (as Emily Anthes recapped on Twitter): “Perhaps we’re seeing more grammatical mistakes because literacy is on the rise.

In the post, Greene scrutinizes the prescriptivist rallying cry that language is in a perpetual decline and must be enshrined (quick!) before it’s too late.  He trots out the usual counter-arguments: linguistic change is constant and inevitable; linguistic change is not necessarily bad (“when a good thing changes it can become another good thing”); and so on.

The most interesting claim Greene makes comes at the end of the post, when he notes that illiteracy rates have plummeted over the last century, to virtually zero.  However, as he is quick to point out:
Literacy is a continuum of skills. Basic education now reaches virtually all Americans.  But many among the poorest have the weakest skills in formal English.
This, he thinks, is to blame for the rise of misplaced apostrophes and teen-text speak.  It’s a truly interesting observation, but one I think he squanders in his conclusion.


Even though he continues to rail against prescriptivism (saying, for example, that it is “far from obvious” that language is declining), he makes frequent use of prescriptivist language in doing so (claiming that ”more people are writing with poor grammar and mechanics”).  By using words like “poor” and “weakest,” he’s tacitly making the same value judgments that prescriptivists do.

Fair enough, I suppose.  But this seems like a lost opportunity to more deeply consider what prescriptivism is for (standardization) and what it’s fighting against (variation).

In fairness to Greene, this is a lost opportunity that may well have been due to space constraints.  So let’s just say I’m picking up where he left off:

The reason formal grammar is taught in schools, is because we are aiming both to conventionalize, and even ‘crystalize,’ our language according to certain norms, and to make it more uniformly patterned (this is why we are taught ‘rules’ that are supposed to apply broadly).  Education is one forcible means of (attempting to) root out non-standard ‘grammars’ (such as African American Vernacular English) and of homogenizing usage [1].  So Greene is right to point out that the weaker one’s educational background, the less likely one is to be thoroughly inured in these norms.

However, there are a couple of additional points I think are well worth exploring.

Perhaps the most obvious is that school is but one way of imparting these standards; popular media (film, TV, radio, books, magazines, newspapers, and so on) is another.  The proliferation of media in the modern world is absolutely unprecedented, and this has consequences too.  As Joshua Foer points out, our reading habits have dramatically changed:
In his essay “First Steps Toward a History of Reading,” Robert Darnton describes a switch from “intensive” to “extensive” reading that occurred as printed books began to proliferate. Until relatively recently, people read “intensively,” Darnton says. “They had only a few books — the Bible, an almanac, a devotional work or two — and they read them over and over again, usually aloud and in groups, so that a narrow range of traditional literature became deeply impressed on their consciousness.” Today we read books “extensively,” often without sustained focus, and with rare exceptions we read each book only once. We value quantity of reading over quality of reading.
This is true not only of books, of course; it’s true of just about everything.  Now, more than ever, we have the unsettling power to choose: what we read, what we watch, what we listen to, what we consume, and so on.  Surprisingly, this can actually work strongly against conventionalization.  In one study I worked on at Stanford, we found that fiction and non-fiction readers’ sensitivity to various distributions of words sharply diverged.  To translate that into non-psych babble: we found that because fiction and non-fiction readers read differently, their representations of English become measurably different over time.

If you think about it, that’s sort of incredible: we were sampling from the (really strikingly) homogenous population of Stanford undergraduates – all well educated, all native English speakers – and we still found impressive variation.  So not only is language changing over time, it is – at this very moment in time – diversifying.

Of course, I’m over-simplifying here, because media pushes in both directions.  In one sense, it can act to disseminate conventions.  For example, if hoards of Americans are watching the same TV shows and reading the same books (and, according to one Zipfian analysis, they are), then they are all ‘drinking from the same well’ so to speak; they are all tuning their representations to the loudest cultural signal.  This works against the development of the kind of strong regional variation seen in the UK, for example [2].  This is the top-down effect of media and other social institutions.

However, the bottom-up effect, I was describing before, is becoming increasingly powerful: First, we have choice, and now more than ever, a means of exercising it.  Minor authors, musicians, artists and so on, have existed for time immemorial, but with the Internet (and, more precisely, with Google), we have a filter that allows us to find them.  You don’t need to live in Austin to have heard of Voxtrot, and you don’t need to be British to have read A.A. Gill (not that he’s minor, but nevermind).  Perhaps more importantly, the Internet age cultivates bottom-up phenomena, in which small trends rapidly turn global (the unconventional are rapidly conventionalized) and self-selecting communities (ranging from 4Chan to, er, NAMBLA), forge and disperse their own norms, be they ethical, sexual, comedic, or linguistic.  These days, they say, you don’t have to go to San Francisco to be openly gay; you can go online.

Just think about how different this is from the days when the one book almost everyone owned and read was – the Bible.

So, to pull the strings together: I agree that part of what’s driving linguistic variation may be, as Greene argues, a lack of strong “top-down” constraints on variation.  Basic literacy has exploded, but not well-normed literacy, and that probably has a lot to do with the massive educational disparities that exist in this country.  On a societal scale, our education system is clearly failing to get everyone ‘up to standards’ [3].

On the other hand, many of the trends that prescriptivists are bent on quashing are surely bottom-up.  I know for a fact that my addiction to LOLCatz has more or less ruined my grammar (these days I am frequently inclined to declaim, “I is going!” or to query, “You can haz it?” – formal ‘rules’ be damned).  Similarly, palling around with a Brit for the last couple of years has introduced such delightful phrases into my vocabulary as ‘fit,’ ‘shite,’ ‘nicked,’ ‘mate,’ and ‘lorry’ and has prompted a regular  (if curious) substitution of ‘what’ for ‘that.’  It’s also (uncontroversially) done wonders for my prosody.  My linguistic foibles – or, more properly, idiosyncrasies – are the result of individual choice: what I take to be funny and whom I choose to associate with (and whether I really feel like tacking the ‘m’ on the end of ‘who-’ to make it formally ‘correct’) [4].  Teen ‘text-speak’ is just more of the same.

In short, variation’s causal web is far more complex than simply ‘education.’

In broadening our picture of the forces at work in language change, we might also consider how English is being influenced from the outside.  According to one statistic, there are now something like three times as many non-native speakers of English as there are native speakers.  English is thus being reappropriated by foreign speakers, both on our shores (in the tides of immigrants that come to this country) and off it (in English creoles and pidgins, and in widespread lexical borrowing), and these reformulations are, in turn, shifting the normative space of what is acceptable.  Just think:
The largest English-speaking nation in the world, the United States, has only about 20 percent of the world’s English speakers. In Asia alone, an estimated 350 million people speak English, about the same as the combined English-speaking populations of Britain, the United States and Canada. 
Thus the English language no longer “belongs” to its native speakers but to the world, just as organized soccer, say, is an international sport that is no longer associated with its origins in Britain.
So should prescriptivists be worried?  Hard to say.  On the one hand, as English ‘diversifies’ as a language – as foreign speakers and text-emboldened teenagers remix it – we might expect language change to start speeding up, as more ‘errors’ and idiosyncrasies are introduced [6].  On the other hand, as the population of English speakers grows ever larger, it may be less likely for any given innovation to sweep across the entire language and take hold.  Thus, what is ‘standard’ may remain so, even with expanding pockets of variation.

Finally, we might ask what role prescriptivism plays – and should continue to play – in modern life.  In theory, there is real utility in imposing standards through education – these standards are meant to get everyone ‘on the same page’ and provide a form of cultural unity through language.  On the other hand, they (seemingly) legitimize discrimination against those populations whose English is non-standard (certain African American communities being a prime example).  By being taught black-and-white rules for “what is right and what is WRONG,” we learn to see language in value-laden terms; as adults, we think we can size up a person by their accent, the kinds and variety of words they employ, their conjugations, their idiomatic use, their slang, their spelling and so on [5].  In some sense these judgments aren’t wrong: our peculiar backgrounds (class, race, region, gender) and predilections are reflected in our language.  On the other hand, prescriptivism implies that there is a moral dimension to language use, and that we should stigmatize variation.  It’s hard to see the good in that.

Thanks to Mr. Greene for an all too brief post that prompted this outpouring.  We’re looking forward to your book, sir.

Brief Asides

[1]  Education doesn’t just do this for language, of course; education socializes children in many of the norms of the broader culture, including values, ethics, social behavior, and so on.

[2]  It is interesting to ask why this kind of variation appears so much stronger in the UK than it is in the United States.  If I could wager a guess: 1) This variation may have been historically entrenched, whereas it has not had the chance to be in the US, which is relatively young.  2)  Certain variations – in accent, say – may more strongly reflect class standing and cultural affiliation in the UK, than they do in the states.  3)  The UK media represent a broader cross-section of this variation in their films and broadcasts, whereas Hollywood does not.

[3]  Counterintuitively, standardized tests like the SAT may actively promote variation, because of the (relatively poor) way they test verbal skills.  To score well on the SAT, it is important to have a fairly broad vocabulary.  However, the verbal exam tests whether you know the ‘definition’ of a word, not whether you know how to use it.  For students with smaller vocabularies who are hoping to score well, this provides an easy path to top marks: memorization.  Every year in the United States, there are scores of diligent tenth graders out there with nose to the grindstone, haplessly memorizing the definitions to hundreds or even thousands of words via flash cards.  What this means, in practice, is that they are learning the meanings of words divorced from context, arrested from the usual company they keep.  Having taught many such ill-taught teens, I know that this tactic often results in highly idiosyncratic usage patterns – the kind of ‘overly flexible’ usage we expect from second language learners, not native speakers.

[4]  On this front, I am driven to distraction by the ‘unilateral’ copy-editing practices adopted by certain magazine editors, in which conventions trump nuance.  For instance, one article I published had all of the contractions stripped out of it.  In that piece, I had adopted an informal and jovial tone, and the contractions were in line with that.  Once the contractions were stripped out, it read almost awkwardly – “What’s more” was suddenly the haughty sounding “What is more.”

[5]  When I was 17 and a budding prescriptivist, I used to scoff at anyone who dared say “on accident” instead of “by accident,” because of what a style book told me.  (Sigh – to be young and an impassioned idiot).  Now it depresses me to think that we judge people by the accident of their language.

[6]  Linguists often talk about change beginning when an ‘error’ slips past the radar of one speaker, and the speaker reproduces it, as if correct.  Arnold Zwicky, writing on the spread of “getter better,” notes:
The crucial fact that allows the error to spread as a new variant is that those who hear (or read) the original slips don’t know the status of the expression for those who produced it; for all they know, it’s just an idiom that they might not have noticed before.
If speakers have more distinct representations of the same language, as a result of strong bottom-up and weak top-down processes, it may be that errors like this will become more common.


Source: http://scientopia.org/blogs/childsplay/2011/03/09/why-lolcats-ruined-my-english/

No comments:

Post a Comment