Were we still "waiting" for anything? The Urbana campus of the University of Illinois is officially censured by the AAUP and Prof. Salaita is off to Beirut for a year. The outcome of the lawsuit still lies before us, and a few more anger-bombs may detonate as additional FOIA'ed documents come forward, but it's been over for a while. Moodie and Manalasan gave us some bones to pick over: Is academic freedom inseparable from the BDS movement? Was the boycott prompted by forces inside or outside the university? These have been rich talking points on Facebook this past week.
I'm more interested in what can be salvaged from a fight we've already lost.
This line from Mad Men has been haunting me, ever since a friend brought it to my attention:
An alternate title for what follows: Why Sara Goldrick-Rabb's Professional Obligation is to Future UW Students, not UW, which Should Be the Same Thing but Isn't.
In academia, the salary or per-course compensation, that is, the money, is for the teaching. That goes for everyone who gets paid to be faculty. And also for the things that go with the teaching (research, if you are salaried in a tenure-stream faculty appointment): the preparation, the ongoing engagement with the field, the creation of new knowledge. And also the things that go into sustaining the department that sustains the teaching (more of these, generally, if you're noncontingent faculty): the service, the committee work, the administrative appointments folded into a faculty workload.
The academic equivalent of that uncompensated space where Peggy finds herself is growing. As higher ed ceases to be a public good, paid for by tuition rather than state appropriations, the expectation grows that public university faculty will not just teach, serve, and do research for the benefit of the citizens of their state. Excellence has become too expensive to speak for itself. Faculty now must be willing to contribute to the PR effort to package those activities as part of a "brand" that can be sold for top dollar to out-of-state and international students.
Unfortunately, state legislators, the upper echelons of academic administration, and the faculty in front of classrooms have very different ideas about what that "brand" consists of, what they are willing to work to achieve that isn't necessarily covered by their compensation. It's not exactly the "thank you" that Peggy is looking for, but it's in the same realm of ineffability. For people who aren't paid-to-teach, it's a certain kind of prestige, name-recognition, newsworthiness. For those who are, it's those moments when, as a result of their work, something exists in the world that wasn't there before: a student's ability to think a complex thought through to uncomfortable conclusions, an online classroom discussion forum where students spontaneously wield arguments borne of evidence they've gained in the course readings, students who didn't think they could do X and who learn that they can. Or beyond the classroom: a new idea, a paper that gets people in the field talking, a book, a discovery. It's not necessary that these two concepts of surplus value be in conflict in higher ed. But they are.
In the nonacademic world, most people in the course of a career will find themselves on both sides of the Mad Men conversation: perceived as demanding recognition before they deserve it, withholding recognition that is warranted. In academia, the conversation of what gets paid for and why is increasingly in the hands of Don Drapers who, unlike this fictional character, are profoundly detached from the creative process, and Peggys who have few options for selling their considerable talents to less complacent buyers.
The emissary from the world of business consulting was fielding questions from students in my course on career planning in the humanities. "Business consulting needs writers, right?" asked one student. The business consulting expert visibly blanched. "Well, not writers, exactly, but we do need..." and he went on to describe any number of key tasks--drafting surveys, communicating results to clients, producing "deliverables"--that most people would call writing.
It's no longer okay to call it that. Spend some time looking at job ads, position descriptions, career counseling sites and it's clear that no one writes anymore: they create content. Or they strategize content. Or sometimes they generate those "deliverables." Rarely do they "write." It's a useful thing to know if you're an old-school humanities major staring down the job market. Type "writing" or "editing" into a job search engine and you quickly reach a dead end. Type in "content"--or prepare to BS your skill set a bit and type in "user experience"--and many more opportunities emerge.
Anyone trained to closely assess language and tease out implications--which I suspect is most of the readers of this blog--can probably spend the next 15 minutes contemplating the differences between "content" and "writing" and arrive at a conclusion about the corporatization of everything. "Content" aims to get monetized. "Writing" may or may not.
"Effective content" may or may not involve "good writing," depending on the strategy in play. Obfuscation, and a warm marinade of corporate terminology, has value in realms of existence that are well above my pay grade, but then, so can wallet-opening clarity and originality. But one dare not call it "writing" and conjure up the inevitable waste of time that goes into conveying an important idea with precision.
Gary Saul Morson of Northwestern University asks "Why College Students Are Avoiding the Study of Literature" in Commentary Magazine. His answer: because we (mostly college instructors, but also high school English teachers) teach it badly.
(Some might point out that students avoid the study of literature because like to get their Gen Ed. credit in ways that require as little writing as possible--and many literature classes involve a lot of writing. They'd also note the widespread perception that college-level study of literature dooms one to a life of penury and irrelevance. But Morson doesn't go there.)
No--we systematically ruin literature, in three distinct ways, according to Morson: (1) We load up with critical vocabulary so that students can treat the study of literature as a kind of forensic autopsy that destroys the subject in the course of analyzing it. (2) We encourage students to judge the literature of the past by the moral standards of the present, which leaves little room for enjoying literary works on their own terms. (3) We treat literary works not as art but as documentary evidence--we encourage students to read literature as a tool for understanding particular historical moments, not as an aesthetic experience.
Guilty on all charges. Note that accomplishing both (2) and (3) could be considered a feat of intellectual incoherence and sleight-of-hand so prodigious as to warrant at least some degree of grudging respect. Yes, of course. Every semester, I fail my students in all these ways, sometimes in the teaching of a single text.