Gwendolyn A. Morgan
Day Late, Dollar Short is the latest in a string of books proposing to expose the decline in academic conditions and standards, together with its causes. In this case, contributors work from two common bases: first, an assumption of a "post-theory generation of scholars," i.e., those of us coming of academic age after the great theory wars of the 1960s and 1970s; and second, the public call for "accountability" (read "profitability") of postsecondary institutions, resulting in the new model of the "corporate university." The book deals mainly with the humanities, but certain generalities can be extended to all disciplines.
The post-theory generation comprises those academics that entered the job market from the late 1980s onward. They faced a graduate education in which the radical approaches of their predecessors had been formally institutionalized as required theory courses. Hence, those ideas had ceased to be radical and, indeed, were expected to become a standard part of our own scholarship. Moreover, the theory wars coincided with the greatest expansion of the university system seen in the twentieth century, and so the proponents of those theories became not only our teachers but also our hiring committees. When the job market shrank drastically in the next decades, new Ph.D.s had better be on the theory bandwagon or expect unemployment.
Additionally, these essays assert, the post-theory generation has no new theory of its own, partly because of the Theorists' control over the profession and partly because of the extremist nature of the now-accepted approaches: after all, if one asserts the absence of meaning or shrinks the signified into non-existence, leaving only the signifier, how much further can one go? The best academics can do, they argue, is hyper-specialize or extend application of the established theory rather than develop or depart from the theory itself. The effects on intellectual freedom, given the tight job market, are obvious.
The second bogeyman of Day Late, Dollar Short is the corporate university model. For the last two decades, the general movement to make universities more self-supporting has had several major effects on academe. First, under the guise of increasing professorial "productivity," the profession saw increased class sizes and course loads. Second, in the light of ever fewer tenured positions, it saw demands for more publications, producing another reason to replicate the theories instituted by those who sat on the review boards of major trade journals and presses. This measure, in turn (according to more than one contributor), led to high volume, production-line publication and extremely short shelf lives of the results. Finally, various disciplines have had to prove their value in monetary terms. In other words, they must prove the immediate applicability of course content to Corporate America's interests. This means that courses producing obviously recognizable, concrete business skills, e.g., technical writing or composition courses, have overtaken those with less tangible benefits, such as literary study.
Within these confines, Herman's collection offers analyses of the problems from Marxist, feminist, deconstructionist, humanist/populist, popular culture, and new historicist perspectives. Here's a summary of basic complaints:
low funding of travel and library resources
threatened abolition of the tenure system and increased use of non-tenured adjuncts and graduate students as teachers
emphasis on the "utility" aspects of a discipline
the accompanying narrowing of scholarly pursuits and intellectual freedom
a "dumbing" down of curricula to retain poorly prepared students who perceive the university only as technical training for the job market and not as a pursuit of knowledge for its own sake.
To be sure, the complaints seem valid, since statistics (the number of tenured positions advertised, adjusted-dollar salaries, budgets, etc.) support them. Proposed solutions, when they are offered, range from unionization and striking to using the media to popularize what we do.
What struck me most about this collection, however, was not the rather predictable (the corporate university) or innovative but tenuous (the post-theory generation) underpinnings of the book as a whole. Rather, it was the sameness of the complaints and the assumption that such are relatively recent. First, the very existence of a sub-genre of similar books argues emphatically they are not new. Second, all arguments for a general decline in academic conditions and standards assume we have something to decline from, a time when the complaints did not exist. History does not bear this out.
Consider, for example, that most Day Late, Dollar Short contributors look back only as far as the 1960s and 1970s to find professorial utopia: in other words, to the very generation of scholars also supposedly responsible for our current intellectual bondage. Yet, at the same time, the rhetoric of contributors suggests that the 1970s were the end of a long history of privilege and prestige. Consider, for example, that Barbara Reibling asserts that the corporate university sees tenure as "the inherited privilege of an ancient regime ripe for the shaking" ("Contextualizing Contexts" 189) and Crystal Bartovich asserts that the maintenance of the trappings of this "ancient regime," devoid of their original content, is what keeps academics from wholesale revolt against current conditions ("To Boldly Go Where No MLA Has Gone Before" 200). In other words, the theorists were academe's "last hurrah"; our decline began shortly thereafter. Other recent books look to the same era, such as Bill Reading's The University in Ruins (1996) and Aronowitz and DiFazio's The Jobless Future (1994). Still others, however, retreat further into the past. Satya Mobanty (Literary Theory and the Claims of History, 1994) finds the roots of academe's "decline" in the eighteenth century, while Robert Scholes sees intimations of it as far back as the Renaissance (The Rise and Fall of English, 1998).
Even recognition of the problems is not a recent phenomenon. In the 1930s, J.R.R. Tolkien lamented the decline in academic standards and freedom, while in the late nineteenth century, Charles Pierce commented that the universities of his day were debased, directed "only to glorify a foregone conclusion" (i.e., that they upheld the business and social status quo, thereby discouraging dissention and intellectual exploration), and generally under-funded. The question, then, is, except for the middle decades of the twentieth century, just when did this golden academic past exist? When did we not have the same complaints?
The answer, sadly, is that it didn't. Having ruled out the sixteenth through the twentieth centuries, we might turn further back to the Middle Ages, when hooding ceremonies and the grandiose Latin phrases on our diplomas originated, and to which our continued glorification of such practices seems to point. Unfortunately, Alan Cobban's English University Life in the Ages (1999, reviewed in Montana Professor, Spring 2000) proves that the same woes existed in the twelfth through the fifteenth century, i.e., since the very rise of the university system. Cobban finds infringements upon intellectual and, indeed, upon personal freedoms due to no reliable tenure system and pressures on professors and students from the external job market. In addition, he documents relatively lower salaries, exceedingly poor funding for libraries and other facilities, masters facing popularity contests among students to retain positions and offer courses, high student drop-out rates, and, you guessed it, poor student preparation for university entrance. Then, as today, the humanities also featured lower funding and salaries than the sciences or professions such as law, and teaching of utility courses by graduate fellows was frequently a requirement of acceptance for advanced study.
All in all, the profession apparently has no past golden age from which it has declined. We have always been underpaid, overworked, and exploited, without adequate tools for research and teaching. We have always faced pressures for utility courses and student evaluations. Only in the post-World War II optimism and renewed prosperity that created a brief aberration in academic history can we find our utopia. For a single generation, universities expanded, and there were more positions than Ph.D.s, with accompanying benefits--financial and cerebral--to draw them. To think otherwise is the worst kind of historical revisionism.
This is not to say that we don't have valid complaints. Perhaps we do, but we have no more a historical right to demand better than do clerks or bartenders, waitresses or child-care workers, secretaries or nurses, or members of any other chronically underpaid and exploited profession. Imposing our vision of what should be on the past in order to justify our demands for the present seems neither morally nor intellectually honorable, and this is what books like Day Late, Dollar Short are wont to do. It is its weakness, and the weakness of all its ilk. We need to accept that we don't have an inheritance we are losing; we have a mission.