Hamilton Craig will be speaking on the panel at “is academia dead?” this Thursday evening. RSVP here.
I recently found myself relying on a comforting habit from my adolescence that I thought I had outgrown: staying up all night high on drugs. Back in the day, “drugs” usually meant mixed amphetamine salts, Adderall, or a generic equivalent. On this occasion, all I was able to lay hands on was a legal concoction sold at the bodega. Whatever the secret recipe of this stuff was, it felt more or less like a solid dose of oxycodone with a weird speedy undertow. It’s marketed like a health drink, but apparently if you do a lot of it your hair falls out. I doubt it will be legal for long. Among the stresses that drove me to this embarrassing but mercifully contained act of regression was the imminent arrival of one of the most feared rituals in the life of the modern American PhD student: orals.
Oral exams are different in every program, but they always mean reading a quite large number of books and then answering questions about them, without notes, before a board of examiners. In my case, the number of books was 140, and they were selected to cover the fields of pre-and-post Civil War American history, modern European history, and the history of my particular topic, which is farmers’ protest movements. The questions are not known in advance, although most find ways of wheedling a general idea of what they will be from the examiners. Even though most people pass, as I did, the orals reliably engender some level of hysterical anxiety in students as they approach. For us, who have usually spent years trying to prove our brilliance and mastery of our subjects while constantly feeling like frauds, this great test of competence can feel like the most important event in our lives.
It is a funny thing to be surrounded by people whose terrors and joys revolve around the successive milestones of the PhD process when the reality, regularly affirmed in the media, and sometimes by our peers and advisors, is that none of this matters anymore. Getting a humanities PhD is a futile endeavor. The share of undergraduates majoring in the humanities is at a historic low, declining between 2012 and 2022 from 13% to 8%. Schools around the country are cutting majors and laying off professors.
I shouldn’t wonder that the demand for tenure-track professors in the humanities is probably lower than the market demand for rappers and models. The numbers make clear that most of us who make it through all the anxiety and strain of this process will find nothing waiting for us at the end, except maybe a few years adjuncting followed by a position at a mid-tier private high school, if not a coffee shop. So why do we do it? Why is it the case that even as the job market has cratered, the number of humanities PhDs has gone up every year?
I suppose I can begin by answering this question for myself. I had an undergraduate advisor, an intense and slightly scary expert on modern Calvinism, who spared no details about how dumb it would be to do what he did and get a PhD in history. “The job market was bad when I went,” he told me, “and now it’s gotten a lot worse.” I paused before answering, “Well, somebody’s going to get the job.” He smiled as if he knew this was exactly what I was going to say. By that point there was nothing else I could say.
The great thing that all people who follow this path understand is the joy of being asked what one thinks. If you attend a liberal arts college, you are asked continually for your thoughts on questions that have exercised the greatest philosophers, historians, and social scientists. For most undergraduates this is experienced as something annoying. But for a few, it’s like a drug. These are the students who have always felt, rightly or wrongly, that they have something to add to the big conversations of humanity. When college is over, and they are faced with the prospect of a desk job where their bosses and colleagues won’t care about any of their big ideas, a kind of panic takes over. They yearn to escape into an environment where their ideas will continue to matter, where they will have some kind of official standing, however tenuous, as thinkers. If you are a person for whom this kind of recognition is necessary, no arguments about your career prospects can stop you from applying. This was how it was for me, and I suspect this is how it was for most of us.
But, this is not the end of the story. Because, for those who make it to graduate school, the real nature of the bargain we have made only dawns after we begin. By climbing on to the bottom rung of the ladder of academia, we thought we were trading comfort and stability in order to be official, institutionally supported thinkers, people whose ideas matter. But the deal is even more exacting than this.
Academia is above all a social environment and you are more or less explicitly given to understand from the beginning of your enrollment that your success will hinge on your ability to read the room. This is communicated in all sorts of ways. You might be told that your article will have a greater chance of publication if you subtly alter its argument to align with some currently trendy viewpoint. You might hear it said that some particular scholar’s comments or writings are objectionable not for being inaccurate but for being “tone deaf” or coming at the “wrong time.” You are constantly told to consider and anticipate the reactions and preferences of examiners, journal editors, and job search committees as you pursue your supposedly free and self-directed inquiries.
Jordan Castro captured this phenomenon well in his Compact article “The Unbearable Right-ness of Professor Speak,” in which he noted the compulsion of academics to ask “right?” at the end of every sentence, constantly scanning for consensus. I understand well where this tic begins: with the lessons taught from the first days of graduate school about the importance of being on the same page as those who can determine your future.
I don’t fault any of my peers or professors for the advice they gave me about the need to go along to get along. They are simply describing how the game is played. But the game has unfortunate consequences. Chief among these is what it does to the minds and spirits of those who play it, people who began playing out of an extraordinary willingness to take risks and to sacrifice practicality for intellectual adventure. This sense of adventure does not easily survive the graduate school process. We all begin by looking up to our professors, by longing to be part of their special elevated world. But as the importance of consensus is drilled into us, this desire to belong, to have institutional recognition, becomes an end unto itself, for which even the passion we began with is sacrificed.
This entails much more than just politically correct self-censorship. It means a diminishing of our horizons so that all we aspire to do is politely add to one scholar’s work or lend our support to another’s. Attacks are to be reserved for scholars who are either safely dead or totally ostracized: Witness the continued fervor for attacking the “Lost Cause” view of the Confederacy, which hasn’t had serious defenders since the mid-20th century. Very few are willing to expose themselves by attempting to generate sweeping, synthetic theories or found new schools of interpretation. It is very difficult to imagine the contemporary academy producing a scholar with the ambition of an E.P. Thompson or Eric Hobsbawm, let alone a Max Weber or Karl Marx.
It is possible to resist the culture imposed on you. When we teach, we can dissent in small ways from the regnant views within our specialities, passing on fragments of apostasy to our bright students. When we participate in seminars we can, with delicacy and disarming pleasantness, criticize a book the professor means for us to like. We can, by our own vulnerability, encourage our peers to let their guards down and air their hidden gripes against the approved views amongst one another. And we can quietly connect with others, across fields and departments, who share our criticisms of the academic regime. In these connections will be found the lineaments of new schools and movements that can change the hearts of the universities.
What has always struck me as I move through grad school is how totally unnecessary it is to conform. By signing on to a graduate program in the humanities, you have already gambled away your future. Why on earth would you spend five, six, eight years reading the room and memorizing buzzwords, only so you can end up giving English lessons on zoom at the end of it? The only conceivable reason to pursue a PhD is to indulge your scholarly hubris to the utmost, whatever the consequences. This isn’t only a responsibility you have to yourself. It is a responsibility to society.
If demand for humanities is cratering, mightn’t it be because our collective failure of nerve has made these fields so puny and uninspiring? The majority of people will always pursue the path most likely to keep their stomachs full. There is nothing wrong with this. But even the bold and brilliant will turn up their noses at English, Philosophy, Literature, and History when these studies are identified with caution and sanctimony rather than iconoclasm and courage. And the ultimate consequence of all this is that the part of humanity that reflects and considers will atrophy, leaving only the bottom line and the eternal, aimless shuffling of the herd.
Hamilton Craig is a doctoral student at CUNY researching farmers' movements in the United States. His essays have appeared in Compact magazine, Countere magazine, and Front Porch Republic, as well as academic journals. His fiction has appeared in Expat and DFL lit. Follow him. Check out his previous articles in cracks in pomo.