How We Talk about Heroes with Feet of Clay

Earlier this week, near the end of class in my modern U.S. survey, an undergraduate student posed a provocative and timely question: Why do we only want to talk about the good things people from history did, and not the bad things? I think the wording was pretty close to that, though I don’t recall exactly.

In the context of the lesson, the student’s question was about public monuments, and specifically the colossal presidential faces of the Mount Rushmore National Memorial. (My student particularly mentioned George Washington’s slaveholding as an example of an inconvenient truth about a historical figure.) But the question also seemed to voice a complaint about the student’s experiences in K-12 education.

We were about to run out of class period, so we tabled this question for the next lesson.

I wanted to make sure we discussed this question properly because a lot of the students in this class are education majors. Whether or not they specialize in social studies, they’ll soon be dropped into a public maelstrom centered on this problem. And many of them will have to decide how they are going to teach children responsibly about flawed figures from America’s past.

Discussion backdrop with detail from a photograph by Sergio Olmos, via OPB

To set up the conversation at the beginning of the next class period, I looked up a story from two years ago.

In October 2020—on the weekend before the federal holiday that Oregon would later designate as Indigenous Peoples Day—some two hundred people in Portland participated in an “Indigenous Peoples Day of Rage.” By the end of that night, some of the protesters had pulled down statues of Theodore Roosevelt and Abraham Lincoln, defaced a mural of the Lewis and Clark expedition, and smashed windows at the Oregon Historical Society, accusing that organization of “honoring racist colonizer murderers.”

I focused on this story because—much more than the recent destruction of some other kinds of monuments—it presents us with legitimately challenging questions about public memory. (We’re toppling Lincoln now? Really?)

It also involves a specific atrocity I discussed in the last class period. Painted across the plinth of the Lincoln statue in Portland that night in 2020 were the words “Dakota 38”: a reference to Abraham Lincoln’s approval of the public mass execution of 38 prisoners after the U.S.-Dakota War in Minnesota in 1862. The statue’s hand was also painted red, presumably to signify Lincoln’s guilt as the Dakotas’ murderer. He had authorized, notoriously, the largest mass execution in U.S. history.

Basically, I didn’t want to make this conversation too easy. If we wanted to talk about hard truths, we should talk about hard truths, not easy ones. Thus, to begin our discussion in the following class period, I displayed a photograph taken that night in 2020 by the Portland journalist Sergio Olmos, and I briefly explained the story of the protest.

Then I posed two questions for the whole class. First, if you agree with the premise of your colleague’s question—that we usually want to talk about only the good, and not the badwhy do you think we’re like that? Second, how can we do better when we talk about our past?

The ensuing conversation lasted half an hour—a substantial portion of our total class time.

Out of an abundance of caution about protecting my students’ privacy, especially considering the political sensitivity of the discussion, I won’t go into the details of what they contributed. But I can tell you for sure that this issue has been on the minds of some of these students.

They are keenly aware that it’s a hot political topic. They understand that politics directly shapes what K-12 teachers can safely say about American history at work. And they already have strong opinions about this, opinions they have formulated with considerable care—in most cases, I’m quite sure, before arriving in my classroom.

Even though I’m being discreet about the contents of this class discussion, I’m writing about this because I think it’s important for American citizens who aren’t attending our colleges and universities to understand that these conversations are happening. It’s also important to understand that students are often coming to their own conclusions before they arrive in the college classroom.

And sometimes, correctly or not, they believe they’re reaching these conclusions in spite of the way they’ve been taught in primary and secondary schools, as much as because of it.

A New Career for History Majors

Many American humanities professors are worrying about evidence that students are avoiding us. Partly due to the extreme cost of a typical college education today, combined with the economic insecurity of our younger middle class, undergraduates are not only deciding not to major in humanities disciplines like history, but—even more worryingly—are also likely more likely than college graduates in other fields to regret their choice later if they do.

Into this darkness comes a sudden ray of hope: a brand-new career opportunity for history majors. And it’s lucrative!

I think we can expect a rebound in history enrollments once young people realize that studying history can prepare you very well to compete for a rewarding job as the king of England.

I discovered this recently when I heard a BBC radio report that mentioned that the newly elevated Charles III studied archaeology at Trinity College, Cambridge, in the 1960s. Looking into this, I found that Charles had indeed studied archaeology and anthropology at first, but had switched to history for the latter part of his time at Trinity.

Now, I’m not exactly an expert on how subject examinations work at Cambridge, but as I understand it, this meant Charles’s baccalaureate degree was indeed a B.A. in history when he earned it in 1970.

In any case, that’s how the New York Times reported it at the time. Describing “the first university degree to be earned by an heir to the British crown,” the Times noted the following:

The degree awarded, based on examination results, was an honors degree in history, Class 2, Division 2. That is about the average at Cambridge — ‘a good middle stream result,’ as one don put it.

There are three classes of honors degrees, awarded according to grades, and the second class in turn has two divisions. …

The Prince’s tutor, Dr. Denis Marrian, senior tutor at Trinity, was asked whether Charles had been in any trouble as an undergraduate. ‘Nothing went wrong,’ was the reply. ‘In fact, I think you’ll find I have more hair now than I did three years ago.’

According to a recent story in the Guardian, the British monarch’s main income last year (the “sovereign grant,” a sort of royal allowance from the government), amounted to £86.3 million, or $98.2 million. That’s revenue derived from official wealth totaling an estimated £17 billion or so. (The crown owns expensive parts of London, much of the sea floor, the swans, etc.) In addition, the late queen had a private fortune estimated this year at £370 million; presumably much of that has passed to Charles III. Being the monarch also means significantly expanded access to housing in today’s tight market.

Seen at sunset: Just one of the exciting benefits of a career in history
Photo by David Iliff, 2006 (CC BY-SA 3.0)

I think we can all agree that this means a bachelor’s degree in history, even with only average grades, can be an excellent investment for a student’s future economic security. I trust American colleges and universities won’t overlook this crucial opportunity to publicize the value of what we do.

To be fair, though, it’s a career with limited opportunities for promotion.

When Students Change Each Other’s Minds, It’s Called Friendship

This weekend, the political scientist Yascha Mounk posed a provocative question on Twitter: “What are the top things universities could do to encourage a culture of free debate and inquiry, not just in the classroom but also in dorms and dining halls?”

(Judging by the context, this question may have been prompted in part by a new “campus expression” report from Heterodox Academy. I discussed the problems with a previous report from the same alarmist study in March 2021.)

Here’s the thing: Because my academic work centers on teaching first-year college students, I ponder issues like this a lot, and I believe Mounk is on the wrong trail.

The thing most people asking this question are actually probing for? It’s not debate. It’s friendship.

Setting aside the intellectual shifts that can happen just because of spending time around new kinds of people, when extracurricular life changes minds, it’s typically because students are forming real friendships, in which important conversations happen organically—not because of a “culture of free debate.”

All those people who wistfully remember (or wish they remember) late nights in each other’s dorm rooms, talking excitedly about the problems of the world? The experience they’re describing is friendship.

When a conversation about something you’re reading or discussing in class spills out into the dining hall, the quad, or the apartment? Why, yes, I do believe you’ve been making friends.

The folks you aren’t afraid of offending when you say something unpopular that needs to be said? They’re either strangers you don’t expect to hang around anyway, acquaintances you’ve already given up on, or, crucially, friends who will trust you enough to listen to what you’re saying.

People who will, in the middle of a busy life, actually sit still while you carefully identify your premises and show why you think they lead to a controversial conclusion? They’re almost certainly people who care about you as a person.

And when you keep having the same argument with the same person over and over, not because you love degrading yourself but because you’re subtly shifting each other’s views over time? “As iron sharpens iron,” you’re honing the mind of your—what’s that word the proverb uses?—oh, that’s right—your friend.

The contemporary world is full of free debate and inquiry. We’re drowning in it. Public faith in democracy—and in the value of debate—is dying from it. When we’ve got the entire Internet at our disposal, a culture of free debate and inquiry is the least exceptional thing college can offer.

What intellectually curious people really want from college is friendship. The kind that can change the mind as well as heal the spirit.

This same weekend, the culture critic Touré posed another observation on Twitter that I believe is directly related to the fears our intellectuals express about college students: “After 35 it’s easier to get a new spouse than it is to get a new close friend.”

Now, I’m not sure that’s literally true, but the anguish it expresses is recognizable.

And I suspect—though of course, I can’t prove—that when aging college graduates like Yascha Mounk, my fellow geriatric millennial, bemoan the supposed intolerance of today’s young people, it often has a lot to do with how increasingly elusive that kind of friendship seems to us.

A Conversation About Pedagogy

Today, Danny Anderson, who teaches English at Mount Aloysius College in central Pennsylvania, invited me onto his Sectarian Review podcast, a wide-ranging religious humanities program, to talk about an essay I wrote here last year, “The Conservatism of My Teaching: Seven Elements.”

Our conversation went in a lot of different directions and covered a number of controversial topics. I can only hope I treated the positions of all my colleagues fairly, given the constraints of the medium and my typical shortcomings as an extemporaneous speaker. We certainly didn’t exhaust any of the topics we discussed.

Later in the interview, I talked about two articles from the March 2022 issue of the Journal of American History: “Meet Me in the Classroom,” by Olga Koulisis, and “Historical Thinking and the Democratic Mind,” by Lindsay Stallones Marshall and John R. Gram. I tried to explain why I lean toward Koulisis’s position, but both of these essays are rewarding.

Leaving Extremism: What’s College Got to Do with It?

I posed a question here almost two years ago: Do humanities teachers know how to deradicalize their students? I was responding to reports about an alleged neo-Nazi terrorist who had received an expensive liberal arts education. (A New York magazine profile subsequently labeled him “the prep-school Nazi.” It also showed, however, that his involvement in the far right probably began only several years after he left college.)

I argued that the evidence for education’s effectiveness in combating extremism is, at best, mixed. We cannot assume education reliably prevents or reverses radicalization. However, this doesn’t mean education has no role to play in the deradicalization process. As I wrote a year ago, “People have to be given the tools to challenge and rebuild their own beliefs.” Thus, the question I was raising was really this: Do humanities teachers know what practices will give students those tools?

This month, I have been revisiting a 2018 book that shows, as a case study, why the answer is complicated. Deradicalization, this book suggests, simultaneously is and is not about education.

At the end of a year when American educators came under fierce attack for their efforts to fight racism, thinking clearly about this paradox seems more important than ever. So let’s talk about this book.

Continue reading “Leaving Extremism: What’s College Got to Do with It?”

God’s Not Dead, Just Away at College

Briefly noting this item from March 2021: Contrary to conventional wisdom, having a college degree makes an American substantially more likely to say they belong to a church, synagogue, or mosque. That’s according to Gallup surveys that have tracked religious affiliation over time.

Edited chart provided by Gallup

The religious affiliation gap between college graduates and non-graduates, which didn’t exist two decades ago, appears to be widening rather than closing.

As I’ve tried to show here at Blue Book Diaries many times, the conventional wisdom about higher education’s ideological effects in the United States is profoundly broken. That is due in no small part to the work of cynical pundits and professional surrealists, many of whom were happy to receive the benefits of education at elite universities themselves.

The relationship between higher education and religious belief is complicated, but simplistic narratives about supposed religious hostility and atheism in college don’t capture the typical American student’s experience.

Religious Hostility in Academia: A View from 2005

A story in the New York Times this weekend sent me back through the archives of World Magazine, looking for a 2005 article that played an important role in my journey into academia.

The Times story—headlined “His Reasons for Opposing Trump Were Biblical. Now a Top Christian Editor Is Out”—describes how Marvin Olasky, a former University of Texas journalism professor who also played a role in shaping the early domestic agenda of George W. Bush, seems to have lost control of an evangelical Christian newsmagazine that he has edited for more than a quarter of a century.

The cover stories of the April 30, 2005, issue were profiles of Pope Benedict XVI and Senator Rick Santorum

For complicated reasons, what this story dredged up for me is a memory of a specific pair of interviews that World ran under a single headline, sixteen years ago.

The headline of that article, published on April 30, 2005, was “Uncongeniality Contest.” The subhead was “Two views of elite academia from Harvard Law School.” I remember it vividly from my days as a subscriber. Going back to re-read it now, I find the article substantially as I remember it.

At the time, I was in my junior year of college at an evangelical university, preparing to apply to Ph.D. programs to study history. I took the article as an attempt to frighten me. (Not me individually, of course, but people like me.) It was one of countless messages I’d seen over the years warning that American secular institutions of higher education were comprehensively hostile to people like me.

But this time, I looked closely at the evidence provided, and what I saw was patently absurd.

Continue reading “Religious Hostility in Academia: A View from 2005”

‘The Chair’ Is a Campus Novel

It seems as if basically every American professor on social media has been either watching The Chair or putting it off for another time. It’s the creation of the actor-writer Amanda Peet and Annie Julia Wyman, a 2017 Harvard Ph.D. recipient. Netflix released the series on Friday—just as the new academic year begins—after plying academics with screeners of the first episodes.

The trailer gives, I think, a reasonably accurate impression of the show. (The show is rated TV-MA, and the trailer features uncensored “adult” language.)

The series has provoked some strong and contradictory reactions. Some academics describe it as a kind of idealized fantasy of elite higher education; others—especially women—find it upsettingly realistic. And people discussing its portrayal of so-called cancel culture have drawn contradictory conclusions about its argument. (Among the surprisingly basic questions people have raised is whether the show counts as a work of satire.)

For my part, I see The Chair as a culturally up-to-date but familiar example of an established literary genre. With just six half-hour episodes and a story that could be considered complete, The Chair is a traditional campus novel in movie form. Versions of this story have been told over and over since at least the time of Mary McCarthy and Kingsley Amis.

The traditional campus novel is satirical, but the target of its satire is broad: It suggests that there is something absurd and enervating about academia itself. Spoofing specific vices isn’t really the point—although vices related to hierarchy and sex are usually abundant.

Structurally, The Chair fits this classic pattern perfectly, however contemporary its various storylines are. Partly for that reason—but also surely because the co-creator Annie Julia Wyman has recent experience in elite universities herself—this is an unusually realistic depiction of higher education for a work of film. It gets closer to the kind of accuracy we expect to see in literary novels than the (abysmal) level of accuracy we’re accustomed to seeing on television.

(The single silliest storyline, in which an IT worker hunts down a student posting cruel Rate My Professors reviews, is implausible for reasons that have nothing to do with academe.)

With that in mind, here is a very incomplete list of some things that I think The Chair gets right about elite university life, which I will not be elaborating upon:

  • The little-known fact that academia is a workplace
  • Over-powerful yet alienated older professors
  • Faculty man-children looking for admiration
  • Political tensions that aren’t about Republicans and Democrats
  • Students trying to find their voice (with mixed results)
  • The many faces of disrespect
  • The battle scars of older women
  • The pious earnestness with which academia betrays people of color
  • The role of double standards in notions of academic excellence
  • Pervasive feelings of insecurity
  • The odd mix of opulence and poverty
  • The “brilliance” bait-and-switch
  • The inconvenient truth that tenure is an instrument of early-career conformity
  • The complicated feelings tenure-track academics have about teaching

I wouldn’t say The Chair is great art. But it’s a very solid example of a respected genre.

The Typical U.S. College Professor Makes $3,556 Per Course

Late last month, the American Association of University Professors (AAUP) released its Annual Report on the Economic Status of the Profession for 2020–2021. It has been published in the AAUP’s Bulletin and is available to download on its website.

This report now represents the most authoritative and up-to-date information we have about the basic employment conditions of college faculty members in the United States.

We need to talk about this report because Americans have many misconceptions about the lives of college professors. These misconceptions are encouraged by cynical rhetoric from politicians and pundits seeking to undermine our work. They also come from popular movies and television shows that depict professors enjoying lavish salaries and palatial campus offices.

This is not how most professors live. (Publicity photo for the forthcoming Netflix series The Chair, starring Sandra Oh, with a release date of August 20.)

These misconceptions can even come from employment websites, which tend to publish fabricated information. Glassdoor, for example, claims the average American adjunct professor makes more than $50,000 per year, and ZipRecruiter claims the same figure is $67,000. Such salaries would hardly be extravagant by middle-class standards in most cities. But in reality, a typical adjunct professor can expect to make only about half that much—with no benefits—if they can get full-time work at all.

Worse still, public misconceptions are not necessarily challenged by the behavior of tenured faculty members at elite research universities, who lead our professional associations and represent us on the public stage. So it’s important to highlight some of the data in this report.


First, the AAUP’s report shows that the typical American college professor today is an adjunct. In other words, part-time contingent faculty members (professors hired by the course and considered “part-time” workers no matter how many courses they teach) are the largest single class of college professor.

By how wide a margin? According to data from 2019, the AAUP report says, 42.9% of American college professors are part-time contingent faculty members. That means the adjunct workforce is significantly larger than the combined number of tenured professors (26.5% of the faculty) or tenure-track professors still seeking tenure (10.5%). It is also more than twice the number of full-time contingent faculty members, such as “visiting” professors or “professors of practice” (20.0%).

It’s important to note that these figures do not include graduate student workers. We’re talking just about professors, not TAs.

Here’s another way to look at those figures: Across American higher education, adjuncts outnumber tenure-seeking junior professors four to one. That means adjunct professors, more than new professors who will one day have tenure, represent the future of the professoriate.

Even among the elite of American universities—doctorate-granting institutions that pride themselves on using the tenure system to protect the freedom of their researchers—adjuncts are nearly one third of the faculty, outnumbering tenure-seeking professors two to one.

There is an important bright spot in these numbers, however. The AAUP report finds that the proportion of the faculty holding full-time contingent appointments—with benefits and better pay than adjuncts get—has been increasing over the last decade and a half.

In 2006, full-time contingent workers were 15.5% of the workforce; as of 2019, they are 20.0%, with steady growth in their relative numbers since 2009. Making inquiries on relevant campuses, the AAUP’s researchers “found that one reason for the shift is that some institutions are taking actions to improve the working conditions for contingent faculty members.” Hooray.


But for now, it’s important to recognize another key element of the AAUP report: professors’ compensation. For adjuncts, the news is unsurprisingly grim.

The data on adjunct pay are more limited than the data for other kinds of professors. But according to information from 360 American institutions in 2019-2020, the average pay of part-time faculty members is $3,556 per course.

[Edit: Let’s be clear—many adjuncts never see wages that are anywhere close to this national mean. Please consult the data Erin Bartram collected for the 2019-2020 academic year. Adjunct instructors across the U.S. and Canada volunteered to reveal their pay for more than 700 courses. Some made well under $2,000 per course.]

Furthermore, only 1.6% of colleges offer all their part-time professors medical benefits, and only 7.1% offer all their part-time professors any retirement benefits.

To underscore what this means for U.S. higher education in 2021:

The typical American college professor (i.e., an average member of the most numerous class of American professors) makes $3,556 per course with no healthcare or other benefits.

If you aren’t familiar with how colleges work behind the scenes, it may be difficult to guess what this means for adjuncts’ annual wages. In fact, adjuncts often have very unreliable employment—being hired and (unofficially) laid off unpredictably from semester to semester. Because of this, as well as other factors, accurate annual wage data still simply don’t exist for adjuncts.

But nationwide, most college professors would recognize teaching three or four courses per regular academic semester as a full-time workload. If we add two summer courses for the sake of a year-round number, that means the typical college professor would be lucky to make $35,560 per year, and often might expect to make more like $21,336—that is, during the years when they could cobble together full-time teaching work at different institutions.

Now, some adjuncts do work on a truly part-time basis, teaching a course here and there on the side while maintaining another full-time career that allows them such fripperies as, say, going to the dentist. That is what many college administrators use as a justification for the shabby way they treat their professors.

But the reality is that many adjuncts today depend exclusively or primarily on their income as college teachers. This is what they face. This is how the typical college professor is rewarded for their work as they keep American higher education going.

If you’re interested in the current state of U.S. higher education, there’s a lot more information where this came from—including salary information for tenure-track professors and (ahem) college presidents, among many other topics.

“How to Cure Colleges’ Adjunct Addiction”

Although I’m not quoted, I had the privilege of speaking with Holly Brewer this week as she worked on an important opinion essay for Washington Monthly. Here’s a taste of her argument:

This problem keeps getting worse, yet university administrators show little interest in addressing it, and sometimes deny it even is a problem. If anybody’s going to fix this, it will probably have to be the federal government. Subsidies to higher education total about $150 billion annually. To protect this investment, the government should set a floor for what universities must pay teachers, and a ceiling of perhaps one-third for the proportion of total teaching jobs that a university administrator may fill with adjuncts.

It’s appropriate that government should solve higher education’s gig-economy problem, because government (at the state level) helped create it by reducing its support for public universities.  In 2020, state governments supplied $8,600 per student, a 40 percent decrease in real dollars from 1994.

But the universities themselves bear plenty of fault too, with a costly proliferation of administrators who, paradoxically, are assigned the task of economizing. Between 2011-2012 and 2018/2019, administrative pay at American public universities increased by $3.7 billion. That represented, for each full time student, a 24 percent increase in administrative salaries. At the University of Maryland, where I teach, former President Wallace Loh was last year paid $734,565 as an adviser.

Rather than bring these absurd administrative costs under control, administrators are going after the university’s core function by opting to hire the cheapest possible teachers. That’s adjuncts.

—Holly Brewer, “How to Cure Colleges’ Adjunct Addiction,” Washington Monthly (Aug. 4, 2021)

_______________

Image: McKeldin Mall at the University of Maryland, College Park; cropped photograph by Radhika Kshirsagar, 2013 (CC BY-SA 4.0)