As a squishy humanities-inflected social science guy who never shuts the hell up on this topic here, I agree 10,000%. But four caveats:
- the risks of being a first mover on tougher standards, either as an individual educator or institution, are immense while the possibility of payoff is remote;
- as long as there is an insistance that undergraduate education must be evaluated solely in terms of monetizable societal benefit, the above risk / payoff ratio becomes even more lopsided;
- everyone has to accept the possibility that it is their child who will decisively wash out of any such higher-standard program, dyslexia or other verbal/literacy difficulties nothwithstanding;
- without succumbing to The Discourse on AI in education in either direction, I'm just going to say "embrace AI as a productivity-enhancing technology" and "significantly increase the rigor of undergraduate humanities education" are, at best, two goals that are hard to reconcile with each other.
I agree with you on the first mover problem. But in terms of “monetizable social benefit” my point is that the current paradigm has already collapsed that for humanities programs
I'm that cursed specter, the humanities adjunct we love to skewer, and I agree with both of your takes 10,000%.
I'm encouraged not to be tough in describing the class on the first day, so that students will stay enrolled through the week for funding. The non-selective U is pushing AI for unspecified purposes, the students who use AI are mostly international, somebody has a documented accommodation regarding slides/lecture notes, a device, visual examples, etc. Somebody is going to vanish (the biggest issue in CC, where I was FT), so attendance and participation have points attached. I'm hardly an example of the financial fruits of humanities study myself. I do plan to treat Canvas as a file cabinet and push the class further toward pen and paper notes and annotations, cold-calling Socratic seminar, and low-tech writing workshop, but each of those is a heavy lift in a course that shares a busy syllabus with other sections taught by other faculty.
I mean, as far as I'm concerned as an adjunct you're a hero. I know that doesn't really help, but for what it's worth! I've never had adjunct faculty in a department I've been in, but if and when I do I resolve to treat them like a human being, which seems to be difficult for some full-time faculty.
Good luck with all that - I really hope you can carve out some rigor and dignity amidst everything.
Thank you so much! I've taught in other, more stable, contexts, but we made family choices (GingTFO of the Deep South with two young/pre teen girls) related to an opportunity for my husband. It looked like I was on track for a NTT spot, but the boulders of state and NSF cuts (many of my students have practical majors related to research, local industry, or allied health), and AI have fallen on that one. Here's to a good semester despite it all.
The first mover problem is easy to solve. Just have the most prestigious university break the mold. Looking at you, Harvard and Yale. They make humanities degrees harder and therefore more valuable to get and the rest will follow in their wake.
Princeton literally tried exactly this for ten years during the period that it was consistently ranked #1 by US News and World Report and abandoned it as an unsuccessful experiment in 2014.
That's unfortunate. I would note (as one person quoted pointed out) that there's a difference between shifting the curve on grades and making a point of being more rigorous. You can do the first without the second, though the second is much harder to pull off.
I was a TA at Princeton back in the late 1970s and indeed we had a lot more freedom to inform students via their grades and they just weren't cutting it.
I expect that the downside risk of poor grades was more marginal relative to the signalling value of the degree-granting institution in the 1970s, although that’s mostly a guess. The harder post-undergraduate institutions scrutinize grades as a legible proxy for desirable outcomes (law / med school admissions, jobs, whatever) the more you run into incentive distortion.
It is a bad equilibrium. Faculty should be able to force students to take pop quizzes and do oral exams/presentations of their written materials. That is the only way to get around AI slop.
Totally agree - but "test anxiety" discourse is strong, and I, at least, have to offer alternatives upon alternatives as an accessibility measure.
I don't want to sound insensitive, but I can't emphasize enough how, from an educator's perspective, accessibility requirements have made every direction I turn in trying to come up with AI-avoiding assessment into a dead end. To say nothing of the "just let them use AI, who cares about writing anyway" discourse.
Anxiety about public speaking is one of those things that is an entirely coddling-generated problem. I don't mean public speaking in front of a crowd of a thousand people -- I love that kind of shit, but I'm a weirdo, and 'stage fright' is a totally normal problem.
Presenting to a group of like 25 people is not stage fright, that's a severe and debilitating anxiety that will haunt you for the rest of your life and it's completely insane we let kids walk out of college having given maybe three or four public presentations, two of which were probably with a group that gave most of the speaking to the most comfortable group member. And it really can be overcome, I have seen it happen with debate kids when I was a coach. Really shy and awkward people come in as freshmen, and leave as non-shy (but probably still awkward...) people two or three years later.
Hell, we get students shitting their pants when we have them present to just two lecturers as part of a mock interview assignment we do. It's bad. But also amusing, when it's the bro-y lads who were disruptive and useless and then suddenly cat's got their tongue when they have to stand and deliver.
"...we get students shitting their pants when...."
It's a pity that was not an option back when I was in school, because I could have *excelled* at that. I could have *cleaned up* (I would have had to, afterwards). I mean, when it comes to shitting my pants I take a back seat to no one.
Hell, my HIGH SCHOOL required all students to take a one-semester public speaking class, and we gave four presentations over the course of it. Mine were on how to fold a paper airplane, Lake Superior, Jefferson Davis, and why a 5 cent/gallon county gas tax was a good idea.
It's funny, because I know a professor who had what would now be called very bad public speaking anxiety, as in threw up before and after every class she taught for a semester or two...and then she got over it. Never liked it, but did it well ans professionally for the next 30 years. Law School and cold calling did something similar for me.
This is what is so frustrating to me about some of the discussion around this topic. I have a family member with both OCD and anxiety, the latter likely stemming from the OCD. The successful treatment is to face it head-on, preferably under the guidance of a therapist who knows what they are doing. Alas, many therapists have no idea how to treat OCD and PTSD, and many of the strategies they use make the problems worse, but that is a column for another day.
I agree, which is why I think the ADA and its expanded uses are a big part of the problem here.
And I don't know how to fix that. I believe students who need wheelchairs and properly digitized texts on their text readers should have those things. But I don't know how we provide those things without also allowing the rampant overuse of "anxiety" diagnoses and time-and-a-half accommodations.
Like anything it turns out you can teach people how to deal with anxiety. I feel that it is a failure in our educational systems that they don't.
It used to be that people "organically" learned to deal with anxiety for the most part except for a "tail" of some people. So why is it that people aren't learning to deal with anxiety?
There's so much I think schools should teach but don't:
* How to self-regulate and manage your anxiety
* How to fall (roll on your shoulders)
* How to diagnose your baby's ear infection (do it before your doctor's office closes)
Yeah, accessibility requirements are also a massive black hole of time and effort for instructors. Instead of being able to spend our scarce time developing and improving activities, assessments, materials, etc, half of it gets sucked into accessibility requirements which most of us managed just fine without 15 years ago. It also vastly adds to the administrative bloat of the university as webmasters have to spend half their time on website accessibility remediation. All of this just to avoid lawsuits/comply with settlements under ADA.
EDIT: I appreciate Nathan's point which is that we shouldn't throw the baby out with the bathwater. Everyone who has replied to him seems to have got my intended message, which is that there's a balance to be struck. The penduluum has swung far beyond giving Deaf, blind, and physically disabled students what they need to learn. It's part of a broader phenomenon of court-mandated accommodations that far exceed the scope and the population ADA was originally designed for, and that beyond the higher ed context, are devouring K-12 school district finances and making it impossible for K-12 teachers to do their jobs.
Remember, there are never actual tradeoffs between things we want, so anytime someone opposes something you think is good, the best response is snark or moral disapproval.
I think the above comment is referring to two different things:
1. Learning disability accommodations (extended time on tests, etc.), which have exploded and are now abused at scale as an aid/security blanket by large fractions of the population
2. The consequences of the ADA rulings about captioning, etc., which, well beyond captioning, in practice have generated lots of confusing bureaucratic constraints about accessibility notices that feel more like GDPR cookie banners than real accessibility for the deaf.
I think the "we did fine before" comment was about (1) not (2), let alone actual accommodations for the deaf.
Agree. We are currently in the process of making our website meet the new federal accessibility requirements for institutions that receive public funds. The new requirements go into effect next year. We are not legally obligated to do this, but many of the institutions we serve are. There is much debate among the people these fixes are intended to serve as to the best way to do this. For instance, alt-text. Many sight-impaired people have told us, "Enough with the excessive alt-text!!" At this point, we are focused on meeting the requirements and will worry about the rest later.
We do want to serve all people in the best way possible, but I do sometimes wonder about the tradeoff between resources used vs how people are served. Another example relates to digitized historical documents. It takes a while to create extensive metadata (obviously, this serves all kinds of people by improving searches). Some researchers have told us that they would rather have more material online with less description than fewer items well-described. The idea is that researchers can always contact the holding institution to get more information if items seem potentially useful. And blind researchers have told us that if items are up with little description, they can find a sighted person to describe them, whereas if items are not put online at all due to lack of description, there is no way they will find them.
More to the point, you can almost always tell the students who really need accommodations from those who don't. The former are on top of things, try not to be a burden, and compensate in other ways; we are always happy to work with them. The latter are usually demanding, asking for exceptions even beyond their accommodations, and sometimes fail anyway.
Andy Hickner's comment specifically says that they should not be required to be met. Based on what happened prior to the ADA (and even after that), that means that those needs will not in fact be met. Hell, they're barely met today when they are required.
You're reading him unreasonably charitably, and inserting a Shirley exception.
Part of the reality is that we're letting in disabled kids who shouldn't go to college. I've had three students in the past four years who very evidently had the sort of autism that lowers one's intelligence. These kids weren't math whiz savants either, quite the opposite. Not talking about a no name school here.
I teach and have zero problems with accommodations. My problem is that the disability resource center makes the profs provide OCR copies of everything assigned to read, which is incredibly time consuming. So for me, it's about inadequate resources for accommodation.
My problem is not so much that the accommodations come with a lot of admin. It’s that I can barely teach anything - like, assigning reading and expecting them to do it is Kafkaesque.
Can I ask what the time consuming part of OCRing is? I'm a data scientist and my (likely naive) impression is that OCR is essentially a solved problem with off-the-shelf open source models now. I OCRed 5 million pages of permitting documents for a couple hundred dollars a few years ago; I'd bet it can be done for 1/10th of that cost today. Is there a software usability barrier?
Do you actually have to make accommodations that preclude use of quizzes? Absolutely wild if so.
I just... don't get it? The whole point of a quiz is to assess knowledge. If a student knows information, they can put it down on a sheet of paper and pass. How is it even possible to assess knowledge if quizzes are not allowed?
Explaining it in detail would be tedious, but basically I can’t give anything that’s a one-shot, time-limited assessment.
But yeah - that’s an excellent question! And one that I pose quite often, though it falls on deaf ears. The answer is that we’re often *not* assessing knowledge, not really.
In person exams being the only way to test for merit and knowledge, is something the Tang dynasty discovered in the 600s. I don't think the dynamic has changed much.
It seems to me that the solution is to do more online lectures, and reserve class time for both proctored test-taking (not necessarily "pop" quizzes) and real-time essay writing. The reverse of how we've always done things in the past.
100% agreed. if you can't pay attention to a lecture, you can't pay attention to an hour long business meeting. if you attend a large meeting as a junior employee you probably have very little relevance to the meeting itself and not a ton of reason to even be there, but your seniors drag you along because they want you to pay attention and understand the business from a wider perspective. the people who successfully do this are the ones who are able to contribute meaningfully and impress the people they need to impress.
There necessarily has to be an in-class component and an out-of-class component because you can't do everything in 3-4 hours per week. The question is simply which activities are better to do when.
Also, as I recall, holding lectures in person is no guarantee that students will pay attention.
Fuck, online goddamn group corp (as like says vendors) meetings as well - EY consultants - the juniors - see on a grid project EY hired on in my zone, they fucking zone out, phone it in w AI summaries on work flows from call (AI summareis they don't actually fucking proof well and didn't grok cause zoned out), we have to beat on their heads.... (figuratively).
I agree - real-time essays, or even essays written across multiple class periods, would be good. There are internet blockers you can use to force students to use only the sources you want them to use.
I’m thinking it has to be more like science classes where the lab component requires four hours in the room for 1 unit of credit - we need four hours in the writing lab for 1 unit of credit, but now they don’t need to spend any time actually doing homework at home.
The biggest difficulty with this is that it takes more minutes per student of faculty time to do this. Even in a class of 30 this will be very difficult.
Universities should have test labs with proctors and blue books and rotate writing class exams through them. It shouldn't fall on faculty, there's an easy economy of scale there
I think it actually makes sense for those rooms to have computers too, as long as the computer just has a simple word processing program, and perhaps the instructor can set some folder of documents that students have available when it’s their turn at the computer. But yes, the economy of scale is a big thing - and it also allows students to schedule at different times if they have need of extra time, or need to schedule around work or whatever.
This is basically what Thompson-Prometric testing centers are. They administer exams in large, quiet, proctored rooms. The exams themselves all seem to be computer-based, which is assume drastically simplifies things. The computer automatically tracks time, closes the test at end time, and allows the taker access to materials allowed on the exam.
I took the patent bar at one. Many of the other people there with me were taking some kind of accounting exam. It really worked smoothly.
Like you point out, one advantage would be flexible scheduling. A computer can automatically assemble a test made of questions selected from a large question bank.
Fwiw this is how law students have taken exams for a long time. Norms shifted a bit during Covid from my understanding but the traditional way for law classes to grade is one in-person final exam, worth 100% of the grade, taken on a locked-down computer. You can even use your own laptop because vendors make software for this purpose.
Exactly. We just need a lot more of them, since all the time that students used to spend doing writing assignments at home will need to be in these spaces.
A closely related issue was pointed out years ago in the comments of DeanDad's blog.
In math, and the math-heavy fields, you will routinely be demonstrably wrong: it's absolutely part of the everyday experience to be wrong, correct the error, and go on.
This is much rarer in the humanities other than foreign language, in practice. "Could be better" is routine, but "flatly wrong" is not an everyday thing.
“ This is much rarer in the humanities other than…. flatly wrong" is not an everyday thing….”
But this is an area where Philosophy might have an advantage, because in that discipline “flatly wrong” is an everyday thing. And that’s just for the teachers, much less the students.
This is very true. I deplore that a lot of humanities and social science education became about having "correct" critical views on the world, but at the very least it was always trying to emphasize that interpretation and argumentation are core to intellectual development. Clearly this often swerved into some bad places, but it was always there, somehow.
The problem is that teaching cognitive processes is, necessarily, hard and full of trial and error and subtle course corrections in thinking. It's slow, frustrating and hard to scale up. It's by nature inefficient and fragile, especially in the face of AI.
And I think the further corollary of this is that you need to grade a lot more work to give the same amount of feedback on this, and that just isn’t sustainable for faculty.
I used to tell my English students, "There ARE wrong answers in English class. The study of English is not just saying whatever you want and chalking it up to 'symbolism.'" And then, of course, I'd teach them how to analyze texts well.
But I have to admit that I think the discipline as a whole needs more hard-assery in its scholarship and teaching. Because we do produce work that includes far too much trendy BS.
One can make the exact opposite argument, though. For most science questions you are either right or wrong. This simple binary is far less advanced than the Bayesian need to balance on the head of a gray-scale pin. I think, however, that forcing students to be never willing to settle on a yes/no answer, in humanities, probably requires more teacher effort and possibly a better teacher, overall.
I think top undergraduate programs (Ivies, etc) need to lead the way on this. I received a wonderful humanities education at a middle of the road private liberal arts college a few years ago. I no doubt benefitted from aggressive grade inflation that helped me get into elite law schools. But in my summer internship and full time job applications before law school I found that my college meant nothing to most hiring managers (good or bad), but I at least had a high gpa.
The elite undergraduate programs are the only ones that can initially raise the standards without throwing their students under the employment bus post-grad. A 3.0 Princeton English major will still have an easier time finding work than a 4.0 XYZ College graduate. And, assuming that the elite undergrads are the ones more likely to fill the ranks of academia all through the prestige scale, they can then filter this practice through academia over ensuing decades.
I went to Reed College, a known holdout on grade-inflation for 40+ years. For a liberal arts college it is heavy on both STEM (only nuclear reactor run primarily by undergrads…) and the humanities. It’s a tough school and back in my day, it had an insanely high dropout rate given the quality of students.
The professors get away with grade inflation because you don’t get grades on tests or papers…only points and detailed critique. (They write down your grades at the end of the semester and you have to go to the registrar and ask for them at the end of the semester!)
I think I would have done better professionally had I gone to a more “normal” school. It does not have the name recognition of a Harvard or even say, Purdue or Penn State and unless you go directly to grad school, the cost-benefit of the college is terrible. With PhD’s being overproduced and professorships outsourced to adjuncts the deal is worse than ever.
They send a letter with students’ transcripts and I was able to get into a competitive graduate program. I do wonder about elite professional programs, however. I can’t image Med-schools being happy about a 3.5 GPA, even though there’s only a handful of students who has ever graduated with a 4.0 GPA ever.
A clarification about grades, you don't get them if you are a C or above, so you definitely are informed if you are D and failing. I didn't see my grades until I graduated and asked for a transcript. It wasn't a surprise that they were A's and B's, but it was a surprise as to which in a particular class. (I got B's in some classes that I thought I did better at, while some A's where I had thought it would have been a B.)
Another point about Reed, you are required to write and present a thesis. This includes a defense before a group of professors.
YMMV (your mileage may vary), but I have gotten recognition for having a Reed degree, and I think graduate programs kind of know that given that ¾ of Reedies go on to a graduate program (not me).
In conclusion, don't short yourself, a Reed education taught and encouraged me how to learn, which is vastly more important than any specific knowledge (although I got that too).
The same thing happened at my alma mater. The grade deflation policy lasted about 15 years, and the school rescinded it in 2019 after years of protest that grads were getting hurt in graduate school admissions due to the policy. I have no idea if the latter is true. I know the school noted the policy on transcripts, etc.
I don't think grade inflation helped you. I do graduate admissions and the signal from grades is so weak that I hardly consider them. Oddly enough, as a historian, the thing that gets my attention is not straight 'A's but the transcript with some 'A-'s in stem subjects as a signal of intelligentce
I recently listened to an interesting interview with the admissions deans at HLS and YLS, who indicated that GPAs have gotten so inflated across the board that they now use course selection and recs more than the raw number. Of course, what works for HYS likely does not work for more aggressively number-maxxing schools like Wash U.
Princeton is known for having harder grades and the top law schools have fewer Princeton grads than Harvard or Yale grads as a result. You can’t fix this problem without changing law school admissions and other major post-grad routes that emphasize raw GPA. And that could be hard if you have lots of hiring managers who don’t care about the name of the school but only look at raw GPA.
I guess an alternative would be external assessment where academics from other universities assess the quality of work submitted for grades and work to ensure that an A or an A- means the same thing across different universities and subjects.
In the UK, just about every paper submitted will be assessed by at least one external examiner as well as by the professor who submitted it - and most senior lecturers and readers (the mid-ranking academics - associate and assistant professors in US speak) will end up assessing at least as much work from other universities as their own, often more.
Much as I like to imagine the external examiner from BU going into Harvard and changing A-s to C+s, I doubt they'd have the prestige, authority, or backing to actually do it; more likely, the generous marking standards of the Ivies will get pushed elsewhere.
What would be fun would be the accreditors saying that a university that gives it's highest GPA to more than 5% of the graduating class must introduce a new higher GPA, so you'd be able to get a 4.1 or a 4.2 or whatever. Even if students really are doing better quality work than in the past and therefore more of them deserve a 4.0 (which is one possible cause of grade inflation and is what most of the universities claim to be the case), the purpose of the grade is to distinguish between the students, so giving 40% of them a 4.0 doesn't achieve that, and they'd need to go up to 5.0 or 5.1 to make enough useful distinctions.
At that point just put more emphasis on standardized tests…
It’s not really fair to rank students compared to others at the same school because schools vary so much in selectivity. Like it is just objectively harder to be in the top 5% at Princeton than a non-selective community college. So on that basis Ivy League grade inflation is understandable and they could reasonably argue that getting an A there isn’t any harder than at community college even if a higher percent of their students are getting As. It’s a hard problem.
That’s why I think Princeton should be handing out Ss (like a videogame meta tier list) and giving those a 5 as a grade point to average when calculating a GPA.
"In the UK, just about every paper submitted will be assessed by at least one external examiner as well as by the professor who submitted it - and most senior lecturers and readers (the mid-ranking academics - associate and assistant professors in US speak) will end up assessing at least as much work from other universities as their own, often more"
I wouldn't say "just about every," or even many - external examiners tend to look at module-level statistics for outliers and then might look at specific work if there's an issue or to see what feedback generally looks like. Moderation is usually internal (and that's also usually just a subset of assessments, like 10%).
But yes, it's interesting that peer review of day-to-day operations is institutionalized in the UK and it isn't in the US.
The law school arms race is nuts these days. Hope you were smart enough to determine whether your undergrad awards A+ grades before attending, otherwise good luck!
I graduated from Princeton in 1982. A few years back I had to get my transcript as part of the paperwork for a fellowship at Chapman University in Orange County. (I already had the fellowship. This was just routine stuff.) I was shocked at how many Bs there were. Nowadays it would be considered lousy. Back then it was top 10% of the class.
But why would they do this? How does it materially benefit the graduates of those schools to have stricter grading in the humanities? It's not as if a Princeton undergrad is suffering at the moment because employers aren't convinced they're smart enough due to grade inflation.
Also, I don't see how elite programs leading the way is actually a key to solving the problem. Not only are Princeton humanities students not suffering on the job market right now, but they are also probably still getting good educations and learning a lot, grade inflation or no grade inflation. I would say the programs that need to lead the way are big flagship state universities, which are the types of schools that have small rigorous philosophy departments that stand out from other humanities departments. (Some "directional state" type schools don't even have philosophy departments at all.)
Just to reflect on my own experience as an undergrad at Rutgers, a state school with a very rigorous philosophy department, I was sort of stunned at how much more strictly graded the Philosophy 101 course I took there was than any other humanities class I took was, and given that I wanted to apply to grad school in history (which I did, and I'm now finishing a PhD at Columbia) and knew that GPA was important for that, I simply didn't take more philosophy classes. I got like an A- in that class, so it wasn't like I did all that poorly, but it took significantly more work to get that than it took to get A's in most of my other classes.
Don’t fall for the trap of perma adjuncting. Econ History is still a place for tenure in Europe. Be prepared to exit academia if you are given stable employment options, because you are more than how unis treat new PhDs.
Already planning my way out of academia! Have a number of friends that have shifted into consulting/other business paths, so I'll likely end up following in their footsteps.
Taking undergrad classes at Rutgers in philosophy must be like accidentally stumbling into a boss room in a video game. I'm not sure a lot of people would expect them to have one of the best philosophy departments in the world.
That's an interesting example, given that Rutgers has a merely decent overall reputation (apparently it's #41 in US News for national universities, which is honestly higher than I would have thought), but has one of the strongest philosophy programs in the world.
Would an admissions person in a non-philosophy humanities PhD program know the second part and take it into account if you'd taken more philosophy? I would kind of hope that they would, but maybe that's not a given.
The admissions people are generally professors in the department that houses the PhD program you're applying to, so while they may be aware of the reputation of the philosophy program at Rutgers, it's not really a safe bet. If you were planning to apply to Philosophy PhD programs on the other hand, then sure. I'd also say that my experience of undergraduate history courses here at Columbia doesn't give me the impression that they're graded much if any more strictly than the ones I took at Rutgers—my sense is the difference in rigor here has less to do with the quality of the philosophy program at Rutgers and more to do with differing disciplinary standards in general.
I think they would have to just do it for the love of the game, basically. There is certainly no material incentive to do it.
But my understanding is that at the turn of the century Princeton was a pretty unserious place for academics. It was mostly a finishing school for the elite and not a place of deep intellectual rigor. And then they hired Woodrow Wilson as president and he turned it into a place that students came to learn.
Maybe it is naive, maybe my understanding of that history is incorrect. But hopefully for the good of the nation and humanity people will bring about a new age of academic integrity and rigor in the academy.
It's been a while since I was in school so my first-hand knowledge is a bit dated, but it seems to me that a lot of humanities academics no like anger have a clear sense of what game they're playing or why.
To me, other than the commercially spinoff analytical and commination skills, its about gaining a better understanding of humanity, our good points and bad points, strengths and foibles, why the same tragic and stupid mistakes keep happening over and over, at the collective level more-so than at the individual level. Like how a therapist can help an individual gain more insight into their own behavior, why they keep falling into the same patterns, that what studying the humanities can be for humanity as a whole. But a lot of academics seem to approach their role more as scolding cleric than understanding and patient therapist.
I guess I just think that such a thing will only happen if it doesn't actively cut against material incentives, especially in more than one or two institutions.
I think you should look beyond "material incentives" for explanations. Though they are real and important, it seems there are other things going on here. I would note that some of the material results -- lower number of students, less revenue coming in, significant backlash and reductions from public funding sources -- are pointing in a different direction.
the first mover problem is definitely real for professors at mid-level state schools and so on, but i really feel that institutions dramatically underrate the appeal of a school being hard. it's absolutely true that once a class begins, the students in the class would rather get good grades than bad grades. but a lot of smart kids want to go to places that have a reputation for being difficult! when i was undergrad, i had the option of taking normal calculus, which was a massive auditorium and all the grades were based on computer homework and exams in a computer lab. there was also Honors calculus, which was a small section of 20 students being taught by a professor who had a 1/5 on RateMyProfessor with like a hundred reviews all talking about how he was mean and made everything too difficult and assigned too much homework and so on. I took the honors section, and 19 other smart kids took the honors section, and we actually learned to understand calculus instead of learning how to algorithmically solve calculus homework problems. and at the end of the day, it barely made a difference on our transcript (just a little H tag next to the class), but we all enjoyed the sense of intellectual superiority the class afforded us.
anyway, all this is to say that i think a school that actually advertised itself as doing a Return to Rigor where you're going to take hard classes that dumb and lazy people can't complete would actually make the program *more* attractive to the kinds of kids you want taking your classes anyway.
this does not do a lot to resolve the actual 'departments are funded by the tuition of kids that shouldn't be there' problem though, so i expect it's best reserved for programs that are already relatively well-funded by things other than tuition.
"anyway, all this is to say that i think a school that actually advertised itself as doing a Return to Rigor where you're going to take hard classes that dumb and lazy people can't complete would actually make the program *more* attractive to the kinds of kids you want taking your classes anyway."
Maybe - but my point is that this is one hell of a roll of the dice, even with solid financial support independent of tuition.
"a lot of smart kids want to go to places that have a reputation for being difficult"
This certainly was no small part of the appeal of MIT back in my day, even if they no longer had the freshman orientation line: "Look at the person on your left. Look at the person on your right. One in three of you will not graduate from the Institute."
Since all these trios of new students looking right and left overlap with each other (you're in three of them), doesn't this mean that basically no one will graduate?
Granted, I was not a STEM major so I'm probably screwing up the analysis here.
The line they had when I was doing college applications was "If you're the smartest person in the room, you're in the wrong room." I always liked that philosophy.
Sounds good, until one person looks around and then gets up and leaves. A moment passes and another person looks around, gets up and leaves. And then . . . oh well, I think we see where this is heading.
The tradition in Oxbridge and the old-fashioned Ivies was that you could/can teach hard stuff while letting people who don’t want to work that hard skate through with a “gentleman’s C”, or “gentleman’s pass” in England. So even easier than making people fail is to simply make grades real again, which solves at least part of the problem; it makes GPA filtering for law school work a bit better, at least.
The funny thing is that, at least where I am, we don’t have a problem with grade inflation at the top; 1st class (A-equivalent) degrees are genuinely rare. We actually give out plenty of gentleman’s / gentlelady’s 2:2s (Cs, basically). It’s just that no small part of them *really* should have flunked out.
Beyond the first mover problem, this is a competitive strategy. Yale and Princeton can make their humanities classes hard and know they’ll attract the best students, but if Ithaca College makes a history degree harder then rapidly rising in the rankings is their only hope to fill seats. If every college does so, a lot of history departments will simply shutter. Many of these programs only exist because they could launder the implied intelligence and industriousness of a college degree.
Right - this is essentially what I mean by first mover problem, but this is a good illustration.
Though I wouldn't necessarily say that if the Ivies decide to take the gloves off with respect to rigor, they'll instantly raise the standard of their students. Maybe, but I think expectations have been locked in to the point where it would be a rough ride even for them.
I thought your “first mover problem” was closer to “if a random school raises standards, they might lose a lot of students in the short term and not reap the reputational reward for years”, whereas this is making a quite different point: only a few colleges can improve their reputations this way at all, because you run out of smart and driven students. Regardless, both of these are strong headwinds facing this strategy.
I'm not sure I agree on the last point. For two reasons, one relating to the humanities student's role as an AI user and one relating to their role as an employee of AI users:
1) it could be the case that a more rigorous humanities education gives the student skills that make them a better user of AI. AIs are, after all, as I understand it, trained on important works of literature, among other things. Possibly the well-trained student could phrase questions to the AI more precisely or more evocatively and therefore be a better "manager" of their AI "workers."
2) In a world where many workers are using AI to enhance productivity, those workers will still be employing other humans to do things for them. Those things just may be different than what humans employ other humans to do today, because some of what people are hired to do today will instead be done by AI. But it could be that rigorous study of the humanities will prepare students to be valuable employees to other AI-using humans in this AI-rich world.
I don't think this is wrong, necessarily, but you're just restating my assertion, which is that you have to come up with a way to convince students that rigorous study of the humanities *without* AI assistance in the first instance is *foundational* to understanding AI later down the road.
I have no idea how to do it at other schools now, but my alma mater specifically really resisted grade inflation. At least as of 15 years ago.
Their reputation (and a pamphlet that went out with every transcript) allowed graduates to get into grad programs with lower than average GPAs for entrance. I kept an outside scholarship even though I fell below the required GPA to maintain it.
The humanities classes were still considered less challenging than the hard sciences, but they weren’t easy.
RE: "First Mover", I disagree. Maybe it's a problem to be the 4th or 5th mover, but being the first one to stand up and loudly declare, "OUR Humanities majors are smarter than the ones everywhere else because our classes are way harder" gets you a lot of pub and a lot of smart humanities majors with chips on their shoulders.
But they won't be smarter, unless the school is an Ivy. Schools already filter so aggressively on general intelligence that a middle-ranked school can't possibly hope to produce better thinkers than a top school reliably, no matter what curriculum changes they make.
Professor in a humanities discipline at a research university here. I love the idea in principle, but here is the practical problem.
My Dean has made it clear that what matters to him when making decisions (especially decisions about whether to replace professors when they leave or retire, what kinds of financial support for department activities etc.) is “metrics”. And he has said openly that the single metric that matters most in this context is class sizes - how many students the department teaches.
So if we make our courses so hard that our enrollments collapse by half, we will lose massively. So our strong incentive is to keep enrollments high, which is best done by not making it unfeasibly hard to get a good grade.
This does mean that our major looks less prestigious, for the reason Matt explained, so we attract fewer majors and probably lower quality students doing those majors. But the Dean doesn’t care about that, provided the overall undergraduate numbers hold up via our elective courses.
The way I handle this personally is by still setting whole books, lots and lots of reading, and making it clear that I want people to do it - while at the same time setting papers, exams etc. that students can do well on even if they haven’t done all the reading.
And yes, I’m aware of a certain inconsistency in this, but it’s the best I can think of, given the pressures on me.
I am a professor who happens to have a lot of knowledge about the role of women in Meiji Japan! I agree that professors themselves want to have high standards and give real grades. The problems are 1) the incentives are a mess. The administration wants high retention rates, athletics wants passable classes, everyone wants more students in seats. 2) There’s actually a lot of pressure to care for students that can run counter to evaluating them. Part of this is the accommodations - so many accommodations that they make it hard to have any kind of standard assignments. And part of it is the idea of setting students up for success by paying more attention to “navigating the university” or “knowing where to seek help.” Most of this work seems to end up with the humanities professors, who are perceived as more caring - and more of us are women, which can’t be a coincidence. In fact, the “rigorous” disciplines are always the more male coded ones - even philosophy fits this model. 3) as others have said, no department can raise standards unilaterally without losing students and getting lines cut etc. So the whole problem seems unsolvable at the moment.
+ 1000 on your point 2. The base assumption is something like “for students, attending class and being assessed is traumatic, so we have to mitigate that somehow.”
And I totally agree about the gendered expectations. It’s partially a discipline thing, as you say.
Athletes need to have rigorous courses too. Why do most Ivy League colleges have mediocre athletic programs? Because they made the right choice about athletics ironically 🤨
As a former philosophy professor (thanks for the kind words Matt) I agree with this analysis. Admin directs us to continually get more majors, while also telling us not to inflate grades. At the same time, a huge share of classes are taught by contingent faculty, for whom student evaluations matter a lot, and student evaluations are largely functions of how easy the grading is. I think a lot of professors would sincerely love to raise the bar, but there just isn't enough security for departments and employees to do this.
I think student evaluations should be abolished - it’s pretty well documented at this point that they’re discriminatory against people who don’t fit “classic professor” archetypes.
Unfortunately in the UK we have the National Student Survey that is a statutory requirement, but at least those are program-level.
I don't think they should be abolished - I've often gotten useful constructive feedback from them which has made me rethink aspects of my courses. But they should certainly play no part in hiring or promotions etc.: they should be for the professor's eyes only.
Yes, this has been my experience exactly as someone in the college English classroom. And when you're not tenured or in a reasonably secure lectureship, you're on the job market every year, for which game you have to produce student-evaluation sets with scores in order to even make it to the preliminary interview stage. I hate everything about this system, but the only people who can change it are the small percentage of tenured professors who sit on search and tenure committees. The vast majority of college teachers aren't among that number.
If it's any help, this isn't universal. I'm a tenured full professor, I've sat on more search committees than I can now count (and have chaired a large proportion of them). I have never once been on a search committee that asked applicants to produce their student evaluations, and if anyone had ever proposed it, I would have argued strongly against it, precisely because of the well-documented problems with them.
This isn't to doubt your experience - I am sure that it's true that some search committees ask for them. But I simply want to reassure people that isn't some kind of unquestioned norm.
You're right, it's not a requirement of every search, but eval sets are now required often enough in my field that every candidate applying broadly will have to produce them.
I actually do appreciate committees caring about teaching. I'm just not convinced that student evals are the best way to adjudicate teaching.
Agreed - but the only way out of it is for society (“society”) to agree to foot more of the bill or accept that university is going to be an “elite,” as in selective, enterprise. You (not you, “one”) can’t expect universities to operate competitively in a mass enrollment environment *and* demand rigor across the board. I think you can probably pick two from that list, but not all three.
Yes to all this. I wonder how much of this has to do with the fact that adolescence/childhood has been extended in our culture and adulthood has been delayed. I think a lot of undergrads just aren’t ready college — not in terms of raw intellectual ability but the discipline, agency, and self-reflection required to really engage with subjects. This goes doubly for humanities. Because the path through STEM subjects is more linear and structured, a smart, motivated student has a clear ladder to climb: put your nose to the grindstone and just keep moving. The choose your own adventure nature of humanities and social sciences (at least today) allows students to get kinda lost in the weeds, unless they have the maturity to make serious decisions about how to shape their own course of study. And many don’t. I sure didn’t.
I did well in college but honestly I coasted, the same way I did with high school; if I’d gone 5 or when 10 years later I would have gotten so much more out of it.
I'm curious to hear how you deal with AI - I would definitely assign "lots and lots of reading," but I have no mechanism, really, to ensure that they do it on their own.
The problems you describe about enrollment are even more acute in the UK, where the financial model is basically entirely tuition-based. There's modest state support, but tuition is the name of the game - and American-style endowments don't really exist outside of Oxbridge (and even those endowments are largely from investments, not alumni contributions).
The way I personally deal with AI (we don’t have a department policy on this) is by increasing the amount of in-class testing (exams, quizzes); and for the part dependent on papers, I set papers with complex prompts requiring close attention to aspects of the text: my experience when experimenting with AI is that they perform less well on questions like this.
It’s an imperfect solution - I am sure students still use AI. But I am moderately confident that no one can get an A in my class while having AI do most of the work.
My solution would be to make the generating institutions liable for defaulted/forgiven student loans. Some version of IDR should be the only loan repayment plan with a max payment cap of 120 payments at which point the school becomes responsible for remaining balances. We have to create an incentive system that protects the value of degrees.
I must be very slow, but I can’t see how your solution would stop my Dean from assessing departments by enrollments, or would change the effects of that. Can you explain the connection?
Because it would mean the school would start losing money by enrolling students whose degrees turn out not to be valuable. The reason enrollment is the only metric is because the school has no skin in student outcomes, more students = more money. I would make the institution financially invested in the value of the degrees they convey.
Off-hand, that seems like it would lead to discouraging public service degrees. Teaching lower income and rural schools, mental and community health, and anything that has a high burn-out rate. My guess is that this would increase inequality while decreasing upwards mobility.
I feel like this would lead to a school cutting lower-earning departments altogether, no?
Like if the decision is doing hard work of changing curricula and shifting culture vs cutting the Gordian knot, I could see many if not most institutions choosing the latter.
A premise I was arguing from is that humanities are important to study, and that there should be a robust and rigorous course for students who are interested in the humanities to study.
I guess if that isn't your position, having institutions cut their humanities department isn't an issue. But that does not seem to be a desirable outcome in my opinion.
I doubt this would provide any reason to stop universities allocating money to departments by the number of students enrolled in those departments. It is already the case, at least at my university, that a high proportion of students major in departments whose degrees are perceived as financially lucrative for reasons that have nothing to do with the intellectual rigor or otherwise of the degree - above all economics and business and engineering. This would simply provide an additional reason for a university to ensure that as many students enrolled in those courses as possible, and as few as possible in - well, anything else. Even philosophy, which (as Matt observed) scores well above average for the humanities in financial outcomes, scores well below those.
Matt's original post assumed that the humanities were valuable for reasons that have nothing to do with being financially lucrative (sample quote: "I believe deeply in the value of studying literature and history and philosophy and big ideas"). If you do not share that assumption, then naturally a "solution" like yours makes sense. But if you do not share that assumption, then your response isn't very well addressed to either Matt's post or my response (which accepted the assumption but pointed out the problems with Matt's solution given the internal dynamics of financial allocations WITHIN universities).
Maybe your university should require more classical humanities gen-ed classes, so total enrollment in humanities classes stays the same or increases even if enrollment in upper level humanities classes decreases.
Maybe - but that is something that would have to come from the top, and the current tendency is in the opposite direction: the Deans are concerned that there are too many course requirements on students before they get to their majors. And while the policy you propose would benefit my department, I do rather think they have a point.
I don't know how it works at your university, but at mine, even freshmen, unless they are undeclared, do take 1000-level courses for their declared major (along with plenty of gen-eds). They don't wait until their their second or third year to begin taking their major classes.
I suspect part of the problem here is that while in some sense it’s true that colleges can “choose” to “arbitrarily” make undergrad classes as easy or hard as they like, there is a hell of a lot more wiggle room in the humanities than STEM. A school handing out cheap As to CS majors who can’t write decent code, Bio majors who bomb the MCAT, and economists who can’t do math is apt to slam into reality fairly quickly. There’s much less immediate external feedback if that English grad taking an office job never really learned to analyze Ulysses that well. And schools sort of NEED lower rigor departments to warehouse the kids who only got in as legacy admits or because they’re good at a sport, which creates pressure to relax standards in the areas where it’s easiest to get away with.
Humanities/qualitative social sciences are layered in this respect. Students that go on to do doctoral work are absolutely rigorously evaluated. I would NEVER recommend a student to a program if I thought they were no good, because I would personally know the person who was going to supervise them and i/I'd feel like a shit; ii/they'd never trust a letter I wrote again (and rightly so). [of course, it doesn't mean that the letters aren't written in an absurdly overblown language that requires a bit of hermeneutical knowledge to read through the lines and know that if the student isn't 'brilliant' it means they're a moron]
Presumably you've read it, but if you haven't, oh you must read "Dear Committee Members" by Julie Schumacher. It's composed entirely of letters written by a jaundiced, cynical English professor of a lower tier liberal arts college. His letters of recommendation are a treasure. E.g.,
"This letter recommends Melanie deRueda for admission to the law school on the well-heeled side of this campus. I've known Ms. deRueda for eleven minutes, ten of which were spent in a fruitless attempt to explain to her that I write letters of recommendation only for students who have signed up for and completed one of my classes . . . "
Yeah this is what honors courses/sections were for when I was at Michigan State 20-plus years ago. The ones I took were designed to be significantly more challenging than the non-honors courses. It seemed to work pretty well.
Maybe even more importantly, adding a few extra math questions to the test makes it a lot harder to take, but only adds a few seconds of grading time (maybe even zero more grading time if it’s scantron) while making a humanities class that much harder requires adding another essay assignment, which requires nearly 10 minutes per student of additional faculty grading time.
You can make the questions harder without adding more grading time, especially on exams. But then you face your department chair when students write you bad evals and possibly don't sign up for your classes. The eval and butts-in-seats pressures in today's higher-ed environment are something no individual faculty member can solve. And, to be honest, I'm not sure anyone else can, either, at any but the most selective institutions.
That goes both ways. It is easy to objectively make a STEM class more difficult, because evaluations can be 100% questions with right/wrong answers. But how does one make an assessment in a literature course more difficult in an objective manner? There is always some subjectivity in the grading, and it is particularly difficult to honestly make gradation at the top end.
There’s a lot of low-hanging fruit out there if the professor is willing to actually put in the effort. As a social sciences major/degree-holder, I had numerous classes where your grade was entirely based on a single paper or a single exam at the end of the semester/quarter. There was little incentive to actually attend class or participate in any meaningful way. Adding a basic participation component like needing to be prepared to answer cold-called questions about assigned reading (ie Socratic method), or even just simply taking attendance would go a long way.
Once you grab all the low hanging fruit the issue probably gets more complex. But from personal experience there’s a heck of a lot of that fruit to be picked up first.
For CS, at least, it’s a common complaint that schools do pass out good grades to students who can’t code. Like… not “to our standards” but at all. It’s a reason fizzbuzz tests became so common.
Is it also possible that it's harder and demanding more energy for humanities professors to explain to their students why their answers are wrong that it is in STEM? And maybe sometimes they themselves aren't so sure of what the "rigorous" answer should be about the meaning of "Middlemarch"?
I think you're mixing to things here. More time consuming, perhaps. But that's not the same as not knowing what a good answer is. There is a common misconception that confuses "more than one possible answer" and "no way to tell a good answer". The two are radically different, and any well-constructed Humanities assignment will never fall into the latter category. Often what we test is method, and that is fairly objective- i.e. did you use the required/allowed kinds of evidence to make an argument using the relevant methods? And that's not to mention the many many low hanging fruits, i.e. the endless kind of FACTUAL questions we ask: did you identify the passage correctly? Did you translate this sentence correctly? Is your metrical scansion correct? These are as objective as basic arithmetic.
And that's well and good for those areas of humanities instruction. But less so for evaluating and grading essays along the lines of "discuss the place of women in 19th century British society as exemplified by Dorothea Brooke in 'Middlemarch.'"
Not sure if I'd assign such an essay, but to humor you, this is eminently gradable. First we will assign a precentage for following factual accuracy i.e. are you referring to correct and relevant parts of the text? Are you aware of the historical realities of women in 19th cent. British society?
Second, we'll think about the complexity of the argument - e.g. different kinds of women in 19h cent. british society, which itself evolved over time, Dorothea as a woman of her own time vs. Eliot's time which is quite diffent (a fact ostensibly pointed out in the novel if memory serves!)
We will also assign grades for the craft of writing itself - is the essay sturcutre good and effective? Do you transition between paragraphs?
And of course afew points for correct technical citations bibliography proofreading etc.
In other words - we use a rubric. The rubric itself is somewhat arbitrary (is structure going to be 20% or 30% of the grade), but this arbitrariness gives consistency across the board. So yes, of course there is some subjectivity on the micro level (am I giving this essay 18/20 or 19/20 on structure) but this is not a problem *in practice*. In fact it allow consistent and transparent grading that easily differentiates the A from the C students AND allows you to clearly explain to the students why they got the grade they did and on what they should improve for next time.
Re warehouse majors, we had anthropology for the jocks and sociology for the blacks; not sure why one group gravitated to one dept rather than the other
I want to promote this point from C-man, because it shouldn't be buried three layers in:
"... accessibility requirements have made every direction I turn in trying to come up with AI-avoiding assessment into a dead end...."
If that's right, then a big part of the story of the dumbing down of academia has nothing to do with the teachers: it's the administrators.
If there is a whole administrative industry devoted to "accessibility" and "accommodations", where this means making testing and assignments easier, then the teachers don't have much power to change things.
You cannot demand that everyone write their exam in long-hand, because Johnny has a special note from his doctor saying he has to use a keyboard. You cannot give a 1-hour exam in class, because Billy has a special note saying that he gets double the time that everyone else gets. You cannot do oral exams, because Freddie has a note saying that this constitutes an undue infliction of social anxiety. And so on.
If this is what's happening (and I may have the details wrong), then teachers have lost control of the curriculum. Especially if the accommodations industry can claim the authority (in the US) of federal law, via various Titles. (And presumably something similar applies in the UK.)
If the coddling is federally mandated, then there's little that a given faculty member can do.
As much as I love to blame administrators, it's like 90% the fault of ADA and the courts. Any school receiving federal student aid has already been sued and had to settle an ADA case, or has proactively adopted the policies for fear of being sued. The settlements mandate all these insane requirements and accommodations.
Yeah, my point was not to blame those individuals. So let me restate it: not administrators, but the administrative system that they operate. We can agree that it is due to a kind of expansive bloat starting with the ADA -- a really good thing in many ways! -- and the consequent gaming of the "accommodations" dodge by suburban kids who want to get any fractional advantage on their peers.
This is one of those situations where we look for ways to send the pendulum back in the other directions. But it'll be a long, long time. How many of the younger generations are basing their entire identities on their diagnoses? There may be some unknown, catastrophic environmental factor underlying an upsurge in neurological difficulties.
But at this point diagnoses might be encouraging expectations of accommodations, rather than support for personal responsibility for their own successes, not to mention enjoyment, in life.
Yup. It is this and more. My big public university is introducing totally batshit requirements in an attempt to deal with new federal and state disability policies. Like: readings must be clear and understandable when skimmed.
I joke that we need a DeSantis Turing test of administrators and their policies. If we described what these folks were doing in the DeSantis/Florida case, would left leaning folks say, “Good lord, another example of the right’s war on higher education.” Many decisions these days fail the DeSantis Turing test.
From the US perspective, the ADA does not apply to athletic participation. It should not apply to higher education, either (beyond things like handicap access).
"...It should not apply to higher education, either (beyond things like handicap access)...."
So, I don't know how this works, either from a legal standpoint or from a higher ed standpoint. But from what C-man is saying (and maybe this is just the UK), it sounds like the problem is that *everything* becomes a "handicap access" issue. Needing more time is a handicap. Being unable to write with pen and paper is a handicap. And so on.
So, that's one possible route for the spread. I feel like I read something about this in the NYT a while back, i.e. a large percentage of kids, even in high school, getting "diagnosed" with some condition that allows them to claim "handicap access."
So, even apart from "everyone gets an A", there is the problem of "everyone gets an accommodation diagnosis that lets them claim a handicap."
Gaming the system to medicalize low innate abilities or mutable personality defects. It is crazy how much this has spread. It’s like how Oxy and the “crisis of chronic pain” was propagated.
I would not be surprised if some of the addiction dynamics are replicated, at least as a matter of psychological addiction. I.e., you get a diagnosis of anxiety in HS. Then you get extra time, or special environments. Then if you get thrown into a tougher environment, that makes you anxious, because you haven't had to cope. So now you're more anxious. So you verify your diagnosis. And so on.
The tricky part is that "handicap access" covers most accommodations, even for very classic definitions of "handicap." the blind student probably needs a keyboard and text-to-voice to do corrections.
My favorite disability is the "can't make it to class or have difficulty waking up for class" disability that asks the professor to modify the attendance policy. It's insane.
I should also specify that if you have a medical condition that makes attendence difficult, I'll accommodate you. The problem is students who have mental health issues that make attendence difficult because of issues like sleep. For example, I had a student with fibromyalgia that had an attendence accommodation. I gladly gave them an exemption, but they only missed twice.
I should also say that students get three free absences, so it's not like the Paper Chase over here.
This was years ago, but I read an article that noted that school districts with a disproportionate number of accommodations tilted towards the wealthy, with about 25% of the student body in Greenwich, CT public schools having accommodations. I wonder if this is still the case.
I suspect that’s true but in elementary education it’s true but access and attitude are playing a huge role here.
I’ve seen people instruct their kids to refuse accommodations for some kind of reason about not being labeled far more than 504 fishing. Maybe this happens more with older students.
> Needing more time is a handicap. Being unable to write with pen and paper is a handicap.
These are pretty different things though. There are people out there with physical disabilities that make using a pen and paper difficult, but they fully understand the material and should be given an equal opportunity to demonstrate that. (and, if anything, pen and paper is the antiquated thing nowadays)
Getting extra time, on the other hand, demonstrates less skill at the actual thing that's ostensibly being tested. If you can't do the assignment in time, then you can't earn the grade or pass the class.
We should be clear about what things are reasonable and what aren't.
I don't know if the UK Equality Act (roughly the equivalent) is more or less encompassing than ADA. But as DT says below, everything, including "write by hand," becomes an "equality" issue.
Ok, where does the "dumb" computer come from? Why buys it? Where does it live when people aren't using it? Who decides which programs are dumb enough? (Just imagining the committee meetings about this fills me with dread.)
I agree that having a dumb computer might be a good idea, but implementation would be highly non-trivial. An "in principle" solution is not sufficient, you actually have to be able to do it.
I, personally, as an instructor, cannot easily do this. It would involve at a minimum talking with IT and the administration. I very much do not want to do either of those things. If many people wanted the same thing it would involve more physical infrastructure --- more actual computers, and more space to house the computers. I am not asserting that it is impossible, and I am not asserting that it is technically challenging. I am saying that the implementation is nonetheless highly non-trivial.
On the other hand, if I want to force my students to write in blue books, I can buy 30 blue books, carry them with me on exam day, and distribute them for the exam.
At many universities, the disability center has a testing room where students who need extra time can come in and do the test on a computer with no forbidden software on it. (I assume that’s how they have it set up - I’ve never actually seen the testing site.)
I think a lot of things would be easier if we just had everyone take all their tests in that sort of environment, and separate the proctoring and scheduling of exams from the instructional staff for the class itself.
If we expand the amount of space available, we could even have students write all their essays in that room.
I'd go one further: you must be able to pay for the exam/certification and the instructional services separately. The thing that gets you whatever cert you are after is the proctored, highly regular (maybe multi day?) exam. The teaching is a service offered if you need it - sort of how the relationship between law school and the bar exam used to work.
I think that’s how German universities worked in the 19th century. Some biographies I’ve read of 19th century academics suggest that they used to collect tips from the students at the end of class.
This kind of sounds like AP exams? I started college with sophomore status partly because i passed so many AP exams - some of which I didn't even take a class for because it wasn't offered at my school (I think world history was one).
That would be prohibitively expensive. Our testing center is already oversubscribed with the relatively small fraction of students who have accommodations.
I cannot emphasize this point enough - this is specific to my institution, of course, but any assessment has to come with alternatives that can basically be completed even if the student has never set foot in my classroom.
People should just be given ample time on tests for content. Very little is dependent on speed irl, at least the kind of speed required to quickly take tests
I also would like to second that from a "TA in STEM at a prestigious US university perspective". It definitely isn't a problem only in humanities, and seeing people who never showed up get A's, while people who, sure, weren't extremely talented, but showed up and asked questions get B's, was not something I loved about the job.
For intro stem classes, I would expect the students who get it as an easy A to never show up (that’s certainly how I was as a student, because it was easier to just learn from the book on my own time). It’s difficult to convince students like me when they have to make the transition to talking to people about material.
That’s fine. Most STEM classes don’t take attendance because they figure if you don’t know it you’ll fail the hard exam. I only went to those classes if the teacher was outstanding, because most of the time I learn best from reading the textbook and doing problems on my own time.
Having been a TA, though, what would grind my gears was having to give As to students I knew were *cheating* while giving Bs to students who merely worked hard.
I’m mildly skeptical about this. Most of these problems can be solved by software (my law school used examplify) that takes over your computer and prevents you from doing anything but the exam. If you make students use this, and provide extra time for the students who need it, what additional ADA requirements are there to meet?
My experience in similar situations says that the person sitting in front of the computer might be a friend who took the class the previous year, if you can't force people to come in person.
Over my years proctoring logic exams at Texas A&M, some fraction of students always tried to show me their student ID when turning in the test, which suggests that some other professors were using this as a way to prevent that.
My last degree was online, and showing your ID was an integral part of the online proctoring experience. Which, by the way, runs a spectrum of “cheatability,” the more onerous programs being the more difficult to cheat.
I'd be really surprised if you can't force people to come in person. I've spent 8 years in higher education to date, and I can't think of a single take-home exam I completed during that time (I'm sure I have taken some, but it is not at all common). Maybe there are a few students here and there who qualify for exceptions based on disabilities, but I don't think threat of heightened cheating for that tiny minority poses a serious threat to the general rigor of the exam
Just to clarify, I specifically referred only to students with accommodations in my previous comment (because that's what dt's original comment was about).
This is orthogonal to Matt’s point, but I will just say as a law firm partner formerly in charge of hiring, it is VERY difficult to find people who can actually write well. (As many on this comment board know, writing well is all you really need to be a good lawyer, or at least the most important thing.)
I was an English major at an Ivy League school who graduated 20 years ago (after trying and failing at a hard science major). I would not say it was easy. I probably had to read 20+ books a semester in addition to doing a foreign language. More importantly, we were essays week. If the essays weren’t clear and persuasive, we’d get bad grades.
I don’t pretend to know what academics are like there now, but I do see the outputs (and these are people who go to top law schools also). Not great.
Writing was one of my weaker graded subjects early on, and I found out later that it's because they made me overwhelmingly write on fiction. I get that that's what most kids find more interesting, but it just bored me. I got much more interested and thus better at it when I honed in on nonfiction, including subjects like what's discussed here, and also law--I even ended up working at a law firm for several stints.
As an aside, I'm also curious how many philosophy majors go into law. Seems like a nautral path to me, but certainly not for Matt....
"... one of my weaker graded subjects early on...."
Dude, of course you're going to have trouble writing: you're a city of trees.
I mean, I love trees -- we all love trees -- but they're not known for their literary output. Who can expect fiction from a deciduous downtown, a suburb of saplings, a coniferous conurbation?
Law school is definitely the most common trajectory for philosophy majors. And the people who want law school but don’t major in philosophy still usually take the logic classes, because the LSAT has some relevant sections (which I probably should look at some time, given how often I teach logic).
They removed the logic games, so of the things you might get out of a symbolic logic course, it's basically just diagramming conditional statements at this point.
Some basic stuff was helpful for the LSAT, but beyond the first few weeks of intro to logic, the class got much more advanced than the test. The LSAT LG section was hard mostly because of the time pressure, not the difficulty of the logic involved. The idea that truly learning logic was helpful for the LSAT was a useful fiction for philosophy departments. And now they got rid of logic games entirely due to ADA issues.
As someone who was a philosophy minor, I think the two have a fairly well-established link. I know one philosophy instructor who heavily recruited interns from the law school for his larger classes because they generally understood the material better.
Is this not already the case everywhere? I went to Random State U and every student had to take 2 dedicated writing courses as part of thier university gen eds.
I passed out of all college writing classes by getting a 5 on the AP Literature exam. To some extent that means I already "passed" college writing but I don't think I worked as hard in my 12th grade class as college writing students do.
What were your gen ed courses? I did not attend a University anyone would be impressed by but the University required the following number of classes (3 credits each) regardless of major. 2 composition , 1 government, 1 history, 2 humanities, 2 sociology, 1 math, 1 science. Among the above one class had to also carry an "international" designation and another had to have a "Diversity" designation.
You didn't need to take a special gen ed version of all these though. As an engineering major we started at calculus for math, where the typical humanities major would take algebra or something.
My engineering school required a couple of writing classes - one general/introductory, and one that was linked to your major, so typically lab/project reports or research summaries.
Writing is one of the most important skills to learn, but it’s one of the most labor-intensive to teach. Grading writing is pretty much the worst part of the job being an academic in the humanities.
It’s also the part of the teaching that is most directly threatened by AI.
Can confirm - in our org, HR instituted a bunch of fairness protocols (panel interviews, etc), some of which mitigate against careful assessment of candidate capability. I can't ask for a writing sample and our ref checks are run by a third party whose reports are pretty vague. It makes the interview itself pretty high stakes and I've gotten better at probing the candidate's cognitive and reasoning skills as a proxy.
Yea. This is a real problem. we are still able to get writing samples which is probably the most important part of the evaluation process, at least to me.
Yeah I still have mild PTSD from an English class I took at my state college 20 years back: 10 week quarter, 10 books (some 1000+ pages), 10 harshly graded essays, and you had to have something intelligent to say in class every week (on top of all my other STEM weed-out classes). To say I barely scraped by is an understatement…
I hear you. I can usually tell when a young associate/intern has actual writing talent, separate and apart from the things that just require experience or subject matter knowledge. It’s a rare thing.
One thing I always debate with myself is, to what degree is effective writing innate talent vs learned skill. Obviously practice and experience will improve anything. But when it comes to being an effective writer, how far can practice take you before you run into a wall?
I don’t have a good answer, and that’s one of the things that makes evaluating new attorneys difficult. As mentioned, I think one can tell when a recruit has innate talent regardless of how unrefined or inexperienced they are. But what if they don’t have that, yet are good at other aspects of the job? You kind of have to guess about how far you can bring them along through mentoring and practice.
Communication in a broad sense seems like it'll be a key differentiator for people moving forward in the age of AI. It's something I'm actively trying to get better at, both in a written and a verbal context.
I think it's actually a "interns don't know how to productively use AI" problem. Vanilla ChatGPT isn't the tool for that job, and I'm not sure even Deep Research is, but a RAG solution akin to NotebookLM that runs on a local server and that doesn't share data to the cloud would probably be worth trying. The human value-add would be knowing which case law is applicable, and being able to capably proof the output.
Fair. I think in 10 years time it'll be the norm though. Basically all you need is an open source, locally-installed NotebookLM alternative to deal with the data privacy concerns, and stuff like LM Studio are already 75% of the way there.
Its important to note that this "dumbing down" also occurred during the growth of adjunct professors who need those good student evals to secure their next contract.
I was an adjunct comp sci prof for a decade. And yes, it ended when I had the audacity to fail unqualified students who literally cheated on their exams. Literally all it took was one angry parent writing a letter.
I know this was already a losing battle when I was in undergrad, but I can scarcely imagine anything more shameful and infantilizing than one of my parents contacting my college (or heaven forfend, my fucking job) for any reason besides illness/injury but *especially* to demand something I don't deserve. I know plenty of students are yoked by financial support and would *also* prefer their helicopter parents alter course, but for the ones who accept it happily I wish nothing but ego-destroying embarrassment when they someday have a moment of clarity.
The whole “lose your scholarship if you don’t have a 3.0 by semester 3 or 5” thing sucks. If we’re going to be doing this “college is a time to figure out what you want to do” thing, and I’m not sure we should be (high school should be like undergrad, in my view), then you shouldn’t be penalized for trying something it turns out you suck at and failing. My next door neighbor growing up lost her lottery scholarship because of these requirements. She had started out in architecture and it turns out she didn’t have a knack for it. Switched to nursing and did well and is successful. One should perhaps be penalized for just phoning in college, but not for the trial and error we sell the experience as.
I think that may have accelerated the process, but there are studies showing grade inflation has been going on since early 1960s, with the biggest single jump being in the late 1960s (almost certainly to try to help male students avoid being drafted during the Vietnam War).
Yes - and this is part of why this is such a difficult problem to address. Higher education, at least in the US, is so highly leveraged on contingent and precarious labor that there's little incentive to be an outlier in terms of demanding higher standards of students.
this probably doesn't get anywhere close to passing an ideological turing test, but...
doesn't the ethos in current humanities departments that "rigor/objectivity/etc are vestiges of cis-heteronormative capitalistic patriarchal white supremacy" kinda make these changes practically impossible?
I bet over 99% of administrators and faculty in the humanities, for example, support affirmative action, which is trading off rigor in exchange for other considerations like diversity and helping rectify injustice.
By most objective measures, students enrolling in institutions had better qualifications compared to pre-affirmative action. In my experience as a computer science professor from 2001-2019, courses were generally more rigorous. Do you have any evidence for this claim?
The fact that student quality has generally trended upwards over time does not imply that this causally related to affirmative action, clearly.
The evidence I have for my claim is that affirmative action literally means lowering standards to achieve other aims. Like, that's the whole point, that's the evidence.
I'm not even saying affirmative action is bad! It very well may be a net positive thing in society. My point was that given that the humanities are especially attuned to things like societal disparaties, there would be internal resistence if their rigor increased so much that their classrooms started to look the same demographically as physics or CS classes.
1. AA is designed to help racial minorities, class-based affirmative action only does this indirectly
2. Class is a bigger impediment to success than race. Malia Obama is not more oppressed than a white kid from Pikeville, KY.
3. Current AA approaches are about the phenotypical pie chart -- so you end up wtih a lot of children of rich Nigerian immigrants than American descendents of slaves in elite institutions.
4. Doing racial AA is kind of in explicit tension with the 14th amendment in a way that class-based affirmative action is not
So my feelings on it are mixed. I support it, yes, but it almost certainly would not go far enough. So idk.
You can equate affirmative action with poor rigor, in your head, if you want. But there are endless forms of affirmative action which do not require a trade off in rigor. Take for instance, crossing off names on resumes before reading them. That would increase rigor and be affirmative action. One might even say that your anti-AA stance lacks in rigor. :)
Pretty much everyone who talks about affirmative action in a US university context is talking about having lower admissions standards for members of underrepresented groups (blacks and hispanics being the important ones in the US). The admitted black kids end up with a substantially lower average SAT score / GPA than the admitted white kids.
The opponents of AA in the US are the ones who want universities to do race-blind admissions--crossing off the names, as well as race and gender, from the applications and judging them entirely on what's left, like test scores, grades, advanced classes taken, etc.
Which may or may not be a bad idea but is not a reduction in rigor. I was saying that crossing off names on resumes, where black names have been shown to have adverse effects on odds of hiring, would be a form of AA(which also would not decrease rigor).
You made the absurd assertion that affirmative action is, "trading off rigor in exchange for other considerations like diversity and helping rectify injustice." I see that you walked this back a bit later in the thread. There are obviously possible ways to take affirmative action without degrading rigor. I mentioned one (resumes). Several people mentioned admissions, later in the thread. I agree 100% that AA has gone too far. I was making a narrow objection to your claim that AA is impossible without sacrificing rigor.
The acknowledgment that there are values beyond rigor, that you are willing to trade off against rigor, is extremely different from a claim that rigor is itself a bad thing.
It's possible that it does reduce rigor but, as you say, maybe it helps rectify injustice. I recall the argument by Derek Bok and William Bowen back in 1998 in "The Shape of the River" that minorities admitted via affirmative action actually did quite well in college. (This was an incredibly quantitative study, one should note.)
Affirmative action only accounts for a small percentage of students—most top universities are under 20% Black+Hispanic, even if you assume all of them got in through affirmative action which they didn’t, the vast majority of the class is still non-affirmative action.
If you really wanted to have the maximum rigor with the smartest possible students, the much bigger lever to pull is international students—you would just give financial aid and need-blind admissions to everyone so some top students going to Tsinghua, IITs, etc. come to your school instead. That would also increase diversity ironically. And most liberal humanities professors would probably support this—they aren’t the obstacle to it.
One of my daughter's roommates went to Illinois Institute of Technology as an undergrad. I was surprised to learn that it has the highest percentage of international students of any school in the US. The roommate's theory is that it is because its initials are IIT.
1. Yes -- a truly meritocratic university system would have a lot of kids from the Tsinghuas and IIts of the world. That's undeniable.
2. This is also the *real* reason for affirmative action, if you take the semi-cynical view that universities are not agents of social justice but rather self-interested corporations -- a university that looked like IIT in America would be less attractive to prospective students/donors than one that had (at least superficial) diversity. It's the same argument for allowing legacy admissions -- rich kids' parents paying for nice amenities makes the school more attractive overall, even if this means things are less "fair."
3. I'm not sure what progressive humanities professors would think of this. Opposition to purely merit-based high schools like Stuyvesant in NYC comes from the left, because they're uncomfortable with a demographic pie chart that is so heavily skewed against Black and Latino students, even if most of the students are poor children of immigrants.
I believe most colleges and universities in the US are not particularly competitive--pretty-much anyone who remotely belongs in college and applies will get in. The top private and state colleges are selective, sometimes extremely so--even with truly excellent grades and test scores, it's hard to get into Harvard! That's the place where affirmative action, legacy admits, and sometimes athletics (at big schools with financially important basketball/football programs especially) have a direct impact.
To the extent that it did/does I can tell you it is already shifting, at least where I am at. "Return to rigor" discourse post-COVID is hot now. I think AI has played some role in this. The big distinction though is between intellectual rigor and procedural rigor. Increasing the assignment count or complexity of requirements for completion is procedural rigor, and not helpful (and beyond that is the kind of rigor that AI most easily circumvents and that compels students to shortcut), whereas increasing the expectations for student participation or quality of arguments is intellectual rigor, and preferable. Jamiella Brooks and Julie McGurk have done a bunch of work on describing what this would look like.
So now I have a name for why online classes suck. They always have a million little assignments, I assume to make it hard for people to hire others to take their online classes for them. But it’s so obnoxious. Give me 3 tests and a final, please! But please don’t take attendance
Call me a dirtbag centrist, but I think your comment here and the comments of your detractors are both wrong.
*Administrators* absolutely believe this. All of the non-faculty professional staff at a university have the most insane wacky progressive beliefs you've ever heard in your life.
But honestly, faculty mostly don't, even the radical leftist ones believe it's important for students to learn things and for classes to be hard. You have to be a real gadfly to resist the pressure that the professional staff brings to bear, though.
As a professional staff member, allow me to speculate as to why this is the case:
I work as an instructional support staff member, and what you describe is somewhat true. I think much of it can be explained by the fact that, at least where I am, staff like me are seemingly discouraged from participating in the actual class environment. We make alot of informed suggestions based on what the current research and discourse on pedagogy suggests, run some voluntary workshops, assist with building course sites, etc, but we're rarely invited to actually sit in on a course we're helping with or even get updated info on how things are going, and sometimes I get the feeling some instructors are downright hostile to the idea. Because of that we have to fill in gaps based on hearsay and rumor that floats around third-hand. Alot of my job is a kind of "reading the room" because some instructors will try to shop around instead of just telling me what they think. So despite the fact that my job is to help make the teaching in my R1 university as good as it can be, I have spent very little time in an actual classroom since COVID. That's not a good situation for either of us. My institution doesn't even have very robust tuition support, so I couldn't spend time in a classroom even as a legit student without incurring significant cost to myself. I've begun looking at adjunct positions, teaching a class or two, just so I can have some minimal experience, despite not having much interest in becoming a full-time instructor.
If most instructors find the professional staff's views to be wildly out of bounds, we wouldn't really know it, because nobody will tell us whether the situation is of the "I already know this stuff and it's old news/out of date/unworkable" kind or the "this is weird and new and I don't think it makes any sense" variety. Personally, I don't care which one it is, as I have little personal stake in the matter, but I kind of need to know so I know where to productively go from there. My sense is there is a heavy amount of defensiveness about whichever one it is from academics who are used to being confident in their specialty domain, and are working in an environment where admitting an intellectual blindspot is heavily discouraged. Sometimes I just have no idea how familiar instructors are or are not with some of this stuff, and when I ask outright I get weird looks like I just asked either a forbidden question or one that was so obvious asking it made me look silly (both?).
I could be wrong about that, but if I am it's because I am ultimately at least right about the fact that very few people will just be up-front and honest about whether I'm welcome in their actual classroom or not so I can get a real sense for how they teach, or whether I'm just supposed to be a glorified intern putting widgets in an LMS.
I've worked at other smaller places where this dynamic was less so and instructors loved having me sit in. Maybe it's a problem with "elite" academia, idk.
I'm an instructional technologist/designer. Sometimes grad students do that stuff, but we are fortunate to have a large edtech team to let grad students focus more on content/class instruction.
Why would an edtech person have input into how a class is taught?
Granted I think technology is a big chunk of the problem and is almost never a solution. Online problem sets in math and science need to die a violent death. Students are supposed to use pencil and paper and professors are supposed to write on a board, not use a CMS except to post grades.
I would think the best teachers mostly would just be using their class notes from when they took the same class when they were in college.
I mean, there are a few faculty who believe this. There was a brief-lived movement towards "minimum grades" that basically said that if you show up consistently, you get a B no matter what, and then you can go up from there. Thankfully this seems to have been marginal.
But yeah, in my experience people not in the classroom on a regular basis develop some pretty wild beliefs about what teaching involves.
I mean I had a friend that went to New College, which didn’t do grades in favor of lengthy evaluations, but I don’t think it was any less rigorous. It made their test scores more important for medical/law/grad school, though
the "rigor is white supremacy" ethos definitely hit its apex a half decade ago but most leftwing institutions do seem to subscribe to the notion that disparate outcomes are indicative of discrimination. So institutions like these aren't going to optimize solely on rigor if that means outcomes remain disparate
This is largely (but-alas- not completely)* a straw man. The dirty secret is that 95% of those who espouse these views are hypocrites who secretly still believe in rigor and *when left to their own devices* usually act accordingly. The bigger problem is precisely that which MY talks about: it is NOT because of ideology but because of practicality (enrollments, collective action problem in an environment of grade inflation etc.). Some however feel better about themselves by *pretending* that this sad state of affairs which everyone laments has some ideological virtue behind it.
*My only qualification to the assertion above is that hypocrisy can have a way of running ahead of you so that sometimes people seem to almost accidentally put their money where their mouth is, e.g., one person mentions one of those silly platitudes in a dept. meeting simply to virtue signal, and another feels they need or should up the ante, saner voices are too cowardly to intervene, and before you know it the department has adopted an official policy with serious academic implications that nobody individually would have tolerated in their own classes - e.g., Princeton's absurd decision about language requirements (which, contrary to MY's assumption has little to do with enrollments to my knowledge, and also is a little more nuanced and less imbecilic then you'd think, but is still basically moronic).
To the extent that anyone is making the "rigor is white supremacy, etc." argument, it's uniquely in elite institutions. Everyone else literally can't afford to increase rigor, because students will go somewhere else - either a different course, program or university - where they can GPT their way to a degree.
In my field of English, "rigor" is coded as white, male, and privileged. Teachers, unfortunately, think of rigor and the Canon in the same breath. It's a terrible attitude. And it tries to dole out racial justice in terms of grades.
There's a big split, mostly along generational lines (with the exception of a few slightly younger curmudgeons like me), between faculty who espouse that point of view and faculty who roll their eyes at it.
I'm not sure why you'd treat this as a problem localized to College level humanities when it's equally applicable to basically all of K-12. "Low standards have devalued [the high school diploma]" is at the very core of a huge amount of modern social issues from student loans to education polarization and it's all the same basic argument. The main value of any education program is signaling and if it's not meaningfully filtering for ability or knowledge or hard work the signalling value is zero.
"The main value of any education program is signaling"
No it's not. The value of education is knowledge transfer, and building of skills/critical thinking ability.
Actually knowing how to do math is a skill. And a very valuable one. Knowing how to read, analyze and then write effectively is also a very valuable skill.
More job specific for me, knowledge of accounting and being able to apply it is a skill. Which is why I ask accounting questions to everybody that interviews.
I don't care about your accounting degree if you can't answer my accounting questions. As many people who have interviewed with me have learned.
I think you and Dave Coffin are using the word "value" differently.
At the society level, the "value" of education is mostly in improving human capital in the population. Knowledge transfer and thinking skills are a huge part of improving human capital.
At the individual level, the "value" of education includes improved human capital but also signaling "I am intelligent and conscientious, and I know things about a specific topic" to employers.
Skills have little value if you can't get in the door. Most would be job candidates never get an interview where knowledge might become relevant. It's not like the knowledge you learn in college is some big secret. I can read a bunch of books for free. People pay for college for the credential and the connections because that's what gets you the interview to begin with.
Pretty much agreed. I think it's tragic that the US has not come up with some testing-based alternative to a bachelor's degree for demonstrating skills and knowledge in particular fields.
For example, sitting for the CPA exam in most (maybe all) states requires a bachelor's degree with a certain amount of coursework in accounting. Why not just let people take tests indicating they know material covered in this coursework, and sit for the CPA exam if their pass said tests?
I studied electrical engineering in undergrad and I think that about 95% of the knowledge/skills covered in the degree plan could be demonstrated via written exams. The rest could probably be demonstrated in a few weeks of labs and projects.
Knowing that academics are pretty prevalent in this comment section, I have a question for you: Could you fix this in your own classes?
If you are teaching 200-level humanities class, could you make your assignments more difficult and turn out a grading system that results in, say, 20% A's, 40% B's, 20% C's and 20% D's or F's? Or would your administrators not allow you to do this?
I'm just curious how this lowering of standards and rigor can be disrupted. Must it be at the institution level, or can professors choose a different path? What is tenure for, if not something like this?
I think "make your assignments more difficult" has become a very different enterprise in a genAI world, for one thing. Even if you somehow incorporate AI into the assignment workflow, designing an assignment that leads to some learning outcome *and* is rigorous is, to put it mildly, very, very, very difficult.
I think something MY doesn't really cover as well is that it's not just assignments as such - attendance is the other huge part of the puzzle. It's different everywhere, of course, but attendance in my university has cratered despite some tepid attempts to address it. When you only have 20% of your students in the classroom on any given day, there are a lot of steps to go before you can even begin to think about "increasing rigor."
I agree - though I also have to admit that I owe my job to students who basically only exist on paper.
But I think it's also important to note that it's pretty possible, at least at my institution, to get a degree without ever attending a class, or maybe showing up like 10% of the time across your whole degree. It sucks so hard and I hate it, but it also keeps me employed.
I wasn’t very concerned about attendance when I was in school. As long as I felt I understood the material and could do the assignments, I didn’t feel like I needed to sit in a classroom.
In re attendance: Long ago, I worked in the registrar's office at my undergrad school and one of my tasks was to catalogue about 70 years' worth of student handbooks. In the course of doing that I made an amazing discovery: Pre-WW2, if you missed a day of class without a *written* explanation for the absence, that was an automatic 10% reduction in your grade. However, if you had an unexcused absence the first school day immediately before or after a regularly scheduled holiday (e.g., the Friday of the Labor Day weekend), that was an automatic ***30%*** reduction in your grade.
I will admit that one thing that causes me great cognitive dissonance is that in college, I *loved* going to class. I understand that there are bad teachers and so on, but it's really hard for me to wrap my head around what has become the default posture, which is that college classes are an annoyance to be tolerated at best.
Depends on the subject. I ate my anthropology, foreign language, and of course studio art classes up. But a lot of my science teachers were pretty shitty because their reason for being there was research. I went to a few of those teachers’ classes who were really good, but there’s high variance in quality
Yes, that was definitely the most interesting part. I could read all this stuff on my own, but to have discussion about it and the addition of the professor's in-depth knowledge made it much more interesting.
20% is a lot lower than what I remember back in the day, but attendance rates have always been bad. And often they are actually worse in STEM classes, since (a) class participation is never part of the grade, (b) anything covered in the lecture that you need to know for the exam is also in the textbook, (c) the vast majority of the professors can't teach.
Yeah - I recall taking an undergraduate CS class during grad school because I was bored and aimless and toying with escaping into tech (hahahahahahahaha), and attendance was definitely not 100%. Of course, it was also harder to tell, because there were like 500 students in the class to begin with and so the room would feel ful regardless.
I agree with the "can't teach" bit - I try to be an engaging teacher and was even shortlisted for a university-wide lecturer of the year award, but I can say that it still doesn't help with attendance, at least where I am.
As someone who studies mostly social sciences with some humanities on the side, it always shocks me to hear how few people in STEM majors actually attend classes. I have friends who have gotten As in CS classes they've only ever been to for the test, which, say what you want about non-STEM fields, is essentially impossible to do unless you already have expert-level knowledge about the coursework and somehow manage to only pick classes that don't have any sort of attendance mechanism. For all the kvetching about the humanities, pretty much all of the sheepskin effect extremists at my school are in STEM majors or business. My university is very pre-professional, so it could definitely be different elsewhere, but all the English majors I know actually care a lot about reading books.
For whatever it's worth, my engineering school experience does not at all reflect Mr. Bear's--I think it was a rare class that was under 80% attendance.
This is only half the argument as I see it. The other half is that you will fail the class if you do not show up because you will not have the knowledge necessary to perform well on exams or even long-form essays. If I think CS classes should require students to attend, it's because I think that it would prove that the professors aren't entirely obsolete. Say what you want about the humanities, but the history professors I have had have all provided very strong analysis and metaanalysis of the material that has been useful in forming my own interpretation of events. I don't get the same sense of that happening for CS.
Couldn't you say that if a student misses more than X number of classes, their grade will suffer? I'm trying to tease out how much of this is just inertia and how much is within the control of the tenured professors?
In law school I believe the rule was if you missed 5 classes you would not be allowed to take the final which guaranteed failing. They meant it too. There was a guy in my crim class first semester 1L who had missed that many and the professor told him to leave when he arrived for the exam.
I think I remember some requirements on this in undergrad for history but don't recall the specifics or witnessing any enforcement.
My recollection is that attendance rule was actually an ABA accreditation requirement for law schools (thereby getting around the collective action problem), but I don't know offhand whether it still is.
It depends - on your institution's policies or your program policies.
I - at a UK institution - do not have this option (there is also no such thing as tenure here, so that layer of protection is not available). When I was a TA at the University of Washington, we were also not allowed to grade on attendance (though you could grade "participation" and so on as a proxy). That's only n=2; in some places I'm sure you can require attendance. When I guest lecture in a Masters program in France, we always pass around the fiche de présence that they have to hand-sign.
Accessibility discourse is particularly strong here in the UK, and it emphasizes students' other work and caring responsibilities. Which is fine, but it often implies "so if you require physical presence in your class, you're an elitist a[rse]hole," which makes it hard or impossible to insist on attendance. Not sure what this looks like in the US.
So I finished an undergrad degree last year in PPE in the UK, and after the first few weeks on my first year stopped going to seminars or lectures because they were useless and boring.
For my philosophy and politics seminars, no one did the reading and when they did do the reading said completely insane things about the reading (I got "Rishi Sunak is a race traitor" in my third year political philosophy class.)
Economics was different because its a problem set subject, and the problem sets weren't hard enough for seminars to be helpful.
It's completely understandable to feel pretty disheartened by this as someone in a teaching capacity, but this was why, as a student, I stopped going. In short, I could learn everything more quickly with a textbook than by going to seminars and I find them to be actively unpleasant social experiences.
For context, I went to Warwick (for non-brits reading this, uni ranked somewhere between 5th and 10th in the UK. Think Brown or Cornell in terms of student quality, but definitely STEM biased.)
I don't deny that there are crappy teachers out there. I teach political geography and I try to emphasize "primary sources" over academic articles - and for the students that do show up they seem to like it. We looked at the Project 2025 hiring questionnaire when talking about populism and illiberalism, for instance. Or we compare the Online Safety Bill and EU Digital Services Act's enforcement mechanisms as examples of epistemic bordering (I don't use the word "epistemic"). Or I pair historic and contemporary texts from public intellectuals so we can see how old ideas about e.g. nationalism are still quite present today.
I don't fault students for not showing up if the content is objectively bad. But it's also a vicious cycle. And in my experience, at least, even being good at teaching doesn't necessarily help.
(Speaking of the "race traitor" comment - if anything, where I am my students are disengaged enough that it actually makes it easier in some ways to teach about controversial topics, because they tend to not really have an opinion at all)
Yeah I don't even think that the PhD students teaching the seminars were bad at teaching. I'd guess that part of the problem is that you can get good grades my not doing the reading throughout the year and then really working hard in the last like third of the academic year.
I'd guess im also pretty unusual in that I'm pretty obsessed with social science (and now work in public policy doing social science) and I had lots of friends i could and did talk to about philosophy and social science.
"Economics was different because its a problem set subject, and the problem sets weren't hard enough for seminars to be helpful."
I double majored in Econ and and CS (from engineering heavy school). This was 40 years ago and the difficulty of the econ problem sets versus the CS/Engineering problems sets was night and day. Even relatively basic math stuff like prob. and stats would slow the pace to a crawl in the econ classes with profs apologizing for the "extremely challenging" problem sets.
But I was under the impression that econ as a profession has become much more mathematically rigorous and challenging over the past 30-40 years and now comes close to rivaling the math difficulty of an engineering or CS degree.
It definitely is the case, but it mostly only really starts at grad school, and econ masters and phd programs have roughly the same mathematical difficulty as cs or engineering.
I took the most mathematically difficult econ courses my uni offered (it helped there were taught jointly with the maths department I think) and they were definitely the best courses I took at university I learned a lot particularly from the really pure microeconomic theory courses. They made me a real economist in the sense that because of those courses I could read most microeconomics papers (but basically not any macro papers - the maths is considerably harder and not taught I think anywhere to econ undergrads in the UK except maybe at LSE.)
They definitely weren't as hard as 2nd or 3rd year physics or maths courses though - people do general relativity in third physics and take topology in their second and measure theory in their third for maths!
The people I knew doing maths were definitely the people working hardest and the people who felt most burnt out by their subject at the end of their degree. This is may be some warning actually against making courses too hard actually. Most of the people I knew doing economics were more excited about economics after they finished their degrees, but the opposite was true for the people I knew doing maths.
So, if you want to go to graduate school for economics, you shouldn't major in economics in undergrad. This of course is only knowable to people who are lucky enough to have good mentors or already know people in the field.
As someone who recently left grad school: I think it’s fine to make classes hard enough that attendance is effectively required. But I don’t like the idea of requiring attendance for its own sake.
I recently graduated from a joint JD/MPP program. The JD classes required attendance. The MPP classes did not.
I found that the lack of required attendance allowed me to use my time much more efficiently. By the end of every semester, I was swamped with papers, projects, and studying needed for final exams. The best way for me to use my time was often not to attend class, but to instead spend additional time studying the nuances of topics I found challenging, drilling practice problems, and going the extra mile on research papers and other large projects. Nowadays classes are recorded, so you can still watch them later if you miss - and speed them up, skip the stuff you understand well, etc.
This is fine for motivated students - but in general, I'm not dealing with motivated students.
There's also a tragedy of the commons issue here. With the levels of attendance I have, I can't teach effectively because I can't predict who will show up in class from week to week. It's effectively a different group of students each week, which means that building instructional continuity becomes very difficult. Having only 5 students in class when you're meant to have 25 means that in-class activities - which we're encouraged to do in order to "engage" students - become impossible or much less effective.
I get what you're saying, but I think it's also important to recognize that classrooms don't function identically regardless of how many students are in them.
The engagement problem is definitely real. I’m surprised there’s not a fairly reliable core of students who are there most weeks.
But re the motivated vs non-motivated students: It seems to me that if a student isn’t motivated to come to class or learn the material on their own, then their performance on exams/papers will reflect that, so grading based on attendance still wouldn’t be necessary. I guess AI does complicate this tho. Maybe the solution is in-person exams with a software like examplify that prevents internet access?
I *can* show up, I just sometimes *choose* not to (when I have the option) because it's not the best use of my time. Efficiency is also part of being a lawyer!
Yeah, I agree with this. I’m fine with attendance counting for these classes because it’s such a huge part of what the class is. (But I would distinguish true discussion classes from, eg, law school doctrinal classes that only involve some discussion)
At my college, NC State, professors were required to take attendance for 100 and 200 level classes to ensure underclassmen actually went to class. Typically the rule was something like if you miss more than X number of classes your final grade will be bumped down one letter grade.
Yes, I can set my own attendence policy. And I do have a rigorous attendence policy because I teach film classes. And if you can't attend a film class, you have bigger problems than my class.
Yeah, you can definitely do this and most of the classes I've taken have had a mechanism like that. STEM classes are pretty much the only classes I've seen where you can only come into class to take a test and still get an A. The most generous social science or humanities class I've taken would give you a B, and that would be assuming you already had encyclopedic knowledge of the coursework. More likely, you'd be somewhere in the C- range due to missing out on discussions.
And to think that 20 years after the end of my formal education, I still occasionally wake up in a cold sweat because I've had a nightmare about having missed class.
If you're doing it unilaterally it's at the very least a collective-action problem in just the same way that it is at the institutional level—if your course counts the same as your colleague's course but you grade more harshly, why would anyone take your course? And of course there is always administrative pressure not to fail students.
Earlier this year, I was an adjunct co-teaching a required grad school course. There was one alternative to this course and it was known to be less work/an easier good grade. That other course apparently always fills up first.
Our course, which the “real” professor has taught for years, was really good (thanks to him) and extremely relevant to both current events and the foundation of the profession. Yet few of the students seemed to want to be there, and their work (which I graded) reflected that.
I'm a business school professor. I can assign grades more or less freely*. The curve you outline is very close to what I am for in a 200 level undergraduate class (though perhaps with closer to 10% D/F and more As and Cs).
But, my impression is that compared to the humanities our grading is quite a bit more objective. Like.. I give tests where there is a problem with a "right" answer, so if the student gets it wrong, even if I give partial credit, they don't really have any grounds on which to complain. They still complain, of course, but I can shut it down quite easily by saying basically: you're wrong, sorry. In many humanities assignments, grading is more subjective. This isn't a bad thing, but it is much more of a judgement by the professor, which then leads to endless grade grubbing, and worse, bad-faith accusations of bias. It is much, much easier to give most students an A than to have to endlessly defend yourself.
As far as I can tell, the assumption that students expect to negotiate any grade they don't like is a relatively recent change in norms. I went to a liberal arts college as an undergrad and I cannot *imagine* telling one of my humanities professors that actually, my poorly argued essay deserved an A because I put in so much effort, and they must not care about students because anything less than an A is going to ruin my life. It would have just been unfathomable and I assume I would have been summarily dismissed from their office. I don't know. Perhaps other students were doing this and it was just me being naive. But these days every C student has a form letter ready to go (I hope this email finds you well!) that is on auto send whenever the gradebook is populated with something less than an A.
*With the caveat that faculty who were giving more than 50% As got an email from the chair last semester saying basically: stop doing that.
I'm sure I'm too late to the game for anyone to read this, but there are clear reasons why it is difficult to adapt a policy that would lead to only 20% As. For reference, I work at an R1 school, highly ranked department in STEM. The short answer is collective action - at all levels of the administrative chain.
One issue is that professors compete with each other for enrollment. If I taught an elective course with this grading policy, nobody would take it, so that is a complete non-starter.
Even when I teach mandatory courses, students almost always have a choice of instructor. If I adapted this policy, and the instructor teaching a parallel section of the same course did not, then no students would sign up for my section until the other were full. And the students forced to sign up for mine would be *very* disgruntled, and would leave terrible reviews.
It is true that once you get tenure, you can survive getting poor student evaluations, but it for sure hurts you.
I do try to nudge things in the direction John indicates, and work to get the university to disincentivize instructors from assigning high grades to get strong student evaluations. But the action space is much more limited than you may think, even for people who are very secure in their positions. You would need to convince the very top leadership (president and provost) that this is where they want to go, and then get them to forcefully enact policies to get there. That would be a very hard sell, for obvious reasons.
About a decade ago, I knew someone who taught at a small private college (which happened to be our alma mater) Most of the students failed her first test. She was told in no uncertain terms that wasn’t acceptable and she had to make most of the class pass. There was no suggestion the test was too hard; they just wanted everyone to pass so they kept paying tuition. Aside from maybe a few Ivy League schools, I think it’s that way most places.
"I have tenure, this is my grading rubric, those students didn't pass, go F yourself"
I'm actually asking a serious question: Why doesn't tenure provide the space for professors to act as they think best? Or does it only protect things like political speeches and controversial theories?
Tenure does not protect you from administrative abuse, specious and unfounded investigations, and retaliatory firing. It just means they have to waste time on generating a pretext for firing you or coerce you into quitting by substantially degrading your work and work environment.
I could be wrong, but part of what might be happening is that most actual grading is done largely by graduate students and/or non-tenure-track faculty, who have lots of incentives to produce satisfied customers. My first semester as a graduate teaching assistant at Harvard, I graded papers the way my papers were graded as an undergrad at MIT. I got eviscerated in course evals. Now, I don't give Bs anymore, and my course evals are glowing.
Even if a faculty member is willing to stick up for you and enforce the lower grades, they can't stop students from shredding you, and students are smart enough to make it sound like their complaints are more sophisticated than "my TF gave me bad grades."
I can definitely assign grades however I like (I’m psychology, not humanities though). I usually aim for a B- average but I’m definitely one of the “harsher” graders in my department. I’m willing to deal with students’ griping about grades and how much work my classes are and I know my chair/admin will have my back if any student or parent complains.
The main barrier I see in my younger colleagues is that giving poor grades feels (to them) like they are being “mean”. And today’s students definitely interpret grades as a judgement on their inherent self-worth. For example, I’ve had many students in my office say things like “I’m not sure why you don’t like me.” Or “you must think I’m terrible.” Just because they earned a C on a paper! It’s a problem.
There are three separate, if related, issues here: admin policies, grade inflation and rigor. I'll address all three:
1. Admin: This varies a lot, from my experience. I was blessed to teach at institutions where admin gave faculty 100% backup. I could assign whatever grade I wanted to any student and had the final say period. In cases of cheating, too, I found the relevant bureaucrats investigating complaints to have my back.
2. Grade inflation- The problem isn't bureaucrats, and even not so much evals (though those undoubtedly play a part). It's that I care about my students. It would not be fair for me to have a class with a median of 70% in an environment where the median is an A-. It would be punishing the students who chose my class by artificially hurting their GPA, potentially harming their employment prospects.
In other words, grade inflation is a collective action problem that must have a top-down solution.
3. Rigor - to an extent you can demand high standards even with grade inflation. This can be achieved by tricking the students a little. Grading early exams toughly to weed out those unable or willing to do the work, and scare the rest to do it properly. Then, at the end, to have some technique to get the median up. This is my usual M.O. I believe I end up having tougher courses than most in terms of the work required and the academic rigor, but in terms of grade usually I manage to get the median to be close to the normal (though sometimes students really don't do the work and the grades do end up much lower - I'm fine with that). That is the best balance I believe I can strike within my own individual power. Real change will require university-level reform.
P.S.
I've been lucky in the institutions I taught in, which are also among the most elite in the country. Most aren't so lucky and admins in many cases do NOT have faculty's back which of course makes everything orders of magnitude worse.
P.P.S
A separate issue is that one of the most rigorous and beneficial exercises you could give is a research paper. I stopped giving those to undergrads except in very small seminars because AI made it impossible. Any class where I cannot orally examine each and every student at length is a class where all grades are based on pen-and-paper closed-book class-proctored exams (+ participation in discussion, where applicable and feasible). This absolutely harms the kind of education students get compared to what they would have gotten in 2019, although less so than had I pretended it's still 2019 or even 2022.
If your program loses enough students, faculty lines get cut, and tenure can't protect you from that. Some schools are cutting entire departments, philosophy often being one (perhaps because the kind of rigor philosophy expects has led to low enrollment).
I’m sure I would be allowed to. But if I wanted to assign enough writing to be able to tell apart the top 20% from the rest reliably (and give the rest most practice improving their writing) then I would have to spend 2/3 of my time reading student essays, and there’s no way you’re convincing me to do that.
I mean, it shouldn’t be curved like that. If everyone does excellent work everyone should get excellent grades. Doesn’t usually happen, but in my experience (which is just as a TA) we kind of assigned the grades where the distribution separated and we did give Ds and Fs.
In my institution, you could do it and some professors definitely grade harder (not anywhere near that hard). If you kept your enrollment numbers up, there would be no issue (as long as you announced the policy at the start of class, that is) or if you're a superstar researcher. If you gave Fs that would be alot of work as those would all be bureaucratically monitored and you'd have to justify every single one, but D's are fine.
One of the issues is that one of the most common high-paying career routes for humanities majors is law school and law schools do care more about your raw GPA than how hard your classes were because they are ranked based on incoming students’ average GPA. That creates a massive incentive for grade inflation and disincentive for students to take classes that grade harder. Ideally you would be able to make the classes harder but everyone still gets a good grade in the end.
This is becoming less true. Advisory Opinions just had a podcast with the Deans of Admissions for Harvard and Yale Law schools. They discuss this issue, as well as the expanding use of accommodations. It is worth a listen.
Yes, this seems like a massively underrated factor. Law school admissions is mostly an IQ test (yay! Merit!) plus a mind-numbingly dumb unweighted GPA component
This is basically my point above, I think. People who seriously studied for the LSAT probably found the logic games the easiest to consistently nail. For people who are taking the test without much studying, LG is pretty easy to brick, which some interpret as "hard."
It's even worse now. I graduated summa from my university a decade + ago and was above the 75% quartile for GPA basically everywhere, and now my GPA is below the 25% quartile at a lot of places. I honestly don't understand how some schools have top quartiles with GPAs > 4.0.
I usually don't begin sentences with "As someone with an English PhD who's taught college" because it's insufferable to do so. But since that insufferable clause is true, I'll say I endorse every sentence of this post.
The problem with the admirable solution Matt suggests is one I discovered in my first semester as a grad-student teacher. I had gone to a college where my best English professors had very high standards for classroom discussion, assigned tons of reading, and didn't hesitate to assign low grades to low-quality work. I tried to apply the same standards once I started grad school and was the one doing the teaching and grading. The result was that I received terrible scores on my teaching evaluations and got hauled into the office of the grad-student teaching coordinator and lectured about my allegedly terrible teaching. I struggled to improve my teaching (item: it was fine) for another semester until I encountered a fellow grad student who'd been similarly lectured and who told me she just grade-inflated her way to higher teaching-eval scores.
My mentor at my next job told me that "Grade-inflate till tenure" is the standard advice to junior professors.
These jobs were both LESS pressured than most in that they weren't adjuncting gigs in which teachers are hired on a per-semester basis, which is the norm in the humanities at most institutions of higher ed.
These problems will only be solved if humanities departments at universities with graduate schools stop over-producing PhDs relative to the demand for full-time professors. Which is never going to happen, both because it would dent the egos of R1 professors not to have graduate students and because the politics of humanities professors render them far more likely to complain about "neoliberalism" and right-wing politicians than to read a chart or realize that supply and demand are real phenomena that exist and shape even the humanities job market.
That's not insufferable since it's relevant to your comment; what's insufferable is when people stick their academic credentials next to their username.
Tl:dr: I endorse Matt's diagnosis and prescription, but I believe the humanities is going to keep driving in its current direction until it goes right over the demographic cliff that's just a few years down the road.
The fundamental topic missing in this essay is Gen Ed. Humanities departments support their faculty (and PhD program)size by teaching general education courses and making those, which are the main exposure most students have to the humanities, hard would simply send those students to other departments and collapse the size of faculty and graduate student population.
The other thing that this misses is that for most students, STEM vs humanities is not the choice. Both of those are hard and for a minority of students. Instead most people are majoring in business, criminal justice, communications, education, nursing, etc.
Also, for a big gen ed class designed to get butts in seats, you don’t have enough teaching labor to grade the amount of writing it would take to make the class more rigorous. (I suppose you could make it artificially rigorous by asking students to memorize facts from readings, but that isn’t what anyone wants students to get out of the class.)
Yes, the entire premise of the piece is a bit off. The primary purpose of the humanities is not to develop people with a high degree of expertise in the humanities. It is to deliver elements of a liberal education to all students. In my day, the science departments offered "10-series" intro classes to non-majors (the textbook for Physics 10 was "Physics Without Math"). I suppose humanities departments could do the same thing, but I don't know how practical that would be.
Obviously I don't know where you teach, but at IU majoring in English or History is not as hard as in physics or CS but much harder than apparel marketing or kinesiology.
Another important reason is the structure of the curriculum. Most STEM disciplines have a strict prerequisite structure which means later courses can assume a lot. Most humanities disciplines don't organize things this way, for a variety of reasons, but that means there's a broader range of students in many classes.
First off I agree with all point in the article. But as a science professor at a PhD program school, Matt is missing one big change that has happened since he (and I) went to college. That is the rise of undergraduate business degrees. Where once students would choose between history or economics, they are now increasingly opting into management or logistics degrees. They are not moving more into stem (trust me.) The customers (the parents!) are laser focused on roi for degrees at all but the most selective universities, and business colleges fit the bill nicely.
Left unsaid in this piece is the impact of AI on the humanities, which I have to figure is going to be significantly negative from the standpoint of learning marketable skills. That is, if in the humanities you learn how to read difficult texts, to write intelligently and persuasively, how is that differentiated from what the next GPT is going to be able to do, and is it therefore already devalued? However, if institutions took the obvious anti-AI-enabled-cheating step of implementing oral exams for the humanities, then I could see the marketable value from someone who could discuss difficult subjects on their feet.
I mean, that's more or less true of every major, though. Pretty much all STEM majors are teaching about what is valuable now versus what will be valuable later. Right now, it seems like a substantial proportion of what CS majors take is just straight-up useless, and there's not much keeping the same from being true in other fields once the technology improves.
I guess what you say is true of computer science, but this is not the case for fields where hands-on work is required, such as bioengineering or chemistry. Until we have humanoid robots to do AI’s bidding, people will be necessary to actually build and test devices, materials, chemicals, organisms and so forth.
My first college English professor was Dr. Ray. With an infamous nickname of “Death Ray”. He graded extremely hard (at first), but was always extremely clear in how and where we needed to improve.
Most important class I ever took - because it made me stretch my abilities. Learning I could even do that was foundational.
I had a similar experience (in the 20th century) with a freshman year course called Methods of Thinking. It was fundamentally about structuring an argument; I can’t recall the whole syllabus but we started with a lot of Plato’s (Socratic) dialogues. It was a lot of work but it was by far the best class I’ve ever taken (and I’ve taken *a lot* of classes). I use what I learned in that class pretty much every day.
As a squishy humanities-inflected social science guy who never shuts the hell up on this topic here, I agree 10,000%. But four caveats:
- the risks of being a first mover on tougher standards, either as an individual educator or institution, are immense while the possibility of payoff is remote;
- as long as there is an insistance that undergraduate education must be evaluated solely in terms of monetizable societal benefit, the above risk / payoff ratio becomes even more lopsided;
- everyone has to accept the possibility that it is their child who will decisively wash out of any such higher-standard program, dyslexia or other verbal/literacy difficulties nothwithstanding;
- without succumbing to The Discourse on AI in education in either direction, I'm just going to say "embrace AI as a productivity-enhancing technology" and "significantly increase the rigor of undergraduate humanities education" are, at best, two goals that are hard to reconcile with each other.
I agree with you on the first mover problem. But in terms of “monetizable social benefit” my point is that the current paradigm has already collapsed that for humanities programs
Sure - which just means that the amount of kayfabe around desperately avoiding rigor is even more insurmountable, I think.
I'm that cursed specter, the humanities adjunct we love to skewer, and I agree with both of your takes 10,000%.
I'm encouraged not to be tough in describing the class on the first day, so that students will stay enrolled through the week for funding. The non-selective U is pushing AI for unspecified purposes, the students who use AI are mostly international, somebody has a documented accommodation regarding slides/lecture notes, a device, visual examples, etc. Somebody is going to vanish (the biggest issue in CC, where I was FT), so attendance and participation have points attached. I'm hardly an example of the financial fruits of humanities study myself. I do plan to treat Canvas as a file cabinet and push the class further toward pen and paper notes and annotations, cold-calling Socratic seminar, and low-tech writing workshop, but each of those is a heavy lift in a course that shares a busy syllabus with other sections taught by other faculty.
I mean, as far as I'm concerned as an adjunct you're a hero. I know that doesn't really help, but for what it's worth! I've never had adjunct faculty in a department I've been in, but if and when I do I resolve to treat them like a human being, which seems to be difficult for some full-time faculty.
Good luck with all that - I really hope you can carve out some rigor and dignity amidst everything.
Thank you so much! I've taught in other, more stable, contexts, but we made family choices (GingTFO of the Deep South with two young/pre teen girls) related to an opportunity for my husband. It looked like I was on track for a NTT spot, but the boulders of state and NSF cuts (many of my students have practical majors related to research, local industry, or allied health), and AI have fallen on that one. Here's to a good semester despite it all.
The fallout of the cuts goes a long way. I try to be optimistic about our ability to get out of this, but most days I am not successful.
The first mover problem is easy to solve. Just have the most prestigious university break the mold. Looking at you, Harvard and Yale. They make humanities degrees harder and therefore more valuable to get and the rest will follow in their wake.
Princeton literally tried exactly this for ten years during the period that it was consistently ranked #1 by US News and World Report and abandoned it as an unsuccessful experiment in 2014.
https://www.thecrimson.com/article/2014/10/9/princeton-grade-deflation-reversal/
https://statmodeling.stat.columbia.edu/2014/11/23/princeton-abandons-grade-deflation-plan/
That's unfortunate. I would note (as one person quoted pointed out) that there's a difference between shifting the curve on grades and making a point of being more rigorous. You can do the first without the second, though the second is much harder to pull off.
I was a TA at Princeton back in the late 1970s and indeed we had a lot more freedom to inform students via their grades and they just weren't cutting it.
I expect that the downside risk of poor grades was more marginal relative to the signalling value of the degree-granting institution in the 1970s, although that’s mostly a guess. The harder post-undergraduate institutions scrutinize grades as a legible proxy for desirable outcomes (law / med school admissions, jobs, whatever) the more you run into incentive distortion.
They already have a template for it too, as the University of Chicago is known for exactly this kind of rigor (or at least, it has been historically).
It is a bad equilibrium. Faculty should be able to force students to take pop quizzes and do oral exams/presentations of their written materials. That is the only way to get around AI slop.
Totally agree - but "test anxiety" discourse is strong, and I, at least, have to offer alternatives upon alternatives as an accessibility measure.
I don't want to sound insensitive, but I can't emphasize enough how, from an educator's perspective, accessibility requirements have made every direction I turn in trying to come up with AI-avoiding assessment into a dead end. To say nothing of the "just let them use AI, who cares about writing anyway" discourse.
Accommodating anxiety only makes anxiety worse.
They got test anxiety, then take more fucking tests until you fret less?
Anxious about public speaking, put them on stage time after time until they learn to fake it.
Anxious over deadlines, yeet their phone into the garbage.
Anxiety about public speaking is one of those things that is an entirely coddling-generated problem. I don't mean public speaking in front of a crowd of a thousand people -- I love that kind of shit, but I'm a weirdo, and 'stage fright' is a totally normal problem.
Presenting to a group of like 25 people is not stage fright, that's a severe and debilitating anxiety that will haunt you for the rest of your life and it's completely insane we let kids walk out of college having given maybe three or four public presentations, two of which were probably with a group that gave most of the speaking to the most comfortable group member. And it really can be overcome, I have seen it happen with debate kids when I was a coach. Really shy and awkward people come in as freshmen, and leave as non-shy (but probably still awkward...) people two or three years later.
Hell, we get students shitting their pants when we have them present to just two lecturers as part of a mock interview assignment we do. It's bad. But also amusing, when it's the bro-y lads who were disruptive and useless and then suddenly cat's got their tongue when they have to stand and deliver.
"...we get students shitting their pants when...."
It's a pity that was not an option back when I was in school, because I could have *excelled* at that. I could have *cleaned up* (I would have had to, afterwards). I mean, when it comes to shitting my pants I take a back seat to no one.
My CC required a Speach class.
5 individual presentations
Hell, my HIGH SCHOOL required all students to take a one-semester public speaking class, and we gave four presentations over the course of it. Mine were on how to fold a paper airplane, Lake Superior, Jefferson Davis, and why a 5 cent/gallon county gas tax was a good idea.
It's funny, because I know a professor who had what would now be called very bad public speaking anxiety, as in threw up before and after every class she taught for a semester or two...and then she got over it. Never liked it, but did it well ans professionally for the next 30 years. Law School and cold calling did something similar for me.
"Yeet" is an inherently funny word.
It’s funny that it is a relatively new word for tossing something.
“Look up” dictionaries lag behind. The meaning given is “exclamation, a phoneme used to express excitement”.
This is what is so frustrating to me about some of the discussion around this topic. I have a family member with both OCD and anxiety, the latter likely stemming from the OCD. The successful treatment is to face it head-on, preferably under the guidance of a therapist who knows what they are doing. Alas, many therapists have no idea how to treat OCD and PTSD, and many of the strategies they use make the problems worse, but that is a column for another day.
I agree, which is why I think the ADA and its expanded uses are a big part of the problem here.
And I don't know how to fix that. I believe students who need wheelchairs and properly digitized texts on their text readers should have those things. But I don't know how we provide those things without also allowing the rampant overuse of "anxiety" diagnoses and time-and-a-half accommodations.
Like anything it turns out you can teach people how to deal with anxiety. I feel that it is a failure in our educational systems that they don't.
It used to be that people "organically" learned to deal with anxiety for the most part except for a "tail" of some people. So why is it that people aren't learning to deal with anxiety?
There's so much I think schools should teach but don't:
* How to self-regulate and manage your anxiety
* How to fall (roll on your shoulders)
* How to diagnose your baby's ear infection (do it before your doctor's office closes)
* How to manage your finances
There is so much to learn!
Yeah, accessibility requirements are also a massive black hole of time and effort for instructors. Instead of being able to spend our scarce time developing and improving activities, assessments, materials, etc, half of it gets sucked into accessibility requirements which most of us managed just fine without 15 years ago. It also vastly adds to the administrative bloat of the university as webmasters have to spend half their time on website accessibility remediation. All of this just to avoid lawsuits/comply with settlements under ADA.
EDIT: I appreciate Nathan's point which is that we shouldn't throw the baby out with the bathwater. Everyone who has replied to him seems to have got my intended message, which is that there's a balance to be struck. The penduluum has swung far beyond giving Deaf, blind, and physically disabled students what they need to learn. It's part of a broader phenomenon of court-mandated accommodations that far exceed the scope and the population ADA was originally designed for, and that beyond the higher ed context, are devouring K-12 school district finances and making it impossible for K-12 teachers to do their jobs.
Real bummer that those Deaf kids can get an education at your school, eh?
*You* may have managed just fine without them, but only by excluding a lot of people.
Remember, there are never actual tradeoffs between things we want, so anytime someone opposes something you think is good, the best response is snark or moral disapproval.
I think the above comment is referring to two different things:
1. Learning disability accommodations (extended time on tests, etc.), which have exploded and are now abused at scale as an aid/security blanket by large fractions of the population
2. The consequences of the ADA rulings about captioning, etc., which, well beyond captioning, in practice have generated lots of confusing bureaucratic constraints about accessibility notices that feel more like GDPR cookie banners than real accessibility for the deaf.
I think the "we did fine before" comment was about (1) not (2), let alone actual accommodations for the deaf.
Agree. We are currently in the process of making our website meet the new federal accessibility requirements for institutions that receive public funds. The new requirements go into effect next year. We are not legally obligated to do this, but many of the institutions we serve are. There is much debate among the people these fixes are intended to serve as to the best way to do this. For instance, alt-text. Many sight-impaired people have told us, "Enough with the excessive alt-text!!" At this point, we are focused on meeting the requirements and will worry about the rest later.
We do want to serve all people in the best way possible, but I do sometimes wonder about the tradeoff between resources used vs how people are served. Another example relates to digitized historical documents. It takes a while to create extensive metadata (obviously, this serves all kinds of people by improving searches). Some researchers have told us that they would rather have more material online with less description than fewer items well-described. The idea is that researchers can always contact the holding institution to get more information if items seem potentially useful. And blind researchers have told us that if items are up with little description, they can find a sighted person to describe them, whereas if items are not put online at all due to lack of description, there is no way they will find them.
A great piece from last year about this: https://www.chronicle.com/article/do-colleges-provide-too-many-disability-accommodations
This is a massive strawman - no one is saying that genuine accommodation needs shouldn't be met.
More to the point, you can almost always tell the students who really need accommodations from those who don't. The former are on top of things, try not to be a burden, and compensate in other ways; we are always happy to work with them. The latter are usually demanding, asking for exceptions even beyond their accommodations, and sometimes fail anyway.
Andy Hickner's comment specifically says that they should not be required to be met. Based on what happened prior to the ADA (and even after that), that means that those needs will not in fact be met. Hell, they're barely met today when they are required.
You're reading him unreasonably charitably, and inserting a Shirley exception.
Part of the reality is that we're letting in disabled kids who shouldn't go to college. I've had three students in the past four years who very evidently had the sort of autism that lowers one's intelligence. These kids weren't math whiz savants either, quite the opposite. Not talking about a no name school here.
I teach and have zero problems with accommodations. My problem is that the disability resource center makes the profs provide OCR copies of everything assigned to read, which is incredibly time consuming. So for me, it's about inadequate resources for accommodation.
My problem is not so much that the accommodations come with a lot of admin. It’s that I can barely teach anything - like, assigning reading and expecting them to do it is Kafkaesque.
Can I ask what the time consuming part of OCRing is? I'm a data scientist and my (likely naive) impression is that OCR is essentially a solved problem with off-the-shelf open source models now. I OCRed 5 million pages of permitting documents for a couple hundred dollars a few years ago; I'd bet it can be done for 1/10th of that cost today. Is there a software usability barrier?
I can also confirm this from personal experience
This is 100% true. And under discussed.
Do you actually have to make accommodations that preclude use of quizzes? Absolutely wild if so.
I just... don't get it? The whole point of a quiz is to assess knowledge. If a student knows information, they can put it down on a sheet of paper and pass. How is it even possible to assess knowledge if quizzes are not allowed?
Explaining it in detail would be tedious, but basically I can’t give anything that’s a one-shot, time-limited assessment.
But yeah - that’s an excellent question! And one that I pose quite often, though it falls on deaf ears. The answer is that we’re often *not* assessing knowledge, not really.
In person exams being the only way to test for merit and knowledge, is something the Tang dynasty discovered in the 600s. I don't think the dynamic has changed much.
Yeah- I learned about this from the Judge Dee books!
It seems to me that the solution is to do more online lectures, and reserve class time for both proctored test-taking (not necessarily "pop" quizzes) and real-time essay writing. The reverse of how we've always done things in the past.
Online lectures seems like a recipe for students not paying attention.
Practicing paying attention to an online meeting/lecture/briefing is a genuine white collar job skill these days...
100% agreed. if you can't pay attention to a lecture, you can't pay attention to an hour long business meeting. if you attend a large meeting as a junior employee you probably have very little relevance to the meeting itself and not a ton of reason to even be there, but your seniors drag you along because they want you to pay attention and understand the business from a wider perspective. the people who successfully do this are the ones who are able to contribute meaningfully and impress the people they need to impress.
I'll be using this as an example in class --thanks!
Apple would like a word with you. Please.
https://www.youtube.com/watch?v=BK8bnkcT0Ng
Then they won't do well on the test.
There necessarily has to be an in-class component and an out-of-class component because you can't do everything in 3-4 hours per week. The question is simply which activities are better to do when.
Also, as I recall, holding lectures in person is no guarantee that students will pay attention.
Fuck, online goddamn group corp (as like says vendors) meetings as well - EY consultants - the juniors - see on a grid project EY hired on in my zone, they fucking zone out, phone it in w AI summaries on work flows from call (AI summareis they don't actually fucking proof well and didn't grok cause zoned out), we have to beat on their heads.... (figuratively).
Isn't just school.
I agree - real-time essays, or even essays written across multiple class periods, would be good. There are internet blockers you can use to force students to use only the sources you want them to use.
I’m thinking it has to be more like science classes where the lab component requires four hours in the room for 1 unit of credit - we need four hours in the writing lab for 1 unit of credit, but now they don’t need to spend any time actually doing homework at home.
The biggest difficulty with this is that it takes more minutes per student of faculty time to do this. Even in a class of 30 this will be very difficult.
Universities should have test labs with proctors and blue books and rotate writing class exams through them. It shouldn't fall on faculty, there's an easy economy of scale there
I think it actually makes sense for those rooms to have computers too, as long as the computer just has a simple word processing program, and perhaps the instructor can set some folder of documents that students have available when it’s their turn at the computer. But yes, the economy of scale is a big thing - and it also allows students to schedule at different times if they have need of extra time, or need to schedule around work or whatever.
This is basically what Thompson-Prometric testing centers are. They administer exams in large, quiet, proctored rooms. The exams themselves all seem to be computer-based, which is assume drastically simplifies things. The computer automatically tracks time, closes the test at end time, and allows the taker access to materials allowed on the exam.
I took the patent bar at one. Many of the other people there with me were taking some kind of accounting exam. It really worked smoothly.
Like you point out, one advantage would be flexible scheduling. A computer can automatically assemble a test made of questions selected from a large question bank.
Fwiw this is how law students have taken exams for a long time. Norms shifted a bit during Covid from my understanding but the traditional way for law classes to grade is one in-person final exam, worth 100% of the grade, taken on a locked-down computer. You can even use your own laptop because vendors make software for this purpose.
Universities could have test labs with computers not connected to the internet. You don't even need blue books. Just someone monitoring the students.
Exactly. We just need a lot more of them, since all the time that students used to spend doing writing assignments at home will need to be in these spaces.
A closely related issue was pointed out years ago in the comments of DeanDad's blog.
In math, and the math-heavy fields, you will routinely be demonstrably wrong: it's absolutely part of the everyday experience to be wrong, correct the error, and go on.
This is much rarer in the humanities other than foreign language, in practice. "Could be better" is routine, but "flatly wrong" is not an everyday thing.
“ This is much rarer in the humanities other than…. flatly wrong" is not an everyday thing….”
But this is an area where Philosophy might have an advantage, because in that discipline “flatly wrong” is an everyday thing. And that’s just for the teachers, much less the students.
This is very true. I deplore that a lot of humanities and social science education became about having "correct" critical views on the world, but at the very least it was always trying to emphasize that interpretation and argumentation are core to intellectual development. Clearly this often swerved into some bad places, but it was always there, somehow.
The problem is that teaching cognitive processes is, necessarily, hard and full of trial and error and subtle course corrections in thinking. It's slow, frustrating and hard to scale up. It's by nature inefficient and fragile, especially in the face of AI.
And I think the further corollary of this is that you need to grade a lot more work to give the same amount of feedback on this, and that just isn’t sustainable for faculty.
Not at 4 year research universities, but it’s routine at community and liberal arts colleges where the caliber of teaching is better
I used to tell my English students, "There ARE wrong answers in English class. The study of English is not just saying whatever you want and chalking it up to 'symbolism.'" And then, of course, I'd teach them how to analyze texts well.
But I have to admit that I think the discipline as a whole needs more hard-assery in its scholarship and teaching. Because we do produce work that includes far too much trendy BS.
One can make the exact opposite argument, though. For most science questions you are either right or wrong. This simple binary is far less advanced than the Bayesian need to balance on the head of a gray-scale pin. I think, however, that forcing students to be never willing to settle on a yes/no answer, in humanities, probably requires more teacher effort and possibly a better teacher, overall.
I think top undergraduate programs (Ivies, etc) need to lead the way on this. I received a wonderful humanities education at a middle of the road private liberal arts college a few years ago. I no doubt benefitted from aggressive grade inflation that helped me get into elite law schools. But in my summer internship and full time job applications before law school I found that my college meant nothing to most hiring managers (good or bad), but I at least had a high gpa.
The elite undergraduate programs are the only ones that can initially raise the standards without throwing their students under the employment bus post-grad. A 3.0 Princeton English major will still have an easier time finding work than a 4.0 XYZ College graduate. And, assuming that the elite undergrads are the ones more likely to fill the ranks of academia all through the prestige scale, they can then filter this practice through academia over ensuing decades.
The Princeton example is interesting because they did try to aggressively combat grade inflation and I believe they rolled it back after pushback.
I went to Reed College, a known holdout on grade-inflation for 40+ years. For a liberal arts college it is heavy on both STEM (only nuclear reactor run primarily by undergrads…) and the humanities. It’s a tough school and back in my day, it had an insanely high dropout rate given the quality of students.
The professors get away with grade inflation because you don’t get grades on tests or papers…only points and detailed critique. (They write down your grades at the end of the semester and you have to go to the registrar and ask for them at the end of the semester!)
I think I would have done better professionally had I gone to a more “normal” school. It does not have the name recognition of a Harvard or even say, Purdue or Penn State and unless you go directly to grad school, the cost-benefit of the college is terrible. With PhD’s being overproduced and professorships outsourced to adjuncts the deal is worse than ever.
They send a letter with students’ transcripts and I was able to get into a competitive graduate program. I do wonder about elite professional programs, however. I can’t image Med-schools being happy about a 3.5 GPA, even though there’s only a handful of students who has ever graduated with a 4.0 GPA ever.
Me, too (went to Reed)!
A clarification about grades, you don't get them if you are a C or above, so you definitely are informed if you are D and failing. I didn't see my grades until I graduated and asked for a transcript. It wasn't a surprise that they were A's and B's, but it was a surprise as to which in a particular class. (I got B's in some classes that I thought I did better at, while some A's where I had thought it would have been a B.)
Another point about Reed, you are required to write and present a thesis. This includes a defense before a group of professors.
YMMV (your mileage may vary), but I have gotten recognition for having a Reed degree, and I think graduate programs kind of know that given that ¾ of Reedies go on to a graduate program (not me).
In conclusion, don't short yourself, a Reed education taught and encouraged me how to learn, which is vastly more important than any specific knowledge (although I got that too).
The same thing happened at my alma mater. The grade deflation policy lasted about 15 years, and the school rescinded it in 2019 after years of protest that grads were getting hurt in graduate school admissions due to the policy. I have no idea if the latter is true. I know the school noted the policy on transcripts, etc.
I don't think grade inflation helped you. I do graduate admissions and the signal from grades is so weak that I hardly consider them. Oddly enough, as a historian, the thing that gets my attention is not straight 'A's but the transcript with some 'A-'s in stem subjects as a signal of intelligentce
This isn't true for a lot of grad programs like law though. A 4.0 in basket weaving is always better than a 3.6 in a triple major in hard STEM.
I recently listened to an interesting interview with the admissions deans at HLS and YLS, who indicated that GPAs have gotten so inflated across the board that they now use course selection and recs more than the raw number. Of course, what works for HYS likely does not work for more aggressively number-maxxing schools like Wash U.
That's because of US News and World Report rankings.
Yeah, but that's literally all that matters. Ask any of the lawyers here whether job placements correlate with rankings.
Princeton is known for having harder grades and the top law schools have fewer Princeton grads than Harvard or Yale grads as a result. You can’t fix this problem without changing law school admissions and other major post-grad routes that emphasize raw GPA. And that could be hard if you have lots of hiring managers who don’t care about the name of the school but only look at raw GPA.
I guess an alternative would be external assessment where academics from other universities assess the quality of work submitted for grades and work to ensure that an A or an A- means the same thing across different universities and subjects.
In the UK, just about every paper submitted will be assessed by at least one external examiner as well as by the professor who submitted it - and most senior lecturers and readers (the mid-ranking academics - associate and assistant professors in US speak) will end up assessing at least as much work from other universities as their own, often more.
Much as I like to imagine the external examiner from BU going into Harvard and changing A-s to C+s, I doubt they'd have the prestige, authority, or backing to actually do it; more likely, the generous marking standards of the Ivies will get pushed elsewhere.
What would be fun would be the accreditors saying that a university that gives it's highest GPA to more than 5% of the graduating class must introduce a new higher GPA, so you'd be able to get a 4.1 or a 4.2 or whatever. Even if students really are doing better quality work than in the past and therefore more of them deserve a 4.0 (which is one possible cause of grade inflation and is what most of the universities claim to be the case), the purpose of the grade is to distinguish between the students, so giving 40% of them a 4.0 doesn't achieve that, and they'd need to go up to 5.0 or 5.1 to make enough useful distinctions.
At that point just put more emphasis on standardized tests…
It’s not really fair to rank students compared to others at the same school because schools vary so much in selectivity. Like it is just objectively harder to be in the top 5% at Princeton than a non-selective community college. So on that basis Ivy League grade inflation is understandable and they could reasonably argue that getting an A there isn’t any harder than at community college even if a higher percent of their students are getting As. It’s a hard problem.
That’s why I think Princeton should be handing out Ss (like a videogame meta tier list) and giving those a 5 as a grade point to average when calculating a GPA.
This feels like the logical pinnacle of grade inflation. You inflate grades so much you literally need to invent new letter grades
S tier is a wonderful concept that should indeed be used for much more.
"In the UK, just about every paper submitted will be assessed by at least one external examiner as well as by the professor who submitted it - and most senior lecturers and readers (the mid-ranking academics - associate and assistant professors in US speak) will end up assessing at least as much work from other universities as their own, often more"
I wouldn't say "just about every," or even many - external examiners tend to look at module-level statistics for outliers and then might look at specific work if there's an issue or to see what feedback generally looks like. Moderation is usually internal (and that's also usually just a subset of assessments, like 10%).
But yes, it's interesting that peer review of day-to-day operations is institutionalized in the UK and it isn't in the US.
The law school arms race is nuts these days. Hope you were smart enough to determine whether your undergrad awards A+ grades before attending, otherwise good luck!
This is true of the service academies as well I think? They kind of have a job mapped out though…
I graduated from Princeton in 1982. A few years back I had to get my transcript as part of the paperwork for a fellowship at Chapman University in Orange County. (I already had the fellowship. This was just routine stuff.) I was shocked at how many Bs there were. Nowadays it would be considered lousy. Back then it was top 10% of the class.
But why would they do this? How does it materially benefit the graduates of those schools to have stricter grading in the humanities? It's not as if a Princeton undergrad is suffering at the moment because employers aren't convinced they're smart enough due to grade inflation.
Also, I don't see how elite programs leading the way is actually a key to solving the problem. Not only are Princeton humanities students not suffering on the job market right now, but they are also probably still getting good educations and learning a lot, grade inflation or no grade inflation. I would say the programs that need to lead the way are big flagship state universities, which are the types of schools that have small rigorous philosophy departments that stand out from other humanities departments. (Some "directional state" type schools don't even have philosophy departments at all.)
Just to reflect on my own experience as an undergrad at Rutgers, a state school with a very rigorous philosophy department, I was sort of stunned at how much more strictly graded the Philosophy 101 course I took there was than any other humanities class I took was, and given that I wanted to apply to grad school in history (which I did, and I'm now finishing a PhD at Columbia) and knew that GPA was important for that, I simply didn't take more philosophy classes. I got like an A- in that class, so it wasn't like I did all that poorly, but it took significantly more work to get that than it took to get A's in most of my other classes.
Don’t fall for the trap of perma adjuncting. Econ History is still a place for tenure in Europe. Be prepared to exit academia if you are given stable employment options, because you are more than how unis treat new PhDs.
Already planning my way out of academia! Have a number of friends that have shifted into consulting/other business paths, so I'll likely end up following in their footsteps.
Taking undergrad classes at Rutgers in philosophy must be like accidentally stumbling into a boss room in a video game. I'm not sure a lot of people would expect them to have one of the best philosophy departments in the world.
That's an interesting example, given that Rutgers has a merely decent overall reputation (apparently it's #41 in US News for national universities, which is honestly higher than I would have thought), but has one of the strongest philosophy programs in the world.
Would an admissions person in a non-philosophy humanities PhD program know the second part and take it into account if you'd taken more philosophy? I would kind of hope that they would, but maybe that's not a given.
The admissions people are generally professors in the department that houses the PhD program you're applying to, so while they may be aware of the reputation of the philosophy program at Rutgers, it's not really a safe bet. If you were planning to apply to Philosophy PhD programs on the other hand, then sure. I'd also say that my experience of undergraduate history courses here at Columbia doesn't give me the impression that they're graded much if any more strictly than the ones I took at Rutgers—my sense is the difference in rigor here has less to do with the quality of the philosophy program at Rutgers and more to do with differing disciplinary standards in general.
Definitely, I had UVA, Berkeley, etc in mind as well.
I think they would have to just do it for the love of the game, basically. There is certainly no material incentive to do it.
But my understanding is that at the turn of the century Princeton was a pretty unserious place for academics. It was mostly a finishing school for the elite and not a place of deep intellectual rigor. And then they hired Woodrow Wilson as president and he turned it into a place that students came to learn.
Maybe it is naive, maybe my understanding of that history is incorrect. But hopefully for the good of the nation and humanity people will bring about a new age of academic integrity and rigor in the academy.
"for the love of the game"
It's been a while since I was in school so my first-hand knowledge is a bit dated, but it seems to me that a lot of humanities academics no like anger have a clear sense of what game they're playing or why.
To me, other than the commercially spinoff analytical and commination skills, its about gaining a better understanding of humanity, our good points and bad points, strengths and foibles, why the same tragic and stupid mistakes keep happening over and over, at the collective level more-so than at the individual level. Like how a therapist can help an individual gain more insight into their own behavior, why they keep falling into the same patterns, that what studying the humanities can be for humanity as a whole. But a lot of academics seem to approach their role more as scolding cleric than understanding and patient therapist.
"...academics no like anger have...."
typo for "no longer have"?
If so, awesome autocorrect.
Yes, correct. And thanks for the humanistic demonstration of looking through the surface chaos to see meaning....
It took me a moment to understand which century was turning in your comment. Woodrow Wilson: got it now.
I guess I just think that such a thing will only happen if it doesn't actively cut against material incentives, especially in more than one or two institutions.
I think you should look beyond "material incentives" for explanations. Though they are real and important, it seems there are other things going on here. I would note that some of the material results -- lower number of students, less revenue coming in, significant backlash and reductions from public funding sources -- are pointing in a different direction.
You could be right, hopefully not though. I don’t know how else we’ll turn back the tide of decades of grade inflation.
the first mover problem is definitely real for professors at mid-level state schools and so on, but i really feel that institutions dramatically underrate the appeal of a school being hard. it's absolutely true that once a class begins, the students in the class would rather get good grades than bad grades. but a lot of smart kids want to go to places that have a reputation for being difficult! when i was undergrad, i had the option of taking normal calculus, which was a massive auditorium and all the grades were based on computer homework and exams in a computer lab. there was also Honors calculus, which was a small section of 20 students being taught by a professor who had a 1/5 on RateMyProfessor with like a hundred reviews all talking about how he was mean and made everything too difficult and assigned too much homework and so on. I took the honors section, and 19 other smart kids took the honors section, and we actually learned to understand calculus instead of learning how to algorithmically solve calculus homework problems. and at the end of the day, it barely made a difference on our transcript (just a little H tag next to the class), but we all enjoyed the sense of intellectual superiority the class afforded us.
anyway, all this is to say that i think a school that actually advertised itself as doing a Return to Rigor where you're going to take hard classes that dumb and lazy people can't complete would actually make the program *more* attractive to the kinds of kids you want taking your classes anyway.
this does not do a lot to resolve the actual 'departments are funded by the tuition of kids that shouldn't be there' problem though, so i expect it's best reserved for programs that are already relatively well-funded by things other than tuition.
"anyway, all this is to say that i think a school that actually advertised itself as doing a Return to Rigor where you're going to take hard classes that dumb and lazy people can't complete would actually make the program *more* attractive to the kinds of kids you want taking your classes anyway."
Maybe - but my point is that this is one hell of a roll of the dice, even with solid financial support independent of tuition.
"a lot of smart kids want to go to places that have a reputation for being difficult"
This certainly was no small part of the appeal of MIT back in my day, even if they no longer had the freshman orientation line: "Look at the person on your left. Look at the person on your right. One in three of you will not graduate from the Institute."
This is part of why I went to Georgia Tech ,though given that I got in-state tuition, there was also a huge financial incentive. Also, MIT said no :(
Since all these trios of new students looking right and left overlap with each other (you're in three of them), doesn't this mean that basically no one will graduate?
Granted, I was not a STEM major so I'm probably screwing up the analysis here.
The line they had when I was doing college applications was "If you're the smartest person in the room, you're in the wrong room." I always liked that philosophy.
Sounds good, until one person looks around and then gets up and leaves. A moment passes and another person looks around, gets up and leaves. And then . . . oh well, I think we see where this is heading.
The tradition in Oxbridge and the old-fashioned Ivies was that you could/can teach hard stuff while letting people who don’t want to work that hard skate through with a “gentleman’s C”, or “gentleman’s pass” in England. So even easier than making people fail is to simply make grades real again, which solves at least part of the problem; it makes GPA filtering for law school work a bit better, at least.
The funny thing is that, at least where I am, we don’t have a problem with grade inflation at the top; 1st class (A-equivalent) degrees are genuinely rare. We actually give out plenty of gentleman’s / gentlelady’s 2:2s (Cs, basically). It’s just that no small part of them *really* should have flunked out.
Beyond the first mover problem, this is a competitive strategy. Yale and Princeton can make their humanities classes hard and know they’ll attract the best students, but if Ithaca College makes a history degree harder then rapidly rising in the rankings is their only hope to fill seats. If every college does so, a lot of history departments will simply shutter. Many of these programs only exist because they could launder the implied intelligence and industriousness of a college degree.
Right - this is essentially what I mean by first mover problem, but this is a good illustration.
Though I wouldn't necessarily say that if the Ivies decide to take the gloves off with respect to rigor, they'll instantly raise the standard of their students. Maybe, but I think expectations have been locked in to the point where it would be a rough ride even for them.
I thought your “first mover problem” was closer to “if a random school raises standards, they might lose a lot of students in the short term and not reap the reputational reward for years”, whereas this is making a quite different point: only a few colleges can improve their reputations this way at all, because you run out of smart and driven students. Regardless, both of these are strong headwinds facing this strategy.
I'm not sure I agree on the last point. For two reasons, one relating to the humanities student's role as an AI user and one relating to their role as an employee of AI users:
1) it could be the case that a more rigorous humanities education gives the student skills that make them a better user of AI. AIs are, after all, as I understand it, trained on important works of literature, among other things. Possibly the well-trained student could phrase questions to the AI more precisely or more evocatively and therefore be a better "manager" of their AI "workers."
2) In a world where many workers are using AI to enhance productivity, those workers will still be employing other humans to do things for them. Those things just may be different than what humans employ other humans to do today, because some of what people are hired to do today will instead be done by AI. But it could be that rigorous study of the humanities will prepare students to be valuable employees to other AI-using humans in this AI-rich world.
I don't think this is wrong, necessarily, but you're just restating my assertion, which is that you have to come up with a way to convince students that rigorous study of the humanities *without* AI assistance in the first instance is *foundational* to understanding AI later down the road.
I have no idea how to do it at other schools now, but my alma mater specifically really resisted grade inflation. At least as of 15 years ago.
Their reputation (and a pamphlet that went out with every transcript) allowed graduates to get into grad programs with lower than average GPAs for entrance. I kept an outside scholarship even though I fell below the required GPA to maintain it.
The humanities classes were still considered less challenging than the hard sciences, but they weren’t easy.
Good for them. Care to give us the name of this admirable institution? It must have been very hard to resist the national tsunami of grade inflation.
It was Reed College. To be fair, a weird place in a number of ways.
Ah. Yes, my daughter and I visited there for her college tour and she decided not to apply because, indeed, it was a weird place.
I was impressed by all the bound copies of senior theses they featured prominently in the room where parents and prospective students gathered.
I hope she enjoyed where she ended up! It's an interesting prospect, figuring out where you want to spend the first few years of adulthood.
University of Puget Sound in Tacoma. Which turned out to be a treasure and absolutely perfect for her. Highly recommend!
RE: "First Mover", I disagree. Maybe it's a problem to be the 4th or 5th mover, but being the first one to stand up and loudly declare, "OUR Humanities majors are smarter than the ones everywhere else because our classes are way harder" gets you a lot of pub and a lot of smart humanities majors with chips on their shoulders.
But they won't be smarter, unless the school is an Ivy. Schools already filter so aggressively on general intelligence that a middle-ranked school can't possibly hope to produce better thinkers than a top school reliably, no matter what curriculum changes they make.
I mean…maybe? But it’s one hell of a gamble.
Professor in a humanities discipline at a research university here. I love the idea in principle, but here is the practical problem.
My Dean has made it clear that what matters to him when making decisions (especially decisions about whether to replace professors when they leave or retire, what kinds of financial support for department activities etc.) is “metrics”. And he has said openly that the single metric that matters most in this context is class sizes - how many students the department teaches.
So if we make our courses so hard that our enrollments collapse by half, we will lose massively. So our strong incentive is to keep enrollments high, which is best done by not making it unfeasibly hard to get a good grade.
This does mean that our major looks less prestigious, for the reason Matt explained, so we attract fewer majors and probably lower quality students doing those majors. But the Dean doesn’t care about that, provided the overall undergraduate numbers hold up via our elective courses.
The way I handle this personally is by still setting whole books, lots and lots of reading, and making it clear that I want people to do it - while at the same time setting papers, exams etc. that students can do well on even if they haven’t done all the reading.
And yes, I’m aware of a certain inconsistency in this, but it’s the best I can think of, given the pressures on me.
Yes this makes sense. The higher level stakeholders need to be on par for something can change.
I am a professor who happens to have a lot of knowledge about the role of women in Meiji Japan! I agree that professors themselves want to have high standards and give real grades. The problems are 1) the incentives are a mess. The administration wants high retention rates, athletics wants passable classes, everyone wants more students in seats. 2) There’s actually a lot of pressure to care for students that can run counter to evaluating them. Part of this is the accommodations - so many accommodations that they make it hard to have any kind of standard assignments. And part of it is the idea of setting students up for success by paying more attention to “navigating the university” or “knowing where to seek help.” Most of this work seems to end up with the humanities professors, who are perceived as more caring - and more of us are women, which can’t be a coincidence. In fact, the “rigorous” disciplines are always the more male coded ones - even philosophy fits this model. 3) as others have said, no department can raise standards unilaterally without losing students and getting lines cut etc. So the whole problem seems unsolvable at the moment.
+ 1000 on your point 2. The base assumption is something like “for students, attending class and being assessed is traumatic, so we have to mitigate that somehow.”
And I totally agree about the gendered expectations. It’s partially a discipline thing, as you say.
This (Point 2) really depresses me
Athletes need to have rigorous courses too. Why do most Ivy League colleges have mediocre athletic programs? Because they made the right choice about athletics ironically 🤨
Or because their incentives point towards excellence in athletics but mediocrity in academics
Well yes but the graduates from that school will be less employable
As a former philosophy professor (thanks for the kind words Matt) I agree with this analysis. Admin directs us to continually get more majors, while also telling us not to inflate grades. At the same time, a huge share of classes are taught by contingent faculty, for whom student evaluations matter a lot, and student evaluations are largely functions of how easy the grading is. I think a lot of professors would sincerely love to raise the bar, but there just isn't enough security for departments and employees to do this.
I think student evaluations should be abolished - it’s pretty well documented at this point that they’re discriminatory against people who don’t fit “classic professor” archetypes.
Unfortunately in the UK we have the National Student Survey that is a statutory requirement, but at least those are program-level.
I don't think they should be abolished - I've often gotten useful constructive feedback from them which has made me rethink aspects of my courses. But they should certainly play no part in hiring or promotions etc.: they should be for the professor's eyes only.
Yes, this has been my experience exactly as someone in the college English classroom. And when you're not tenured or in a reasonably secure lectureship, you're on the job market every year, for which game you have to produce student-evaluation sets with scores in order to even make it to the preliminary interview stage. I hate everything about this system, but the only people who can change it are the small percentage of tenured professors who sit on search and tenure committees. The vast majority of college teachers aren't among that number.
If it's any help, this isn't universal. I'm a tenured full professor, I've sat on more search committees than I can now count (and have chaired a large proportion of them). I have never once been on a search committee that asked applicants to produce their student evaluations, and if anyone had ever proposed it, I would have argued strongly against it, precisely because of the well-documented problems with them.
This isn't to doubt your experience - I am sure that it's true that some search committees ask for them. But I simply want to reassure people that isn't some kind of unquestioned norm.
You're right, it's not a requirement of every search, but eval sets are now required often enough in my field that every candidate applying broadly will have to produce them.
I actually do appreciate committees caring about teaching. I'm just not convinced that student evals are the best way to adjudicate teaching.
The whole “meat in seats” incentive system is disastrous.
Agreed - but the only way out of it is for society (“society”) to agree to foot more of the bill or accept that university is going to be an “elite,” as in selective, enterprise. You (not you, “one”) can’t expect universities to operate competitively in a mass enrollment environment *and* demand rigor across the board. I think you can probably pick two from that list, but not all three.
Yes to all this. I wonder how much of this has to do with the fact that adolescence/childhood has been extended in our culture and adulthood has been delayed. I think a lot of undergrads just aren’t ready college — not in terms of raw intellectual ability but the discipline, agency, and self-reflection required to really engage with subjects. This goes doubly for humanities. Because the path through STEM subjects is more linear and structured, a smart, motivated student has a clear ladder to climb: put your nose to the grindstone and just keep moving. The choose your own adventure nature of humanities and social sciences (at least today) allows students to get kinda lost in the weeds, unless they have the maturity to make serious decisions about how to shape their own course of study. And many don’t. I sure didn’t.
I did well in college but honestly I coasted, the same way I did with high school; if I’d gone 5 or when 10 years later I would have gotten so much more out of it.
College really is wasted on kids.
Here is a syllogism for you:
"College really is wasted on kids."
_Youth is wasted on kids._
Therefore, college = youth.
YUP.
I'm curious to hear how you deal with AI - I would definitely assign "lots and lots of reading," but I have no mechanism, really, to ensure that they do it on their own.
The problems you describe about enrollment are even more acute in the UK, where the financial model is basically entirely tuition-based. There's modest state support, but tuition is the name of the game - and American-style endowments don't really exist outside of Oxbridge (and even those endowments are largely from investments, not alumni contributions).
The way I personally deal with AI (we don’t have a department policy on this) is by increasing the amount of in-class testing (exams, quizzes); and for the part dependent on papers, I set papers with complex prompts requiring close attention to aspects of the text: my experience when experimenting with AI is that they perform less well on questions like this.
It’s an imperfect solution - I am sure students still use AI. But I am moderately confident that no one can get an A in my class while having AI do most of the work.
My solution would be to make the generating institutions liable for defaulted/forgiven student loans. Some version of IDR should be the only loan repayment plan with a max payment cap of 120 payments at which point the school becomes responsible for remaining balances. We have to create an incentive system that protects the value of degrees.
I must be very slow, but I can’t see how your solution would stop my Dean from assessing departments by enrollments, or would change the effects of that. Can you explain the connection?
Because it would mean the school would start losing money by enrolling students whose degrees turn out not to be valuable. The reason enrollment is the only metric is because the school has no skin in student outcomes, more students = more money. I would make the institution financially invested in the value of the degrees they convey.
Off-hand, that seems like it would lead to discouraging public service degrees. Teaching lower income and rural schools, mental and community health, and anything that has a high burn-out rate. My guess is that this would increase inequality while decreasing upwards mobility.
Public service careers being wildly overcredentialed is a big problem.
Not in my field.
I feel like this would lead to a school cutting lower-earning departments altogether, no?
Like if the decision is doing hard work of changing curricula and shifting culture vs cutting the Gordian knot, I could see many if not most institutions choosing the latter.
Should certainly lead to fewer total degrees. That's a good thing.
A premise I was arguing from is that humanities are important to study, and that there should be a robust and rigorous course for students who are interested in the humanities to study.
I guess if that isn't your position, having institutions cut their humanities department isn't an issue. But that does not seem to be a desirable outcome in my opinion.
I doubt this would provide any reason to stop universities allocating money to departments by the number of students enrolled in those departments. It is already the case, at least at my university, that a high proportion of students major in departments whose degrees are perceived as financially lucrative for reasons that have nothing to do with the intellectual rigor or otherwise of the degree - above all economics and business and engineering. This would simply provide an additional reason for a university to ensure that as many students enrolled in those courses as possible, and as few as possible in - well, anything else. Even philosophy, which (as Matt observed) scores well above average for the humanities in financial outcomes, scores well below those.
Matt's original post assumed that the humanities were valuable for reasons that have nothing to do with being financially lucrative (sample quote: "I believe deeply in the value of studying literature and history and philosophy and big ideas"). If you do not share that assumption, then naturally a "solution" like yours makes sense. But if you do not share that assumption, then your response isn't very well addressed to either Matt's post or my response (which accepted the assumption but pointed out the problems with Matt's solution given the internal dynamics of financial allocations WITHIN universities).
I'd have to think about this further, but I like it so far.
Maybe your university should require more classical humanities gen-ed classes, so total enrollment in humanities classes stays the same or increases even if enrollment in upper level humanities classes decreases.
Maybe - but that is something that would have to come from the top, and the current tendency is in the opposite direction: the Deans are concerned that there are too many course requirements on students before they get to their majors. And while the policy you propose would benefit my department, I do rather think they have a point.
I don't know how it works at your university, but at mine, even freshmen, unless they are undeclared, do take 1000-level courses for their declared major (along with plenty of gen-eds). They don't wait until their their second or third year to begin taking their major classes.
in my institution, it is identical
I suspect part of the problem here is that while in some sense it’s true that colleges can “choose” to “arbitrarily” make undergrad classes as easy or hard as they like, there is a hell of a lot more wiggle room in the humanities than STEM. A school handing out cheap As to CS majors who can’t write decent code, Bio majors who bomb the MCAT, and economists who can’t do math is apt to slam into reality fairly quickly. There’s much less immediate external feedback if that English grad taking an office job never really learned to analyze Ulysses that well. And schools sort of NEED lower rigor departments to warehouse the kids who only got in as legacy admits or because they’re good at a sport, which creates pressure to relax standards in the areas where it’s easiest to get away with.
We are warehousing waaaaaaay more than just legacy and athletic admits, at least at non-elite institutions.
Humanities/qualitative social sciences are layered in this respect. Students that go on to do doctoral work are absolutely rigorously evaluated. I would NEVER recommend a student to a program if I thought they were no good, because I would personally know the person who was going to supervise them and i/I'd feel like a shit; ii/they'd never trust a letter I wrote again (and rightly so). [of course, it doesn't mean that the letters aren't written in an absurdly overblown language that requires a bit of hermeneutical knowledge to read through the lines and know that if the student isn't 'brilliant' it means they're a moron]
Presumably you've read it, but if you haven't, oh you must read "Dear Committee Members" by Julie Schumacher. It's composed entirely of letters written by a jaundiced, cynical English professor of a lower tier liberal arts college. His letters of recommendation are a treasure. E.g.,
"This letter recommends Melanie deRueda for admission to the law school on the well-heeled side of this campus. I've known Ms. deRueda for eleven minutes, ten of which were spent in a fruitless attempt to explain to her that I write letters of recommendation only for students who have signed up for and completed one of my classes . . . "
https://julieschumacher.com/writing/novels/dear-committee-members/
Yeah - that and “Straight Man” are required reading for academics who have a sense of humor.
Yeah this is what honors courses/sections were for when I was at Michigan State 20-plus years ago. The ones I took were designed to be significantly more challenging than the non-honors courses. It seemed to work pretty well.
Maybe even more importantly, adding a few extra math questions to the test makes it a lot harder to take, but only adds a few seconds of grading time (maybe even zero more grading time if it’s scantron) while making a humanities class that much harder requires adding another essay assignment, which requires nearly 10 minutes per student of additional faculty grading time.
You can make the questions harder without adding more grading time, especially on exams. But then you face your department chair when students write you bad evals and possibly don't sign up for your classes. The eval and butts-in-seats pressures in today's higher-ed environment are something no individual faculty member can solve. And, to be honest, I'm not sure anyone else can, either, at any but the most selective institutions.
That goes both ways. It is easy to objectively make a STEM class more difficult, because evaluations can be 100% questions with right/wrong answers. But how does one make an assessment in a literature course more difficult in an objective manner? There is always some subjectivity in the grading, and it is particularly difficult to honestly make gradation at the top end.
There’s a lot of low-hanging fruit out there if the professor is willing to actually put in the effort. As a social sciences major/degree-holder, I had numerous classes where your grade was entirely based on a single paper or a single exam at the end of the semester/quarter. There was little incentive to actually attend class or participate in any meaningful way. Adding a basic participation component like needing to be prepared to answer cold-called questions about assigned reading (ie Socratic method), or even just simply taking attendance would go a long way.
Once you grab all the low hanging fruit the issue probably gets more complex. But from personal experience there’s a heck of a lot of that fruit to be picked up first.
I mean, I can’t speak for others, but I have stripped entire orchards bare, believe you me.
For CS, at least, it’s a common complaint that schools do pass out good grades to students who can’t code. Like… not “to our standards” but at all. It’s a reason fizzbuzz tests became so common.
If we're being brutally honest, the weakest students are being warehoused not in English but in even easier majors.
Is it also possible that it's harder and demanding more energy for humanities professors to explain to their students why their answers are wrong that it is in STEM? And maybe sometimes they themselves aren't so sure of what the "rigorous" answer should be about the meaning of "Middlemarch"?
I think you're mixing to things here. More time consuming, perhaps. But that's not the same as not knowing what a good answer is. There is a common misconception that confuses "more than one possible answer" and "no way to tell a good answer". The two are radically different, and any well-constructed Humanities assignment will never fall into the latter category. Often what we test is method, and that is fairly objective- i.e. did you use the required/allowed kinds of evidence to make an argument using the relevant methods? And that's not to mention the many many low hanging fruits, i.e. the endless kind of FACTUAL questions we ask: did you identify the passage correctly? Did you translate this sentence correctly? Is your metrical scansion correct? These are as objective as basic arithmetic.
And that's well and good for those areas of humanities instruction. But less so for evaluating and grading essays along the lines of "discuss the place of women in 19th century British society as exemplified by Dorothea Brooke in 'Middlemarch.'"
Not sure if I'd assign such an essay, but to humor you, this is eminently gradable. First we will assign a precentage for following factual accuracy i.e. are you referring to correct and relevant parts of the text? Are you aware of the historical realities of women in 19th cent. British society?
Second, we'll think about the complexity of the argument - e.g. different kinds of women in 19h cent. british society, which itself evolved over time, Dorothea as a woman of her own time vs. Eliot's time which is quite diffent (a fact ostensibly pointed out in the novel if memory serves!)
We will also assign grades for the craft of writing itself - is the essay sturcutre good and effective? Do you transition between paragraphs?
And of course afew points for correct technical citations bibliography proofreading etc.
In other words - we use a rubric. The rubric itself is somewhat arbitrary (is structure going to be 20% or 30% of the grade), but this arbitrariness gives consistency across the board. So yes, of course there is some subjectivity on the micro level (am I giving this essay 18/20 or 19/20 on structure) but this is not a problem *in practice*. In fact it allow consistent and transparent grading that easily differentiates the A from the C students AND allows you to clearly explain to the students why they got the grade they did and on what they should improve for next time.
I hate it when people take my gotcha question and shame me with a thoughtful, totally convincing answer.
Re warehouse majors, we had anthropology for the jocks and sociology for the blacks; not sure why one group gravitated to one dept rather than the other
I want to promote this point from C-man, because it shouldn't be buried three layers in:
"... accessibility requirements have made every direction I turn in trying to come up with AI-avoiding assessment into a dead end...."
If that's right, then a big part of the story of the dumbing down of academia has nothing to do with the teachers: it's the administrators.
If there is a whole administrative industry devoted to "accessibility" and "accommodations", where this means making testing and assignments easier, then the teachers don't have much power to change things.
You cannot demand that everyone write their exam in long-hand, because Johnny has a special note from his doctor saying he has to use a keyboard. You cannot give a 1-hour exam in class, because Billy has a special note saying that he gets double the time that everyone else gets. You cannot do oral exams, because Freddie has a note saying that this constitutes an undue infliction of social anxiety. And so on.
If this is what's happening (and I may have the details wrong), then teachers have lost control of the curriculum. Especially if the accommodations industry can claim the authority (in the US) of federal law, via various Titles. (And presumably something similar applies in the UK.)
If the coddling is federally mandated, then there's little that a given faculty member can do.
As much as I love to blame administrators, it's like 90% the fault of ADA and the courts. Any school receiving federal student aid has already been sued and had to settle an ADA case, or has proactively adopted the policies for fear of being sued. The settlements mandate all these insane requirements and accommodations.
"...I love to blame administrators,..."
Yeah, my point was not to blame those individuals. So let me restate it: not administrators, but the administrative system that they operate. We can agree that it is due to a kind of expansive bloat starting with the ADA -- a really good thing in many ways! -- and the consequent gaming of the "accommodations" dodge by suburban kids who want to get any fractional advantage on their peers.
This is one of those situations where we look for ways to send the pendulum back in the other directions. But it'll be a long, long time. How many of the younger generations are basing their entire identities on their diagnoses? There may be some unknown, catastrophic environmental factor underlying an upsurge in neurological difficulties.
But at this point diagnoses might be encouraging expectations of accommodations, rather than support for personal responsibility for their own successes, not to mention enjoyment, in life.
Now that Trump has starved the fed education money flow, maybe taking of federal fund won’t be a thing anymore?
Yup. It is this and more. My big public university is introducing totally batshit requirements in an attempt to deal with new federal and state disability policies. Like: readings must be clear and understandable when skimmed.
I joke that we need a DeSantis Turing test of administrators and their policies. If we described what these folks were doing in the DeSantis/Florida case, would left leaning folks say, “Good lord, another example of the right’s war on higher education.” Many decisions these days fail the DeSantis Turing test.
“Like: readings must be clear and understandable when skimmed.”
What
what
what
what
Whaaaaaaaaaaa
There went P chem
America really is cooked
From the US perspective, the ADA does not apply to athletic participation. It should not apply to higher education, either (beyond things like handicap access).
"...It should not apply to higher education, either (beyond things like handicap access)...."
So, I don't know how this works, either from a legal standpoint or from a higher ed standpoint. But from what C-man is saying (and maybe this is just the UK), it sounds like the problem is that *everything* becomes a "handicap access" issue. Needing more time is a handicap. Being unable to write with pen and paper is a handicap. And so on.
So, that's one possible route for the spread. I feel like I read something about this in the NYT a while back, i.e. a large percentage of kids, even in high school, getting "diagnosed" with some condition that allows them to claim "handicap access."
So, even apart from "everyone gets an A", there is the problem of "everyone gets an accommodation diagnosis that lets them claim a handicap."
Gaming the system to medicalize low innate abilities or mutable personality defects. It is crazy how much this has spread. It’s like how Oxy and the “crisis of chronic pain” was propagated.
"...It’s like how Oxy and the...."
I would not be surprised if some of the addiction dynamics are replicated, at least as a matter of psychological addiction. I.e., you get a diagnosis of anxiety in HS. Then you get extra time, or special environments. Then if you get thrown into a tougher environment, that makes you anxious, because you haven't had to cope. So now you're more anxious. So you verify your diagnosis. And so on.
The tricky part is that "handicap access" covers most accommodations, even for very classic definitions of "handicap." the blind student probably needs a keyboard and text-to-voice to do corrections.
My favorite disability is the "can't make it to class or have difficulty waking up for class" disability that asks the professor to modify the attendance policy. It's insane.
We have:
“May be late”
“May leave class”
We’re also not allowed to have an attendance policy, period, which makes the above moot.
I should also specify that if you have a medical condition that makes attendence difficult, I'll accommodate you. The problem is students who have mental health issues that make attendence difficult because of issues like sleep. For example, I had a student with fibromyalgia that had an attendence accommodation. I gladly gave them an exemption, but they only missed twice.
I should also say that students get three free absences, so it's not like the Paper Chase over here.
This was years ago, but I read an article that noted that school districts with a disproportionate number of accommodations tilted towards the wealthy, with about 25% of the student body in Greenwich, CT public schools having accommodations. I wonder if this is still the case.
I suspect that’s true but in elementary education it’s true but access and attitude are playing a huge role here.
I’ve seen people instruct their kids to refuse accommodations for some kind of reason about not being labeled far more than 504 fishing. Maybe this happens more with older students.
> Needing more time is a handicap. Being unable to write with pen and paper is a handicap.
These are pretty different things though. There are people out there with physical disabilities that make using a pen and paper difficult, but they fully understand the material and should be given an equal opportunity to demonstrate that. (and, if anything, pen and paper is the antiquated thing nowadays)
Getting extra time, on the other hand, demonstrates less skill at the actual thing that's ostensibly being tested. If you can't do the assignment in time, then you can't earn the grade or pass the class.
We should be clear about what things are reasonable and what aren't.
Yeah - like, I’ve seen a “must use a computer” accommodation for arthritis, which is totally fair enough.
I don't know if the UK Equality Act (roughly the equivalent) is more or less encompassing than ADA. But as DT says below, everything, including "write by hand," becomes an "equality" issue.
This is fake news. You just give people a "dumb" computer with only the most programs enabled
Ok, where does the "dumb" computer come from? Why buys it? Where does it live when people aren't using it? Who decides which programs are dumb enough? (Just imagining the committee meetings about this fills me with dread.)
I agree that having a dumb computer might be a good idea, but implementation would be highly non-trivial. An "in principle" solution is not sufficient, you actually have to be able to do it.
Sigh. A networked computer can have the administrator lock different programs.
This isn't a hypothetical. It's been done for years
I, personally, as an instructor, cannot easily do this. It would involve at a minimum talking with IT and the administration. I very much do not want to do either of those things. If many people wanted the same thing it would involve more physical infrastructure --- more actual computers, and more space to house the computers. I am not asserting that it is impossible, and I am not asserting that it is technically challenging. I am saying that the implementation is nonetheless highly non-trivial.
On the other hand, if I want to force my students to write in blue books, I can buy 30 blue books, carry them with me on exam day, and distribute them for the exam.
Bring back word processors, of course.
At many universities, the disability center has a testing room where students who need extra time can come in and do the test on a computer with no forbidden software on it. (I assume that’s how they have it set up - I’ve never actually seen the testing site.)
I think a lot of things would be easier if we just had everyone take all their tests in that sort of environment, and separate the proctoring and scheduling of exams from the instructional staff for the class itself.
If we expand the amount of space available, we could even have students write all their essays in that room.
I'd go one further: you must be able to pay for the exam/certification and the instructional services separately. The thing that gets you whatever cert you are after is the proctored, highly regular (maybe multi day?) exam. The teaching is a service offered if you need it - sort of how the relationship between law school and the bar exam used to work.
I think that’s how German universities worked in the 19th century. Some biographies I’ve read of 19th century academics suggest that they used to collect tips from the students at the end of class.
This kind of sounds like AP exams? I started college with sophomore status partly because i passed so many AP exams - some of which I didn't even take a class for because it wasn't offered at my school (I think world history was one).
That would be prohibitively expensive. Our testing center is already oversubscribed with the relatively small fraction of students who have accommodations.
I appreciate the shout-out!
I cannot emphasize this point enough - this is specific to my institution, of course, but any assessment has to come with alternatives that can basically be completed even if the student has never set foot in my classroom.
Faculty should be permitted to defenestrate students for asinine behavior.
"... should be permitted to defenestrate ...."
Okay, but what's the solution for windowless classrooms?
Oh ya, I forgot we were talking about humanities departments.
Anecdotally, my Electrical Engineering classes were largely in windowless labs or classrooms. The humanities hogged anything with an actual window.
I joke because the humanities departments got shitty buildings at my PhD institution.
I feel like this is the case everywhere. All the new/shiny buildings were for STEM/fields with actual grant funding.
It needs to start early, in grade school!
People should just be given ample time on tests for content. Very little is dependent on speed irl, at least the kind of speed required to quickly take tests
I also would like to second that from a "TA in STEM at a prestigious US university perspective". It definitely isn't a problem only in humanities, and seeing people who never showed up get A's, while people who, sure, weren't extremely talented, but showed up and asked questions get B's, was not something I loved about the job.
For intro stem classes, I would expect the students who get it as an easy A to never show up (that’s certainly how I was as a student, because it was easier to just learn from the book on my own time). It’s difficult to convince students like me when they have to make the transition to talking to people about material.
That’s fine. Most STEM classes don’t take attendance because they figure if you don’t know it you’ll fail the hard exam. I only went to those classes if the teacher was outstanding, because most of the time I learn best from reading the textbook and doing problems on my own time.
Having been a TA, though, what would grind my gears was having to give As to students I knew were *cheating* while giving Bs to students who merely worked hard.
I’m mildly skeptical about this. Most of these problems can be solved by software (my law school used examplify) that takes over your computer and prevents you from doing anything but the exam. If you make students use this, and provide extra time for the students who need it, what additional ADA requirements are there to meet?
My experience in similar situations says that the person sitting in front of the computer might be a friend who took the class the previous year, if you can't force people to come in person.
Over my years proctoring logic exams at Texas A&M, some fraction of students always tried to show me their student ID when turning in the test, which suggests that some other professors were using this as a way to prevent that.
I had to do this in a few extremely large classes as an undergrad at A&M.
My last degree was online, and showing your ID was an integral part of the online proctoring experience. Which, by the way, runs a spectrum of “cheatability,” the more onerous programs being the more difficult to cheat.
I'd be really surprised if you can't force people to come in person. I've spent 8 years in higher education to date, and I can't think of a single take-home exam I completed during that time (I'm sure I have taken some, but it is not at all common). Maybe there are a few students here and there who qualify for exceptions based on disabilities, but I don't think threat of heightened cheating for that tiny minority poses a serious threat to the general rigor of the exam
Just to clarify, I specifically referred only to students with accommodations in my previous comment (because that's what dt's original comment was about).
I interpreted the OP as being about the effect of ADA requirements on the modal assessment method, but I could be wrong
Different proctor software has different standards for how intrusive and thus how effective it is.
This is orthogonal to Matt’s point, but I will just say as a law firm partner formerly in charge of hiring, it is VERY difficult to find people who can actually write well. (As many on this comment board know, writing well is all you really need to be a good lawyer, or at least the most important thing.)
I was an English major at an Ivy League school who graduated 20 years ago (after trying and failing at a hard science major). I would not say it was easy. I probably had to read 20+ books a semester in addition to doing a foreign language. More importantly, we were essays week. If the essays weren’t clear and persuasive, we’d get bad grades.
I don’t pretend to know what academics are like there now, but I do see the outputs (and these are people who go to top law schools also). Not great.
Writing was one of my weaker graded subjects early on, and I found out later that it's because they made me overwhelmingly write on fiction. I get that that's what most kids find more interesting, but it just bored me. I got much more interested and thus better at it when I honed in on nonfiction, including subjects like what's discussed here, and also law--I even ended up working at a law firm for several stints.
As an aside, I'm also curious how many philosophy majors go into law. Seems like a nautral path to me, but certainly not for Matt....
"... one of my weaker graded subjects early on...."
Dude, of course you're going to have trouble writing: you're a city of trees.
I mean, I love trees -- we all love trees -- but they're not known for their literary output. Who can expect fiction from a deciduous downtown, a suburb of saplings, a coniferous conurbation?
Naturally it got better when you branched out.
Not to mention the horrific slaughter of his comrades that attends the creation of paper on which fiction is printed.
At least I didn't end up feuding with myself to destruction, like a certain tale of the maples vs. the oaks...
Thanks for making my day with that comment.
Law school is definitely the most common trajectory for philosophy majors. And the people who want law school but don’t major in philosophy still usually take the logic classes, because the LSAT has some relevant sections (which I probably should look at some time, given how often I teach logic).
They removed the logic games, so of the things you might get out of a symbolic logic course, it's basically just diagramming conditional statements at this point.
They got rid of those sections, Kenny. Tracing Woodgrains has a long rant about why that's bad.
Some basic stuff was helpful for the LSAT, but beyond the first few weeks of intro to logic, the class got much more advanced than the test. The LSAT LG section was hard mostly because of the time pressure, not the difficulty of the logic involved. The idea that truly learning logic was helpful for the LSAT was a useful fiction for philosophy departments. And now they got rid of logic games entirely due to ADA issues.
As someone who was a philosophy minor, I think the two have a fairly well-established link. I know one philosophy instructor who heavily recruited interns from the law school for his larger classes because they generally understood the material better.
One of my writer friend told me that frequent exposure to student’s bad writing will lower one’s own writing ability.
More schools should make people take an actual writing class, not just hope people pick up writing by osmosis from other humanities classes.
Is this not already the case everywhere? I went to Random State U and every student had to take 2 dedicated writing courses as part of thier university gen eds.
Another grad of Random State U reporting in, we had a three-class writing component requirement.
I passed out of all college writing classes by getting a 5 on the AP Literature exam. To some extent that means I already "passed" college writing but I don't think I worked as hard in my 12th grade class as college writing students do.
Most of these requirements are being stripped and the rigor of
what qualifies as writing intensive has been degraded.
It was optional and not part of gen eds at my school.
What were your gen ed courses? I did not attend a University anyone would be impressed by but the University required the following number of classes (3 credits each) regardless of major. 2 composition , 1 government, 1 history, 2 humanities, 2 sociology, 1 math, 1 science. Among the above one class had to also carry an "international" designation and another had to have a "Diversity" designation.
You didn't need to take a special gen ed version of all these though. As an engineering major we started at calculus for math, where the typical humanities major would take algebra or something.
Tough, in the AI era. You'd have to force students to write by hand, in class.
My engineering school required a couple of writing classes - one general/introductory, and one that was linked to your major, so typically lab/project reports or research summaries.
I had to take 5 for a BS at UMiami
Writing is one of the most important skills to learn, but it’s one of the most labor-intensive to teach. Grading writing is pretty much the worst part of the job being an academic in the humanities.
It’s also the part of the teaching that is most directly threatened by AI.
The HORRORS of having to read and critique bad student papers. I think I’d rather have dental work done
I mean, sometimes you get very funny things to snicker about. But AI is taking away the pleasures of terrible student writing.
Can confirm - in our org, HR instituted a bunch of fairness protocols (panel interviews, etc), some of which mitigate against careful assessment of candidate capability. I can't ask for a writing sample and our ref checks are run by a third party whose reports are pretty vague. It makes the interview itself pretty high stakes and I've gotten better at probing the candidate's cognitive and reasoning skills as a proxy.
Yea. This is a real problem. we are still able to get writing samples which is probably the most important part of the evaluation process, at least to me.
Yeah I still have mild PTSD from an English class I took at my state college 20 years back: 10 week quarter, 10 books (some 1000+ pages), 10 harshly graded essays, and you had to have something intelligent to say in class every week (on top of all my other STEM weed-out classes). To say I barely scraped by is an understatement…
I hear you. I can usually tell when a young associate/intern has actual writing talent, separate and apart from the things that just require experience or subject matter knowledge. It’s a rare thing.
One thing I always debate with myself is, to what degree is effective writing innate talent vs learned skill. Obviously practice and experience will improve anything. But when it comes to being an effective writer, how far can practice take you before you run into a wall?
I don’t have a good answer, and that’s one of the things that makes evaluating new attorneys difficult. As mentioned, I think one can tell when a recruit has innate talent regardless of how unrefined or inexperienced they are. But what if they don’t have that, yet are good at other aspects of the job? You kind of have to guess about how far you can bring them along through mentoring and practice.
Communication in a broad sense seems like it'll be a key differentiator for people moving forward in the age of AI. It's something I'm actively trying to get better at, both in a written and a verbal context.
The horror stories of zoom interviews where the candidate reads ai generated responses to questions is scary.
That's crazy. AI-speak is the new therapy-speak, as in the new thing making everyone sound like they're reading off the same flashcards.
Sounds like it was EASIER though (than the science you failed).
No question
“… how to summarize a fact-pattern….”
Isn’t this a task that AI is supposed to have mastered? Have you tried using AI for it?
(I say as someone with no experience in law or AI.)
I think it's actually a "interns don't know how to productively use AI" problem. Vanilla ChatGPT isn't the tool for that job, and I'm not sure even Deep Research is, but a RAG solution akin to NotebookLM that runs on a local server and that doesn't share data to the cloud would probably be worth trying. The human value-add would be knowing which case law is applicable, and being able to capably proof the output.
Fair. I think in 10 years time it'll be the norm though. Basically all you need is an open source, locally-installed NotebookLM alternative to deal with the data privacy concerns, and stuff like LM Studio are already 75% of the way there.
Its important to note that this "dumbing down" also occurred during the growth of adjunct professors who need those good student evals to secure their next contract.
I was an adjunct comp sci prof for a decade. And yes, it ended when I had the audacity to fail unqualified students who literally cheated on their exams. Literally all it took was one angry parent writing a letter.
I know this was already a losing battle when I was in undergrad, but I can scarcely imagine anything more shameful and infantilizing than one of my parents contacting my college (or heaven forfend, my fucking job) for any reason besides illness/injury but *especially* to demand something I don't deserve. I know plenty of students are yoked by financial support and would *also* prefer their helicopter parents alter course, but for the ones who accept it happily I wish nothing but ego-destroying embarrassment when they someday have a moment of clarity.
The whole “lose your scholarship if you don’t have a 3.0 by semester 3 or 5” thing sucks. If we’re going to be doing this “college is a time to figure out what you want to do” thing, and I’m not sure we should be (high school should be like undergrad, in my view), then you shouldn’t be penalized for trying something it turns out you suck at and failing. My next door neighbor growing up lost her lottery scholarship because of these requirements. She had started out in architecture and it turns out she didn’t have a knack for it. Switched to nursing and did well and is successful. One should perhaps be penalized for just phoning in college, but not for the trial and error we sell the experience as.
I think that may have accelerated the process, but there are studies showing grade inflation has been going on since early 1960s, with the biggest single jump being in the late 1960s (almost certainly to try to help male students avoid being drafted during the Vietnam War).
Yup, both phenomena are real, one having led to the other by means of overproduction of humanities doctorates starting in the Vietnam era.
Yes - and this is part of why this is such a difficult problem to address. Higher education, at least in the US, is so highly leveraged on contingent and precarious labor that there's little incentive to be an outlier in terms of demanding higher standards of students.
this probably doesn't get anywhere close to passing an ideological turing test, but...
doesn't the ethos in current humanities departments that "rigor/objectivity/etc are vestiges of cis-heteronormative capitalistic patriarchal white supremacy" kinda make these changes practically impossible?
I am highly skeptical that such a view has anywhere close to a controlling attitude in the humanities.
I bet over 99% of administrators and faculty in the humanities, for example, support affirmative action, which is trading off rigor in exchange for other considerations like diversity and helping rectify injustice.
Supporting affirmative action is not akin to your first statement.
it does invariably mean reducing rigor though
By most objective measures, students enrolling in institutions had better qualifications compared to pre-affirmative action. In my experience as a computer science professor from 2001-2019, courses were generally more rigorous. Do you have any evidence for this claim?
The fact that student quality has generally trended upwards over time does not imply that this causally related to affirmative action, clearly.
The evidence I have for my claim is that affirmative action literally means lowering standards to achieve other aims. Like, that's the whole point, that's the evidence.
I'm not even saying affirmative action is bad! It very well may be a net positive thing in society. My point was that given that the humanities are especially attuned to things like societal disparaties, there would be internal resistence if their rigor increased so much that their classrooms started to look the same demographically as physics or CS classes.
Just curious, but would you also oppose class-based affirmative action?
It's tough -- largely yes but somewhat no.
1. AA is designed to help racial minorities, class-based affirmative action only does this indirectly
2. Class is a bigger impediment to success than race. Malia Obama is not more oppressed than a white kid from Pikeville, KY.
3. Current AA approaches are about the phenotypical pie chart -- so you end up wtih a lot of children of rich Nigerian immigrants than American descendents of slaves in elite institutions.
4. Doing racial AA is kind of in explicit tension with the 14th amendment in a way that class-based affirmative action is not
So my feelings on it are mixed. I support it, yes, but it almost certainly would not go far enough. So idk.
You can equate affirmative action with poor rigor, in your head, if you want. But there are endless forms of affirmative action which do not require a trade off in rigor. Take for instance, crossing off names on resumes before reading them. That would increase rigor and be affirmative action. One might even say that your anti-AA stance lacks in rigor. :)
Pretty much everyone who talks about affirmative action in a US university context is talking about having lower admissions standards for members of underrepresented groups (blacks and hispanics being the important ones in the US). The admitted black kids end up with a substantially lower average SAT score / GPA than the admitted white kids.
The opponents of AA in the US are the ones who want universities to do race-blind admissions--crossing off the names, as well as race and gender, from the applications and judging them entirely on what's left, like test scores, grades, advanced classes taken, etc.
Which may or may not be a bad idea but is not a reduction in rigor. I was saying that crossing off names on resumes, where black names have been shown to have adverse effects on odds of hiring, would be a form of AA(which also would not decrease rigor).
In what way is lowering admissions requirements based on race not a reduction in rigor?
I don't think any reasonable person thinks it's appropriate to cross names off of resumes before reading them.
I just don't think this happens at the frequency you do (unless we're talking about ethnically Asian names, that might happen in that case).
You made the absurd assertion that affirmative action is, "trading off rigor in exchange for other considerations like diversity and helping rectify injustice." I see that you walked this back a bit later in the thread. There are obviously possible ways to take affirmative action without degrading rigor. I mentioned one (resumes). Several people mentioned admissions, later in the thread. I agree 100% that AA has gone too far. I was making a narrow objection to your claim that AA is impossible without sacrificing rigor.
The acknowledgment that there are values beyond rigor, that you are willing to trade off against rigor, is extremely different from a claim that rigor is itself a bad thing.
It's possible that it does reduce rigor but, as you say, maybe it helps rectify injustice. I recall the argument by Derek Bok and William Bowen back in 1998 in "The Shape of the River" that minorities admitted via affirmative action actually did quite well in college. (This was an incredibly quantitative study, one should note.)
https://muse.jhu.edu/article/14822/summary
Affirmative action only accounts for a small percentage of students—most top universities are under 20% Black+Hispanic, even if you assume all of them got in through affirmative action which they didn’t, the vast majority of the class is still non-affirmative action.
If you really wanted to have the maximum rigor with the smartest possible students, the much bigger lever to pull is international students—you would just give financial aid and need-blind admissions to everyone so some top students going to Tsinghua, IITs, etc. come to your school instead. That would also increase diversity ironically. And most liberal humanities professors would probably support this—they aren’t the obstacle to it.
One of my daughter's roommates went to Illinois Institute of Technology as an undergrad. I was surprised to learn that it has the highest percentage of international students of any school in the US. The roommate's theory is that it is because its initials are IIT.
Idaho State needs to rebrand as IIT immediately
1. Yes -- a truly meritocratic university system would have a lot of kids from the Tsinghuas and IIts of the world. That's undeniable.
2. This is also the *real* reason for affirmative action, if you take the semi-cynical view that universities are not agents of social justice but rather self-interested corporations -- a university that looked like IIT in America would be less attractive to prospective students/donors than one that had (at least superficial) diversity. It's the same argument for allowing legacy admissions -- rich kids' parents paying for nice amenities makes the school more attractive overall, even if this means things are less "fair."
3. I'm not sure what progressive humanities professors would think of this. Opposition to purely merit-based high schools like Stuyvesant in NYC comes from the left, because they're uncomfortable with a demographic pie chart that is so heavily skewed against Black and Latino students, even if most of the students are poor children of immigrants.
I believe most colleges and universities in the US are not particularly competitive--pretty-much anyone who remotely belongs in college and applies will get in. The top private and state colleges are selective, sometimes extremely so--even with truly excellent grades and test scores, it's hard to get into Harvard! That's the place where affirmative action, legacy admits, and sometimes athletics (at big schools with financially important basketball/football programs especially) have a direct impact.
To the extent that it did/does I can tell you it is already shifting, at least where I am at. "Return to rigor" discourse post-COVID is hot now. I think AI has played some role in this. The big distinction though is between intellectual rigor and procedural rigor. Increasing the assignment count or complexity of requirements for completion is procedural rigor, and not helpful (and beyond that is the kind of rigor that AI most easily circumvents and that compels students to shortcut), whereas increasing the expectations for student participation or quality of arguments is intellectual rigor, and preferable. Jamiella Brooks and Julie McGurk have done a bunch of work on describing what this would look like.
“Procedural rigor”
So now I have a name for why online classes suck. They always have a million little assignments, I assume to make it hard for people to hire others to take their online classes for them. But it’s so obnoxious. Give me 3 tests and a final, please! But please don’t take attendance
Yes, I believe it's what they used to call "busy work".
Call me a dirtbag centrist, but I think your comment here and the comments of your detractors are both wrong.
*Administrators* absolutely believe this. All of the non-faculty professional staff at a university have the most insane wacky progressive beliefs you've ever heard in your life.
But honestly, faculty mostly don't, even the radical leftist ones believe it's important for students to learn things and for classes to be hard. You have to be a real gadfly to resist the pressure that the professional staff brings to bear, though.
As a professional staff member, allow me to speculate as to why this is the case:
I work as an instructional support staff member, and what you describe is somewhat true. I think much of it can be explained by the fact that, at least where I am, staff like me are seemingly discouraged from participating in the actual class environment. We make alot of informed suggestions based on what the current research and discourse on pedagogy suggests, run some voluntary workshops, assist with building course sites, etc, but we're rarely invited to actually sit in on a course we're helping with or even get updated info on how things are going, and sometimes I get the feeling some instructors are downright hostile to the idea. Because of that we have to fill in gaps based on hearsay and rumor that floats around third-hand. Alot of my job is a kind of "reading the room" because some instructors will try to shop around instead of just telling me what they think. So despite the fact that my job is to help make the teaching in my R1 university as good as it can be, I have spent very little time in an actual classroom since COVID. That's not a good situation for either of us. My institution doesn't even have very robust tuition support, so I couldn't spend time in a classroom even as a legit student without incurring significant cost to myself. I've begun looking at adjunct positions, teaching a class or two, just so I can have some minimal experience, despite not having much interest in becoming a full-time instructor.
If most instructors find the professional staff's views to be wildly out of bounds, we wouldn't really know it, because nobody will tell us whether the situation is of the "I already know this stuff and it's old news/out of date/unworkable" kind or the "this is weird and new and I don't think it makes any sense" variety. Personally, I don't care which one it is, as I have little personal stake in the matter, but I kind of need to know so I know where to productively go from there. My sense is there is a heavy amount of defensiveness about whichever one it is from academics who are used to being confident in their specialty domain, and are working in an environment where admitting an intellectual blindspot is heavily discouraged. Sometimes I just have no idea how familiar instructors are or are not with some of this stuff, and when I ask outright I get weird looks like I just asked either a forbidden question or one that was so obvious asking it made me look silly (both?).
I could be wrong about that, but if I am it's because I am ultimately at least right about the fact that very few people will just be up-front and honest about whether I'm welcome in their actual classroom or not so I can get a real sense for how they teach, or whether I'm just supposed to be a glorified intern putting widgets in an LMS.
I've worked at other smaller places where this dynamic was less so and instructors loved having me sit in. Maybe it's a problem with "elite" academia, idk.
> instructional support staff member
You mean tutor? Isn’t that usually grad students or work studies?
I'm an instructional technologist/designer. Sometimes grad students do that stuff, but we are fortunate to have a large edtech team to let grad students focus more on content/class instruction.
Why would an edtech person have input into how a class is taught?
Granted I think technology is a big chunk of the problem and is almost never a solution. Online problem sets in math and science need to die a violent death. Students are supposed to use pencil and paper and professors are supposed to write on a board, not use a CMS except to post grades.
I would think the best teachers mostly would just be using their class notes from when they took the same class when they were in college.
I mean, there are a few faculty who believe this. There was a brief-lived movement towards "minimum grades" that basically said that if you show up consistently, you get a B no matter what, and then you can go up from there. Thankfully this seems to have been marginal.
But yeah, in my experience people not in the classroom on a regular basis develop some pretty wild beliefs about what teaching involves.
I mean I had a friend that went to New College, which didn’t do grades in favor of lengthy evaluations, but I don’t think it was any less rigorous. It made their test scores more important for medical/law/grad school, though
Yeah - that sort of thing is fine, I think. Definitely different than "everyone gets at least a B."
What makes these changes impossible is that there are, currently, no financial incentives to increase rigor.
Any serious "rigor is white supremacy" discourse that actually has any sway over high-level administrative decisions met its Waterloo years ago.
the "rigor is white supremacy" ethos definitely hit its apex a half decade ago but most leftwing institutions do seem to subscribe to the notion that disparate outcomes are indicative of discrimination. So institutions like these aren't going to optimize solely on rigor if that means outcomes remain disparate
This is largely (but-alas- not completely)* a straw man. The dirty secret is that 95% of those who espouse these views are hypocrites who secretly still believe in rigor and *when left to their own devices* usually act accordingly. The bigger problem is precisely that which MY talks about: it is NOT because of ideology but because of practicality (enrollments, collective action problem in an environment of grade inflation etc.). Some however feel better about themselves by *pretending* that this sad state of affairs which everyone laments has some ideological virtue behind it.
*My only qualification to the assertion above is that hypocrisy can have a way of running ahead of you so that sometimes people seem to almost accidentally put their money where their mouth is, e.g., one person mentions one of those silly platitudes in a dept. meeting simply to virtue signal, and another feels they need or should up the ante, saner voices are too cowardly to intervene, and before you know it the department has adopted an official policy with serious academic implications that nobody individually would have tolerated in their own classes - e.g., Princeton's absurd decision about language requirements (which, contrary to MY's assumption has little to do with enrollments to my knowledge, and also is a little more nuanced and less imbecilic then you'd think, but is still basically moronic).
Hey, THPacis! Long time no see.
To the extent that anyone is making the "rigor is white supremacy, etc." argument, it's uniquely in elite institutions. Everyone else literally can't afford to increase rigor, because students will go somewhere else - either a different course, program or university - where they can GPT their way to a degree.
In my field of English, "rigor" is coded as white, male, and privileged. Teachers, unfortunately, think of rigor and the Canon in the same breath. It's a terrible attitude. And it tries to dole out racial justice in terms of grades.
There's a big split, mostly along generational lines (with the exception of a few slightly younger curmudgeons like me), between faculty who espouse that point of view and faculty who roll their eyes at it.
I'm not sure why you'd treat this as a problem localized to College level humanities when it's equally applicable to basically all of K-12. "Low standards have devalued [the high school diploma]" is at the very core of a huge amount of modern social issues from student loans to education polarization and it's all the same basic argument. The main value of any education program is signaling and if it's not meaningfully filtering for ability or knowledge or hard work the signalling value is zero.
"The main value of any education program is signaling"
No it's not. The value of education is knowledge transfer, and building of skills/critical thinking ability.
Actually knowing how to do math is a skill. And a very valuable one. Knowing how to read, analyze and then write effectively is also a very valuable skill.
More job specific for me, knowledge of accounting and being able to apply it is a skill. Which is why I ask accounting questions to everybody that interviews.
I don't care about your accounting degree if you can't answer my accounting questions. As many people who have interviewed with me have learned.
I think you and Dave Coffin are using the word "value" differently.
At the society level, the "value" of education is mostly in improving human capital in the population. Knowledge transfer and thinking skills are a huge part of improving human capital.
At the individual level, the "value" of education includes improved human capital but also signaling "I am intelligent and conscientious, and I know things about a specific topic" to employers.
Skills have little value if you can't get in the door. Most would be job candidates never get an interview where knowledge might become relevant. It's not like the knowledge you learn in college is some big secret. I can read a bunch of books for free. People pay for college for the credential and the connections because that's what gets you the interview to begin with.
Pretty much agreed. I think it's tragic that the US has not come up with some testing-based alternative to a bachelor's degree for demonstrating skills and knowledge in particular fields.
For example, sitting for the CPA exam in most (maybe all) states requires a bachelor's degree with a certain amount of coursework in accounting. Why not just let people take tests indicating they know material covered in this coursework, and sit for the CPA exam if their pass said tests?
I studied electrical engineering in undergrad and I think that about 95% of the knowledge/skills covered in the degree plan could be demonstrated via written exams. The rest could probably be demonstrated in a few weeks of labs and projects.
The courses are about knowledge transfer. The diploma is about signaling how successfullu the knowledge was transferred.
Knowing that academics are pretty prevalent in this comment section, I have a question for you: Could you fix this in your own classes?
If you are teaching 200-level humanities class, could you make your assignments more difficult and turn out a grading system that results in, say, 20% A's, 40% B's, 20% C's and 20% D's or F's? Or would your administrators not allow you to do this?
I'm just curious how this lowering of standards and rigor can be disrupted. Must it be at the institution level, or can professors choose a different path? What is tenure for, if not something like this?
I think "make your assignments more difficult" has become a very different enterprise in a genAI world, for one thing. Even if you somehow incorporate AI into the assignment workflow, designing an assignment that leads to some learning outcome *and* is rigorous is, to put it mildly, very, very, very difficult.
I think something MY doesn't really cover as well is that it's not just assignments as such - attendance is the other huge part of the puzzle. It's different everywhere, of course, but attendance in my university has cratered despite some tepid attempts to address it. When you only have 20% of your students in the classroom on any given day, there are a lot of steps to go before you can even begin to think about "increasing rigor."
They should just fucking flunk out. Why are they even taking on debt if they aren’t even attending classes.
I agree - though I also have to admit that I owe my job to students who basically only exist on paper.
But I think it's also important to note that it's pretty possible, at least at my institution, to get a degree without ever attending a class, or maybe showing up like 10% of the time across your whole degree. It sucks so hard and I hate it, but it also keeps me employed.
And now I remember why I bailed on academia.
I wasn’t very concerned about attendance when I was in school. As long as I felt I understood the material and could do the assignments, I didn’t feel like I needed to sit in a classroom.
Fair enough, but the 80% of my students who don’t show up at all or only sporadically do not understand the material nor can they do the assignments.
Because they get stamped with the credential, and the credential opens more doors. Look, I don't like it either, but it's not mysterious.
In re attendance: Long ago, I worked in the registrar's office at my undergrad school and one of my tasks was to catalogue about 70 years' worth of student handbooks. In the course of doing that I made an amazing discovery: Pre-WW2, if you missed a day of class without a *written* explanation for the absence, that was an automatic 10% reduction in your grade. However, if you had an unexcused absence the first school day immediately before or after a regularly scheduled holiday (e.g., the Friday of the Labor Day weekend), that was an automatic ***30%*** reduction in your grade.
Kick-ass.
I will admit that one thing that causes me great cognitive dissonance is that in college, I *loved* going to class. I understand that there are bad teachers and so on, but it's really hard for me to wrap my head around what has become the default posture, which is that college classes are an annoyance to be tolerated at best.
Depends on the subject. I ate my anthropology, foreign language, and of course studio art classes up. But a lot of my science teachers were pretty shitty because their reason for being there was research. I went to a few of those teachers’ classes who were really good, but there’s high variance in quality
Yes, that was definitely the most interesting part. I could read all this stuff on my own, but to have discussion about it and the addition of the professor's in-depth knowledge made it much more interesting.
Woa only 20% attendance? Are you exaggerating or is it really that bad?
I'm not - it's an average across a semester, so there were better and worse days. But it was really, really bad, and it broke me.
That would break me. I would want to flog people for that level of disrespect.
This is only, like, 25% of a joke, but there's a reason why I spend so much time in the SB comments section posting thinly-veiled cries for help.
"...why I spend so much time in the SB comments...."
We appreciate your excellent attendance record.
I try to show up regularly, but sometimes I have other work and caring responsibilities.
Most of us a probably procrastinating at work…
I return to a question I have asked several times on semi-related topics--do these kids ever expect to, like, hold a job???
Yes. They expect to get paid to order DoorDash and scroll TikTok.
I increasingly don’t know what expectations they have at all - partially because they’re often not there to communicate them.
That sucks to hear that you're being broken by a system that sounds broken.
Yikes.
20% is a lot lower than what I remember back in the day, but attendance rates have always been bad. And often they are actually worse in STEM classes, since (a) class participation is never part of the grade, (b) anything covered in the lecture that you need to know for the exam is also in the textbook, (c) the vast majority of the professors can't teach.
Yeah - I recall taking an undergraduate CS class during grad school because I was bored and aimless and toying with escaping into tech (hahahahahahahaha), and attendance was definitely not 100%. Of course, it was also harder to tell, because there were like 500 students in the class to begin with and so the room would feel ful regardless.
I agree with the "can't teach" bit - I try to be an engaging teacher and was even shortlisted for a university-wide lecturer of the year award, but I can say that it still doesn't help with attendance, at least where I am.
As someone who studies mostly social sciences with some humanities on the side, it always shocks me to hear how few people in STEM majors actually attend classes. I have friends who have gotten As in CS classes they've only ever been to for the test, which, say what you want about non-STEM fields, is essentially impossible to do unless you already have expert-level knowledge about the coursework and somehow manage to only pick classes that don't have any sort of attendance mechanism. For all the kvetching about the humanities, pretty much all of the sheepskin effect extremists at my school are in STEM majors or business. My university is very pre-professional, so it could definitely be different elsewhere, but all the English majors I know actually care a lot about reading books.
For whatever it's worth, my engineering school experience does not at all reflect Mr. Bear's--I think it was a rare class that was under 80% attendance.
Same. My engineering classes were generally very well attended.
This is only half the argument as I see it. The other half is that you will fail the class if you do not show up because you will not have the knowledge necessary to perform well on exams or even long-form essays. If I think CS classes should require students to attend, it's because I think that it would prove that the professors aren't entirely obsolete. Say what you want about the humanities, but the history professors I have had have all provided very strong analysis and metaanalysis of the material that has been useful in forming my own interpretation of events. I don't get the same sense of that happening for CS.
Couldn't you say that if a student misses more than X number of classes, their grade will suffer? I'm trying to tease out how much of this is just inertia and how much is within the control of the tenured professors?
In law school I believe the rule was if you missed 5 classes you would not be allowed to take the final which guaranteed failing. They meant it too. There was a guy in my crim class first semester 1L who had missed that many and the professor told him to leave when he arrived for the exam.
I think I remember some requirements on this in undergrad for history but don't recall the specifics or witnessing any enforcement.
My recollection is that attendance rule was actually an ABA accreditation requirement for law schools (thereby getting around the collective action problem), but I don't know offhand whether it still is.
Yeah, it is.
Thank for the confirmation! (Wasn't sure if that might have been repealed during COVID or something.)
This is a bad rule. If the student doesn’t come to the class but can still pass the final exam, they should pass.
Wrong.
Why?
It depends - on your institution's policies or your program policies.
I - at a UK institution - do not have this option (there is also no such thing as tenure here, so that layer of protection is not available). When I was a TA at the University of Washington, we were also not allowed to grade on attendance (though you could grade "participation" and so on as a proxy). That's only n=2; in some places I'm sure you can require attendance. When I guest lecture in a Masters program in France, we always pass around the fiche de présence that they have to hand-sign.
Accessibility discourse is particularly strong here in the UK, and it emphasizes students' other work and caring responsibilities. Which is fine, but it often implies "so if you require physical presence in your class, you're an elitist a[rse]hole," which makes it hard or impossible to insist on attendance. Not sure what this looks like in the US.
So I finished an undergrad degree last year in PPE in the UK, and after the first few weeks on my first year stopped going to seminars or lectures because they were useless and boring.
For my philosophy and politics seminars, no one did the reading and when they did do the reading said completely insane things about the reading (I got "Rishi Sunak is a race traitor" in my third year political philosophy class.)
Economics was different because its a problem set subject, and the problem sets weren't hard enough for seminars to be helpful.
It's completely understandable to feel pretty disheartened by this as someone in a teaching capacity, but this was why, as a student, I stopped going. In short, I could learn everything more quickly with a textbook than by going to seminars and I find them to be actively unpleasant social experiences.
For context, I went to Warwick (for non-brits reading this, uni ranked somewhere between 5th and 10th in the UK. Think Brown or Cornell in terms of student quality, but definitely STEM biased.)
"Rishi Sunak is a race traitor." Yikes.
I don't deny that there are crappy teachers out there. I teach political geography and I try to emphasize "primary sources" over academic articles - and for the students that do show up they seem to like it. We looked at the Project 2025 hiring questionnaire when talking about populism and illiberalism, for instance. Or we compare the Online Safety Bill and EU Digital Services Act's enforcement mechanisms as examples of epistemic bordering (I don't use the word "epistemic"). Or I pair historic and contemporary texts from public intellectuals so we can see how old ideas about e.g. nationalism are still quite present today.
I don't fault students for not showing up if the content is objectively bad. But it's also a vicious cycle. And in my experience, at least, even being good at teaching doesn't necessarily help.
(Speaking of the "race traitor" comment - if anything, where I am my students are disengaged enough that it actually makes it easier in some ways to teach about controversial topics, because they tend to not really have an opinion at all)
"...disengaged enough that it actually makes it easier in some ways to teach about controversial topics...."
https://healthyinfluence.com/wordpress/2016/03/17/panthering-the-wire-with-alzheimers/doonesbury-antichrist-jefferson/
Yeah I don't even think that the PhD students teaching the seminars were bad at teaching. I'd guess that part of the problem is that you can get good grades my not doing the reading throughout the year and then really working hard in the last like third of the academic year.
I'd guess im also pretty unusual in that I'm pretty obsessed with social science (and now work in public policy doing social science) and I had lots of friends i could and did talk to about philosophy and social science.
"Economics was different because its a problem set subject, and the problem sets weren't hard enough for seminars to be helpful."
I double majored in Econ and and CS (from engineering heavy school). This was 40 years ago and the difficulty of the econ problem sets versus the CS/Engineering problems sets was night and day. Even relatively basic math stuff like prob. and stats would slow the pace to a crawl in the econ classes with profs apologizing for the "extremely challenging" problem sets.
But I was under the impression that econ as a profession has become much more mathematically rigorous and challenging over the past 30-40 years and now comes close to rivaling the math difficulty of an engineering or CS degree.
Is that not the case?
It definitely is the case, but it mostly only really starts at grad school, and econ masters and phd programs have roughly the same mathematical difficulty as cs or engineering.
I took the most mathematically difficult econ courses my uni offered (it helped there were taught jointly with the maths department I think) and they were definitely the best courses I took at university I learned a lot particularly from the really pure microeconomic theory courses. They made me a real economist in the sense that because of those courses I could read most microeconomics papers (but basically not any macro papers - the maths is considerably harder and not taught I think anywhere to econ undergrads in the UK except maybe at LSE.)
They definitely weren't as hard as 2nd or 3rd year physics or maths courses though - people do general relativity in third physics and take topology in their second and measure theory in their third for maths!
The people I knew doing maths were definitely the people working hardest and the people who felt most burnt out by their subject at the end of their degree. This is may be some warning actually against making courses too hard actually. Most of the people I knew doing economics were more excited about economics after they finished their degrees, but the opposite was true for the people I knew doing maths.
So, if you want to go to graduate school for economics, you shouldn't major in economics in undergrad. This of course is only knowable to people who are lucky enough to have good mentors or already know people in the field.
As someone who recently left grad school: I think it’s fine to make classes hard enough that attendance is effectively required. But I don’t like the idea of requiring attendance for its own sake.
I recently graduated from a joint JD/MPP program. The JD classes required attendance. The MPP classes did not.
I found that the lack of required attendance allowed me to use my time much more efficiently. By the end of every semester, I was swamped with papers, projects, and studying needed for final exams. The best way for me to use my time was often not to attend class, but to instead spend additional time studying the nuances of topics I found challenging, drilling practice problems, and going the extra mile on research papers and other large projects. Nowadays classes are recorded, so you can still watch them later if you miss - and speed them up, skip the stuff you understand well, etc.
This is fine for motivated students - but in general, I'm not dealing with motivated students.
There's also a tragedy of the commons issue here. With the levels of attendance I have, I can't teach effectively because I can't predict who will show up in class from week to week. It's effectively a different group of students each week, which means that building instructional continuity becomes very difficult. Having only 5 students in class when you're meant to have 25 means that in-class activities - which we're encouraged to do in order to "engage" students - become impossible or much less effective.
I get what you're saying, but I think it's also important to recognize that classrooms don't function identically regardless of how many students are in them.
The engagement problem is definitely real. I’m surprised there’s not a fairly reliable core of students who are there most weeks.
But re the motivated vs non-motivated students: It seems to me that if a student isn’t motivated to come to class or learn the material on their own, then their performance on exams/papers will reflect that, so grading based on attendance still wouldn’t be necessary. I guess AI does complicate this tho. Maybe the solution is in-person exams with a software like examplify that prevents internet access?
If you cannot show up you cannot lawyer, no matter how well you know the material. Showing up is part of the training.
I *can* show up, I just sometimes *choose* not to (when I have the option) because it's not the best use of my time. Efficiency is also part of being a lawyer!
If you teach discussion classes, attendence is essential.
Yeah, I agree with this. I’m fine with attendance counting for these classes because it’s such a huge part of what the class is. (But I would distinguish true discussion classes from, eg, law school doctrinal classes that only involve some discussion)
At my college, NC State, professors were required to take attendance for 100 and 200 level classes to ensure underclassmen actually went to class. Typically the rule was something like if you miss more than X number of classes your final grade will be bumped down one letter grade.
Yes, I can set my own attendence policy. And I do have a rigorous attendence policy because I teach film classes. And if you can't attend a film class, you have bigger problems than my class.
Yeah, you can definitely do this and most of the classes I've taken have had a mechanism like that. STEM classes are pretty much the only classes I've seen where you can only come into class to take a test and still get an A. The most generous social science or humanities class I've taken would give you a B, and that would be assuming you already had encyclopedic knowledge of the coursework. More likely, you'd be somewhere in the C- range due to missing out on discussions.
I can’t say that I always did my homework or the reading, but I always went to class. That’s what I’m paying for!
And to think that 20 years after the end of my formal education, I still occasionally wake up in a cold sweat because I've had a nightmare about having missed class.
Right? I've literally had nightmares about missing a whole semester's worth of class; for some of our "students" that's just what they do.
If you're doing it unilaterally it's at the very least a collective-action problem in just the same way that it is at the institutional level—if your course counts the same as your colleague's course but you grade more harshly, why would anyone take your course? And of course there is always administrative pressure not to fail students.
Earlier this year, I was an adjunct co-teaching a required grad school course. There was one alternative to this course and it was known to be less work/an easier good grade. That other course apparently always fills up first.
Our course, which the “real” professor has taught for years, was really good (thanks to him) and extremely relevant to both current events and the foundation of the profession. Yet few of the students seemed to want to be there, and their work (which I graded) reflected that.
If someone doesn't want to be in a *grad school* course why are they in grad school to begin with?
The mystery of our age!
The credential.
Why is someone getting credentialed in a field that doesn't interest them?
Because a lot of people aren't interested in anything, but still want a well-paid job that involves working in an air conditioned office.
I'm a business school professor. I can assign grades more or less freely*. The curve you outline is very close to what I am for in a 200 level undergraduate class (though perhaps with closer to 10% D/F and more As and Cs).
But, my impression is that compared to the humanities our grading is quite a bit more objective. Like.. I give tests where there is a problem with a "right" answer, so if the student gets it wrong, even if I give partial credit, they don't really have any grounds on which to complain. They still complain, of course, but I can shut it down quite easily by saying basically: you're wrong, sorry. In many humanities assignments, grading is more subjective. This isn't a bad thing, but it is much more of a judgement by the professor, which then leads to endless grade grubbing, and worse, bad-faith accusations of bias. It is much, much easier to give most students an A than to have to endlessly defend yourself.
As far as I can tell, the assumption that students expect to negotiate any grade they don't like is a relatively recent change in norms. I went to a liberal arts college as an undergrad and I cannot *imagine* telling one of my humanities professors that actually, my poorly argued essay deserved an A because I put in so much effort, and they must not care about students because anything less than an A is going to ruin my life. It would have just been unfathomable and I assume I would have been summarily dismissed from their office. I don't know. Perhaps other students were doing this and it was just me being naive. But these days every C student has a form letter ready to go (I hope this email finds you well!) that is on auto send whenever the gradebook is populated with something less than an A.
*With the caveat that faculty who were giving more than 50% As got an email from the chair last semester saying basically: stop doing that.
I'm sure I'm too late to the game for anyone to read this, but there are clear reasons why it is difficult to adapt a policy that would lead to only 20% As. For reference, I work at an R1 school, highly ranked department in STEM. The short answer is collective action - at all levels of the administrative chain.
One issue is that professors compete with each other for enrollment. If I taught an elective course with this grading policy, nobody would take it, so that is a complete non-starter.
Even when I teach mandatory courses, students almost always have a choice of instructor. If I adapted this policy, and the instructor teaching a parallel section of the same course did not, then no students would sign up for my section until the other were full. And the students forced to sign up for mine would be *very* disgruntled, and would leave terrible reviews.
It is true that once you get tenure, you can survive getting poor student evaluations, but it for sure hurts you.
I do try to nudge things in the direction John indicates, and work to get the university to disincentivize instructors from assigning high grades to get strong student evaluations. But the action space is much more limited than you may think, even for people who are very secure in their positions. You would need to convince the very top leadership (president and provost) that this is where they want to go, and then get them to forcefully enact policies to get there. That would be a very hard sell, for obvious reasons.
About a decade ago, I knew someone who taught at a small private college (which happened to be our alma mater) Most of the students failed her first test. She was told in no uncertain terms that wasn’t acceptable and she had to make most of the class pass. There was no suggestion the test was too hard; they just wanted everyone to pass so they kept paying tuition. Aside from maybe a few Ivy League schools, I think it’s that way most places.
"I have tenure, this is my grading rubric, those students didn't pass, go F yourself"
I'm actually asking a serious question: Why doesn't tenure provide the space for professors to act as they think best? Or does it only protect things like political speeches and controversial theories?
Tenure does not protect you from administrative abuse, specious and unfounded investigations, and retaliatory firing. It just means they have to waste time on generating a pretext for firing you or coerce you into quitting by substantially degrading your work and work environment.
It also doesn't protect you from entire departments getting shut down due to low enrollment and/or a bad financial situation.
I could be wrong, but part of what might be happening is that most actual grading is done largely by graduate students and/or non-tenure-track faculty, who have lots of incentives to produce satisfied customers. My first semester as a graduate teaching assistant at Harvard, I graded papers the way my papers were graded as an undergrad at MIT. I got eviscerated in course evals. Now, I don't give Bs anymore, and my course evals are glowing.
Even if a faculty member is willing to stick up for you and enforce the lower grades, they can't stop students from shredding you, and students are smart enough to make it sound like their complaints are more sophisticated than "my TF gave me bad grades."
To chime in - tenure is just not very common these days. And tenure doesn't matter if your institution has to shut down due to lack of enrollment.
This person was an adjunct.
It does to some extent. But see my comment above.. harsh grading is making more work for yourself.
You may not be able to be fired from your position, but you can be fired from teaching a course.
I can definitely assign grades however I like (I’m psychology, not humanities though). I usually aim for a B- average but I’m definitely one of the “harsher” graders in my department. I’m willing to deal with students’ griping about grades and how much work my classes are and I know my chair/admin will have my back if any student or parent complains.
The main barrier I see in my younger colleagues is that giving poor grades feels (to them) like they are being “mean”. And today’s students definitely interpret grades as a judgement on their inherent self-worth. For example, I’ve had many students in my office say things like “I’m not sure why you don’t like me.” Or “you must think I’m terrible.” Just because they earned a C on a paper! It’s a problem.
There are three separate, if related, issues here: admin policies, grade inflation and rigor. I'll address all three:
1. Admin: This varies a lot, from my experience. I was blessed to teach at institutions where admin gave faculty 100% backup. I could assign whatever grade I wanted to any student and had the final say period. In cases of cheating, too, I found the relevant bureaucrats investigating complaints to have my back.
2. Grade inflation- The problem isn't bureaucrats, and even not so much evals (though those undoubtedly play a part). It's that I care about my students. It would not be fair for me to have a class with a median of 70% in an environment where the median is an A-. It would be punishing the students who chose my class by artificially hurting their GPA, potentially harming their employment prospects.
In other words, grade inflation is a collective action problem that must have a top-down solution.
3. Rigor - to an extent you can demand high standards even with grade inflation. This can be achieved by tricking the students a little. Grading early exams toughly to weed out those unable or willing to do the work, and scare the rest to do it properly. Then, at the end, to have some technique to get the median up. This is my usual M.O. I believe I end up having tougher courses than most in terms of the work required and the academic rigor, but in terms of grade usually I manage to get the median to be close to the normal (though sometimes students really don't do the work and the grades do end up much lower - I'm fine with that). That is the best balance I believe I can strike within my own individual power. Real change will require university-level reform.
P.S.
I've been lucky in the institutions I taught in, which are also among the most elite in the country. Most aren't so lucky and admins in many cases do NOT have faculty's back which of course makes everything orders of magnitude worse.
P.P.S
A separate issue is that one of the most rigorous and beneficial exercises you could give is a research paper. I stopped giving those to undergrads except in very small seminars because AI made it impossible. Any class where I cannot orally examine each and every student at length is a class where all grades are based on pen-and-paper closed-book class-proctored exams (+ participation in discussion, where applicable and feasible). This absolutely harms the kind of education students get compared to what they would have gotten in 2019, although less so than had I pretended it's still 2019 or even 2022.
If your program loses enough students, faculty lines get cut, and tenure can't protect you from that. Some schools are cutting entire departments, philosophy often being one (perhaps because the kind of rigor philosophy expects has led to low enrollment).
I’m sure I would be allowed to. But if I wanted to assign enough writing to be able to tell apart the top 20% from the rest reliably (and give the rest most practice improving their writing) then I would have to spend 2/3 of my time reading student essays, and there’s no way you’re convincing me to do that.
Grading is definitely the worst part of teaching
Grading papers is a form of soul murder.
I mean, it shouldn’t be curved like that. If everyone does excellent work everyone should get excellent grades. Doesn’t usually happen, but in my experience (which is just as a TA) we kind of assigned the grades where the distribution separated and we did give Ds and Fs.
In my institution, you could do it and some professors definitely grade harder (not anywhere near that hard). If you kept your enrollment numbers up, there would be no issue (as long as you announced the policy at the start of class, that is) or if you're a superstar researcher. If you gave Fs that would be alot of work as those would all be bureaucratically monitored and you'd have to justify every single one, but D's are fine.
Yes, we can increase the rigor of our individual courses. At least until our student number fall.
This wouldn't work. It's like squeezing a balloon; the students would know and take classes with other professors.
One of the issues is that one of the most common high-paying career routes for humanities majors is law school and law schools do care more about your raw GPA than how hard your classes were because they are ranked based on incoming students’ average GPA. That creates a massive incentive for grade inflation and disincentive for students to take classes that grade harder. Ideally you would be able to make the classes harder but everyone still gets a good grade in the end.
This is becoming less true. Advisory Opinions just had a podcast with the Deans of Admissions for Harvard and Yale Law schools. They discuss this issue, as well as the expanding use of accommodations. It is worth a listen.
https://thedispatch.com/podcast/advisoryopinions/getting-into-law-school-interview-miriam-ingber-and-kristi-jobson/
This isn't becoming less true at all based on 2 schools where the median GPA of incoming students is already a 3.95.
yep, I was going to mention that.
Yes, this seems like a massively underrated factor. Law school admissions is mostly an IQ test (yay! Merit!) plus a mind-numbingly dumb unweighted GPA component
The fix for this is that law school admissions should focus more on the LSAT and less on GPA.
The LSAT has recently dropped the logic-puzzle section, did you know that? I think that's bad.
What? That's really bad. Why?
Because people complained it was too hard.
It had to do with an ADA complaint from people who are blind, I think.
Did it really? That's ridiculous. There would be ways of providing similar challenges to blind people. You don't have to drop the whole section!
Oh man, they were the only questions I got wrong on my LSAT, and I hated them.
This is basically my point above, I think. People who seriously studied for the LSAT probably found the logic games the easiest to consistently nail. For people who are taking the test without much studying, LG is pretty easy to brick, which some interpret as "hard."
Law schools are already maximally focused on LSAT scores.
It's even worse now. I graduated summa from my university a decade + ago and was above the 75% quartile for GPA basically everywhere, and now my GPA is below the 25% quartile at a lot of places. I honestly don't understand how some schools have top quartiles with GPAs > 4.0.
I usually don't begin sentences with "As someone with an English PhD who's taught college" because it's insufferable to do so. But since that insufferable clause is true, I'll say I endorse every sentence of this post.
The problem with the admirable solution Matt suggests is one I discovered in my first semester as a grad-student teacher. I had gone to a college where my best English professors had very high standards for classroom discussion, assigned tons of reading, and didn't hesitate to assign low grades to low-quality work. I tried to apply the same standards once I started grad school and was the one doing the teaching and grading. The result was that I received terrible scores on my teaching evaluations and got hauled into the office of the grad-student teaching coordinator and lectured about my allegedly terrible teaching. I struggled to improve my teaching (item: it was fine) for another semester until I encountered a fellow grad student who'd been similarly lectured and who told me she just grade-inflated her way to higher teaching-eval scores.
My mentor at my next job told me that "Grade-inflate till tenure" is the standard advice to junior professors.
These jobs were both LESS pressured than most in that they weren't adjuncting gigs in which teachers are hired on a per-semester basis, which is the norm in the humanities at most institutions of higher ed.
These problems will only be solved if humanities departments at universities with graduate schools stop over-producing PhDs relative to the demand for full-time professors. Which is never going to happen, both because it would dent the egos of R1 professors not to have graduate students and because the politics of humanities professors render them far more likely to complain about "neoliberalism" and right-wing politicians than to read a chart or realize that supply and demand are real phenomena that exist and shape even the humanities job market.
That's not insufferable since it's relevant to your comment; what's insufferable is when people stick their academic credentials next to their username.
Tl:dr: I endorse Matt's diagnosis and prescription, but I believe the humanities is going to keep driving in its current direction until it goes right over the demographic cliff that's just a few years down the road.
As a Classics major, I spit out my coffee upon reading that schools are not requiring Greek and Latin to graduate with a Classics degree.
I have read that Oxford's classics degree no longer requires Homer, because he's "too hard."
Koine Greek only, in the NIV translation please
I think there are abridged versions? We read it in middle school 20 years ago but I don't think it was the full one
The fundamental topic missing in this essay is Gen Ed. Humanities departments support their faculty (and PhD program)size by teaching general education courses and making those, which are the main exposure most students have to the humanities, hard would simply send those students to other departments and collapse the size of faculty and graduate student population.
The other thing that this misses is that for most students, STEM vs humanities is not the choice. Both of those are hard and for a minority of students. Instead most people are majoring in business, criminal justice, communications, education, nursing, etc.
Also, for a big gen ed class designed to get butts in seats, you don’t have enough teaching labor to grade the amount of writing it would take to make the class more rigorous. (I suppose you could make it artificially rigorous by asking students to memorize facts from readings, but that isn’t what anyone wants students to get out of the class.)
Yes, the entire premise of the piece is a bit off. The primary purpose of the humanities is not to develop people with a high degree of expertise in the humanities. It is to deliver elements of a liberal education to all students. In my day, the science departments offered "10-series" intro classes to non-majors (the textbook for Physics 10 was "Physics Without Math"). I suppose humanities departments could do the same thing, but I don't know how practical that would be.
Where I teach the humanities and social sciences are not hard.
Obviously I don't know where you teach, but at IU majoring in English or History is not as hard as in physics or CS but much harder than apparel marketing or kinesiology.
Another important reason is the structure of the curriculum. Most STEM disciplines have a strict prerequisite structure which means later courses can assume a lot. Most humanities disciplines don't organize things this way, for a variety of reasons, but that means there's a broader range of students in many classes.
First off I agree with all point in the article. But as a science professor at a PhD program school, Matt is missing one big change that has happened since he (and I) went to college. That is the rise of undergraduate business degrees. Where once students would choose between history or economics, they are now increasingly opting into management or logistics degrees. They are not moving more into stem (trust me.) The customers (the parents!) are laser focused on roi for degrees at all but the most selective universities, and business colleges fit the bill nicely.
Students get business degrees because Econ programs require them to take calculus.
And because English programs make them write essays.
Left unsaid in this piece is the impact of AI on the humanities, which I have to figure is going to be significantly negative from the standpoint of learning marketable skills. That is, if in the humanities you learn how to read difficult texts, to write intelligently and persuasively, how is that differentiated from what the next GPT is going to be able to do, and is it therefore already devalued? However, if institutions took the obvious anti-AI-enabled-cheating step of implementing oral exams for the humanities, then I could see the marketable value from someone who could discuss difficult subjects on their feet.
I mean, that's more or less true of every major, though. Pretty much all STEM majors are teaching about what is valuable now versus what will be valuable later. Right now, it seems like a substantial proportion of what CS majors take is just straight-up useless, and there's not much keeping the same from being true in other fields once the technology improves.
I guess what you say is true of computer science, but this is not the case for fields where hands-on work is required, such as bioengineering or chemistry. Until we have humanoid robots to do AI’s bidding, people will be necessary to actually build and test devices, materials, chemicals, organisms and so forth.
Labs were always really strict on attendance
My first college English professor was Dr. Ray. With an infamous nickname of “Death Ray”. He graded extremely hard (at first), but was always extremely clear in how and where we needed to improve.
Most important class I ever took - because it made me stretch my abilities. Learning I could even do that was foundational.
I had a similar experience (in the 20th century) with a freshman year course called Methods of Thinking. It was fundamentally about structuring an argument; I can’t recall the whole syllabus but we started with a lot of Plato’s (Socratic) dialogues. It was a lot of work but it was by far the best class I’ve ever taken (and I’ve taken *a lot* of classes). I use what I learned in that class pretty much every day.
“In the 20th century” you’re killing me.
Being an Old in today's world kills me every day! 😉
As a fellow Old I like to write it as “in the late nineteen hundreds…”
I had an amazing English professor who gave me a D for my first paper and I was ecstatic.