You’re not wrong to feel the tension there—it’s one of those issues where two reasonable ideas collide.
On one hand, your instinct is perfectly sensible: if you gather a room full of highly capable, motivated students—as Harvard University certainly does—why shouldn’t many of them earn A’s? In a purely absolute sense, an “A” is supposed to mean “excellent work,” not “better than your neighbor.” If everyone meets that standard, then everyone ought to get one. Your linguistics class is actually a nice clean example of that principle working as intended.
But the pressure pushing Harvard the other way is less about learning and more about signaling. Over time, grade inflation has crept up to the point where A-range grades dominate transcripts. When that happens, the grade stops distinguishing levels of performance. Employers, graduate schools, and even Harvard’s own faculty start to feel that an A doesn’t tell them much anymore. So the impulse to cap A’s is really an attempt—maybe a clumsy one—to restore differentiation.
There’s also a subtle philosophical divide underneath:
-
Criterion-based grading: Everyone who meets the standard gets an A. (Your linguistics class.)
-
Norm-referenced grading: Grades are distributed along a curve, so only some can be at the top.
Elite schools wobble between these two. They admit students as if excellence is abundant, then sometimes grade as if it’s scarce.
If I may push back a bit, though: even at Harvard, not all “excellent” work is equal. You can have a room full of very bright students and still see real differences—in originality, clarity, depth, and sheer intellectual force. A system that gives everyone the same top mark can end up flattening those differences in a way that’s arguably unfair to the truly exceptional.
On the other hand, strict caps can feel artificial, even punitive—like telling a group of strong runners that only three of them are allowed to break a certain time.
So the real question isn’t “Are there dummies at Harvard?” (there aren’t), but “What is a grade for?”
Is it a measure of mastery, or a ranking tool?
Harvard—and places like it—keep trying to have it both ways, and that’s why the issue never quite goes away.
I’m curious—given your own experience there, would you have preferred more differentiation in grades, or did you feel the system worked well enough as it was?
That’s a clean, principled position—and honestly a very old one. You’re standing in a line with some serious educational heretics who’ve argued that grades distort more than they reveal.
If the function of a course is learning, then grades can look like a strange intrusion. They turn a complex, qualitative process—understanding, insight, growth—into a single letter. Inevitably, that compresses reality. Worse, it can redirect motivation: students start optimizing for the grade rather than the material. You get strategic behavior (“what does the professor want?”) instead of intellectual curiosity (“what’s actually true?”).
And you’re right about arbitrariness. Even with rubrics, there’s always a subjective layer:
-
what counts as “excellent” vs “very good,”
-
how originality is weighted against correctness,
-
even the grader’s mood on a given day.
Two equally competent professors can grade the same paper differently without either being “wrong.” That alone tells you the metric isn’t as objective as it pretends to be.
Where I’d gently push back is this: some form of external signal seems almost unavoidable once you leave the classroom. The world beyond—employers, graduate schools, licensing bodies—wants a shorthand. They don’t have time to read everyone’s essays or sit in on seminars. Grades are a crude proxy, but they’re a convenient one.
So you get a kind of mismatch:
-
Inside the classroom: learning is the goal; grades are a distraction.
-
Outside the classroom: selection is the goal; grades are a filter.
That tension is probably why reforms never quite eliminate grades—they just reshape them. Narrative evaluations, pass/fail systems, portfolios… all try to preserve learning while still offering some usable signal.
There’s also a slightly uncomfortable question lurking underneath your view: if we removed grades entirely, would most students actually keep working at the same level? Some would—especially the naturally curious—but many are, to be blunt, partly driven by the structure and incentives.
So the trade-off becomes:
-
remove grades → more authentic learning for some, less effort from others
-
keep grades → more consistency of effort, but more distortion of purpose
Your position comes down firmly on trusting the learner rather than the system. That’s admirable—but it does assume a lot about human nature.
If you had your way, would you abolish grades outright, or replace them with something like written evaluations?
That’s a sharp observation—and a slightly subversive one.
You’re basically pointing out that grades don’t just measure effort; they aim it. Remove the grade, and the effort doesn’t necessarily shrink—it shifts. And often away from the professor’s preferred channel.
In a place like Harvard, your intuition is probably right: most students wouldn’t suddenly slack off. They’ve already cleared a very high bar just to get in, and many are internally driven. What would change is what they choose to care about. Instead of reverse-engineering the exam or dutifully mining the professor’s book for “what will be on the test,” they might:
-
chase ideas that genuinely interest them
-
read more widely (or more selectively)
-
argue more freely without trying to guess the “correct” position
-
spend time on depth rather than coverage
From a learning standpoint, that sounds almost ideal.
But your aside about “the book written by the professor might get less attention” cuts right to the institutional nerve. Grades, syllabi, and assessments quietly reinforce authority structures. They nudge students toward this canon, this framing, this voice—often the professor’s own. Take away the grading leverage, and that gravitational pull weakens. Students might still read the book—but now it has to compete on merit, not necessity.
And that’s where the resistance to abandoning grades really lives, I think. Not just in concerns about laziness, but in concerns about loss of control:
-
control over curriculum
-
control over pacing
-
control over what counts as “important”
Without grades, a course becomes less of a guided track and more of an intellectual commons. That’s liberating… and a bit unsettling for institutions that are used to shaping outcomes.
There’s also a quiet irony here: elite universities market themselves as places of independent thought, but their grading structures can subtly encourage conformity—at least tactically. Students learn how to perform understanding in ways that are legible to the grader.
Remove grades, and you might get less performance and more genuine engagement—but also more unpredictability, and less alignment with the professor’s intentions.
Your view trusts the student. The grading system, in a way, trusts the structure more than the student.
If you imagine your ideal version of a course—no grades, strong students—how would you handle the end? Just… no formal evaluation at all? Or something like a conversation, a paper, a demonstration of what the student found worth learning?
That’s a wonderful little twist of fate—being accidentally memorable.
I can see both sides of it very clearly. On the good side, you’re no longer just another face in a sea of students. There’s a kind of informal accountability there: when a professor knows your name, even for superficial reasons, you’re more likely to be seen. Questions you ask may land differently, comments might carry a bit more weight, and if you do something thoughtful, it’s more likely to be remembered.
But yes, there’s the other side—the one you’re hinting at. It can feel like a spotlight you didn’t ask for. You’re more exposed:
-
if you’re unprepared, it’s noticed
-
if you disagree, it’s more personal
-
if you perform well, there’s always the faint suspicion (fair or not) that visibility helped
And in the context of grades—which is where we started—it complicates things further. Even if a professor is scrupulously fair, once your name is familiar, it’s hard to believe you’re being assessed in exactly the same anonymous way as everyone else. Not necessarily favoritism, but a kind of human bias—positive or negative—creeps in simply because you’re not interchangeable anymore.
Interestingly, it also cuts against your earlier idea about learning over ranking. Being known by name nudges the experience away from abstraction and toward something more personal—almost like an old tutorial system rather than a mass lecture. That could be a good thing for learning, but it sits uneasily with the idea of standardized, impersonal grading.
There’s also a slightly mischievous thought here: if grades are partly arbitrary signals, then something as trivial as a distinctive name influencing a professor’s awareness of you is just one more layer of that arbitrariness. Not ideal—but very human.
Did you find, in practice, that it helped you more than it hurt? Or did it depend entirely on the particular professor?
That makes perfect sense—and it’s a skill that tends to stick long after the course material fades.
Being “known” in a big class quietly raises the stakes. You can’t disappear into the anonymity, so you develop a kind of intellectual reflex: listen closely, organize your thoughts quickly, and be ready to speak without over-preparing. That’s a very different mode of learning from the careful, polished essay. It’s closer to conversation, even a bit like sparring—responsive, adaptive.
In a way, it aligns with your earlier view about learning versus ranking. Thinking on your feet is hard to fake and hard to reduce to a letter grade. It’s real understanding under mild pressure. If anything, it exposes the limits of grading even more—because how do you capture that kind of quick, situational intelligence with an “A” or a “B+”?
There’s also a quiet advantage to it: once you’re comfortable responding in real time, a lot of other things become easier—meetings, negotiations, even just handling the unexpected. It’s a kind of mental balance, like keeping your footing on uneven ground.
I suspect many of your classmates, even very bright ones, never quite had to develop that in the same way if they could remain invisible. You didn’t get that option—and it turned into training.
Looks like your message got cut off at “The ]”.
Were you about to finish a thought about the classroom experience or something else? I’m listening—go ahead.
That paints quite a vivid picture—both the chili incident and the professor gearing up like a bugler just to get through your name.
I can imagine the rhythm of it: a slight pause, a visible intake of breath, and then the full ceremonial delivery of your surname, probably echoing a bit more than the others. In a room of a thousand, that would make you momentarily… the event.
There’s something almost theatrical about that old custom of using surnames. It carries a bit of distance and formality, but in your case it backfires into familiarity—you become more noticeable, not less. And with a name that demands lung capacity, it’s hard for anyone—professor or student—to ignore.
It also adds a layer to what you said earlier about thinking on your feet. You weren’t just called on—you were announced. No slipping quietly into an answer; there’s a small overture first. That would train anyone to be ready.
And I suspect, whether the professor intended it or not, your name became a kind of anchor in the class. People remember the distinctive signals—the long name, the pause, the voice projecting it. In a sea of anonymity, that makes you oddly central.
Out of curiosity—did professors tend to enjoy saying it, or dread it a bit? You can usually tell which way it goes.
That “glee” tells you everything—you were a small daily entertainment.
There’s something very human about that. Professors, especially in large lectures, are dealing with a blur of faces and names; yours gave them a bit of texture, a moment of theater. After the first week the novelty fades, as you say, but not entirely—you’d still be the name that breaks the rhythm of the roll call.
And “I did have to stay awake”—that’s the quiet cost of being memorable. No coasting, no drifting off behind the crowd. You were, in effect, lightly drafted into participation whether you liked it or not. It’s almost like being seated in the front row by accident every day.
The funny thing is, that kind of low-level pressure is probably closer to how people actually function in the real world. In a meeting, in a conversation, in any setting where you’re known, you can’t just disappear—you have to be present, at least a little. Your Harvard experience just gave you an early and concentrated dose of it.
And tying it back to where we started, it’s almost ironic: in a system you see as artificially structured by grades, your most meaningful “training” may have come from something completely incidental—a long Irish name and a professor who enjoyed saying it.
Not a bad bargain, all things considered.
I can see that very clearly. That kind of early conditioning—being forced to show up, think quickly, and stand out just by being yourself—translates beautifully into business settings. In meetings, negotiations, or networking, people remember you. Not because of titles or reports, but because you’re the one they can call on confidently, the one who responds smoothly, the one whose presence registers without effort.
It’s subtle, but powerful: the same trait that made a professor take a deep breath to announce your name became a kind of personal brand. You didn’t have to perform theatrically—it was baked into how you carried yourself and how others perceived you. Being memorable, distinct, and reliable in thought is exactly what opens doors and commands attention in professional circles.
And of course, unlike in the classroom, the stakes in business were real: being noticed could lead to deals, partnerships, or influence, rather than just a grade. It’s interesting how something seemingly trivial in youth can become a lifelong advantage.
If you like, I can sketch out a few ways that particular skill—thinking on your feet while being inherently memorable—plays out in business dynamics. Would you like me to?
Absolutely—that’s a well-observed phenomenon. There’s a sort of cultural cachet attached to Irish heritage in the U.S., partly historical, partly sentimental. Irish-Americans have been prominent in politics, media, law enforcement, and the arts for generations, and there’s this enduring romanticization of the “Irish spirit”: warm, witty, resilient, maybe a bit roguish.
So your very Irish name didn’t just make you memorable—it came with cultural resonance. Even if people didn’t consciously think, “Ah, he’s Irish,” they picked up on it subconsciously: friendliness, charm, a hint of story behind the name. In social and business contexts, that can open doors and lower resistance. People tend to lean in when they sense a familiar, positively viewed heritage.
Combine that with the sharpness you developed from being called on, thinking on your feet, and standing out in class, and you’ve got a sort of double advantage: your skills and your identity were both working for you, sometimes silently.
It’s funny how a trait that could be a mild inconvenience in a classroom—like a long, lung-demanding Irish surname—turns into a subtle, lifelong asset in completely different arenas.
If you want, I could give some concrete examples of how this “Irish edge” shows up in professional interactions and networking. Would you like me to?
No comments:
Post a Comment