When I read the Tracinski book about Atlas Shrugged, at first I interpreted it as heavy overreaching. There are errors all over the place. They seem out of control. The book seems broken throughout. It’s hard to get any value out of it because the quality is too low.
However, I came up with a different perspective on the book so that it made more sense. I figured out that the errors (mostly) fit certain patterns and were not simply everywhere. In some sense, Tracinski had a lot of control over the errors and wasn’t overreaching. And once you know the patterns for what has errors and what doesn’t, it’s much easier to get some value from the book: you can focus on the stuff that isn’t wrong. But that only works when the errors seem somewhat organized to you instead of chaotic.
The key insight to understanding the patterns of error was to consider Tracinski’s goals. What does he care about? What is he trying to do? Most of the errors were on side issues rather than directly affecting Tracinski’s goals.
I decided that Tracinski’s goals including writing a tribalist book, but not a precise book. Detail errors don’t matter when you’re being a tribal leader. If you think of him as leading a chant in front of a mob of 10,000 people on his side, then who cares how accurate his wordings are?
There were some errors regarding the plot of Atlas Shrugged and some factual errors. It wasn’t just a bunch of logic errors and sloppy wordings. But what Tracinski cares about is politics and allegiance to a particular political tribe. He keeps saying how his tribe is good and other tribes are bad. Facts are pretty irrelevant to that goal. The goal isn’t about rational truth-seeking.
When Tracinski focused only on Atlas Shrugged for a while, with no mention of our society or anyone’s reactions to Atlas Shrugged or any current political issues, then quality went up. In those parts, his goals were focused more on book analysis. However, the overall goal of the book was still tribalist, so quality was mediocre. Part of his goal in the analysis parts was to posture as a rational thought leader – if he is one, then that gives more status to his tribalist claims. Then the more overtly, directly tribalist parts of the book were much lower quality – facts would get more egregiously ignored or trampled because Tracinski’s attention would be focused on his goal of flaming the outgroup.
Thinking about the goals behind stuff is a widespread, important idea. Another place it comes up, besides CF and TOC, is in the psychologist Adler. It’s discussed in the Adlerian book The Courage to be Disliked (primarily in night 1, which is the best part). Sometimes we talk about it in terms of the “motive” behind an action. Or we say “Cui bono?” (who benefits? in other words, the goal seems to be to benefit some individuals at the expense of others, and we need to figure which individuals and how they benefit. This is particularly used for crime suspects).
I’m finding the same sort of goals-based analysis necessary to understanding an academic epistemology paper by Richard Pettigrew. Initially I got stuck on the second paragraph because of errors. That paragraph is part of the introduction, and I thought later parts built on it. I assumed the introductory remarks were there to set up, and serve as a foundation for, the later remarks. So if they’re wrong, it’s hard to keep reading. I stopped reading to consider. I wanted to make sense of the introductory text before continuing, but found various blockers to that project (e.g. there’s no good forum to ask at, and no good Paths Forward).
I had a similar issue on page 2 of Pettigrew’s textbook. I also noticed poor attention to detail from Pettigrew in a bunch of ways, e.g. dead links for his papers on his website, inconsistent ordering of different media types for papers, inconsistent format options available for papers, inconsistent platforms used for sharing papers, and “PDF” links that go to HTML pages not to PDFs (in particularly, most of them go to Google Drive pages that have a web viewer for PDF content and a download button that can get you an actual PDF). Great attention to detail seems like a prerequisite for doing good work in his field (formal epistemology).
But what are Pettigrew’s goals? Maybe he wants to get grants worth a lot of money so he can avoid teaching. He seems to have done that multiple times. Do the grant givers care if some papers listed on his website have no links that actually work? Do they care if some HTML pages are labelled as PDFs? I doubt it.
Setting aside issues like career advancement and social climbing, I have a theory about Pettigrew’s philosophical goals which affect his writing.
I think Pettigrew has a narrow specialization. He only wants to (and knows how to) talk about some specific local details within an assumed framework. He wants to treat a bunch of assumptions and premises as givens so he can get to the details he works with. He doesn’t know how to defend or explain the whole chain of ideas from the low level ones he focuses on up to high level ideas, common sense, things laymen would understand, etc. He doesn’t check his premises. He works within a particular speciality based on a ton of assumptions. He’s in a poor position to engage with high level disputes from rival schools of thought. He can’t engage about e.g. Bayesianism (his position) vs. Critical Rationalism because CR challenges Bayesianism a dozen levels of abstraction above the stuff Pettigrew has focused his effort on.
So what’s going on with errors in the introductions of his writing? Introductions are where Pettigrew feels pressured to give high level summaries before diving into the good part – the low level details. In introductions, Pettigrew has to write something about the forest instead of just the trees. But he’s only an expert on one type of tree, not on forests, so he makes lots of errors.
I think Pettigrew just wants to get to the good part (low level details and his pursuit of various local optima within his framework with a ton of assumptions) and the introductions aren’t meant to actually be engaged with. None of the high level stuff he writes is really meant for engagement. I think this is widespread among academics and happens elsewhere too.
A lot of Pettigrew’s errors relate to clarity and ambiguity. He doesn’t know a lot about communicating. These errors don’t matter much to his goals because he’s trying to speak to other specialists who have a lot of the same assumptions and context as him. The more similar someone is to you, the less skill at communication you need. You can give incomplete hints at something and they can guess what you mean because they already think about it similarly – less communication effectiveness is needed because they already know most of what you’re saying before you say it (this is why people like talking to their friend group and get frustrated when trying to talk to people from other subcultures – their attempts at communication start failing and they tend to just blame others since they experience their communication skill as adequate and ineffective when talking with their friends, which they think indicates they are competent, effective communicators).
I think Pettigrew and Tracinski make important errors even relative to their goals. But way fewer errors than it first looked like to me.
I think Pettigrew and Tracinski have bad goals, and are dishonest with themselves (and their audiences) about what their goals are. I think such bad goals are widespread and difficult to discuss because people deny having them. But when someone makes a thousand errors relative to goal X, and only a dozen relative to goal Y, we should all (including them) suspect Y is their goal, even if they claim X is. These discussions also tend to get bogged down by them denying and debating the majority of the errors – while having the goal of defending themself not the goal of truth-seeking, so it isn’t productive (plus the truth-seeking goal is so foreign that they are missing a lot of prerequisites for it and can’t just turn it on even if they want to).
When people appear to me to be heavily overreaching, it’s often mild overreaching at some different, bad goals instead.