During COVID I went down something of a rabbit hole on the subject of wrongness, and wrote some essays into a OneDrive folder that I had established on a new PC. Then they disappeared for a couple of years (probably because of operator error) and I couldn’t find them. Fortuitously, they warped back into existence right around the time of the recent election, and I’ve been kicking through them.
In the meantime, of course, I happened to establish this Substack. So I will post them here when they’re decent. This will be the first of those. I don’t know when the others will be ready. Currently I’m heads down writing the second book in the Bomb Light series, and so it’s going to be sporadic.
If you’re wondering how wrongness can even be a topic, here are a couple of examples of widespread, systemic wrongness:
There are a lot of different religions. They can’t all be right. If you’re non-religious, you believe that all religious people are wrong. If you believe in a specific religion, you believe that all of the non-religious people, as well as all of the people who believe in a different religion from yours, are wrong.
We didn’t even know about germs until less than 200 years ago. Before then, everyone who had an opinion about infectious disease was wrong.
Or, as Charles Sanders Pierce—the subject of this post—put it: “Every work of science great enough to be well remembered for a few generations affords some exemplification of the defective state of the art of reasoning of the time when it was written.”
Many more examples of wrongness could be cited, but the point is that nearly everyone is wrong nearly all of the time, including people we love, trust, and respect.
A lot of discourse on social media is devoted to staking out positions that are right and flaming people who are wrong.
After watching that for a while it occurred to me that there’s a split even more fundamental than the one that gets all the attention, namely, left vs. right.
It lies between people who implicitly assume that it’s even possible to arrive at abstract positions that are sturdy enough to act upon, vs. those who are at peace with the fact that the universe is very complicated; that we don't have all of the information we need to make sense of it; that a lot of the information we do have is questionable; and that our intellectual powers are limited. That we are, in short, wrong a lot. A word for this is Fallibilism.
By default, Fallibilists tend to revert to middle-of-the-road positions in the political sphere.
(In a way, this ties into my previous post Idea Having Is Not Art. In that context, I was talking about idea havers in the creative sphere. But politics is full of idea havers too, and Fallibilists tend to take a dim view of political idea havers.)
Pictured at the top of this post is American philosopher Charles Sanders Peirce, whom I became aware of in mid-2021 when I participated in a Brookings Institution webcast with Anne Applebaum and Jonathan Rauch. The occasion was the launch of Jonathan’s book The Constitution of Knowledge, which talks in some detail about Peirce’s philosophy. As such it’s probably a better source of information than what you’re reading now. But here’s a quick introduction—just in time for Thanksgiving, when many Americans will no doubt find themselves in the company of loved ones who are Wrong.
A lot of Peirce's work is in realms of logic that most people are going to find pretty forbidding, though it did earn him respect from heavy hitters of later generations such as Bertrand Russell, John Dewey, and Karl Popper. Fortunately, everything we need for our purpoes is contained in a single essay entitled "The Fixation of Belief" which was published in 1877. Though written in a typically Victorian style, it doesn't make use of specialist terminology and is quite approachable for modern readers.
Peirce’s The Fixation of Belief
Preliminaries
Peirce says, “Doubt is an uneasy and dissatisfied state from which we struggle to free ourselves and pass into the state of belief.”
Once we have done so we don’t want to pass back into the state of doubt, or to change what we believe.
“The irritation of doubt causes a struggle to attain a state of belief. I shall term this struggle inquiry….The sole object of inquiry is the settlement of opinion. We may fancy that this is not enough for us, and that we seek, not merely an opinion, but a true opinion. But put this fancy to the test, and it proves groundless; for as soon as a firm belief is reached we are entirely satisfied, whether the belief be true or false.”
Peirce then defines four distinct methods that are used by different people to decide what to believe.
The Method of Tenacity
“…the instinctive dislike of an undecided state of mind, exaggerated into a vague dread of doubt, makes men cling spasmodically to the views they already take.”
We've all known this person: one who can't be swayed by any evidence or logic to alter their position. It's difficult to make out what's going on in their heads, because it's so alien to how evidence-based people think.
My surmise is that they basically perceive differences of opinion as contests of will--struggles to establish dominance. Anyone who changes their mind is perceived as having been defeated and so undergoes a loss of face. This dynamic is intensified when the debate is happening in public. The more eyes are on you, the greater the humiliation attendant on changing your mind. Social media, of course, heightens this by putting every discussion in the public eye. Even if only a few people are following it in real time, there's no telling when it'll get re-linked and go viral to be seen by countless observers.
The Method of Authority
As Peirce explains, the Method of Authority arises directly from the Method of Tenacity. It's hard to improve on Peirce's own words: "The man who adopts [the Method of Tenacity] will find that other men think differently from him, and it will be apt to occur to him, in some saner moment, that their opinions are quite as good as his own, and this will shake his confidence in his belief. This conception, that another man's thought or sentiment may be equivalent to one's own, is a distinctly new step, and a highly important one. It arises from an impulse too strong in man to be suppressed, without danger of destroying the human species. Unless we make ourselves hermits, we shall necessarily influence each other's opinions; so that the problem becomes how to fix belief, not in the individual merely, but in the community."
And that is how we get to the Method of Authority. As the name implies, this boils down to believing what you're told to believe. For some people that authority might be a single charismatic leader. For others it might be an institution or a holy book. While it's associated with religions, it seems to work equally well with non-religious ideologies.
Peirce, after a fairly dispassionate explanation of this Method, lowers the boom: "For the mass of mankind, then, there is perhaps no better method than [the method of authority]. If it is their highest impulse to be intellectual slaves, then slaves they ought to remain...Cruelties always accompany this system; and when it is consistently carried out, they become atrocities of the most horrible kind in the eyes of any rational man."
The A Priori Method
In the same way as the Method of Authority was a natural outgrowth of the Method of Tenacity, the A Priori Method arises from the inherent shortcomings of the Method of Authority--at least in the minds of thoughtful and worldly people.
Let us say that you are a Method of Authority person. Unless you live in complete isolation from the rest of the world, you can't help but notice that the world abounds in authoritative persons and institutions telling people what to believe. These don't all agree with one another. They can't all be right.
If you're fanatically devoted to one authority, everything's easy: you just declare that all of the other authorities are wrong. This puts you firmly in the Method of Authority camp.
For some, though, it's natural to ask "can't we all just get along?" and try to reason things out. According to Peirce's taxonomy, this puts you in the A Priori camp (I don't quite understand his use of the term "a priori" here, but we have to attach labels to these concepts, and we might as well stick with Peirce's).
The specific examples he cites come from the history of astronomy, in which various ancient and medieval thinkers tried to work out the scientific basis for the movements of the planets across the night sky. If you watch the movements of Jupiter, Saturn, Mars, etc. for long enough and keep records, you can see patterns hinting at some underlying natural order. But if you don't know about gravity and other basic physics, then your mind is free to roam just about anywhere. Plato tried to explain the positions of the planets in terms of the length of plucked strings, drawing an analogy between their movements and musical harmony. Kepler, before he hit on a better answer, thought that the same phenomena could be explained using a geometric argument having to do with the sizes and shapes of various polyhedra. Both Plato and Kepler were wrong. But the way in which they developed their arguments seemed reasonable. They weren't simply appealing to authority. They were following a kind of thought process that, to learned and intelligent people of their day, seemed "agreeable to reason."
The tricky part, as Peirce points out, is that beliefs that seem "agreeable to reason" don't always agree with each other. Plato and Kepler were intellectual giants who thought deeply about these matters. Both came up with reasonable-sounding explanations of the planets' movements. Highly intelligent and well-informed people all around them nodded their heads sagely and said "that totally makes sense." But they were all wrong.
A more up-do-date example has to do with QAnon and other Internet-based conspiracy theories. Followers of these theories actually think that they are being reasonable. "Do the research" is their mantra. But the "research" that they're doing just consists of clicking on links taking them to bad information. This is what makes these conspiracies so viral and tenacious in our politics: their followers appear to be hunting down leads and connecting the dots in a way that superficially looks like a scientific, investigative process. They can thus claim that they're independent thinkers--not simply following the Method of Authority, not taking anyone's word for it. Though they'd never use this terminology, they are claiming to follow the A Priori Method.
Peirce states: “This method is far more intellectual and respectable from the point of view of reason than either of the others which we have noticed. But its failure has been the most manifest…the very essence of [the A Priori Method] is to think as one is inclined to think"
This is a harsh but inescapable conclusion. It seems harsh because followers of the A Priori Method are making a sincere effort to get out from under the obvious faults of the Method of Tenacity and the Method of Authority. They are evaluating evidence and drawing logical-seeming conclusions. But in the end, you can't get Plato and Kepler to agree. In circles where everyone follows the A Priori Method, you get a lot of people--probably smarter and better educated than most, and proud of being so--agreeing with each other as to what sounds reasonable. But there is no baseline for determining whether their conclusions are valid. It's how we get Internet bubbles, which is to say, groups of like-minded people on social media all vigorously agreeing with each other, certain that they’re right.
Peirce concludes: "A different new method of settling opinions must be adopted, that shall not only produce an impulse to believe, but shall also decide what proposition it is which is to be believed"
The Method of Scientific Investigation
"A method should be found by which our beliefs may be determined by nothing human, but by some external permanency -- by something upon which our thinking has no effect…this is the only one of the four methods which presents any distinction of a right and a wrong way."
Alas, Pierce doesn’t devote much space in The Fixation of Belief to actually describing this last and best method. Instead he sort of kicks the can down the road to other papers in the same series. Those end up delving into the matter in a lot more detail that can be covered here. But the gist of it, I think, is willingness to change one’s opinion in the face of evidence and sound logical arguments. Exactly how this differs from the A Priori Method isn’t crystal clear, since followers of that method are likely to claim that they too are just being scientific and rational. I think that it boils down, in Peirce’s mind, to the degree of rigor that one applies to weighing evidence, combined with a willingness to tolerate “the irritation of doubt” when that is demanded. The overall doctrine of Fallibilism emerges thence.
Anyone who has ever sincerely believed in something, only to look back on it later in the full awareness that they were wrong, is on the path to being a Fallibilist; and when they encounter an idea haver who is sure they’re right, they see a more naive version of themselves.
This is probably enough to get anyone who is actually interested in this material an introduction. It’s easy to find Peirce’s work online, and he’s an engaging writer. So I’ll leave off with the following quote. Readers can decide for themselves how to apply it to whatever transpires around Thanksgiving dinner tomorrow:
“If liberty of speech is to be untrammeled from the grosser forms of constraint, then uniformity of opinion will be secured by a moral terrorism to which the respectability of society will give its thorough approval. Following the method of authority is the path of peace. Certain non-conformities are permitted; certain others (considered unsafe) are forbidden. These are different in different countries and in different ages; but, wherever you are, let it be known that you seriously hold a tabooed belief, and you may be perfectly sure of being treated with a cruelty less brutal but more refined than hunting you like a wolf.”
Kathryn Schulz's book, "Being Wrong: Adventures on the Margin of Error" was revelatory for me. Rather than shunning error or trying to avoid it, I realized that acceptance of the inevitability of error was a kind of freedom. In "The Name of the Rose", William of Baskerville says that, instead of conceiving of one error, he imagines many, so that he becomes a slave to none.
Thanks for sharing! Hadn't heard about this person before.
Could it have been called a priori because in many of those instances, people seem to plot the curve first and then pick data to fit?