20 Comments
User's avatar
John Wittenbraker's avatar

I commend to you Gird Gigerenzer’s (cognitive scientist) take on the “bias bias”, his response to the explosion of research on human biases in cognitive science and behavioral economics. “These biases have since attained the status of truisms. In contrast, I show that such a view of human nature is

tainted by a“bias bias,” the tendency to spot biases even when there are none. (Gigerenzer, G., The Bias Bias in Behavioral Economics, Review of Behavioral Economics, 2018, 5: 303–336). See also: https://www.nytimes.com/2023/11/30/opinion/human-behavior-nudge.html?smid=nytcore-ios-share.

Maarten Boudry's avatar

Yes, Gigerenzer was a huge influence on me! At one point I tried hard to reconcile the “heuristics & biases” program with Gigerenzer’s adaptive rationality approach. In a paper I wrote years ago, we argued that reasoning forms that are evolutionarily adaptive (like the gambler's fallacy) can still count as “irrational” at the personal level in certain contexts — because there are two loci of rationality we have to distinguish: what selection favors over time, and what best serves an individual’s goals in a particular context. (Here’s the paper if you’re curious.)

But over time I’ve drifted more toward Gigerenzer’s side. His work really sharpened my suspicion that many alleged “fallacies” are artifacts of experimental setups rather than deep flaws in human cognition. Take the conjunction fallacy — the famous “Linda is a bank teller” example. Or the base-rate neglect experiments. Gigerenzer shows, very persuasively in my view, that once you take conversational norms, ecological context, and information formats seriously, people’s answers often make sense. Kahneman and Tversky interpreted deviations from strict probability theory as cognitive irrationality, but the applied a formal calculus that is often detached from how humans actually process information in real-life contexts. Same with fallacy theory. When you strip away context and impose a rigid formal template, you see “irrationality” everywhere. But once you reintroduce the background assumptions people are actually using, the sharp verdict often dissolves. Much of what gets labeled as “irrational” is better understood as fast-and-frugal reasoning operating outside the lab — sometimes successfully, sometimes not.

John Wittenbraker's avatar

One of the root causes is that bias research has become a sort of fad in cognitive science. Everyone seems to be chasing after Dunning and Kruger to capture the kind of fame that comes from naming an “effect”This is exacerbated by the impulse to slice things ever more thinly in scientific research these days to get more publications.

The discipline does seem to be coming around (perhaps fueled by the replication crisis). See for example Oeberst, A., & Imhoff, R. (2023). Toward Parsimony in Bias Research: A Proposed Common Framework of Belief-Consistent Information Processing for a Set of Biases. Perspectives on Psychological Science, 18(6), 1464-1487.

Windriven's avatar

Having read this I have to say I'm glad you've stopped teaching critical thinking. I hope that you will reread the section you titled "Growing doubts" and, ahem, think critically about what you wrote.

"They began to see fallacies everywhere."

Logical fallacies are ubiquitous. They began to see fallacies everywhere because fallacies *are* everywhere. Communication is rarely a search for understanding; it is an exercise in persuasion and those engaged in persuasion often use whatever argument is available with superficial persuasive power trumping veracity. In my experience (admittedly a dangerous collection of anecdotes) many of those engaged in persuasion often aren't aware of their own logical faux pas.

"Instead of engaging with the substance of an argument ..."

You are of course familiar with the terms sophistry and casuistry? How does one engage with the substance of an argument grounded in fallacies? Life is short. If one's interlocutor can't mount a reasonable argument is one expected to mount it for him?

Yours is a long piece and I am not going to go through it line by line to offer criticism. While the illusion of certainty is the security blanket of the simple mind, we must nonetheless - we are ethically compelled per Clifford - to pursue truth. In science we do this using the Bayesian successive approximation engine called the scientific method. The same principles can be applied to any inquiry with a possible objective resolution. Critical thinking is the bedrock of the process. A knowledge of logical fallacies is a useful tool in the workshop of critical thinking. It shouldn't be fetishized but it certainly shouldn't be eschewed.

Maarten Boudry's avatar

When you state that “Logical fallacies are ubiquitous” and rhetorically ask how one engages with arguments “grounded in fallacies,” you’re already assuming what’s at issue, namely, that these arguments are fallacies in the strict sense. That’s precisely what my essay disputes. Many of the argumentative moves we’re tempted to classify as fallacies are, on closer inspection, ordinary defeasible inferences whose strength depends on context. Calling them fallacies from the outset risks begging the question (isn't that a "fallacy" on the list? ;-)). And you don't really present evidence that fallacies are ubiquitous, whereas my essay already offered cirumstantial evidence that they aren't (because textbooks invariably rely on toy examples)

I’m not saying that bad reasoning isn’t ubiquitous. Of course it is. Nor am I denying that persuasion often trumps truth, or that sophistry is real. The world is full of sloppy, biased, self-serving arguments. What I’m questioning is whether the traditional fallacy taxonomy is the best way to diagnose that problem. When my students “saw fallacies everywhere,” they weren’t suddenly becoming razor-sharp Bayesians cutting through sophistry. They were short-circuiting analysis by reducing subtle arguments to logical form. And you can't blame them, because that's precisely what the concept of "fallacy" invites us to do. Instead of charitably reconstructing the argument, weighing the evidence, checking base rates and background assumptions, they slapped on a Latin label and moved on. That’s not critical thinking; that’s intellectual box-ticking.

You ask: how do you engage with the substance of an argument grounded in fallacies? By specifying what’s wrong in evidential or probabilistic terms. Is the causal inference underdetermined? Are alternative hypotheses ignored? Is the base rate neglected? Is the incentive structure distorting testimony? Those are substantive critiques. “Post hoc!” or “Ad hominem!” is usually shorthand at best and evasion at worst. So no, I’m not eschewing critical thinking. I’m arguing for a more demanding version of it, one that goes beyond formal pattern-matching and actually analyzes why an argument is weak.

Windriven's avatar

With all due respect, I think you have gone down a rabbit hole that you dug for yourself. The heart of critical thinking isn't mastering the identification of logical fallacies. Facility at that identification is simply a useful tool.

Let's look at your mushroom example:

"Imagine you eat some mushrooms you picked in the forest. Half an hour later, you feel nauseated, so you put two and two together: “Ugh. That must have been the mushrooms.” Are you committing a fallacy? Yes, says your logic textbook. No, says common sense..."

In this case your logic textbook is correct. You are committing a fallacy, one that the physician you visit in the ER will immediately identify and set to the side. The logical error occurs in the third sentence when our hapless mycophile says *must* have been the mushrooms. It could be norovirus. It could be the rare hamburger he had the previous evening. And, of course, it could be the mushrooms. The physician in the ER will take a history and build a differential diagnosis. Then she will work through that ddx ordering tests where appropriate to help decide which diagnosis is most likely and begin treatment there. This is critical thinking in action. The fact that the patient made a logical error is both true and relatively unimportant.

Let's revisit your introduction to your essay:

"So I dutifully taught my students the standard laundry list and then challenged them to put theory into practice. Read a newspaper article or watch a political debate—and spot the fallacies!"

I would argue that puts the cart before the horse. Let's try instead:

So I dutifully taught my students the standard laundry list and then challenged them to read a newspaper article or watch a political debate - and include the use of this new set of tools where appropriate in analyzing the debate critically.

A political debate or article is an especially good testbed. Isolate the core assertion at issue. Perhaps it is taxation. The first question that your student should ask themselves is whether or not there even is a potentially objective answer. If not, everything that follows is masturbation.

The point here is that understanding logical fallacies is critical but only at the appropriate time in an analysis. As I write this I happened to think about Karl Popper and I think the principle of falsifiability - carefully used - works here. If in the logical stream of analyzing an assertion a logical fallacy is identified, it undermines but does not negate the assertion. That is to say an argument, especially in a political essay, may contain one or more logical fallacies. Identifying them is critically important. But once identified one has to ask whether or not the assertion can stand if the logical fallacy is removed. Some logical fallacies may be tangential to the assertion. The heart of critical thinking is stripping an assertion down to the bare metal before analyzing it further. After all, in real life critical thinking isn't a game of gotcha, it is an effort to ascertain the probability of an assertion being true.

Alex's avatar

No one can force you to read the piece, but what they say in the piece is well-trodden ground actually, and not some postmodern sophistry.

"Truth" is a thing that can be studied many ways, from the Artistotlean (True or False with no middle) of the classic days to the Bayesian (probabilities between 100% and 0%), and they obey different rules. Most human logics look like a probability logic, where several of the "fallacies" are NOT forbidden (and math backs this up!)

I find it often the case that people who like reciting fallacies are not involved in investigative work like science or diagnosis where some of the "fallacies" are indespensible like in the example in this essay of the poisonous mushroom.

Windriven's avatar

I never said thar I didn't read the piece. I said that it was too long to go through and critique line by line. So, we're off to a good start, hmm?

"Well-trodden" and postmodern sophistry are not mutually exclusive.

Your second paragraph is simply an echo of points I made in my last paragraph.

After reading your last paragraph I have to ask if you read mine and especially my last sentence. So, stifling laughter as I write this, I have to ask: do you understand the straw man fallacy?

Alex's avatar

Well, it's neat I guess that Persuasion has a new troll, but you may want to aim for more engagement next time. It was a bit too obvious that you're not willing to invest any effort having a conversation, so only a fool would keep going.

Windriven's avatar

Can we agree on the Merriam Webster definition of trolling? "[T]o antagonize (others) online by deliberately posting inflammatory, irrelevant, or offensive comments or other disruptive content."

Maarten Boudry wrote an essay that in my opinion either misunderstands or intentionally mischaracterizes the role of identifying logical fallacies when evaluating an argument. I commented on what I apprehend the general errors in his argument to have been. This, you will note, is how conversations happen. So I'm not sure on what you base your assertion that, "[i]t was a bit too obvious that you're not willing to invest any effort having a conversation." Neither am I sure what I wrote that you imagine to be "inflammatory, irrelevant, or offensive."

In the event, given the above I have as little interest in continuing this conversation with you as you do with me.

JakeH's avatar

I like this and generally agree. I never liked "absence of evidence isn't evidence of absence." It obviously is if you'd expect evidence, and the phrase is usually deployed in just such situations. I'm less hostile, however, to "correlation does not equal causation" and "ad hominem." People really do make those two mistakes all the time -- mistakes not necessarily in the sense of formal logic but in the sense of overstating the power of the evidence.

I do not think people have a great handle on correlation vs. causation in, say, health reporting or social science reporting or anything involving "studies show." The implication is typically an unjustified causal leap, and, while yes, there is a theory of mechanism, there may well be other explanations and those other possibilities often go unexplored. Some reporting is careful about this but a lot isn't.

Psychology and education is full of this. For example, studies show that those who experienced what's called "authoritative parenting" turn out better (social adjustment, academic performance, etc.). The overwhelming implication of such reportage is that the authoritative parenting caused the kid to be well adjusted. And it does make sense. It's just that other possibilities make sense too. The causal arrow could very well go in the other direction. Kids who grow up to be well-adjusted may have been easier to parent and so lent themselves to authoritative (rather than authoritarian) parenting. There may very well be a third element -- genes, say, or culture -- causing both calm parenting and well-adjusted smarty-pants kids. it's very easy to lose sight of these other possibilities, which is why the "correlation is not causation" reminder is often a worthy one.

Likewise, in everyday discussion, people often do throw stink on an argument based on the identity of the person in unjustified and/or lame ways. It's especially noticeable in areas where you don't have to rely on trust or expertise, where the validity or problem with the argument is plain and the other person responds not by engaging with those points but rather resorting to the less persuasive evidence.

Maarten Boudry's avatar

Thanks, this is a thoughtful pushback, and I partly agree. You’re right that my section on post hoc reasoning could have been stronger. It may have sounded as if I was downplaying how central correlation–causation confusion is in pseudoscience and media reporting on health and eduction. I agree that people routinely overstate what correlational data can show. Your parenting example is a great illustration. The issue isn’t that people are committing some clean logical sin. It’s that they’re treating a plausible causal story as if competing explanations didn’t exist. The real work lies in spelling out mechanisms, ruling out third variables, and clarifying what the study design can and cannot support. The slogan helps — but it doesn’t do the analysis for you. It's more illuminating to frame "post hoc reasoning" in terms of signal detection: generating too many false positives and failing to correct for them. The mistake isn’t a simple formal error, but overconfidence — a failure to consider base rates, confounders, placebo effects, regression to the mean, and alternative explanations.

James Quinn's avatar

One of the benefits of age and experience is the production of the best human interpretive skill available - a difficult to define but remarkably useful quality we usually call common sense. It is the best fallacy detector I know of.

Michael Lipkin's avatar

Very good. Life isn't pure maths, however

I challenge you to find a counter example to the Gambler's fallacy. If an event has occurred more frequently in the past then it is also likely to occur more frequently in the future (by Bayes rule)

The Gambler's fallacy comes from an individual knowing a little math (e.g. long run probability = 0.5 for a 'fair coin' - whatever that is!) causing the Gambler to make this mistake.

So this one is actually true!

Maarten Boudry's avatar

"If an event has occurred more frequently in the past then it is also likely to occur more frequently in the future". That's naive inductivism, it didn't work for Bertrand Russell's turkey. ;-)

"The turkey found that, on his first morning at the turkey farm, he was fed at 9 a.m. Being a good inductivist turkey he did not jump to conclusions. He waited until he collected a large number of observations that he was fed at 9 a.m. and made these observations under a wide range of circumstances, on Wednesdays, on Thursdays, on cold days, on warm days. Each day he added another observation statement to his list. Finally he was satisfied that he had collected a number of observation statements to inductively infer that “I am always fed at 9 a.m.”. However on the morning of Christmas eve he was not fed but instead had his throat cut."

Maarten Boudry's avatar

Not sure if I understand the question, but weather provides an example where the gambler's falalcy is "rational": after enduring five straight days of rain, it would've been rational for a hunter-gatherer to expect a sunny spell, as weather fluctuations often exhibit statistical clusters. The sun can really be ‘overdue’.

alexsyd's avatar

Welcome to the world of culture. Culture is what's left over after you forgot what you tried to learn.

You are beginning to comprehend the current dominant sacred victim, entitled parasite culure and are gravitating to the former but workable privilege, obligation, honor, divine order one.

Alex's avatar

The study of fallacies is thousands of years long and has been studied by such prestigious institutions as the Catholic Church. This is not a return to antiquity so much as construction of a format of reasoning suitable for the classroom AND the laboratory

alexsyd's avatar

It's hard to know what your culture is from the inside. It's like asking a fish what is water. The best way to understand the nature of your culture is to find out where all the energy flows. The money, power, and most importantly, what's sacred and cannot be touched or questioned. The author says up front he's become a fallacy studies apostate, an unbeliever. The "format of reasoning" has broken down. Ergo, I welcomed him into the world of learned instinct.