Informal Fallacies

Informal Fallacies

The monk Zuigan used to start every day by saying to himself out loud: “Master, are you there?” And he would answer himself: “Yes sir, I am!” Then he would say, “Better sober up!” Again he would answer, “Yes sir! I’ll do that!” Then he would say, “Look out now; don’t let them trick you!” And he would answer, “Oh no, sir, I won’t! I won’t!”

– Zen Koan

What do you expect from a trickster?

Some basic concepts of logic  •  Deduction and induction • Fallacies

It was March 14, 2003, and PsiTech had some explaining to do.

PsiTech is a company based in Washington that trains people in a skill called remote viewing, which allows you to go into a trance, and then travel through time and space to witness distant events. You can watch a schlocky company promo from PsiTech on YouTube if you like. Listen to PsiTech’s CEO Dane Spotts, who asks rhetorically: “Is it possible to travel through time and space using only your mind? To leap ten, twenty, even hundreds of years into the future? Or the past?”

The answer to all of Spotts’s questions is “no,” but that did not deter what may be the unluckiest psychic organization in history. PsiTech loved to butt in on high-profile crime investigations by offering the services of their time-and-space traveling “special ops” teams. In October 2002, PsiTech’s remote viewers involved themselves in the D.C. Sniper case, and told anyone listening that the assailant was a lone gunman who was overweight, had long, greasy hair, and couldn’t speak English correctly. They were wrong, wrong, wrong, wrong, and wrong.

They seemed to do a little better on January 24, 2003, when their remote viewing target was the “Laci Peterson Disappearance Event” in Modesto, California. PsiTech operatives used their special powers to determine that “Laci was murdered near or is currently submerged and trapped in a body of water, in the vicinity of a bridge, pier, or dam.” They cautioned the public that remote viewing “is a highly specialized skill and the data is not interpretable [sic] to the unlearned eye.” Don’t try this on your own, in other words. All those unlearned-eyes who had not completed the company’s “Technical Remote Viewing” courses ($1295.00 each) would have to settle for watching CNN, which had reported two weeks earlier that “detectives searched near San Francisco Bay as part of the investigation into the disappearance of Laci Peterson.”

The San Francisco Bay is a body of water. With piers. There are five bridges that traverse the bay, including this one, which has turned up in numerous photos since its opening in 1937. There are about 1,500 dams in California. As far as I have been able to determine, all of them are near a body of water. When you think about it, traveling back in time to a news show that aired fourteen days in the past, or traveling through space to San Francisco (which I highly recommend, as it is one of my favorite cities) really isn’t all that impressive, and probably not worth the price of a remote viewing course.

PsiTech’s most disastrous fail was in June 2002. The company had started investigating the bizarre kidnapping of 14-year-old Elizabeth Smart of Salt Lake City, Utah. Almost immediately, their special-ops team reported some dreadful news to Elizabeth’s parents. Their daughter was dead. The Smart family would live in agony for another nine months before they learned that Elizabeth was in fact very much alive. She had been kidnapped by a deranged religious fanatic named Brian David Mitchell, who committed this awful crime to fulfill some divine commandment that made sense to no one but Mitchell himself. Elizabeth was quickly reunited with her family.

A thoroughly embarrassed Joni Dourif, PsiTech’s president, published a lengthy apology that included this explanation:

We “assumed” [Elizabeth was dead] because we found her unmoving and lying still in one place. The assumption was backed by follow up sessions containing the same data that she was lying still curled up in one place with the same immediate surroundings.

Nice try. Rather than admitting that remote viewing is just a pricey psychic fantasy, Dourif switched one claim (Elizabeth was lying still) for another (Elizabeth was dead). And she did so after the fact, which pretty much defeats the purpose of remote viewing.

Dourif’s lame excuse for her company’s failure is an example of an informal fallacy of reasoning. A fallacy is a premise or argument which, while it may seem appealing at first glance, contributes nothing to the broader argument one is trying to make; or worse, one that is intended to lead you to a false conclusion.

This particular fallacy is an ad hoc argument (from the Latin, “for the thing”). It’s when someone adds a premise to an argument not because the premise is true, or supported by evidence, but because it rescues an argument from refutation. You know, someone tells you “Let’s assume, for the sake of argument…” There’s nothing wrong with that if you are just trying to think through a difficult problem by considering all possibilities. But sometimes people will ask you to assume something “for the sake of argument” because they are trying to trick you. There’s something very wrong with that.

James Randi

There are all sorts of informal fallacies. Here’s another one, again courtesy of PsiTech. By 2004, PsiTech’s wild claims and unflattering success rate came to the attention of James Randi, the magician-turned skeptic who had been busy debunking the paranormal since the early 1970s. Randi offered PsiTech $1 million if they could demonstrate remote viewing under proper observing conditions. “Proper observing conditions” simply means that the possibility of cheating had been eliminated. Here is how PsiTech’s CEO Dane Spotts responded to the challenge in a newsletter to his true-believers:

The deal is rigged. Of course it is. What would you expect from a trickster? Frankly, none of us at PSI TECH ever believed Randi was for real.

The issue at hand was whether remote viewers had any psychic abilities at all, or whether they were just working well-known parlor tricks like cold reading, warm reading, hot reading, safe-guessing, and so on. Whether or not James Randi is a “trickster” was not relevant. PsiTech was free to ignore Randi’s offer, design their own tests, and publish their results for scientific scrutiny. They were charging fees for training others to be remote viewers. Wouldn’t independent confirmation of paranormal ability make their services all the more marketable, even without Randi’s money?

Spotts had found a way to direct his readers’ attention away from the issue at hand by focusing on Randi’s career as an illusionist. This fallacy is called argumentum ad hominem; literally, an argument towards the man. Of course Randi was a trickster. That’s what illusionists do, and that’s why we find them so entertaining. The differences between Randi and PsiTech are (1) Randi never claimed to have supernatural powers, and (2) Randi’s tricks never bombed as badly as PsiTech’s.

Some basic concepts of logic

The bad news is that there are people far more clever in the art of deception than PsiTech is. The good news is that there are also people who are clever in detecting deception. So whether you want to be a trickster or a debunker of tricksters, read on.

Let’s add a few more basic concepts. The online dictionary Wordnet offers seven definitions of the word argument. The top four are (1) a contentious speech act or dispute, (2) an assertion offered as evidence that something is true, (3) a discussion in which reasons are advanced for and against some proposition or proposal, and (4) a course of reasoning aimed at demonstrating a truth or falsehood. In the study of logic and rhetoric, the word is generally used in all but the first sense. Although arguments often turn nasty, they don’t have to.

A proposition is any claim that can be characterized as true or false.

A premise is a proposition offered as proof of a particular conclusion.

A syllogism is an argument consisting of one or more premises (almost always more than one) and a conclusion. Here is a simple example of a syllogism:

Premise 1: If today is Friday, then John is in his office.
Premise 2: Today is Friday.
Conclusion: Therefore, John is in his office.

A syllogism is valid if the conclusion follows from the premises. It is sound when it is valid and all of its premises are true. When those two conditions hold, the conclusion is necessarily true. This particular type of syllogism that uses “if…then” statements is called modus ponens, or “method of affirming.” The first proposition in Premise 1 above (“Today is Friday) is called the antecedent. The second is called the consequent.

To illustrate the difference between validity and soundness, suppose that you go to the office on Friday, and John from the example above is not there. You would not question your logic in that circumstance. You have reasoned correctly, and your conclusion properly follows from the premises.

What you will have discovered instead is that one of your premises must be false. So you first check your calendar. If today is really Thursday rather than Friday, your problem is solved. If today is Friday, then by process of elimination, you can conclude that your first premise is false. This is a legitimate and useful method of reasoning called reductio ad absurdum, or “reduction to an absurdity.” Apparently John doesn’t have office hours on Friday.

One common fallacy, called affirming the consequent goes like this:

Premise 1: If today is Friday, then John is in his office.
Premise 2: John is in his office.
Conclusion: Therefore, today is Friday

To see why this is not a valid deduction, suppose that Mondays, Wednesdays and Fridays are office days for John, and that he works from home the other days of the week. The second premise alone does not rule out John being in his office Mondays and Wednesdays, and therefore does not logically support the conclusion. It may be factually true that John is in his office, and that today is Friday. If so, it’s not because your logic is good. You’ve merely assumed there is some connection between Friday and John being in his office, when in fact there is not.

Deduction and induction

I’ve read a number of definitions of two other important terms in logic and rhetoric: inductive and deductive reasoning. One of my favorites is from Trivium Pursuit:

Reasoning can run in two opposite directions. Deductive reasoning moves from a general premise to a more specific conclusion. Inductive reasoning moves from specific premises to a general conclusion. These two methods of reasoning will produce two different kinds of results.

I like these definitions because they are fairly common and slightly wrong, but their imprecision ironically helps clarify induction and deduction. Deduction does not necessarily go from general premises to a specific conclusion. You can validly deduce a specific conclusion from specific premises, as in:

Premise 1: Either John was a the party, or Mary was at the party.
Premise 2: Mary was not at the party.
Therefore: John was at the party.

or a general conclusion from general premises:

Premise 1: All dogs are mammals.
Premise 2: All mammals are mortal.
Therefore: All dogs are mortal.

or even a general conclusion from specific premises:

Premise 1: Kletso is a dog.
Premise 2: Kletso is not vicious.
Therefore: It is false that all dogs are vicious.

I find it helpful to refer back to the origins of the words deduction and induction. Both come from the Latin verb ducere, which means to lead or to guide. The ducts in your house, for example, are the things that guide heated or cooled air around your rooms. If you become a world-renowned biologist, you may get inducted into the National Academy of Sciences. You’ll probably get a big pay raise, which means that more money will be deducted from your paycheck. So deduction is that which you bring out of an argument in the form of a conclusion, and induction is that which you bring into an argument in the form of premises.

Formal logic uses operators to express the relationships between the propositions in syllogisms. The most common operators are negation (¬), conjunction (&), disjunction (∨), the conditional “if . . . then” (→), and the biconditional “if and only if (↔).” You can represent my first syllogism above by letting P stand for “Today is Friday” and Q for “John is in his office”:

P → Q
P
∴ Q

The three-dot symbol means “therefore.” You can find an excellent introduction to formal logic in P.D. Magnus’s (2017) Forallx: An Introduction to Formal Logic (Albany: State University of Albany). It is available under a Creative Commons license.

While deduction has to do with the methods of deriving conclusions from the premises in an argument, induction has to do with the methods of getting premises into arguments in the first place. There are a number of ways of doing this. For example, you can simply make up premises, as I did in my example above about John and his office hours. I didn’t have any particular John in mind in that case, so obviously I don’t know anything about his office hours. There’s nothing wrong with that if you are just illustrating a point.

You can also add a premise to an argument as a postulate, and declare “let’s assume for the sake of argument that this is true and see where it leads us,” or words to that effect. Or you can introduce a premise into an argument after careful observation, either on your own or in collaboration with a community of researchers. For instance: “Gases like carbon dioxide and methane allow shortwave radiation from the sun to pass through the earth’s atmosphere, but then they absorb heat as it reflects off the earth and back towards space,” and “This trapping of heat causes a gradual rise in global temperatures.”

Fallacies

Or you can introduce premises into an argument to deceive people, including – possibly – yourself. People who traffic in bogus ideas have a large stock of fallacies that they use to trick people into believing that what is true is false, and what is false is true. Below is a compendium of the most common of these, with commentary.

ad hocappeal to natureargumentum ad hominemargumentum ad ignorantiamargumentum ad populumargumentum ad verecundiamanecdotal evidencecherry-pickingcirculus in demonstrandoclustering Illusionequivocationfalse dilemmafaulty analogygambler’s fallacygarbage in a pretty pailgenetic fallacyguilt by associationpost hocquote-miningshotgunningslippery slopespecial pleadingstraw mantu quoquewhen all else fails

Ad hoc

Even though I mentioned this fallacy in my discussion of PsiTech’s unfortunate psychic profiling, it merits a little more attention because it is one of the most common fallacies, as well as one of the most seductive. An ad hoc fallacy is when you add a premise to your argument only to protect your claims from criticism. When psychics fail to perform magic in front of impartial observers, they often complain that “negative energy” is interfering with their mental powers. They offer no evidence that negative energy even exists, and usually don’t even bother defining what it’s supposed to be. They invent the idea simply to explain away their failure.

Another way to explain ad hoc premises is to say that they are not independently motivated. For instance, suppose that one day you discover that stuff has been stolen from your house. It soon turns up in your neighbor’s garage. Other things being equal, this suggests that your neighbor is the thief. But your neighbor explains that “Well, maybe there is some kleptomaniac going around stealing property and framing others.” Your first question would likely be “Where did this story about a kleptomaniac come from?” You are probably thinking that your neighbor made it up out of whole cloth in an attempt to explain away incriminating evidence. Crooks and charlatans do this all the time, often out of desperation.

Suppose, on the other hand, that there really is a kleptomaniac in your neighborhood, that he has already been caught stealing from other neighbors, that he has admitted to framing people in the past, and that he recently had a dispute with your neighbor, which amounts to motive for a set-up. In other words, there are facts that are true independently of the theft of your property. Even if your stuff had not been stolen, these facts would hold true. In those circumstances, the kleptomaniac argument is not ad hoc. It therefore counts as evidence, other things being equal.

Here is one more illustration of this fallacy. Some years ago, there was a multi-million dollar research program going on worldwide, though mostly in the USA, to teach apes to talk using sign language. All such efforts ended in failure. The apes mostly used sign language when their trainers, who were not exactly impartial observers, were present with rewards of some kind. Skeptics pointed out that apes like bonobos (pygmy chimps) have never been observed using language in the wild, where it would do them the most good, and that it is therefore unlikely they would have the cognitive wherewithal to acquire a human language in captivity. A group called WhyFiles published an article in which one true-believer was quoted explaining this fact:

She suspects that bonobos are using language in the wild, but since they congregate in trees in groups of about 100, “it’s almost impossible to study them.” And on the ground, they are silent to avoid predators.

Notice in the first place that this supposition brings the whole argument to a grinding halt. If it’s impossible to study bonobos when they are in the trees, how can we know they are talking up there? Quite often, that is the purpose of an ad hoc argument. If a syllogism depends on an unknowable premise, there is no way to draw meaningful conclusions. These suspicions about ape-chatter in the treetops were not motivated by facts that could be independently verified, but simply by the desire to defend the research program from damning criticism.

Appeal to nature

Arguments that appeal to nature declare that anything natural is good, and anything unnatural (that is, made by humans) is bad. There is perhaps no better example of this than the company in Santa Fe, New Mexico, that sells “100% Additive-Free Tobacco.” Tobacco is an “all natural product” (or a “plant,” as some say), and perhaps we should be grateful that at least one cigarette company does not put any more deadly substances into its product. Flu vaccines and antibiotics are man-made. The deadly pathogens they protect us from are natural. Genetically modified crops are somewhere between natural and man-made, but there are no known health risks associated with them. Quite the contrary: they have already saved numberless people from starvation.

The appeal to nature fallacy has turned up in some scholarship on nativist-empiricist views of human cognition (that is, the extent to which our mental faculties are innate or acquired). There is a fascinating line of research in cognitive psychology exploring way-finding, and spatial abilities in men versus women (see, for example, Lawton & Kallai, 2002; Kimichi, Amishav, & Sulitzeanu-Kenan, 2009; Cashdan et al. 2012). Some cognitive scientists have attributed this to our hunter-gatherer heritage, a time when men and women needed much different strategies for orienting themselves spatially and geographically. In other words, way-finding may be an innate difference, not a learned one.

Some people might counter that this is a sexist argument: Men and women ought to be treated equally, and we shouldn’t claim that nature has given them non-equal cognitive abilities. But the brain is what it is. It is an organ that doesn’t change its nature to please anyone’s political ideologies. It is not racist to say that diabetes is more prevalent among African-Americans than it is among whites, nor that melanoma is more prevalent among whites than among African-Americans. Some kinds of cancer are caused by the environment, and some have a genetic component. Suppose someone told you “I don’t believe that, because we can change the environment, but we can’t change our genetic make-up!” Would you be persuaded? Nature does not abide by our desires.

Argumentum ad hominem

This is the fallacy of arguing against a person rather than against a claim, as PsiTech CEO Dane Spotts did in the example I cited above. In recent history, no one has been as prolific a user of ad hominem arguments as Donald Trump. Trump tweeted this about journalist/lawyer Megyn Kelly and Texas Senator Ted Cruz during the Republican primaries:

Crazy @megynkelly supposedly had lyin’ Ted Cruz on her show last night. Ted is desperate and his lying is getting worse. Ted can’t win!

This particular outburst was prompted by a comment from one spokesperson for Cruz, who had said a day earlier that Trump was “scared to debate Ted Cruz.”  Trump also attacked Crazy Joe Biden, Crooked Hillary, Lying James Comey, Little Marco Rubio, Cryin’ Chuck Schumer, Goofy Elizabeth Warren, and so on. Sadly, ad hominem arguments seem to resonate with those who have a short attention span, which makes tweeting an ideal format for this fallacy.

Argumentum ad ignorantiam

Argument to ignorance is a fallacy that seems to be particularly popular among true-believers in UFOs and extraterrestrial visitation. The Latin ignorantiam comes from the verb ignorare, which means “to not know” or “to be unaware of.” It’s worth emphasizing that the meaning of ignorare and its derivatives is quite different from the meaning of the English ignorant, which is synonymous with “stupid” or “uneducated.” There are no such connotations in Latin or the Latinate languages like French and Spanish.

The basic idea behind this fallacy is that not knowing one thing somehow proves another. UFO buffs may show you a blurry picture of something that looks like a flying saucer and ask you “Can you prove this is a hoax?” Maybe you can, maybe you can’t. “No, I can’t” means the same thing as “No, I don’t know what it is.” If you were not present when the picture was taken, there would be no reason why you should know. Does it make sense to say that unless you can explain why some unknown photographer was unable to focus his camera on some unknown object on some undetermined date, we must assume that the Earth has been visited by creatures from outer space?

Sometimes ad ignorantiam can work in ways that are a little harder to detect. If you were to tell me that you don’t believe psychics can predict the future, would I be right in countering “Well, have you ever been to one?” The question implies that you cannot pass judgment on psychic phenomena without personal experience. But even if your first-ever psychic session was unsuccessful, I could always explain that this only proves one particular psychic was ineffective on one particular occasion. Maybe you should try another. And keep trying until someone gets it right. Do all of us have to visit every psychic in every strip mall before concluding they are all bogus?

Here is a comment I found on Reddit about the absurd conspiracy theory called Pizzagate that briefly attracted attention during the 2016 presidential elections in the USA:

PizzaGate has not been disproven, nor publicly discredited, by a single credible expert in the national security or law enforcement world – or in any field, for that matter.

“Has not been disproven” means the same thing as “We don’t know.”  Let’s assume, arguendo, that PizzaGate has not been disproven.  That still doesn’t prove it did happen.  This redditor is apparently assuming that unless you can disprove something to 100% certainty (a near-impossibility in the world we live in), then we don’t know if something happened.  Therefore, it happened, right?  Remember where the burden of proof lies. It is not the responsibility of skeptics to prove that such-and-such can’t possibly be true. The burden of proof is on the person making an affirmative claim.

Argumentum ad populum

Argument to the people is when you rely on popular sentiments to prove your claims. Lots of people have tried homeopathic remedies to treat colds and have reported favorable results. So there is our proof, right? Not quite. There was a time when most people believed the Earth was flat, that the Sun revolved around the Earth, and that illness was spread by evil spirits or miasmas. Large numbers of believers do not settle arguments. I found this quote on a site peddling the healing power of magnets:

An estimated 140 million people use magnetic therapy to relieve pain, improve circulation, reduce swelling, minimize stiffness, and increase overall performance. Can they all be wrong?

Yes, they can. Without proper, controlled testing, our personal experiences with medicines mean little. There are no known health benefits associated with magnetic therapy.

Argumentum ad verecundiam

This fallacy, “appeal to authority” in English, depends, as its name implies, on someone’s authority, be it real or imagined, to prove a point. We all like to trust experts, and certainly no one in her right mind would get a medical diagnosis from someone who has never been to medical school.

Even so, smart people get fooled into believing nonsense all the time. History brims with such examples. Isaac Newton was a devoted alchemist, and his writing on that subject is far more voluminous than his scientific work. Sir Arthur Conan Doyle, whose character Sherlock Holmes was the epitome of reason, logic, and objectivity, was himself hopelessly smitten with things paranormal, and was fooled into accepting as authentic some amateurishly forged photos of fairies taken by a couple of teenagers. Russell Targ is a well-educated physicist, and a pioneer in the development of laser. He is also a true believer in remote viewing and ESP. John Mack, once one of the world’s leading advocates of alien abductions, was on the faculty of the Harvard Medical School. In his excellent book Voodoo Science, Robert Park cites case after case in which an intelligent, bona fide scientist advocated some kind of nonsense. Facts must speak for themselves, authority notwithstanding.

Anecdotal evidence

Anecdotes are second-hand stories that people try to pass off as proof. I believe that the herb ginseng improves my cognitive functions because my neighbor says he used to live down the street from a man whose uncle was once married to a woman whose father had been acquainted with a newspaper reporter who did a story on a man who took ginseng regularly, with good results. Such arguments are also referred to as “hearsay” or, when people use them to promote products and services, as “testimonials.” At most, they are an indication that you might want to look into a claim. However, they do not constitute proof in themselves. Answers in Genesis once published an article suggesting that dinosaurs may not have gone extinct 65 million years ago, as most school children now think. Their claim was repeated in a 2017 book by Billy Chone titled The Truth about Dinosaurs (Pahrump, NV: Get a Life Ministries, 138):

In 1950, cattlemen on the border between the Northern Territory and Queensland claimed losing stock to a strange beast which left mutilated, half-eaten corpses in its destructive wake. A part-Aboriginal tracker also claimed to have seen a bipedal reptile, 7-8 metres (25 feet) tall, moving through the scrub near Lagoon Creek on the Gulf Coast in 1961.

Chrone and AiG’s preoccupation with living dinos has to do with their beliefs about the age of universe, which do not allow for the extinction of anything 65 millions years ago, because AiG insists that our planet is a mere 6,000 years old.

The quote above is a good indication of how desperate for evidence true-believers can become. Who are these cattlemen from more than a half-century in the past? Who is the tracker? What makes the beast, of which we have no physical description, so “strange?” Maybe it was just a dingo. A maxim in critical thinking is that you have to give your opponents a fighting chance to refute your ideas. If there is no way to track down evidence, then it is not evidence.

Cherry-picking

This fallacy, a variant of the straw man fallacy, is when you present data in a way that makes your argument look especially appealing, or your adversary’s especially bad. I came across an article in Scientific American titled Dear Skeptics, Bash Homeopathy and Bigfoot Less, Mammograms and War More in which author John Horgan tries to convince us that health care in the USA is really, really bad. Says Horgan:

The U.S. spends much more on health care per capita than any other nation in the world. And yet we rank 34th in longevity.

Technically, that’s true. The problem is that the distribution of longevity is skewed at the top. Japan leads with a life expectancy rounded off to 84 years. Seven countries tie for second place at 83, ten are at 82, and eleven are at 81. Just 4.4 years separate the USA from the top spot. At 31st from the bottom are Papua New Guinea and South Africa, at 63 years. That’s 16 years behind the USA, and 13 years ahead of last-place Sierra Leone. You can represent those same statistics by saying that, after rounding, the USA is tied for 6th place in longevity. That makes us look not so bad.

If you are from Texas, you can brag that Texas has lower cancer rates than Florida. However, that’s because Florida has so many more retirees than Texas, and cancer is a disease that correlates strongly with age. If you are from Florida, you can brag that your state has lower age-adjusted cancer rates than Texas, which is also factually true.

Here are the MLB standings, as of May 29, 2018:

TeamWinsLosses
Atlanta3122
Washington3022
Philadelphia2922
NY Mets2625
Miama2033

You can cherry-pick these results ordinally, and lament that the poor Mets are second from the Miami bottom-dwellers. Or you can cherry-pick by wins and losses, and brag that after 51 games, the Mets have a winning record and are just four games behind the leader.

Circulus in demonstrando

Circular reasoning, sometimes called begging the question, is when someone merely repeats a premise in the guise of a conclusion. As an illustration, I’ll rework one of my previous syllogisms, in admittedly dull fashion:

Premise 1: If today is Friday, then John is in his office.
Premise 2: Today is Friday.
Therefore: Today is Friday.

Real-life examples are rarely this blatant. But they do turn up when people assume something they are supposed to be proving. In his book The Interpretation of Dreams, Sigmund Freud proposed that all dreams are examples of what he called wish-fulfillment. Your car seems to be plagued by mechanical problems that you can’t afford to fix. Then one night, you have a dream in which your wealthy aunt and uncle buy you a brand-new car. You dreamt what you wished for. I have done this lots of times. We even have the expression in English “In your dreams!”, which means that your strongest desires will only get satisfied in the unreal scenarios of deep sleep.

But Freud’s critics pointed to an obvious problem. What about nightmares? I have also had all sorts of dreams about things I would never wish for in real life. Here is how Freud responded to his critics in Freud, S. (1917). A General Introduction to Psychoanalysis (New York: Sheba Blake Publishing):

Even dreams with a painful content are to be analyzed as the fulfillment of wishes.

The contradiction to my theory in the case of another female patient, the most witty among all my dreamers, was solved in a simpler manner, although according to the scheme that the non-fulfillment of one wish signifies the fulfillment of another.

In spite of their undesirable content, all these dreams must be interpreted as wish-fulfillments.

In other words:

Premise 1: All dreams are examples of wish-fulfillment.
Premise 2: But I had a nightmare.
Premise 3: And all nightmares are dreams.
Premise 4: Your dreams are no different than anyone else’s.
Therefore: All nightmares are also examples of wish-fulfillment.
Therefore: All dreams are examples of wish-fulfillment.

In waking life, it is unproductive to declare an idea to be true, and then cite your own declaration as proof.

The Clustering Illusion

This fallacy is sometimes called the “Texas Sharpshooter.” That comes from an old joke about someone who takes target practice at the side of a barn, and then draws a circle around the bullet holes. It has to do with finding coincidental patterns in random events. I found this example of the clustering illusion:

Why are we getting all these natural disasters like earthquakes, volcanoes, floods, storms and hurricanes?” This is a question on many people’s minds, because as you may have noticed, this world seems to be tearing apart at the seams. Take a look at Bible prophecy news and you will see new reports virtually every week about some natural disaster.

The person who wrote this was clearly distressed about current events. And I will grant the part about “floods and storms,” having lived though Hurricane Harvey. But earthquakes, volcanoes, and hurricanes are not new. The worst natural disaster in North American history occurred before any of us were born. So did the world’s most cataclysmic volcanic eruption, and its most devastating earthquake. Only a few remaining humans remember history’s deadliest war.  It seems we all have a cognitive availability bias in that the more we hear about certain events, the more likely we are to believe that they are common. Since the media have a financial incentive to report and overstate bad news, we should not be surprised that we sometimes hear lots of it all at once.

Equivocation

Also known as weasel-wording, an equivocal argument is one made in less than explicit terms. The nickname comes from the observation that equivocating is a way of allowing yourself to weasel out of your arguments after you make them, just in case someone proves you wrong. I found the following information on a website promoting acupuncture, a popular but wholly unproven alternative to standard medicine.

Acupuncture is a Chinese therapy that has been used for centuries. It is based on the theory that there is energy, called chi or qi, flowing through your body. Chi is thought to flow along energy pathways called meridians. Acupuncturists believe a blocking or imbalance of the flow of chi at any point on a pathway may result in illness. Chinese medicine practitioners believe acupuncture unblocks and rebalances the flow of chi to restore health.

People often use acupuncture to relieve pain. Western medical researchers who have studied acupuncture believe that it may reduce pain through body chemicals that have calming effects (opioid peptides), or by affecting glands (such as the hypothalamus) that produce substances the body uses.

How many weasel words can you find? Here is my list:

1. It is based on a theory… Do you mean theory in the scientific sense, as in “a coherent group of tested general propositions, commonly regarded as correct, that can be used as principles of explanation and prediction for a class of phenomena,” or in the vulgar sense, meaning “guess or conjecture?”
2. Chi is thought to flow along energy pathways called meridians. Who thinks so? Only acupuncturists? Only the author of this page?
3. Acupuncturists believe a blocking or imbalance of the flow of chi at any point on a pathway may result in illness. Do they know this, or have reason to conclude that it is true? Or do they literally just believe it?
4. Chinese medicine practitioners believe acupuncture unblocks and rebalances… Same problem.
5. People often use acupuncture to relieve pain… Maybe people often use acupuncture only because they have been fooled into believing it’s effective. People also use magical spells and incantations to relieve pain.
6. Western medical researchers who have studied acupuncture believe… Do you mean all western researchers? Some? Two? Apparently not these medical researchers. Or this one. Or these guys. Or her. Not her either. Nor any of these eleven researchers. Not these fourteen.
7. …it may reduce pain.

Or it may not.

False dilemma

In my discussion of deduction above, I mention the logical operator disjunction (∨), which almost translates into English as or. I say “almost” because in English we use or in an inclusionary and an exclusionary sense. When you read on a company’s web site an invitation to “contact us by phone or on the web,” you understand or in the inclusionary sense, as in “feel free to do one or the other, or both.” On the other hand, if you read on the menu at your favorite dive that you can order “fries or rice as your side dish,” you will interpret the or as exclusionary, as in “one or the other, but not both.” In propositional logic, the operator ∨ is used in the inclusionary sense. The exclusionary or is logically equivalent to (P ∨ Q) & ¬ (P & Q). In other words: “Either P or Q, but not both.”

There are questions to which a yes or no answer is appropriate. Either John is in his office or he isn’t. Other questions, however, allow for a wider range of responses. A false dilemma, also known as the fallacy of the false alternative, a false dichotomy, the either-or fallacy, or black-and/or-white thinking, is when someone wrongly uses or in its exclusionary sense. There is a maxim from the German philosopher Friedrich Nietzsche that has become cliché: “What does not kill me strengthens me.” In other words, an illness or tribulation must either be fatal or make you stronger. It’s nice inspiration in difficult times, but as we all know, there are traumas that weaken people irreversibly without killing them immediately.

False dilemmas are popular among conspiracy theorists, who can usually be counted on to remind us that governments hide information from the public. Here, for instance, is a quote from a web site dedicated to “9-11 Truth”:

Conspiracy theories arise from evidence. After the government releases an explanation of a particular event, a conspiracy theory is only born because evidence exists to disprove their explanation, or at least call it into question. There’s nothing insane about it, unless you define sanity as believing whatever the government tells you. In light of the fact that our government lies to us regularly, I would define believing everything they tell you as utter stupidity.

In other words, either you think everything the government tells you is a lie, or you are stupid or insane. There is no room in the argument for those who think governments lie sometimes, or even often. Below are some other false dilemmas I have made up, in an attempt to offend as many people as possible.

1. If you favor the death penalty, then you agree to allow your father, brother, or son to be executed for a crime he did not commit, just to be on the safe side. If you oppose the death penalty, then you agree to allow all child murderers to be freed, knowing that they will likely kill children again.

2. If you oppose higher taxes, then you will from now on refuse to drive on highways that have been paid for with tax dollars, as a matter of principle. If you support higher taxes, then you will agree to a large pay cut, as a matter of principle.

3. If you support universal health care, then you agree to pay higher taxes and to accept a lower standard of health care from now on. If you oppose universal health care, then you agree that if you lose your job and get sick, we will just let you die.

4. If you support military intervention in Syria, then you believe that we have the right to exterminate any culture that does not share our values. If you oppose military intervention, then you are siding with ISIL, an organization that openly promotes oppression of women, homosexuals, Christians, and Jews.

If these arguments seem unfair, it’s because they are. They all ignore nuance and middle ground.

Faulty analogy

A faulty analogy, also known as “comparing apples and oranges,” tries to establish similarities between things that are in fact fundamentally different. The psychic John Edward once said that psychic intuitions are like radio waves. Sometimes the reception is good; other times it crackles with static. It’s a useful analogy for explaining away psychic intuitions that are imprecise. If your psychic reception is not so good, dig deeper into your mind, which is more or less the equivalent of fiddling with the dial on your radio. When you make a good psychic guess, your reception is at its best.

This ignores some glaring differences between radio stations and the paranormal. If you and I tune our radios to the same country music station, we will both hear the same thing. If we switch to the jazz station, we will both hear jazz. So will everyone else. On the other hand, psychic predictions about things like hurricanes and earthquakes are all over the map. What are we to make of following analogy from an episode on haunted houses that aired on the CBS News program 48 Hours?

There’s a natural Earth source underneath the house that’s creating a powerful magnetic field. McRaven House might be described as kind of a storage battery for these energies, and people who go into this house are exposing themselves to those energies. And I think that is what’s triggering these experiences.

A house is like a storage battery? We can measure the voltage and amperage of a battery’s electrical current. We can attach batteries to devices and see the effects. Some of us know what happens when we place the terminals of a nine-volt battery on our tongues (don’t try this yourself). Batteries are not similar to houses in the least. Analogies can be useful illustrations to help us make a point or explain an idea, but they are not evidence of anything. And clearly, some illustrations are more helpful than others.

The Gambler’s Fallacy

This is the belief that random events in the present or future are affected by random events from the past. People who gamble on the Roulette Wheel in casinos will often say “I’m on a roll!” That means “I’ve just won several times, so I am likely to win this time.” Others will say “I’m overdue!” That means “I’ve just lost several times, so I am likely to win this time.”  And some say “I am out of money, so I am not likely to play anymore.” Below are three short quizzes that demonstrate how this fallacy can fool us.

First, imagine a lottery game in which you select six numbers, from 1 to 40. The lotto picks a winner by blowing numbered ping-pong balls around inside a machine, and then letting them pop out one by one. Which of the following sequences is a more likely outcome?

1 2 3 4 5 6
19 11 29 17 23 13

Second, suppose you select the second series of numbers, and you win $40 million on one ticket. Congrats! You decide that, just for fun, you’ll play again next month. Do your chances of winning a second time go up or down?

Third, suppose that 19, 11, 29, 17, 23 and 13 are your lucky numbers. So when you play a month after winning your jackpot, you play exactly those numbers a second time. Does that strategy improve your odds of winning or not?

The answer to all three questions is pretty much the same. It doesn’t matter.

Either set of numbers is as likely, or unlikely, to win. “But the first one just looks so improbable,” you may be thinking. It is, as is any sequence, which is why it’s so improbable that any of us will win the lottery. The first sequence may seem particularly unlikely because the numbers are ordered, and everyone believes that lottery drawings are random events rather than ones. But while the balls pop out of the machine randomly, it is a human being who puts them in numerical order. So that arrangement isn’t as random as you might think. Once the balls are out of the machine, there are 720 ways to arrange them. The second set above is also ordered. Those are the first six double-digit prime numbers. I just moved them around to disguise their orderliness.

And if you are lucky enough to win the lotto this month, your odds of winning next month are the same. The air machine that spits out those ping-pong balls doesn’t keep track of what happened in the past. It doesn’t remember that it spat out 19, 11, 29, 17, 23 and 13 last month. In fact, it may not even be the same air machine and the same ping-pong balls. It is just as likely to spit out exactly that set again this month, and again next month.

Some people have strategies for picking winning lottery numbers (the birthdays of loved ones are popular). Authors sell books on how to go about picking winning numbers. A cursory check on a certain book-vending web site recently recommended to me slightly fewer than 100 titles. As far as I know, all of the authors of those volumes have made more money in book royalties than in lotto winnings. If you had a foolproof method of winning the lottery, would you share it with others?

Garbage in a pretty pail

There is an old saying that garbage set out in a pretty pail is garbage all the same. Those who promote weird ideas will often dress their claims up in hifalutin prose to make it appear that they are saying something profound. The French philosopher Jacques Derrida, who was once described as among the most influential philosophers of the 20th Century, wrote this:

In all senses of the word writing thus comprehends language. Not that the word ‘writing’ has ceased to designate the signifier of the signifier, but it appears, strange as it may seem, that ‘signifier of the signifier’ no longer defines accidental doubling and fallen secondarity. ‘Signifier of the signifier’ describes the contrary movement of language: in its origin, to be sure, but one can already suspect that an origin whose structure can be expressed as ‘signifier of the signifier’ conceals and erases itself in its own production.

Sometimes the prose is more mystical than this, but the effect is the same. Here’s a quote I found on a web site about Rainbow Star Light Frequencies:

The Crystalline Rainbow Star Light Frequencies renew you in the energy of feeling good. All desired intentions manifest from your feeling good! Starlight The Rainbow Star Light Frequencies feels like daily get up and go for my energetic little red wagon! High Frequency Light Encoded Frequencies that activate and raise your vibration to the degree you are able to allow. The effect is cumulative.

I have no idea what any of this is supposed to mean. It is true that explaining complex arguments often requires technical jargon, which in turn requires that we readers invest more time in understanding another’s ideas. However, obscuring one’s prose to the point of incomprehensibility, whether out of carelessness, pretense, or a desire to frustrate rebuttals, does no one any good. If premises whose truth or falsity is unknown serve no purpose in a syllogism, then neither do premises that don’t mean anything in the first place.

The Genetic Fallacy

The genetic fallacy is related to guilt by association in that it aims to discredit arguments on the basis of their origins rather than on their merits. For instance, Aaron S. Kesselheim and his collaborators reported in the New England Journal of Medicine that physicians were more skeptical of drug research that had been funded by a private pharmaceutical company rather than by a governmental agency like the National Institutes of Health, regardless of the merits of the study itself.

Being wary of conflicts of interest is an indispensable part of critical thinking, to be sure. However, cautious consumers of medical research should bear in mind that private corporations do have strong financial motives for knowing whether or not their own products are safe and effective. The drug manufacturer Merck was hit with a $4.85 billion judgment in a lawsuit over its arthritis drug Vioxx, which was shown to increase the risk of stroke and heart attacks.

Dow Corning, Inc. had to file for bankruptcy protection due to protracted class-action litigation over its silicon breast implants. The left-wing media watch group Fairness and Accuracy in Reporting (FAIR) published these comments about Dow:

Two years after discontinuing the product but just as several implant trials were due to start, Dow Corning’s ad declared: “Here’s what some people don’t want you to know about breast implants.’ Studies at “prestigious medical institutions’ like Harvard Medical School, the Mayo Clinic, the University of Michigan and Emory University showed “no link between breast implants and disease.’ What Dow Corning failed to mention was that implant manufacturers had funded several of the studies.

FAIR is criticizing the Harvard study not for its methodology, but merely because Dow Corning paid for it. In this case, however, FAIR was wrong. All major subsequent studies from Europe, the US, and Australia reached the same conclusion as the Harvard study. Silicon breast implants were never scientifically linked to any health problems. Ultimately, research must stand or fall on its own merits, regardless of who is behind it.

Guilt by association

This fallacy is a variant on ad hominem. Guilt by association is the tactic of impugning others by calling attention to some organization or ideology that your audience will perceive as abhorrent. One of my favorite examples comes from a far-right web site titled Conservapedia (I have edited some of them for brevity without changing the meaning).

On the 2008 campaign trail, Obama was reading “The Post-American World” by Fareed Zakaria.
While Obama was a candidate, the White House military office assigned him the Secret Service code name “Renegade”. “Renegade” conventionally describes someone who goes against normal conventions of behavior, but its first usage was to describe someone who has turned from their religion.
Obama uses the Muslim Pakistani pronunciation for “Pakistan” rather than the common American one.
In 2012, Obama participated in a $40,000 per plate fundraiser on Holy Thursday, which many devout Christians would find to be inappropriate.

The web site’s authors did not say that Obama was a Muslim. They didn’t have to. They knew that many of their readers would draw that conclusion from these not-so-subtle hints, just as they knew their target audience would be repelled by the thought of a non-Christian president. Apparently, Conservapedia is no longer in business, so rational people still have hope.

Post hoc

The full name for this fallacy is post hoc, ergo propter hoc (“after the fact, therefore because of the fact”). This is when you argue that since A occurred prior to B, A must have caused B. To see why this can be a problem, compare the following syllogisms, both of which are valid:

Premise 1: If the dam has broken, then the city will flood.
Premise 2: The dam has broken.
Therefore, the city will flood.

Premise 1: If today is Monday, then tomorrow will be Tuesday.
Premise 2: Today is Monday.
Therefore, tomorrow will be Tuesday.

Although these syllogisms are of the same form (modus ponens), there is an obvious difference in that in the first one we understand that the breaching of the dam causes the city to flood. On the other hand, it sounds quite odd to say that Monday somehow causes Tuesday, even if Tuesday always follows Monday.

Causality is a complex problem that often entails careful study of many variables. If some event A happens before event B, it is not necessarily true that A caused B. The sequence could be coincidental. It could be that some other event caused both, or that A causes B only in conjunction with some other event or events. The post hoc fallacy is especially common in narratives about the alleged health benefits of so-called alternative medicines. Have you heard the joke about the man who caught the 24-hour flu, took some magical herbs, and felt much better the next day? Check out this comment from a seemingly satisfied consumer of a homeopathic treatment for flu and cold symptoms:

I am writing in to let you guys know how much FluGo has helped me. I had a big ice hockey tournament coming up. Last week I woke up with a running nose and I couldn’t stop sneezing, which quickly progressed by mid-morning. My mother uses your ImmunityPlus, and I quickly ordered FluGo online –– all I can say is WOW. My symptoms improved and I could feel my body fighting back. I was able to take part in the tournament and I wasn’t groggy or ‘jittery’ like when I take over–the–counter meds. Thanks so much.

Most of the maladies for which we seek treatment will go away by themselves. The common cold usually lasts about a week. This customer apparently felt symptoms one day and ordered the meds the next day. A safe guess is that they arrived in the mail about three to four days later. Therefore, five or six days likely passed between onset of symptoms and arrival of medicine. The customer then takes the medicine and sure enough, the symptoms soon disappear. It is quite possible that all of this would have happened without treatment. No side effects like grogginess or jitters? Maybe that’s because the product doesn’t do anything at all.

Quote-mining

There is a variant of the straw man argument (see below) called quote-mining. This is when you quote someone’s words out of context, carefully edit quotes to make it appear that someone who disagrees with you actually agrees with you, partially edit someone’s quotes to make that person appear foolish, or whatever. In 2017, the pro-creationist advocacy organization Answer in Genesis (AiG) published an article about the most famous example of quote-mining ever; the passage in the Bible (Psalm 14:1) that says “There is no God.” It really does say that. Check it out. Then read the verse in its entirety. That’s quote-mining.

Ironically, AiG was itself guilty of the second-most famous example ever of quote-mining, when it commented on this sentence from Charles Darwin’s Origin of Species:

To suppose that the eye with all its inimitable contrivances for adjusting the focus to different distances, for admitting different amounts of light, and for the correction of spherical and chromatic aberration, could have been formed by natural selection, seems, I freely confess, absurd in the highest degree.

AiG would have you believe that Darwin is confessing that his own theory of natural selection is absurd. But in the next paragraph (6th edition), he points out that when “it was first said that the sun stood still and the world turned round, the common sense of mankind declared the doctrine false.” Darwin adds that the eye’s astonishing complexity, “insuperable by our imagination, should not be considered as subversive of the theory.”  When you read the passage in its entirety, it is clear that Darwin’s argument was exactly the opposite of what AiG said it was.

Shotgunning

There is a valid inference in propositional logic called disjunction introduction that works simply like this:

P
∴ P ∨ Q

Recall that inclusionary disjunction means that at least one proposition in a premise is true. If you already know that P is true, then obviously you know that either P or Q is true. Or that P or Q or R is true, and so on. Thus from P alone you can validly deduce the complex proposition ((P ∨ Q) ∨ (R ∨ S)). People who promote nonsense sometimes use disjunctive propositions in similar ways. This trick is called shotgunning.

When I was a child, my friends and I enjoyed a past-time that involved gravel rather than arguments. We would find a bottle or tin can on the side of a road out in the country, set it up on a fence post, then take a few steps back, and chuck rocks at it. Invariably we grew impatient with the misses, and just picked up a handful of gravel and flung it in the direction of the target, hoping that at least one rock would hit. The tactic usually got the job done, but it was poor testament to our aim.

Remember the special ops team at PsiTech and its psychic investigations into the D.C. Sniper spree? PsiTech actually made 27 separate predictions. Some were so obvious that no rational person, psychic or not, would think otherwise. They said the assailant “drives frequently,” which would rule out anyone moving around the D.C. area on foot, on bicycle, or on mass transit with a high-powered rifle. Some were ambiguous. PsiTech said the assailant “needs a cause” and is not “all there.” Some were unprovable. The assailant “uses talcum powder.”

PsiTech also double-dipped, which is when psychics either make one and the same prediction twice with some slight rewording so that they can claim two hits in case they are right, or else predict that something will and will not happen, thus guaranteeing a hit. For example, they predicted the assailant was “slow,” and that “reasoning is difficult for him.” Those expressions mean the same thing.

The one thingPsiTech didn’t note was that the assailant had an accomplice. That’s a pretty big miss for a time-and-space traveling team. PsiTech threw out so many predictions that you would expect at least one or two would be on the mark.  Yet not a single one pointed to a perpetrator.

Criminal defense lawyers are notorious for shotgunning. Maybe my client wasn’t at the crime scene. Maybe the eyewitnesses were hallucinating. Maybe the crime lab botched their DNA testing. Maybe my client lent his shoes to someone, and that’s why their prints are at the crime scene. Maybe, maybe, maybe, maybe. It’s an effective tactic in a courtroom because a lawyer does not have to prove all possibilities to all jurors. If just one maybe resonates with one juror to the point of reasonable doubt, then the trick will have served its purpose. The bottom line is that ten weak arguments do not trump one good one.

Slippery slope

In my discussion of deduction above I used the familiar scheme

P →Q
P
∴ Q

to represent syllogisms like “If today is Friday, then John is in his office. Today is Friday. Therefore, John is in his office.” It is easy to construct syllogisms with a chain of “if…then” premises, as in:

Premise 1: P → Q
Premise 2: Q → R
Premise 3: R → S
Premise 4: S → T
Premise 5: P
∴ T (by skipping over all those intermediate steps)

Sometimes people will use arguments like these in which P represents something bad, and the subsequent premises ratchet up the badness at every step. You can see an example in an article by Henry Morris of the Institute for Creation Research:

As evolutionism has become the dominant teaching in our schools and colleges, those evil doctrines and practices whose rationale is based on evolution have inevitably followed. In terms of its impact on society especially a society once founded on principles of Biblical morality as ours was, evolutionism has indeed evolved into evil-utionism (as the British pronounce it!)

What are the “evil doctrines and practices” in question? Morris mentions homosexuality, sexual promiscuity, AIDS, abortion, rape, communism, Nazism, racism, imperialism, and “human greed in general” In other words, if you teach standard biology in schools, the whole country will go to hell in a handcart. It’s an effective scare tactic for some, but it has nothing to do with the evidence behind evolution.

Special pleading

Special pleading is when someone proposes an exemption from the standards that apply to everyone else. In medical research, the gold standard for determining the safety and efficacy of drugs and others treatments is randomized controlled testing, or RCT, in which experimental subjects are randomly assigned to a treatment group that gets a dose of medicine, to a placebo group that gets some inert substance, or to a control group that gets nothing at all.

Homeopathy has fared none too well in RCTs. An advocacy group called the European Committee for Homeopathy has come up with an imaginative explanation for this. In homeopathy, the ECH says, treatment is usually tailored to the individual. Therefore:

The efficacy of an individualised homeopathic intervention is thus a complex blend of the prescribed medicine together with the other facets of the in-depth consultation and integrated health advice provided by the practitioner; under these circumstances, the specific effect of the homeopathic medicine itself may be difficult to quantify with precision in RCTs.

In other words, we’re special, so we don’t have to do rigorous scientific testing. The same exemption could be invoked by anyone practicing any sort of sham medicine.

Straw man

A straw man argument is one that distorts, exaggerates, or simplifies a claim in order to refute it. The name comes from a metaphor: a straw man is easier to beat up than a real man. One of the most memorable examples this century came from former vice presidential candidate Sarah Palin:

The America I know and love is not one in which my parents or my baby with Down Syndrome will have to stand in front of Obama’s “death panel” so his bureaucrats can decide, based on a subjective judgment of their “level of productivity” in society, whether they are worthy of health care. Such a system is downright evil.

Public debate over any policy is indispensable in an open society. Comments like these are not helpful, however, because there was nothing even remotely resembling a “death panel” in Obama’s health care plan. But it sure is easier to argue against an imaginary plan with a death panel than it is to argue against an actual plan without one.

Of course it’s not just Republicans who try to refute frightening arguments that were never made in the first place. In a 2018 interview with the Fort Worth Star Telegram, a Democratic candidate for governor of Texas Lupe Valdez, the Sheriff of Dallas County, offered the following argument against the NRA’s long-help opposition to universal background checks on those who want to buy firearms:

There are some counties in Texas where the first day of hunting, you have to shut down the schools because people are going to go hunting.

The Sheriff’s point is not entirely clear, but she was obviously warning us about the downside of America’s gun culture. Trouble is, she was arguing against a policy that doesn’t exist. Texas schools do not shut down when hunting season begins. Texas has 254 counties, each with its own regulations on hunting. Those regs vary according to prey, so there isn’t really any “first day of hunting” anyway.

Tu quoque

Literally “you as well,” tu quoque is a way of deflecting criticism from yourself by accusing others of having the same faults you have. The Institute for Historical Review is a historical revisionist organization devoted to convincing people that the Jewish Holocaust of the Second World War never happened. One of its most prominent advocates is Mark Weber, who made this comment in a paper titled The Japanese Camps in California:

The post-war mass media has spent years hammering away at the “guilt” of the German people for generally doing nothing while the Jews were being evacuated to the East. How does the German experience compare with the American record of popular enthusiasm for evacuating the West Coast Japanese?
The German defendants at Nuremberg were declared guilty of “crimes against humanity” for, among other things, victimizing members of a group on the basis of ancestry. What responsibility did the countries, including the United States, which set up the International Military Tribunal have in upholding that principle in their own territories?

Weber is comparing the internment of Americans citizens of Japanese descent to the internment of Jews in about 40,000 prisons during the Second World War. It is true that Americans relocated and detained Japanese-Americans on our own soil after the attack on Pearl Harbor. This was truly a shameful episode in our history. President Reagan signed the Civil Liberties Act in 1988 to compensate the victims and their descendants. President George H. W. Bush formally apologized on behalf of the USA on October 9, 1991.

Even so, Weber’s comparison between the Nazi concentration camps and American internment camps is wrong for many reasons. First, there is the sheer magnitude. Nazi Germany killed about six million Jews during the Holocaust. About 100,000 Japanese-Americans were held in camps. Second, the living conditions in Nazi death camps were horrific almost beyond imagination. Japanese internees were generally well-treated. They suffered no malnutrition and no outbreaks of disease. Children in the camps had schools. Third, Nazis willfully murdered prisoners. Americans did not.

And finally, America’s violation of the civil rights of Japanese-Americans does not disprove, forgive, or even mitigate Nazi genocide. Two wrongs don’t make a right.

When all else fails

There is one other very effective, tried-and-true trick used by people who promote nonsense. It’s called “not telling the truth.” This has to be said circumspectly, because all of us are at times fallible, forgetful, distracted, misinformed, or just plain wrong for whatever reason. There is no shame in any of that.

But there is a vast wasteland of nonsense that begins on the edge of benign error and ends with pure, shameless dishonesty. There are those who are guilty of willful ignorance in that they assiduously avoid information they suspect might contradict a cherished worldview. And there are others who just make stuff up. There are no logical nuances to explain in those cases. Some people really are just flat-out dishonest, and there is no more to be said about it. It is often not clear when someone is not telling the truth out of ignorance, self-delusion, or calculated deceit. This fallacy is sometimes called “lying.”

So look out. Don’t let them trick you.


P.S.: I found the koan at the top of this page in Zen Buddhism, from the Peter Pauper Press, 1959, 50. The book is apparently no longer in circulation.