All Lessons

All Lessons

Several of these have names in Latin, but I mostly ignored that and used English. If anyone is bothered by my using “he” everywhere, note that “he” is the person arguing fallaciously.

  • Ad Hominem (Argument To The Man)
  • Affirming The Consequent
  • Amazing Familiarity
  • Ambiguous Assertion
  • Appeal To Anonymous Authority
  • Appeal To Authority
  • Appeal To Coincidence
  • Appeal To Complexity
  • Appeal To False Authority
  • Appeal To Force
  • Appeal To Pity (Appeal to Sympathy, The Galileo Argument)
  • Appeal To Widespread Belief (Bandwagon Argument, Peer Pressure, Appeal To Common Practice, Argumentum ad Populum)
  • Argument By Dismissal
  • Argument By Emotive Language (Appeal To The People)
  • Argument By Fast Talking
  • Argument By Generalization
  • Argument By Gibberish (Bafflement)
  • Argument By Half Truth (Suppressed Evidence)
  • Argument By Laziness (Argument By Uninformed Opinion)
  • Argument By Personal Charm
  • Argument By Pigheadedness (Doggedness)
  • Argument By Poetic Language
  • Argument By Prestigious Jargon
  • Argument By Question
  • Argument By Repetition (Argument Ad Nauseam)
  • Argument by Rhetorical Question
  • Argument By Scenario
  • Argument By Selective Observation
  • Argument By Selective Reading
  • Argument By Slogan
  • Argument By Vehemence
  • Argument From Adverse Consequences (Appeal To Fear, Scare Tactics)
  • Argument From Age (Wisdom of the Ancients)
  • Argument From Authority
  • Argument From False Authority
  • Argument From Personal Astonishment
  • Argument From Small Numbers
  • Argument From Spurious Similarity
  • Argument Of The Beard
  • Argument To The Future
  • Bad Analogy
  • Begging The Question (Assuming The Answer, Tautology)
  • Burden Of Proof
  • Causal Reductionism (Complex Cause)
  • Contrarian Argument
  • Changing The Subject (Digression, Red Herring, Misdirection, False Emphasis)
  • Cliche Thinking
  • Common Sense
  • Complex Question (Tying)
  • Confusing Correlation And Causation
  • Disproof By Fallacy
  • Equivocation
  • Error Of Fact
  • Euphemism
  • Exception That Proves The Rule
  • Excluded Middle (False Dichotomy, Faulty Dilemma, Bifurcation)
  • Extended Analogy
  • Failure To State
  • Fallacy Of Composition
  • Fallacy Of Division
  • Fallacy Of The General Rule
  • Fallacy Of The Crucial Experiment
  • False Cause
  • False Compromise
  • Genetic Fallacy (Fallacy of Origins, Fallacy of Virtue)
  • Having Your Cake (Failure To Assert, or Diminished Claim)
  • Hypothesis Contrary To Fact
  • Inconsistency
  • Inflation Of Conflict
  • Internal Contradiction
  • Least Plausible Hypothesis
  • Lies
  • Meaningless Questions
  • Misunderstanding The Nature Of Statistics (Innumeracy)
  • Moving The Goalposts (Raising The Bar, Argument By Demanding Impossible Perfection)
  • Needling
  • Non Sequitur
  • Not Invented Here
  • Outdated Information
  • Pious Fraud
  • Poisoning The Wells
  • Psychogenetic Fallacy
  • Reductio Ad Absurdum
  • Reductive Fallacy (Oversimplification)
  • Reifying
  • Short Term Versus Long Term
  • Slippery Slope Fallacy (Camel’s Nose)
  • Special Pleading (Stacking The Deck)
  • Statement Of Conversion
  • Stolen Concept
  • Straw Man (Fallacy Of Extension)
  • Two Wrongs Make A Right (Tu Quoque, You Too)
  • Weasel Wording

Some other Web sites:

LiteratureReviewHQ interviewed me about this page, and have a podcast.


  • Ad Hominem (Argument To The Man):

Attacking the person instead of attacking his argument.

For example: “Von Daniken’s books about ancient astronauts are worthless because he is a convicted forger and embezzler.” (Which is true, but that’s not why they’re worthless.) This syllogism, which alludes to Alan Turing’s homosexuality. Turing thinks machines think. Turing lies with men. Therefore, machines don’t think.  (Note the equivocation in the use of the word “lies”.) A common form is an attack on sincerity. For example, “How can you argue for vegetarianism when you wear leather shoes ?” The two wrongs make a right fallacy is related. A variation (related to Argument By Generalization) is to attack a whole class of people.
For example, “Evolutionary biology is a sinister tool of the materialistic, atheistic religion of Secular Humanism.” Similarly, one notorious net.kook waved away a whole category of evidence by announcing “All the scientists were drunk.” Another variation is attack by innuendo: “Why don’t scientists tell us what they really know; are they afraid of public panic ?” There may be a pretense that the attack isn’t happening: “In order to maintain a civil debate, I will not mention my opponent’s drinking problem.” Or “I don’t care if other people say you’re [opinionated/boring/overbearing]. Attacks don’t have to be strong or direct. You can merely show disrespect, or cut down his stature by saying that he seems to be sweating a lot, or that he has forgotten what he said last week. Some examples: “I used to think that way when I was your age.” You’re new here, aren’t you ?” “You weren’t breast fed as a child, were you ?” “What drives you to make such a statement ?” “If you’d just listen..” “You seem very emotional.” (This last works well if you have been hogging the microphone, so that they have had to yell to be heard.) Sometimes the attack is on the other person’s intelligence. For example, “If you weren’t so stupid you would have no problem seeing my point of view.” “Even you should understand my next point.” Oddly, the stupidity attack is sometimes reversed. For example, dismissing a comment with “Well, you’re just smarter than the rest of us.” (In Britain, that might be put as “too clever by half”.) This is Dismissal By Differentness. It is related to Not Invented Here and Changing The Subject. Ad Hominem is not fallacious if the attack goes to the credibility of the argument. For instance, the argument may depend on its presenter’s claim that he’s an expert. (That is, the Ad Hominem is undermining an Argument From Authority.) Trial judges allow this category of attacks.

  • Straw Man (Fallacy Of Extension):
  • attacking an exaggerated or caricatured version of your opponent’s position. For example, The Straw Man Argument is that anyone not supporting funding for a particular attack submarine program wants to leave us defenseless.”

    • Inflation Of Conflict:

      arguing that scholars debate a certain point. Therefore, they must know nothing, and their entire field of knowledge is “in crisis” or does not properly exist at all. For example, two historians debated whether Hitler killed five million Jews or six million Jews. A Holocaust denier argued that this disagreement made his claim credible, even though his death count is three to ten times smaller than the known minimum. Similarly, in “The Mythology of Modern Dating Methods” (John Woodmorappe, 1999) we find on page 42 that two scientists “cannot agree” about which one of two geological dates is “real” and which one is “spurious”. Woodmorappe fails to mention that the two dates differ by less than one percent.

            • Argument From Adverse Consequences (Appeal To Fear, Scare Tactics):

              saying an opponent must be wrong, because if he is right, then bad things would ensue. For example: God must exist, because a godless society would be lawless and dangerous. Or: the defendant in a murder trial must be found guilty, because otherwise husbands will be encouraged to murder their wives. Wishful thinking is closely related. “My home in Florida is one foot above sea level. Therefore I am certain that global warming will not make the oceans rise by fifteen feet.” Of course, wishful thinking can also be about positive consequences, such as winning the lottery, or eliminating poverty and crime.

            • Special Pleading (Stacking The Deck):

              using the arguments that support your position, but ignoring or somehow disallowing the arguments against. Uri Geller used special pleading when he claimed that the presence of unbelievers (such as stage magicians) made him unable to demonstrate his psychic powers.

            • Excluded Middle (False Dichotomy, Faulty Dilemma, Bifurcation):

              assuming there are only two alternatives when in fact there are more. For example, assuming Atheism is the only alternative to Fundamentalism, or being a traitor is the only alternative to being a loud patriot.

            • Short Term Versus Long Term:

              this is a particular case of the Excluded Middle. For example, “We must deal with crime on the streets before improving the schools.” (But why can’t we do some of both ?) Similarly, “We should take the scientific research budget and use it to feed starving children.”

            • Burden Of Proof:

              the claim that whatever has not yet been proved false must be true (or vice versa). Essentially the arguer claims that he should win by default if his opponent can’t make a strong enough case. There may be three problems here. First, the arguer claims priority, but can he back up that claim ? Second, he is impatient with ambiguity, and wants a final answer right away. And third, “absence of evidence is not evidence of absence.”

            • Argument By Question:

              asking your opponent a question which does not have a snappy answer. (Or anyway, no snappy answer that the audience has the background to understand.) Your opponent has a choice: he can look weak or he can look long-winded. For example, “How can scientists expect us to believe that anything as complex as a single living cell could have arisen as a result of random natural processes ?” Actually, pretty well any question has this effect to some extent. It usually takes longer to answer a question than ask it. Variants are the rhetorical question, and the loaded question, such as “Have you stopped beating your wife ?”

            • Argument by Rhetorical Question:

              asking a question in a way that leads to a particular answer. For example, “When are we going to give the old folks of this country the pension they deserve ?” The speaker is leading the audience to the answer “Right now.” Alternatively, he could have said “When will we be able to afford a major increase in old age pensions?” In that case, the answer he is aiming at is almost certainly not “Right now.”

            • Fallacy Of The General Rule:

              assuming that something true in general is true in every possible case. For example, “All chairs have four legs.” Except that rocking chairs don’t have any legs, and what is a one-legged “shooting stick” if it isn’t a chair ? Similarly, there are times when certain laws should be broken. For example, ambulances are allowed to break speed laws.

            • Reductive Fallacy (Oversimplification):

              over-simplifying. As Einstein said, everything should be made as simple as possible, but no simpler. Political slogans such as “Taxation is theft” fall in this category.

            • Genetic Fallacy (Fallacy of Origins, Fallacy of Virtue):

              if an argument or arguer has some particular origin, the argument must be right (or wrong). The idea is that things from that origin, or that social class, have virtue or lack virtue. (Being poor or being rich may be held out as being virtuous.) Therefore, the actual details of the argument can be overlooked, since correctness can be decided without any need to listen or think.

            • Psychogenetic Fallacy:

              if you learn the psychological reason why your opponent likes an argument, then he’s biased, so his argument must be wrong.

            • Argument Of The Beard:

              assuming that two ends of a spectrum are the same, since one can travel along the spectrum in very small steps. The name comes from the idea that being clean-shaven must be the same as having a big beard, since in-between beards exist. Similarly, all piles of stones are small, since if you add one stone to a small pile of stones it remains small. However, the existence of pink should not undermine the distinction between white and red.

            • Argument From Age (Wisdom of the Ancients):

              snobbery that very old (or very young) arguments are superior. This is a variation of the Genetic Fallacy, but has the psychological appeal of seniority and tradition (or innovation). Products labelled “New ! Improved !” are appealing to a belief that innovation is of value for such products. It’s sometimes true. And then there’s cans of “Old Fashioned Baked Beans”.

            • Not Invented Here:

              ideas from elsewhere are made unwelcome. “This Is The Way We’ve Always Done It.” This fallacy is a variant of the Argument From Age. It gets a psychological boost from feelings that local ways are superior, or that local identity is worth any cost, or that innovations will upset matters. An example of this is the common assertion that America has “the best health care system in the world”, an idea that this 2007 New York Times editorial refuted. People who use the Not Invented Here argument are sometimes accused of being stick-in-the-mud’s. Conversely, foreign and “imported” things may be held out as superior.

            • Argument By Dismissal:

              an idea is rejected without saying why. Dismissals usually have overtones. For example, “If you don’t like it, leave the country” implies that your cause is hopeless, or that you are unpatriotic, or that your ideas are foreign, or maybe all three. “If you don’t like it, live in a Communist country” adds an emotive element.

            • Argument To The Future:

              arguing that evidence will someday be discovered which will (then) support your point.

            • Poisoning The Wells:

              discrediting the sources used by your opponent. This is a variation of Ad Hominem.

            • Argument By Emotive Language (Appeal To The People):

              using emotionally loaded words to sway the audience’s sentiments instead of their minds. Many emotions can be useful: anger, spite, envy, condescension, and so on. For example, argument by condescension: “Support the ERA ? Sure, when the women start paying for the drinks! Hah! Hah!” Americans who don’t like the Canadian medical system have referred to it as “socialist”, but I’m not quite sure if this is intended to mean “foreign”, or “expensive”, or simply guilty by association. Cliche Thinking and Argument By Slogan are useful adjuncts, particularly if you can get the audience to chant the slogan. People who rely on this argument may seed the audience with supporters or “shills”, who laugh, applaud or chant at proper moments. This is the live-audience equivalent of adding a laugh track or music track. Now that many venues have video equipment, some speakers give part of their speech by playing a prepared video. These videos are an opportunity to show a supportive audience, use emotional music, show emotionally charged images, and the like. The idea is old: there used to be professional cheering sections. (Monsieur Zig-Zag, pictured on the cigarette rolling papers, acquired his fame by applauding for money at the Paris Opera.) If the emotion in question isn’t harsh, Argument By Poetic Language helps the effect. Flattering the audience doesn’t hurt either.

            • Argument By Personal Charm:

              getting the audience to cut you slack. Example: Ronald Reagan. It helps if you have an opponent with much less personal charm. Charm may create trust, or the desire to “join the winning team”, or the desire to please the speaker. This last is greatest if the audience feels sex appeal. Reportedly George W. Bush lost a debate when he was young, and said later that he would never be “out-bubba’d” again.

            • Appeal To Pity (Appeal to Sympathy, The Galileo Argument):

              “I did not murder my mother and father with an axe ! Please don’t find me guilty; I’m suffering enough through being an orphan.” Some authors want you to know they’re suffering for their beliefs. For example, “Scientists scoffed at Copernicus and Galileo; they laughed at Edison, Tesla and Marconi; they won’t give my ideas a fair hearing either. But time will be the judge. I can wait; I am patient; sooner or later science will be forced to admit that all matter is built, not of atoms, but of tiny capsules of TIME.” There is a strange variant which shows up on Usenet. Somebody refuses to answer questions about their claims, on the grounds that the asker is mean and has hurt their feelings. Or, that the question is personal.

            • Appeal To Force:

              threats, or even violence. On the Net, the usual threat is of a lawsuit. The traditional religious threat is that one will burn in Hell. However, history is full of instances where expressing an unpopular idea could you get you beaten up on the spot, or worse.

              “The clinching proof of my reasoning is that I will cut anyone who argues further into dogmeat.” — Attributed to Sir Geoffery de Tourneville, ca 1350 A.D.

            • Argument By Vehemence:

              being loud. Trial lawyers are taught this rule:

              If you have the facts, pound on the facts. If you have the law, pound on the law. If you don’t have either, pound on the table.

The above rule paints vehemence as an act of desperation. But it can also be a way to seize control of the agenda, use up the opponent’s time, or just intimidate the easily cowed. And it’s not necessarily aimed at winning the day. A tantrum or a fit is also a way to get a reputation, so that in the future, no one will mess with you. This is related to putting a post in UPPERCASE, aka SHOUTING. Depending on what you’re loud about, this may also be an Appeal To Force, Argument By Emotive Language, Needling, or Changing The Subject.

            • Begging The Question (Assuming The Answer, Tautology):

              reasoning in a circle. The thing to be proved is used as one of your assumptions. For example: “We must have a death penalty to discourage violent crime”. (This assumes it discourages crime.) Or, “The stock market fell because of a technical adjustment.” (But is an “adjustment” just a stock market fall ?)

            • Stolen Concept:

              using what you are trying to disprove. That is, requiring the truth of something for your proof that it is false. For example, using science to show that science is wrong. Or, arguing that you do not exist, when your existence is clearly required for you to be making the argument. This is a relative of Begging The Question, except that the circularity there is in what you are trying to prove, instead of what you are trying to disprove. It is also a relative of Reductio Ad Absurdum, where you temporarily assume the truth of something.

            • Argument From Authority:

              the claim that the speaker is an expert, and so should be trusted. There are degrees and areas of expertise. The speaker is actually claiming to be more expert, in the relevant subject area, than anyone else in the room. There is also an implied claim that expertise in the area is worth having. For example, claiming expertise in something hopelessly quack (like iridology) is actually an admission that the speaker is gullible.

            • Argument From False Authority:

              a strange variation on Argument From Authority. For example, the TV commercial which starts “I’m not a doctor, but I play one on TV.” Just what are we supposed to conclude ?

            • Appeal To Anonymous Authority:

              an Appeal To Authority is made, but the authority is not named. For example, “Experts agree that ..”, “scientists say ..” or even “they say ..”. This makes the information impossible to verify, and brings up the very real possibility that the arguer himself doesn’t know who the experts are. In that case, he may just be spreading a rumor. The situation is even worse if the arguer admits it’s a rumor.

            • Appeal To Authority:

              “Albert Einstein was extremely impressed with this theory.” (But a statement made by someone long-dead could be out of date. Or perhaps Einstein was just being polite. Or perhaps he made his statement in some specific context. And so on.) To justify an appeal, the arguer should at least present an exact quote. It’s more convincing if the quote contains context, and if the arguer can say where the quote comes from. A variation is to appeal to unnamed authorities . There was a New Yorker cartoon, showing a doctor and patient. The doctor was saying: “Conventional medicine has no treatment for your condition. Luckily for you, I’m a quack.” So the joke was that the doctor boasted of his lack of authority.

            • Appeal To False Authority:

              a variation on Appeal To Authority, but the Authority is outside his area of expertise. For example, “Famous physicist John Taylor studied Uri Geller extensively and found no evidence of trickery or fraud in his feats.” Taylor was not qualified to detect trickery or fraud of the kind used by stage magicians. Taylor later admitted Geller had tricked him, but he apparently had not figured out how. A variation is to appeal to a non-existent authority. For example, someone reading an article by Creationist Dmitri Kuznetsov tried to look up the referenced articles. Some of the articles turned out to be in non-existent journals. Another variation is to misquote a real authority. There are several kinds of misquotation. A quote can be inexact or have been edited. It can be taken out of context. (Chevy Chase: “Yes, I said that, but I was singing a song written by someone else at the time.”) The quote can be separate quotes which the arguer glued together. Or, bits might have gone missing. For example, it’s easy to prove that Mick Jagger is an assassin. In “Sympathy For The Devil” he sang: “I shouted out, who killed the Kennedys, When after all, it was … me.”

            • Statement Of Conversion:

              the speaker says “I used to believe in X”. This is simply a weak form of asserting expertise. The speaker is implying that he has learned about the subject, and now that he is better informed, he has rejected X. So perhaps he is now an authority, and this is an implied Argument From Authority. A more irritating version of this is “I used to think that way when I was your age.” The speaker hasn’t said what is wrong with your argument: he is merely claiming that his age has made him an expert. “X” has not actually been countered unless there is agreement that the speaker has that expertise. In general, any bald claim always has to be buttressed. For example, there are a number of Creationist authors who say they “used to be evolutionists”, but the scientists who have rated their books haven’t noticed any expertise about evolution.

            • Bad Analogy:

              claiming that two situations are highly similar, when they aren’t. For example, “The solar system reminds me of an atom, with planets orbiting the sun like electrons orbiting the nucleus. We know that electrons can jump from orbit to orbit; so we must look to ancient records for sightings of planets jumping from orbit to orbit also.” Or, “Minds, like rivers, can be broad. The broader the river, the shallower it is. Therefore, the broader the mind, the shallower it is.” Or, “We have pure food and drug laws; why can’t we have laws to keep movie-makers from giving us filth ?”

            • Extended Analogy:

              the claim that two things, both analogous to a third thing, are therefore analogous to each other. For example, this debate:

              “I believe it is always wrong to oppose the law by breaking it.” “Such a position is odious: it implies that you would not have supported Martin Luther King.” “Are you saying that cryptography legislation is as important as the struggle for Black liberation ? How dare you !”

A person who advocates a particular position (say, about gun control) may be told that Hitler believed the same thing. The clear implication is that the position is somehow tainted. But Hitler also believed that window drapes should go all the way to the floor. Does that mean people with such drapes are monsters ?

            • Argument From Spurious Similarity:

              this is a relative of Bad Analogy. It is suggested that some resemblance is proof of a relationship. There is a WW II story about a British lady who was trained in spotting German airplanes. She made a report about a certain very important type of plane. While being quizzed, she explained that she hadn’t been sure, herself, until she noticed that it had a little man in the cockpit, just like the little model airplane at the training class.

            • Reifying:

              an abstract thing is talked about as if it were concrete. (A possibly Bad Analogy is being made between concept and reality.) For example, “Nature abhors a vacuum.”

            • False Cause:

              assuming that because two things happened, the first one caused the second one. (Sequence is not causation.) For example, “Before women got the vote, there were no nuclear weapons.” Or, “Every time my brother Bill accompanies me to Fenway Park, the Red Sox are sure to lose.” Essentially, these are arguments that the sun goes down because we’ve turned on the street lights.

            • Confusing Correlation And Causation:

              earthquakes in the Andes were correlated with the closest approaches of the planet Uranus. Therefore, Uranus must have caused them. (But Jupiter is nearer than Uranus, and more massive too.) When sales of hot chocolate go up, street crime drops. Does this correlation mean that hot chocolate prevents crime ? No, it means that fewer people are on the streets when the weather is cold. The bigger a child’s shoe size, the better the child’s handwriting. Does having big feet make it easier to write ? No, it means the child is older.

            • Causal Reductionism (Complex Cause):

              trying to use one cause to explain something, when in fact it had several causes. For example, “The accident was caused by the taxi parking in the street.” (But other drivers went around the taxi. Only the drunk driver hit the taxi.)

            • Cliche Thinking:

              using as evidence a well-known wise saying, as if that is proven, or as if it has no exceptions.

            • Exception That Proves The Rule:

              a specific example of Cliche Thinking. This is used when a rule has been asserted, and someone points out the rule doesn’t always work. The cliche rebuttal is that this is “the exception that proves the rule”. Many people think that this cliche somehow allows you to ignore the exception, and continue using the rule. In fact, the cliche originally did no such thing. There are two standard explanations for the original meaning. The first is that the word “prove” meant test. That is why the military takes its equipment to a Proving Ground to test it. So, the cliche originally said that an exception tests a rule. That is, if you find an exception to a rule, the cliche is saying that the rule is being tested, and perhaps the rule will need to be discarded. The second explanation is that the stating of an exception to a rule, proves that the rule exists. For example, suppose it was announced that “Over the holiday weekend, students do not need to be in the dorms by midnight”. This announcement implies that normally students do have to be in by midnight. Here is a discussion of that explanation. In either case, the cliche is not about waving away objections.

            • Appeal To Widespread Belief (Bandwagon Argument, Peer Pressure, Appeal to Common Practice):

              the claim, as evidence for an idea, that many people believe it, or used to believe it, or do it. If the discussion is about social conventions, such as “good manners”, then this is a reasonable line of argument. However, in the 1800’s there was a widespread belief that bloodletting cured sickness. All of these people were not just wrong, but horribly wrong, because in fact it made people sicker. Clearly, the popularity of an idea is no guarantee that it’s right. Similarly, a common justification for bribery is that “Everybody does it”. And in the past, this was a justification for slavery.

            • Fallacy Of Composition:

              assuming that a whole has the same simplicity as its constituent parts. In fact, a great deal of science is the study of emergent properties. For example, if you put a drop of oil on water, there are interesting optical effects. But the effect comes from the oil/water system: it does not come just from the oil or just from the water. Another example: “A car makes less pollution than a bus. Therefore, cars are less of a pollution problem than buses.” Another example: “Atoms are colorless. Cats are made of atoms, so cats are colorless.”

            • Fallacy Of Division:

              assuming that what is true of the whole is true of each constituent part. For example, human beings are made of atoms, and human beings are conscious, so atoms must be conscious.

            • Complex Question (Tying):

              unrelated points are treated as if they should be accepted or rejected together. In fact, each point should be accepted or rejected on its own merits. For example, “Do you support freedom and the right to bear arms ?”

            • Slippery Slope Fallacy (Camel’s Nose)

              there is an old saying about how if you allow a camel to poke his nose into the tent, soon the whole camel will follow. The fallacy here is the assumption that something is wrong because it is right next to something that is wrong. Or, it is wrong because it could slide towards something that is wrong. For example, “Allowing abortion in the first week of pregnancy would lead to allowing it in the ninth month.” Or, “If we legalize marijuana, then more people will try heroin.” Or, “If I make an exception for you then I’ll have to make an exception for everyone.”

            • Argument By Pigheadedness (Doggedness):

              refusing to accept something after everyone else thinks it is well enough proved. For example, there are still Flat Earthers.

            • Appeal To Coincidence:

              asserting that some fact is due to chance. For example, the arguer has had a dozen traffic accidents in six months, yet he insists they weren’t his fault. This may be Argument By Pigheadedness. But on the other hand, coincidences do happen, so this argument is not always fallacious.

            • Argument By Repetition (Argument Ad Nauseam):

              if you say something often enough, some people will begin to believe it. There are some net.kooks who keeping reposting the same articles to Usenet, presumably in hopes it will have that effect.

            • Argument By Half Truth (Suppressed Evidence):

              this is hard to detect, of course. You have to ask questions. For example, an amazingly accurate “prophecy” of the assassination attempt on President Reagan was shown on TV. But was the tape recorded before or after the event ? Many stations did not ask this question. (It was recorded afterwards.) A book on “sea mysteries” or the “Bermuda Triangle” might tell us that the yacht Connemara IV was found drifting crewless, southeast of Bermuda, on September 26, 1955. None of these books mention that the yacht had been directly in the path of Hurricane Iona, with 180 mph winds and 40-foot waves.

            • Argument By Selective Observation:

              also called cherry picking, the enumeration of favorable circumstances, or as the philosopher Francis Bacon described it, counting the hits and forgetting the misses. For example, a state boasts of the Presidents it has produced, but is silent about its serial killers. Or, the claim “Technology brings happiness”. (Now, there’s something with hits and misses.) Casinos encourage this human tendency. There are bells and whistles to announce slot machine jackpots, but losing happens silently. This makes it much easier to think that the odds of winning are good.

            • Argument By Selective Reading:

              making it seem as if the weakest of an opponent’s arguments was the best he had. Suppose the opponent gave a strong argument X and also a weaker argument Y. Simply rebut Y and then say the opponent has made a weak case. This is a relative of Argument By Selective Observation, in that the arguer overlooks arguments that he does not like. It is also related to Straw Man (Fallacy Of Extension), in that the opponent’s argument is not being fairly represented.

            • Argument By Generalization:

              drawing a broad conclusion from a small number of perhaps unrepresentative cases. (The cases may be unrepresentative because of Selective Observation.) For example, “They say 1 out of every 5 people is Chinese. How is this possible ? I know hundreds of people, and none of them is Chinese.” So, by generalization, there aren’t any Chinese anywhere. This is connected to the Fallacy Of The General Rule. Similarly, “Because we allow terminally ill patients to use heroin, we should allow everyone to use heroin.” It is also possible to under-generalize. For example,

              “A man who had killed both of his grandmothers declared himself rehabilitated, on the grounds that he could not conceivably repeat his offense in the absence of any further grandmothers.” — “Ports Of Call” by Jack Vance

            • Argument From Small Numbers:

              “I’ve thrown three sevens in a row. Tonight I can’t lose.” This is Argument By Generalization, but it assumes that small numbers are the same as big numbers. (Three sevens is actually a common occurrence. Thirty three sevens is not.) Or: “After treatment with the drug, one-third of the mice were cured, one-third died, and the third mouse escaped.” Does this mean that if we treated a thousand mice, 333 would be cured ? Well, no.

            • Misunderstanding The Nature Of Statistics (Innumeracy):

              President Dwight Eisenhower expressed astonishment and alarm on discovering that fully half of all Americans had below average intelligence. Similarly, some people get fearful when they learn that their doctor wasn’t in the top half of his class. (But that’s half of them.)

              “Statistics show that of those who contract the habit of eating, very few survive.” — Wallace Irwin.

Very few people seem to understand “regression to the mean”. This is the idea that things tend to go back to normal. If you feel normal today, does it really mean that the headache cure you took yesterday performed wonders ? Or is it just that your headaches are always gone the next day ? Journalists are notoriously bad at reporting risks. For example, in 1995 it was loudly reported that a class of contraceptive pills would double the chance of dangerous blood clots. The news stories mostly did not mention that “doubling” the risk only increased it by one person in 7,000. The “cell phones cause brain cancer” reports are even sillier, with the supposed increase in risk being at most one or two cancers per 100,000 people per year. So, if the fearmongers are right, your cellphone has increased your risk from “who cares” to “who cares”.

            • Inconsistency:

              for example, the declining life expectancy in the former Soviet Union is due to the failures of communism. But, the quite high infant mortality rate in the United States is not a failure of capitalism. This is related to Internal Contradiction.

            • Non Sequitur:

              something that just does not follow. For example, “Tens of thousands of Americans have seen lights in the night sky which they could not identify. The existence of life on other planets is fast becoming certainty !” Another example: arguing at length that your religion is of great help to many people. Then, concluding that the teachings of your religion are undoubtably true. Or: “Bill lives in a large building, so his apartment must be large.”

            • Meaningless Questions:

              irresistible forces meeting immovable objects, and the like.

            • Argument By Poetic Language:

              if it sounds good, it must be right. Songs often use this effect to create a sort of credibility – for example, “Don’t Fear The Reaper” by Blue Oyster Cult. Politically oriented songs should be taken with a grain of salt, precisely because they sound good.

            • Argument By Slogan:

              if it’s short, and connects to an argument, it must be an argument. (But slogans risk the Reductive Fallacy.) Being short, a slogan increases the effectiveness of Argument By Repetition. It also helps Argument By Emotive Language (Appeal To The People), since emotional appeals need to be punchy. (Also, the gallery can chant a short slogan.) Using an old slogan is Cliche Thinking.

            • Argument By Prestigious Jargon:

              using big complicated words so that you will seem to be an expert. Why do people use “utilize” when they could utilize “use” ? For example, crackpots used to claim they had a Unified Field Theory (after Einstein). Then the word Quantum was popular. Lately it seems to be Zero Point Fields.

            • Argument By Gibberish (Bafflement):

              this is the extreme version of Argument By Prestigious Jargon. An invented vocabulary helps the effect, and some net.kooks use lots of CAPitaLIZation. However, perfectly ordinary words can be used to baffle. For example, “Omniscience is greater than omnipotence, and the difference is two. Omnipotence plus two equals omniscience. META = 2.” [From R. Buckminster Fuller’s No More Secondhand God.] Gibberish may come from people who can’t find meaning in technical jargon, so they think they should copy style instead of meaning. It can also be a “snow job”, AKA “baffle them with BS”, by someone actually familiar with the jargon. Or it could be Argument By Poetic Language. An example of poetic gibberish: “Each autonomous individual emerges holographically within egoless ontological consciousness as a non-dimensional geometric point within the transcendental thought-wave matrix.”

            • Equivocation:

              using a word to mean one thing, and then later using it to mean something different. For example, sometimes “Free software” costs nothing, and sometimes it is without restrictions. Some examples:

              “The sign said ‘fine for parking here’, and since it was fine, I parked there.” All trees have bark. All dogs bark. Therefore, all dogs are trees. “Consider that two wrongs never make a right, but that three lefts do.” – “Deteriorata”, National Lampoon

            • Euphemism:

              the use of words that sound better. The lab rat wasn’t killed, it was sacrificed. Mass murder wasn’t genocide, it was ethnic cleansing. The death of innocent bystanders is collateral damage. Microsoft doesn’t find bugs, or problems, or security vulnerabilities: they just discover an issue with a piece of software. This is related to Argument By Emotive Language, since the effect is to make a concept emotionally palatable.

            • Weasel Wording:

              this is very much like Euphemism, except that the word changes are done to claim a new, different concept rather than soften the old concept. For example, an American President may not legally conduct a war without a declaration of Congress. So, various Presidents have conducted “police actions”, “armed incursions”, “protective reaction strikes,” “pacification,” “safeguarding American interests,” and a wide variety of “operations”. Similarly, War Departments have become Departments of Defense, and untested medicines have become alternative medicines. The book “1984” has some particularly good examples.

            • Error Of Fact:

              for example, “No one knows how old the Pyramids of Egypt are.” (Except, of course, for the historians who’ve read records and letters written by the ancient Egyptians themselves.) Typically, the presence of one error means that there are other errors to be uncovered.

            • Argument From Personal Astonishment:

              Errors of Fact caused by stating offhand opinions as proven facts. (The speaker’s thought process being “I don’t see how this is possible, so it isn’t.”) An example from Creationism is given here. This isn’t lying, quite. It just seems that way to people who know more about the subject than the speaker does.

            • Lies:

              intentional Errors of Fact. In some contexts this is called bluffing. If the speaker thinks that lying serves a moral end, this would be a Pious Fraud.

            • Contrarian Argument:

              in science, espousing some thing that the speaker knows is generally ill-regarded, or even generally held to be disproven. For example, claiming that HIV is not the cause of AIDS, or claiming that homeopathic remedies are not just placebos. In politics, the phrase may be used more broadly, to mean espousing some position that the establishment or opposition party does not hold. This is sometimes done to make people think, and sometimes it is needling, or perhaps it supports an external agenda. But it can also be done just to oppose conformity, or as a pose or style choice: to be a “maverick” or lightning rod. Or, perhaps just for the ego of standing alone:

              “It is not enough to succeed. Friends must be seen to have failed.” — Truman Capote “If you want to prove yourself a brilliant scientist, you don’t always agree with the consensus. You show you’re right and everyone else is wrong.” — Daniel Kirk-Davidoff discussing Richard Lindzen

Calling someone contrarian risks the Psychogenetic Fallacy. People who are annoying are not necessarily wrong. On the other hand, if the position is ill-regarded for a reason, then defending it may be uphill. Trolling is Contrarian Argument done to get a reaction. Trolling on the Internet often involves pretense.

            • Hypothesis Contrary To Fact:

              arguing from something that might have happened, but didn’t.

            • Internal Contradiction:

              saying two contradictory things in the same argument. For example, claiming that Archaeopteryx is a dinosaur with hoaxed feathers, and also saying in the same book that it is a “true bird”. Or another author who said on page 59, “Sir Arthur Conan Doyle writes in his autobiography that he never saw a ghost.” But on page 200 we find “Sir Arthur’s first encounter with a ghost came when he was 25, surgeon of a whaling ship in the Arctic..” This is much like saying “I never borrowed his car, and it already had that dent when I got it.” This is related to Inconsistency.

            • Changing The Subject (Digression, Red Herring, Misdirection, False Emphasis):

              this is sometimes used to avoid having to defend a claim, or to avoid making good on a promise. In general, there is something you are not supposed to notice. For example, I got a bill which had a big announcement about how some tax had gone up by 5%, and the costs would have to be passed on to me. But a quick calculation showed that the increased tax was only costing me a dime, while a different part of the the bill had silently gone up by $10. This is connected to various diversionary tactics, which may be obstructive, obtuse, or needling. For example, if you quibble about the meaning of some word a person used, they may be quite happy about being corrected, since that means they’ve derailed you, or changed the subject. They may pick nits in your wording, perhaps asking you to define “is”. They may deliberately misunderstand you:

              “You said this happened five years before Hitler came to power. Why are you so fascinated with Hitler ? Are you anti-Semitic ?”

It is also connected to various rhetorical tricks, such as announcing that there cannot be a question period because the speaker must leave. (But then he doesn’t leave.)

            • Argument By Fast Talking:

              if you go from one idea to the next quickly enough, the audience won’t have time to think. This is connected to Changing The Subject and (to some audiences) Argument By Personal Charm. However, some psychologists say that to understand what you hear, you must for a brief moment believe it. If this is true, then rapid delivery does not leave people time to reject what they hear.

            • Having Your Cake (Failure To Assert, or Diminished Claim):

              almost claiming something, but backing out. For example, “It may be, as some suppose, that ghosts can only be seen by certain so-called sensitives, who are possibly special mutations with, perhaps, abnormally extended ranges of vision and hearing. Yet some claim we are all sensitives.” Another example: “I don’t necessarily agree with the liquefaction theory, nor do I endorse all of Walter Brown’s other material, but the geological statements are informative.” The strange thing here is that liquefaction theory (the idea that the world’s rocks formed in flood waters) was demolished in 1788. To “not necessarily agree” with it, today, is in the category of “not necessarily agreeing” with 2+2=3. But notice that writer implies some study of the matter, and only partial rejection. A similar thing is the failure to rebut. Suppose I raise an issue. The response that “Woodmorappe’s book talks about that” could possibly be a reference to a resounding rebuttal. Or perhaps the responder hasn’t even read the book yet. How can we tell ? [I later discovered it was the latter.]

            • Ambiguous Assertion:

              a statement is made, but it is sufficiently unclear that it leaves some sort of leeway. For example, a book about Washington politics did not place quotation marks around quotes. This left ambiguity about which parts of the book were first-hand reports and which parts were second-hand reports, assumptions, or outright fiction. Of course, lack of clarity is not always intentional. Sometimes a statement is just vague. If the statement has two different meanings, this is Amphiboly. For example, “Last night I shot a burglar in my pyjamas.”

            • Failure To State:

              if you make enough attacks, and ask enough questions, you may never have to actually define your own position on the topic.

            • Outdated Information:

              information is given, but it is not the latest information on the subject. For example, some creationist articles about the amount of dust on the moon quote a measurement made in the 1950’s. But many much better measurements have been done since then.

            • Amazing Familiarity:

              the speaker seems to have information that there is no possible way for him to get, on the basis of his own statements. For example: “The first man on deck, seaman Don Smithers, yawned lazily and fingered his good luck charm, a dried seahorse. To no avail ! At noon, the Sea Ranger was found drifting aimlessly, with every man of its crew missing without a trace !”

            • Least Plausible Hypothesis:

              ignoring all of the most reasonable explanations. This makes the desired explanation into the only one. For example: “I left a saucer of milk outside overnight. In the morning, the milk was gone. Clearly, my yard was visited by fairies.” There is an old rule for deciding which explanation is the most plausible. It is most often called “Occam’s Razor”, and it basically says that the simplest is the best. The current phrase among scientists is that an explanation should be “the most parsimonious”, meaning that it should not introduce new concepts (like fairies) when old concepts (like neighborhood cats) will do. On ward rounds, medical students love to come up with the most obscure explanations for common problems. A traditional response is to tell them “If you hear hoof beats, don’t automatically think of zebras”.

            • Argument By Scenario:

              telling a story which ties together unrelated material, and then using the story as proof they are related.

            • Affirming The Consequent:

              logic reversal. A correct statement of the form “if P then Q” gets turned into “Q therefore P”. For example, “All cats die; Socrates died; therefore Socrates was a cat.” Another example: “If the earth orbits the sun, then the nearer stars will show an apparent annual shift in position relative to more distant stars (stellar parallax). Observations show conclusively that this parallax shift does occur. This proves that the earth orbits the sun.” In reality, it proves that Q [the parallax] is consistent with P [orbiting the sun]. But it might also be consistent with some other theory. (Other theories did exist. They are now dead, because although they were consistent with a few facts, they were not consistent with all the facts.) Another example: “If space creatures were kidnapping people and examining them, the space creatures would probably hypnotically erase the memories of the people they examined. These people would thus suffer from amnesia. But in fact many people do suffer from amnesia. This tends to prove they were kidnapped and examined by space creatures.” This is also a Least Plausible Hypothesis explanation.

            • Moving The Goalposts (Raising The Bar, Argument By Demanding Impossible Perfection):

              if your opponent successfully addresses some point, then say he must also address some further point. If you can make these points more and more difficult (or diverse) then eventually your opponent must fail. If nothing else, you will eventually find a subject that your opponent isn’t up on. This is related to Argument By Question. Asking questions is easy: it’s answering them that’s hard. If each new goal causes a new question, this may get to be Infinite Regression. It is also possible to lower the bar, reducing the burden on an argument. For example, a person who takes Vitamin C might claim that it prevents colds. When they do get a cold, then they move the goalposts, by saying that the cold would have been much worse if not for the Vitamin C.

            • Appeal To Complexity:

              if the arguer doesn’t understand the topic, he concludes that nobody understands it. So, his opinions are as good as anybody’s.

            • Common Sense:

              unfortunately, there simply isn’t a common-sense answer for many questions. In politics, for example, there are a lot of issues where people disagree. Each side thinks that their answer is common sense. Clearly, some of these people are wrong. The reason they are wrong is because common sense depends on the context, knowledge and experience of the observer. That is why instruction manuals will often have paragraphs like these:

              When boating, use common sense. Have one life preserver for each person in the boat. When towing a water skier, use common sense. Have one person watching the skier at all times.

If the ideas are so obvious, then why the second sentence ? Why do they have to spell it out ? The answer is that “use common sense” actually meant “pay attention, I am about to tell you something that inexperienced people often get wrong.” Science has discovered a lot of situations which are far more unfamiliar than water skiing. Not surprisingly, beginners find that much of it violates their common sense. For example, many people can’t imagine how a mountain range would form. But in fact anyone can take good GPS equipment to the Himalayas, and measure for themselves that those mountains are rising today. If a speaker tells an audience that he supports using common sense, it is very possibly an Ambiguous Assertion.

            • Argument By Laziness (Argument By Uninformed Opinion):

              the arguer hasn’t bothered to learn anything about the topic. He nevertheless has an opinion, and will be insulted if his opinion is not treated with respect. For example, someone looked at a picture on one of my web pages, and made a complaint which showed that he hadn’t even skimmed through the words on the page. When I pointed this out, he replied that I shouldn’t have had such a confusing picture.

            • Disproof By Fallacy:

              if a conclusion can be reached in an obviously fallacious way, then the conclusion is incorrectly declared wrong. For example,

              “Take the division 64/16. Now, canceling a 6 on top and a six on the bottom, we get that 64/16 = 4/1 = 4.” “Wait a second ! You can’t just cancel the six !” “Oh, so you’re telling us 64/16 is not equal to 4, are you ?”

Note that this is different from Reductio Ad Absurdum, where your opponent’s argument can lead to an absurd conclusion. In this case, an absurd argument leads to a normal conclusion.

            • Reductio Ad Absurdum:

              showing that your opponent’s argument leads to some absurd conclusion. This is in general a reasonable and non-fallacious way to argue. If the issues are razor-sharp, it is a good way to completely destroy his argument. However, if the waters are a bit muddy, perhaps you will only succeed in showing that your opponent’s argument does not apply in all cases, That is, using Reductio Ad Absurdum is sometimes using the Fallacy Of The General Rule. However, if you are faced with an argument that is poorly worded, or only lightly sketched, Reductio Ad Absurdum may be a good way of pointing out the holes. An example of why absurd conclusions are bad things:

              Bertrand Russell, in a lecture on logic, mentioned that in the sense of material implication, a false proposition implies any proposition. A student raised his hand and said “In that case, given that 1 = 0, prove that you are the Pope”. Russell immediately replied, “Add 1 to both sides of the equation: then we have 2 = 1. The set containing just me and the Pope has 2 members. But 2 = 1, so it has only 1 member; therefore, I am the Pope.”

            • False Compromise:

              if one does not understand a debate, it must be “fair” to split the difference, and agree on a compromise between the opinions. (But one side is very possibly wrong, and in any case one could simply suspend judgment.) Journalists often invoke this fallacy in the name of “balanced” coverage.

              “Some say the sun rises in the east, some say it rises in the west; the truth lies probably somewhere in between.”

Television reporters like balanced coverage so much that they may give half of their report to a view held by a small minority of the people in question. There are many possible reasons for this, some of them good. However, viewers need to be aware of this tendency.

            • Fallacy Of The Crucial Experiment:

              claiming that some idea has been proved (or disproved) by a pivotal discovery. This is the “smoking gun” version of history. Scientific progress is often reported in such terms. This is inevitable when a complex story is reduced to a soundbite, but it’s almost always a distortion. In reality, a lot of background happens first, and a lot of buttressing (or retraction) happens afterwards. And in natural history, most of the theories are about how often certain things happen (relative to some other thing). For those theories, no one experiment could ever be conclusive.

            • Two Wrongs Make A Right (Tu Quoque, You Too, What’s sauce for the goose is sauce for the gander):

              a charge of wrongdoing is answered by a rationalization that others have sinned, or might have sinned. For example, Bill borrows Jane’s expensive pen, and later finds he hasn’t returned it. He tells himself that it is okay to keep it, since she would have taken his. War atrocities and terrorism are often defended in this way. Similarly, some people defend capital punishment on the grounds that the state is killing people who have killed. This is related to Ad Hominem (Argument To The Man).

            • Pious Fraud:

              a fraud done to accomplish some good end, on the theory that the end justifies the means. For example, a church in Canada had a statue of Christ which started to weep tears of blood. When analyzed, the blood turned out to be beef blood. We can reasonably assume that someone with access to the building thought that bringing souls to Christ would justify his small deception. In the context of debates, a Pious Fraud could be a lie. More generally, it would be when an emotionally committed speaker makes an assertion that is shaded, distorted or even fabricated. For example, British Prime Minister Tony Blair was accused in 2003 of “sexing up” his evidence that Iraq had Weapons of Mass Destruction. Around the year 400, Saint Augustine wrote two books, De Mendacio[On Lying] and Contra Medacium[Against Lying], on this subject. He argued that the sin isn’t in what you do (or don’t) say, but in your intent to leave a false impression. He strongly opposed Pious Fraud. I believe that Martin Luther also wrote on the subject.

  • www

Several of these have names in Latin, but I mostly ignored that and used English.

If anyone is bothered by my using “he” everywhere, note that “he” is the person arguing fallaciously.

  • Ad Hominem (Argument To The Man)
  • Affirming The Consequent
  • Amazing Familiarity
  • Ambiguous Assertion
  • Appeal To Anonymous Authority
  • Appeal To Authority
  • Appeal To Coincidence
  • Appeal To Complexity
  • Appeal To False Authority
  • Appeal To Force
  • Appeal To Pity (Appeal to Sympathy, The Galileo Argument)
  • Appeal To Widespread Belief (Bandwagon Argument, Peer Pressure, Appeal To Common Practice)
  • Argument By Dismissal
  • Argument By Emotive Language (Appeal To The People)
  • Argument By Fast Talking
  • Argument By Generalization
  • Argument By Gibberish (Bafflement)
  • Argument By Half Truth (Suppressed Evidence)
  • Argument By Laziness (Argument By Uninformed Opinion)
  • Argument By Personal Charm
  • Argument By Pigheadedness (Doggedness)
  • Argument By Poetic Language
  • Argument By Prestigious Jargon
  • Argument By Question
  • Argument By Repetition (Argument Ad Nauseam)
  • Argument by Rhetorical Question
  • Argument By Scenario
  • Argument By Selective Observation
  • Argument By Selective Reading
  • Argument By Slogan
  • Argument By Vehemence
  • Argument From Adverse Consequences (Appeal
    To Fear, Scare Tactics)
  • Argument From Age (Wisdom of the Ancients)
  • Argument From Authority
  • Argument From False Authority
  • Argument From Personal Astonishment
  • Argument From Small Numbers
  • Argument From Spurious Similarity
  • Argument Of The Beard
  • Argument To The Future
  • Bad Analogy
  • Begging The Question (Assuming The Answer, Tautology)
  • Burden Of Proof
  • Causal Reductionism (Complex Cause)
  • Contrarian Argument
  • Changing The Subject (Digression, Red
    Herring, Misdirection, False Emphasis)
  • Cliche Thinking
  • Common Sense
  • Complex Question (Tying)
  • Confusing Correlation And Causation
  • Disproof By Fallacy
  • Equivocation
  • Error Of Fact
  • Euphemism
  • Exception That Proves The Rule
  • Excluded Middle (False Dichotomy, Faulty Dilemma, Bifurcation)
  • Extended Analogy
  • Failure To State
  • Fallacy Of Composition
  • Fallacy Of Division
  • Fallacy Of The General Rule
  • Fallacy Of The Crucial Experiment
  • False Cause
  • False Compromise
  • Genetic Fallacy (Fallacy of Origins, Fallacy of Virtue)
  • Having Your Cake (Failure To Assert, or Diminished Claim)
  • Hypothesis Contrary To Fact
  • Inconsistency
  • Inflation Of Conflict
  • Internal Contradiction
  • Least Plausible Hypothesis
  • Lies
  • Meaningless Questions
  • Misunderstanding The Nature Of Statistics (Innumeracy)
  • Moving The Goalposts (Raising The Bar, Argument By Demanding Impossible Perfection)
  • Needling
  • Non Sequitur
  • Not Invented Here
  • Outdated Information
  • Pious Fraud
  • Poisoning The Wells
  • Psychogenetic Fallacy
  • Reductio Ad Absurdum
  • Reductive Fallacy (Oversimplification)
  • Reifying
  • Short Term Versus Long Term
  • Slippery Slope Fallacy (Camel’s Nose)
  • Special Pleading (Stacking The Deck)
  • Statement Of Conversion
  • Stolen Concept
  • Straw Man (Fallacy Of Extension)
  • Two Wrongs Make A Right (Tu Quoque, You Too)
  • Weasel Wording

Some other Web sites:

LiteratureReviewHQ interviewed me about this page, and have a
podcast.


  • Ad Hominem (Argument To The Man):

    attacking the person instead of attacking his argument. For example,
    “Von Daniken’s books about ancient astronauts are worthless
    because he is a convicted forger and embezzler.” (Which is true,
    but that’s not why they’re worthless.)

    Another example is this syllogism, which alludes to Alan Turing’s
    homosexuality:

    Turing thinks machines think.

    Turing lies with men.

    Therefore, machines don’t think.

    (Note the equivocation in the use of
    the word “lies”.)

    A common form is an attack on sincerity. For example, “How can
    you argue for vegetarianism when you wear leather shoes ?” The two wrongs make a right fallacy is related.

    A variation (related to Argument By
    Generalization) is to attack a whole class of people. For example,
    “Evolutionary biology is a sinister tool of the materialistic,
    atheistic religion of Secular Humanism.” Similarly, one notorious
    net.kook waved away a whole category of evidence by announcing
    “All the scientists were drunk.”

    Another variation is attack by innuendo: “Why don’t scientists
    tell us what they really know; are they afraid of public panic ?”

    There may be a pretense that the attack isn’t happening: “In
    order to maintain a civil debate, I will not mention my opponent’s
    drinking problem.” Or “I don’t care if other people say
    you’re [opinionated/boring/overbearing].”

    Attacks don’t have to be strong or direct. You can merely show
    disrespect, or cut down his stature by saying that he seems to be
    sweating a lot, or that he has forgotten what he said last week. Some
    examples: “I used to think that way when I was your age.”
    “You’re new here, aren’t you ?” “You weren’t breast fed
    as a child, were you ?” “What drives you to make such a
    statement ?” “If you’d just listen..” “You seem
    very emotional.” (This last works well if you have been hogging
    the microphone, so that they have had to yell to be heard.)

    Sometimes the attack is on the other person’s intelligence. For
    example, “If you weren’t so stupid you would have no problem
    seeing my point of view.” Or, “Even you should understand my
    next point.”

    Oddly, the stupidity attack is sometimes reversed. For example,
    dismissing a comment with “Well, you’re just smarter than the
    rest of us.” (In Britain, that might be put as “too clever
    by half”.) This is Dismissal By Differentness. It is related to
    Not Invented Here and Changing The Subject.

    Ad Hominem is not fallacious if the attack goes to the credibility
    of the argument. For instance, the argument may depend on its
    presenter’s claim that he’s an expert. (That is, the Ad Hominem is
    undermining an Argument From Authority.) Trial
    judges allow this category of attacks.

  • Needling:

    simply attempting to make the other person angry, without trying to
    address the argument at hand. Sometimes this is a delaying tactic.

    Needling is also Ad Hominem if you insult
    your opponent. You may instead insult something the other person
    believes in (“Argumentum Ad YourMomium”), interrupt, clown to show
    disrespect, be noisy, fail to pass over the microphone, and numerous
    other tricks. All of these work better if you are running things – for
    example, if it is your radio show, and you can cut off the other
    person’s microphone. If the host or moderator is firmly on your side,
    that is almost as good as running the show yourself. It’s even better
    if the debate is videotaped, and you are the person who will edit the
    video.

    If you wink at the audience, or in general clown in their
    direction, then we are shading over to Argument By
    Personal Charm.

    Usually, the best way to cope with insults is to show mild
    amusement, and remain polite. A humorous comeback will probably work
    better than an angry one.

  • Straw Man (Fallacy Of Extension):

    attacking an exaggerated or caricatured version of your opponent’s
    position.

    For example, the claim that “evolution means a dog giving
    birth to a cat.”

    Another example: “Senator Jones says that we should not fund
    the attack submarine program. I disagree entirely. I can’t understand
    why he wants to leave us defenseless like that.”

    On the Internet, it is common to exaggerate the opponent’s position
    so that a comparison can be made between the opponent and Hitler.

  • Inflation Of Conflict:

    arguing that scholars debate a certain point. Therefore, they must
    know nothing, and their entire field of knowledge is “in
    crisis” or does not properly exist at all.

    For example, two historians debated whether Hitler killed five
    million Jews or six million Jews. A Holocaust denier argued that this
    disagreement made his claim credible, even though his death
    count is three to ten times smaller than the known minimum.

    Similarly, in “The Mythology of Modern Dating Methods”
    (John Woodmorappe, 1999) we find on page 42 that two scientists
    “cannot agree” about which one of two geological dates is
    “real” and which one is “spurious”. Woodmorappe
    fails to mention that the two dates differ by less than one percent.

  • Argument From Adverse Consequences (Appeal To Fear, Scare Tactics):

    saying an opponent must be wrong, because if he is right, then
    bad things would ensue. For example: God must exist, because
    a godless society would be lawless and dangerous. Or: the
    defendant in a murder trial must be found guilty, because
    otherwise husbands will be encouraged to murder their wives.

    Wishful thinking is closely related. “My home in Florida is
    one foot above sea level. Therefore I am certain that
    global warming
    will not make the oceans rise by fifteen feet.” Of course, wishful
    thinking can also be about positive consequences, such as winning the
    lottery, or eliminating poverty and crime.

  • Special Pleading (Stacking The Deck):

    using the arguments that support your position, but ignoring or
    somehow disallowing the arguments against.

    Uri Geller used special pleading when he claimed that the presence
    of unbelievers (such as stage magicians) made him unable to
    demonstrate his psychic powers.

  • Excluded Middle (False Dichotomy, Faulty Dilemma, Bifurcation):

    assuming there are only two alternatives when in fact there are more.
    For example, assuming Atheism is the only alternative to
    Fundamentalism, or being a traitor is the only alternative to being a
    loud patriot.

  • Short Term Versus Long Term:

    this is a particular case of the Excluded
    Middle. For example, “We must deal with crime on the streets
    before improving the schools.” (But why can’t we do some of
    both ?) Similarly, “We should take the scientific research budget
    and use it to feed starving children.”

  • Burden Of Proof:

    the claim that whatever has not yet been proved false must be true
    (or vice versa). Essentially the arguer claims that he should win by
    default if his opponent can’t make a strong enough case.

    There may be three problems here. First, the arguer claims priority,
    but can he back up that claim ? Second, he is impatient with
    ambiguity, and wants a final answer right away. And third,
    “absence of evidence is not evidence of absence.”

  • Argument By Question:

    asking your opponent a question which does not have a snappy
    answer. (Or anyway, no snappy answer that the audience has the
    background to understand.) Your opponent has a choice: he can look
    weak or he can look long-winded. For example, “How can
    scientists expect us to believe that anything as complex as a single
    living cell could have arisen as a result of random natural
    processes ?”

    Actually, pretty well any question has this effect to some extent.
    It usually takes longer to answer a question than ask it.

    Variants are the rhetorical
    question, and the loaded question, such as “Have you
    stopped beating your wife ?”

  • Argument by Rhetorical Question:

    asking a question in a way that leads to a particular answer. For
    example, “When are we going to give the old folks of this country
    the pension they deserve ?” The speaker is leading the audience to
    the answer “Right now.” Alternatively, he could have said
    “When will we be able to afford a major increase in old age
    pensions?” In that case, the answer he is aiming at is almost
    certainly not “Right now.”

  • Fallacy Of The General Rule:

    assuming that something true in general is true in every possible
    case. For example, “All chairs have four legs.” Except that
    rocking chairs don’t have any legs, and what is a one-legged
    “shooting stick” if it isn’t a chair ?

    Similarly, there are times when certain laws should be broken. For
    example, ambulances are allowed to break speed laws.

  • Reductive Fallacy (Oversimplification):

    over-simplifying. As Einstein said, everything should be made as
    simple as possible, but no simpler. Political slogans such as
    “Taxation is theft” fall in this category.

  • Genetic Fallacy (Fallacy of Origins, Fallacy of Virtue):

    if an argument or arguer has some particular origin, the argument must
    be right (or wrong). The idea is that things from that origin, or that
    social class, have virtue or lack virtue. (Being poor or being rich
    may be held out as being virtuous.) Therefore, the actual details of
    the argument can be overlooked, since correctness can be decided
    without any need to listen or think.

  • Psychogenetic Fallacy:

    if you learn the psychological reason why your opponent likes an argument,
    then he’s biased, so his argument must be wrong.

  • Argument Of The Beard:

    assuming that two ends of a spectrum are the same, since one can
    travel along the spectrum in very small steps. The name comes from
    the idea that being clean-shaven must be the same as having a big
    beard, since in-between beards exist.

    Similarly, all piles of stones are small, since if you add one
    stone to a small pile of stones it remains small.

    However, the existence of pink should not undermine the distinction
    between white and red.

  • Argument From Age (Wisdom of the Ancients):

    snobbery that very old (or very young) arguments are superior. This
    is a variation of the Genetic Fallacy, but has
    the psychological appeal of seniority and tradition (or innovation).

    Products labelled “New ! Improved !” are appealing to a
    belief that innovation is of value for such products. It’s sometimes
    true. And then there’s cans of “Old Fashioned Baked Beans”.

  • Not Invented Here:

    ideas from elsewhere are made unwelcome. “This Is The Way We’ve
    Always Done It.”

    This fallacy is a variant of the Argument From
    Age. It gets a psychological boost from feelings that local ways
    are superior, or that local identity is worth any cost, or that
    innovations will upset matters.

    An example of this is the common assertion that America has
    “the best health care system in the world”, an idea that
    this 2007
    New
    York Times editorial
    refuted.

    People who use the Not Invented Here argument are sometimes accused
    of being stick-in-the-mud’s.

    Conversely, foreign and “imported” things may be held out
    as superior.

  • Argument By Dismissal:

    an idea is rejected without saying why.

    Dismissals usually have overtones. For example, “If you don’t
    like it, leave the country” implies that your cause is hopeless,
    or that you are unpatriotic, or that your ideas
    are foreign, or maybe all three. “If you don’t
    like it, live in a Communist country” adds
    an emotive element.

  • Argument To The Future:

    arguing that evidence will someday be discovered which will (then)
    support your point.

  • Poisoning The Wells:

    discrediting the sources used by your opponent. This is a variation of
    Ad Hominem.

  • Argument By Emotive Language (Appeal To The People):

    using emotionally loaded words to sway the audience’s sentiments
    instead of their minds. Many emotions can be useful: anger, spite,
    envy, condescension, and so on.

    For example, argument by condescension: “Support the ERA ?
    Sure, when the women start paying for the drinks! Hah! Hah!”

    Americans who don’t like the Canadian medical system have referred
    to it as “socialist”, but I’m not quite sure if this is
    intended to mean “foreign”, or “expensive”, or
    simply guilty by association.

    Cliche Thinking and Argument By Slogan are useful adjuncts,
    particularly if you can get the audience to chant the slogan. People
    who rely on this argument may seed the audience with supporters or
    “shills”, who laugh, applaud or chant at proper
    moments. This is the live-audience equivalent of adding a laugh track
    or music track. Now that many venues have video equipment, some
    speakers give part of their speech by playing a prepared video. These
    videos are an opportunity to show a supportive audience, use emotional
    music, show emotionally charged images, and the like. The idea is old:
    there used to be professional cheering sections. (Monsieur Zig-Zag,
    pictured on the cigarette rolling papers, acquired his fame by
    applauding for money at the Paris Opera.)

    If the emotion in question isn’t harsh, Argument
    By Poetic Language helps the effect. Flattering the audience
    doesn’t hurt either.

  • Argument By Personal Charm:

    getting the audience to cut you slack. Example: Ronald Reagan. It
    helps if you have an opponent with much less personal charm.

    Charm may create trust, or the desire to “join the winning
    team”, or the desire to please the speaker. This last is greatest
    if the audience feels sex appeal.

    Reportedly George W. Bush lost a debate when he was young, and said
    later that he would never be “out-bubba’d” again.

  • Appeal To Pity (Appeal to Sympathy, The Galileo Argument):

    “I did not murder my mother and father with an axe ! Please don’t
    find me guilty; I’m suffering enough through being an orphan.”

    Some authors want you to know they’re suffering for their beliefs.
    For example, “Scientists scoffed at Copernicus and Galileo; they
    laughed at Edison, Tesla and Marconi; they won’t give my ideas a fair
    hearing either. But time will be the judge. I can wait; I am
    patient; sooner or later science will be forced to admit that all
    matter is built, not of atoms, but of tiny capsules of TIME.”

    There is a strange variant which shows up on Usenet. Somebody
    refuses to answer questions about their claims, on the grounds that
    the asker is mean and has hurt their feelings. Or, that the question
    is personal.

  • Appeal To Force:

    threats, or even violence. On the Net, the usual threat is of a
    lawsuit. The traditional religious threat is that one will burn in
    Hell. However, history is full of instances where expressing an
    unpopular idea could you get you beaten up on the spot, or worse.

    “The clinching proof of my reasoning is that I
    will cut anyone who argues further into dogmeat.”

    — Attributed to Sir Geoffery de Tourneville, ca 1350 A.D.

  • Argument By Vehemence:

    being loud. Trial lawyers are taught this rule:

    If you have the facts, pound on the facts.

    If you have the law, pound on the law.

    If you don’t have either, pound on the table.

    The above rule paints vehemence as an act of desperation. But it can
    also be a way to seize control of the agenda, use up the opponent’s
    time, or just intimidate the easily cowed. And it’s not necessarily
    aimed at winning the day. A tantrum or a fit is also a way to get a
    reputation, so that in the future, no one will mess with you.

    This is related to putting a post in UPPERCASE, aka SHOUTING.

    Depending on what you’re loud about, this may also be
    an Appeal To Force, Argument
    By Emotive Language, Needling,
    or Changing The Subject.

  • Begging The Question (Assuming The Answer, Tautology):

    reasoning in a circle. The thing to be proved is used as one of your
    assumptions. For example: “We must have a death penalty to
    discourage violent crime”. (This assumes it discourages crime.)
    Or, “The stock market fell because of a technical
    adjustment.” (But is an “adjustment” just a stock
    market fall ?)

  • Stolen Concept:

    using what you are trying to disprove. That is, requiring the truth of
    something for your proof that it is false. For example, using science
    to show that science is wrong. Or, arguing that you do not exist, when
    your existence is clearly required for you to be making the argument.

    This is a relative of Begging The Question,
    except that the circularity there is in what you are trying to prove,
    instead of what you are trying to disprove.

    It is also a relative of Reductio Ad Absurdum,
    where you temporarily assume the truth of something.

  • Argument From Authority:

    the claim that the speaker is an expert, and so should be trusted.

    There are degrees and areas of expertise. The speaker is actually
    claiming to be more expert, in the relevant subject area, than
    anyone else in the room. There is also an implied claim that expertise
    in the area is worth having. For example, claiming expertise in
    something hopelessly quack
    (like iridology)
    is actually an admission that the speaker is gullible.

  • Argument From False Authority:

    a strange variation on Argument From Authority.
    For example, the TV commercial which starts “I’m not a doctor,
    but I play one on TV.” Just what are we supposed to conclude ?

  • Appeal To Anonymous Authority:

    an Appeal To Authority is made, but the
    authority is not named. For example, “Experts agree that
    ..”, “scientists say ..” or even “they say
    ..”. This makes the information impossible to verify, and brings
    up the very real possibility that the arguer himself doesn’t know who
    the experts are. In that case, he may just be spreading a rumor.

    The situation is even worse if the arguer admits it’s a rumor.

  • Appeal To Authority:

    “Albert Einstein was extremely impressed with this theory.”
    (But a statement made by someone long-dead could be out of date. Or
    perhaps Einstein was just being polite. Or perhaps he made his
    statement in some specific context. And so on.)

    To justify an appeal, the arguer should at least present an exact
    quote. It’s more convincing if the quote contains context, and if the
    arguer can say where the quote comes from.

    A variation is to appeal to unnamed
    authorities .

    There was a New Yorker cartoon, showing a doctor and patient. The
    doctor was saying: “Conventional medicine has no treatment for
    your condition. Luckily for you, I’m a quack.” So the joke was
    that the doctor boasted of his lack of authority.

  • Appeal To False Authority:

    a variation on
    Appeal To Authority, but the
    Authority
    is outside his area of expertise.

    For example, “Famous physicist John Taylor studied Uri Geller extensively and found no
    evidence of trickery or fraud in his feats.” Taylor was not
    qualified to detect trickery or fraud of the kind used by stage
    magicians. Taylor later admitted Geller had tricked him, but he
    apparently had not figured out how.

    A variation is to appeal to a non-existent authority. For example,
    someone reading an article by Creationist Dmitri Kuznetsov tried to
    look up the referenced articles. Some of the articles turned out to be
    in non-existent journals.

    Another variation is to misquote
    a real authority. There are several kinds of misquotation. A quote
    can be inexact or have been edited. It can be taken out of
    context. (Chevy Chase: “Yes, I said that, but I was singing a
    song written by someone else at the time.”) The quote can be
    separate quotes which the arguer glued together. Or, bits might have
    gone missing. For example, it’s easy to prove that Mick Jagger is an
    assassin. In “Sympathy For The Devil” he sang: “I
    shouted out, who killed the Kennedys, When after all, it was …
    me.”

  • Statement Of Conversion:

    the speaker says “I used to believe in X”.

    This is simply a weak form of asserting expertise. The speaker is
    implying that he has learned about the subject, and now that he is
    better informed, he has rejected X. So perhaps he is now an
    authority, and this is an implied Argument From
    Authority.

    A more irritating version of this is “I used to think that way
    when I was your age.” The speaker hasn’t said what is wrong with
    your argument: he is merely claiming that his age has made him an
    expert.

    “X” has not actually been countered unless there is
    agreement that the speaker has that expertise. In general, any bald
    claim always has to be buttressed.

    For example, there are a number of Creationist authors who say they
    “used to be evolutionists”, but the scientists who have
    rated their books haven’t noticed any expertise about evolution.

  • Bad Analogy:

    claiming that two situations are highly similar, when they aren’t.
    For example, “The solar system reminds me of an atom, with
    planets orbiting the sun like electrons orbiting the nucleus. We know
    that electrons can jump from orbit to orbit; so we must look to
    ancient records for sightings of planets jumping from orbit to orbit
    also.”

    Or, “Minds, like rivers, can be broad. The broader the river,
    the shallower it is. Therefore, the broader the mind, the shallower it
    is.”

    Or, “We have pure food and drug laws; why can’t we have laws
    to keep movie-makers from giving us filth ?”

  • Extended Analogy:

    the claim that two things, both analogous to a third thing, are
    therefore analogous to each other. For example, this debate:

    “I believe it is always wrong to oppose the law by breaking
    it.”
    “Such a position is odious: it implies that you would not
    have supported Martin Luther King.”
    “Are you saying that cryptography legislation is as important
    as the struggle for Black liberation ? How dare you !”

    A person who advocates a particular position (say, about gun
    control) may be told that Hitler believed the same thing. The clear
    implication is that the position is somehow tainted. But Hitler also
    believed that window drapes should go all the way to the floor. Does
    that mean people with such drapes are monsters ?

  • Argument From Spurious Similarity:

    this is a relative of Bad Analogy. It is
    suggested that some resemblance is proof of a relationship. There is
    a WW II story about a British lady who was trained in spotting German
    airplanes. She made a report about a certain very important type of
    plane. While being quizzed, she explained that she hadn’t been sure,
    herself, until she noticed that it had a little man in the cockpit,
    just like the little model airplane at the training class.

  • Reifying:

    an abstract thing is talked about as if it were concrete. (A possibly
    Bad Analogy is being made between concept and
    reality.) For example, “Nature abhors a vacuum.”

  • False Cause:

    assuming that because two things happened, the first one caused the
    second one. (Sequence is not causation.) For example, “Before
    women got the vote, there were no nuclear weapons.” Or,
    “Every time my brother Bill accompanies me to Fenway Park, the
    Red Sox are sure to lose.”

    Essentially, these are arguments that the sun goes down because
    we’ve turned on the street lights.

  • Confusing Correlation And Causation:

    earthquakes in the Andes were correlated with the closest approaches
    of the planet Uranus. Therefore, Uranus must have caused them. (But
    Jupiter is nearer than Uranus, and more massive too.)

    When sales of hot chocolate go up, street crime drops. Does this
    correlation mean that hot chocolate prevents crime ? No, it means that
    fewer people are on the streets when the weather is cold.

    The bigger a child’s shoe size, the better the child’s handwriting.
    Does having big feet make it easier to write ? No, it means the child
    is older.

  • Causal Reductionism (Complex Cause):

    trying to use one cause to explain something, when in fact it had
    several causes. For example, “The accident was caused by the taxi
    parking in the street.” (But other drivers went around the
    taxi. Only the drunk driver hit the taxi.)

  • Cliche Thinking:

    using as evidence a well-known wise saying, as if that is proven,
    or as if it has no exceptions.

  • Exception That Proves The Rule:

    a specific example of Cliche Thinking. This
    is used when a rule has been asserted, and someone points out the rule
    doesn’t always work. The cliche rebuttal is that this is “the
    exception that proves the rule”. Many people think that this
    cliche somehow allows you to ignore the exception, and continue using
    the rule.

    In fact, the cliche originally did no such thing. There are two standard
    explanations for the original meaning.

    The first is that the word “prove” meant
    test. That is why the military takes its equipment to a
    Proving Ground to test it. So, the cliche originally said that
    an exception tests a rule. That is, if you find an exception to a
    rule, the cliche is saying that the rule is being tested, and perhaps
    the rule will need to be discarded.

    The second explanation is that the stating of an exception to a
    rule, proves that the rule exists. For example, suppose it was
    announced that “Over the holiday weekend, students do not need to
    be in the dorms by midnight”. This announcement implies that
    normally students do have to be in by midnight. Here is a discussion
    of that explanation.

    In either case, the cliche is not about waving away objections.

  • Appeal To Widespread Belief (Bandwagon Argument, Peer Pressure, Appeal to Common Practice):

    the claim, as evidence for an idea, that many people believe it, or
    used to believe it, or do it.

    If the discussion is about social conventions, such as “good
    manners”, then this is a reasonable line of argument.

    However, in the 1800’s there was a widespread belief that
    bloodletting cured sickness. All of these people were not just wrong,
    but horribly wrong, because in fact it made people sicker. Clearly,
    the popularity of an idea is no guarantee that it’s right.

    Similarly, a common justification for bribery is that
    “Everybody does it”. And in the past, this was a
    justification for slavery.

  • Fallacy Of Composition:

    assuming that a whole has the same simplicity as its constituent
    parts. In fact, a great deal of science is the study of emergent
    properties
    . For example, if you put a drop of oil on water, there
    are interesting optical effects. But the effect comes from the
    oil/water system: it does not come just from the oil or just from the
    water.

    Another example: “A car makes less pollution than a
    bus. Therefore, cars are less of a pollution problem than buses.”

    Another example: “Atoms are colorless. Cats are made of atoms,
    so cats are colorless.”

  • Fallacy Of Division:

    assuming that what is true of the whole is true of each constituent
    part. For example, human beings are made of atoms, and human beings
    are conscious, so atoms must be conscious.

  • Complex Question (Tying):

    unrelated points are treated as if they should be accepted or
    rejected together. In fact, each point should be accepted or rejected
    on its own merits.

    For example, “Do you support freedom and the right to bear
    arms ?”

  • Slippery Slope Fallacy (Camel’s Nose)

    there is an old saying about how if you allow a camel to poke his
    nose into the tent, soon the whole camel will follow.

    The fallacy here is the assumption that something is wrong because
    it is right next to something that is wrong. Or, it is wrong because
    it could slide towards something that is wrong.

    For example, “Allowing abortion in the first week of pregnancy
    would lead to allowing it in the ninth month.” Or, “If we
    legalize marijuana, then more people will try heroin.” Or,
    “If I make an exception for you then I’ll have to make an
    exception for everyone.”

  • Argument By Pigheadedness (Doggedness):

    refusing to accept something after everyone else thinks it is
    well enough proved. For example, there are still Flat Earthers.

  • Appeal To Coincidence:

    asserting that some fact is due to chance. For example, the arguer
    has had a dozen traffic accidents in six months, yet he insists they
    weren’t his fault. This may be Argument By
    Pigheadedness. But on the other hand, coincidences do happen, so
    this argument is not always fallacious.

  • Argument By Repetition (Argument Ad Nauseam):

    if you say something often enough, some people will begin to believe
    it. There are some net.kooks who keeping reposting the same articles to
    Usenet, presumably in hopes it will have that effect.

  • Argument By Half Truth (Suppressed Evidence):

    this is hard to detect, of course. You have to ask questions. For
    example, an amazingly accurate “prophecy” of the
    assassination attempt on President Reagan was shown on TV. But was the
    tape recorded before or after the event ? Many stations did not ask
    this question. (It was recorded afterwards.)

    A book on “sea mysteries” or the “Bermuda
    Triangle” might tell us that the yacht Connemara IV was found
    drifting crewless, southeast of Bermuda, on September 26, 1955. None
    of these books mention that the yacht had been directly in the path of
    Hurricane Iona, with 180 mph winds and 40-foot waves.

  • Argument By Selective Observation:

    also called cherry picking, the enumeration of favorable
    circumstances, or as the philosopher Francis Bacon described it,
    counting the hits and forgetting the misses. For example, a state
    boasts of the Presidents it has produced, but is silent about its
    serial killers. Or, the claim “Technology brings
    happiness”. (Now, there’s something with hits and misses.)

    Casinos encourage this human tendency. There are bells and whistles
    to announce slot machine jackpots, but losing happens silently. This
    makes it much easier to think that the odds of winning are good.

  • Argument By Selective Reading:

    making it seem as if the weakest of an opponent’s arguments was the
    best he had. Suppose the opponent gave a strong argument X and also a
    weaker argument Y. Simply rebut Y and then say the opponent has made
    a weak case.

    This is a relative of Argument By Selective
    Observation, in that the arguer overlooks arguments that he does
    not like. It is also related to Straw Man (Fallacy Of
    Extension), in that the opponent’s argument is not being fairly
    represented.

  • Argument By Generalization:

    drawing a broad conclusion from a small number of perhaps
    unrepresentative cases. (The cases may be unrepresentative because of
    Selective Observation.) For example,
    “They say 1 out of every 5 people is Chinese. How is this
    possible ? I know hundreds of people, and none of them is
    Chinese.” So, by generalization, there aren’t any Chinese
    anywhere. This is connected to the Fallacy Of
    The General Rule.

    Similarly, “Because we allow terminally ill patients to use
    heroin, we should allow everyone to use heroin.”

    It is also possible to under-generalize. For example,

    “A man who had killed both of his grandmothers declared himself
    rehabilitated, on the grounds that he could not conceivably repeat his
    offense in the absence of any further grandmothers.”
    — “Ports Of Call” by Jack Vance

  • Argument From Small Numbers:

    “I’ve thrown three sevens in a row. Tonight I can’t lose.”
    This is Argument By Generalization, but it
    assumes that small numbers are the same as big numbers. (Three sevens
    is actually a common occurrence. Thirty three sevens is not.)

    Or: “After treatment with the drug, one-third of the mice were
    cured, one-third died, and the third mouse escaped.” Does this mean
    that if we treated a thousand mice, 333 would be cured ? Well, no.

  • Misunderstanding The Nature Of Statistics (Innumeracy):

    President Dwight Eisenhower expressed astonishment and alarm on
    discovering that fully half of all Americans had below average
    intelligence. Similarly, some people get fearful when they learn that
    their doctor wasn’t in the top half of his class. (But that’s half of
    them.)

    “Statistics show that of those who contract the habit of
    eating, very few survive.” — Wallace Irwin.

    Very few people seem to understand “regression to the
    mean”. This is the idea that things tend to go back to normal.
    If you feel normal today, does it really mean that the headache cure
    you took yesterday performed wonders ? Or is it just that your
    headaches are always gone the next day ?

    Journalists are notoriously bad at reporting risks. For example, in
    1995 it was loudly reported that a class of contraceptive pills would
    double the chance of dangerous blood clots. The news stories mostly
    did not mention that “doubling” the risk only increased it
    by one person in 7,000. The “cell phones cause brain cancer”
    reports are even sillier, with the supposed increase in risk being at
    most one or two cancers per 100,000 people per year. So, if the
    fearmongers are right, your cellphone has increased your risk from
    “who cares” to “who cares”.

  • Inconsistency:

    for example, the declining life expectancy in the former Soviet Union
    is due to the failures of communism. But, the quite high infant
    mortality rate in the United States is not a failure of capitalism.

    This is related to Internal Contradiction.

  • Non Sequitur:

    something that just does not follow. For example, “Tens of thousands
    of Americans have seen lights in the night sky which they could not
    identify. The existence of life on other planets is fast becoming
    certainty !”

    Another example: arguing at length that your religion is of great
    help to many people. Then, concluding that the teachings of your
    religion are undoubtably true.

    Or: “Bill lives in a large building, so his apartment must be
    large.”

  • Meaningless Questions:

    irresistible forces meeting immovable objects, and the like.

  • Argument By Poetic Language:

    if it sounds good, it must be right. Songs often use this effect to
    create a sort of credibility – for example, “Don’t Fear The
    Reaper” by Blue Oyster Cult. Politically oriented songs should be
    taken with a grain of salt, precisely because they sound good.

  • Argument By Slogan:

    if it’s short, and connects to an argument, it must be an
    argument. (But slogans risk the Reductive
    Fallacy.)

    Being short, a slogan increases the effectiveness of Argument By Repetition. It also helps Argument By Emotive Language (Appeal To The
    People), since emotional appeals need to be punchy. (Also, the
    gallery can chant a short slogan.) Using an old slogan is Cliche Thinking.

  • Argument By Prestigious Jargon:

    using big complicated words so that you will seem to be an expert.
    Why do people use “utilize” when they could utilize
    “use” ?

    For example, crackpots used to claim they had a Unified Field
    Theory (after Einstein). Then the word Quantum was popular. Lately it
    seems to be Zero Point Fields.

  • Argument By Gibberish (Bafflement):

    this is the extreme version of Argument By
    Prestigious Jargon. An invented vocabulary helps the effect, and
    some net.kooks use lots of CAPitaLIZation. However, perfectly ordinary
    words can be used to baffle. For example, “Omniscience is greater
    than omnipotence, and the difference is two. Omnipotence plus two
    equals omniscience. META = 2.” [From R. Buckminster Fuller’s
    No More Secondhand God.]Gibberish may come from people who can’t find meaning in technical
    jargon, so they think they should copy style instead of meaning. It
    can also be a “snow job”, AKA “baffle them with
    BS”, by someone actually familiar with the jargon. Or it could be
    Argument By Poetic Language.

    An example of poetic gibberish: “Each autonomous individual
    emerges holographically within egoless ontological consciousness as a
    non-dimensional geometric point within the transcendental thought-wave
    matrix.”

  • Equivocation:

    using a word to mean one thing, and then later using it to mean
    something different. For example, sometimes “Free software”
    costs nothing, and sometimes it is without restrictions. Some examples:

    “The sign said ‘fine for parking here’, and since it was
    fine, I parked there.”

    All trees have bark.

    All dogs bark.

    Therefore, all dogs are trees.

    “Consider that two wrongs never make a right, but that three
    lefts do.”

    – “Deteriorata”, National Lampoon

  • Euphemism:

    the use of words that sound better. The lab rat wasn’t killed, it was
    sacrificed. Mass murder wasn’t genocide, it was ethnic
    cleansing
    . The death of innocent bystanders is collateral
    damage
    . Microsoft doesn’t find bugs, or problems, or security
    vulnerabilities: they just discover an issue with a piece of
    software.

    This is related to Argument By Emotive
    Language, since the effect is to make a concept emotionally
    palatable.

  • Weasel Wording:

    this is very much like Euphemism, except that
    the word changes are done to claim a new, different concept rather
    than soften the old concept. For example, an American President may
    not legally conduct a war without a declaration of Congress. So,
    various Presidents have conducted “police actions”,
    “armed incursions”, “protective reaction strikes,”
    “pacification,” “safeguarding American interests,”
    and a wide variety of “operations”. Similarly, War
    Departments have become Departments of Defense, and untested medicines
    have become alternative medicines. The book “1984” has some
    particularly good examples.

  • Error Of Fact:

    for example, “No one knows how old the Pyramids of Egypt
    are.” (Except, of course, for the historians who’ve read records
    and letters written by the ancient Egyptians themselves.)

    Typically, the presence of one error means that there are other
    errors to be uncovered.

  • Argument From Personal Astonishment:

    Errors of Fact caused by stating offhand opinions
    as proven facts. (The speaker’s thought process being “I don’t
    see how this is possible, so it isn’t.”) An example from
    Creationism is given
    here.

    This isn’t lying, quite. It just seems that way
    to people who know more about the subject than the speaker does.

  • Lies:

    intentional Errors of Fact. In some contexts this is called bluffing.

    If the speaker thinks that lying serves a moral end, this would be
    a Pious Fraud.

  • Contrarian Argument:

    in science, espousing some thing that the speaker knows is generally
    ill-regarded, or even generally held to be disproven. For example,
    claiming that HIV is not the cause of AIDS, or claiming that
    homeopathic remedies are not just placebos.

    In politics, the phrase may be used more broadly, to mean espousing
    some position that the establishment or opposition party does not
    hold.

    This is sometimes done to make people think, and sometimes it
    is needling, or perhaps it supports an
    external agenda. But it can also be done just to oppose conformity, or
    as a pose or style choice: to be a “maverick” or lightning
    rod. Or, perhaps just for the ego of standing alone:

    “It is not enough to succeed. Friends must be seen to have
    failed.”
    — Truman Capote

    “If you want to prove yourself a brilliant scientist, you
    don’t always agree with the consensus. You show you’re right and
    everyone else is wrong.”
    — Daniel Kirk-Davidoff discussing Richard Lindzen

    Calling someone contrarian risks the
    Psychogenetic Fallacy. People who are annoying
    are not necessarily wrong. On the other hand, if the position is
    ill-regarded for a reason, then defending it may be uphill.

    Trolling is Contrarian Argument done to get a reaction. Trolling
    on the Internet often involves pretense.

  • Hypothesis Contrary To Fact:

    arguing from something that might have happened, but didn’t.

  • Internal Contradiction:

    saying two contradictory things in the same argument. For example,
    claiming that
    Archaeopteryx is a dinosaur with hoaxed feathers, and also saying
    in the same book that it is a “true bird”. Or another
    author who said on page 59, “Sir Arthur Conan Doyle writes in his
    autobiography that he never saw a ghost.” But on page 200 we find
    “Sir Arthur’s first encounter with a ghost came when he was 25,
    surgeon of a whaling ship in the Arctic..”

    This is much like saying “I never borrowed his car, and it
    already had that dent when I got it.”

    This is related to Inconsistency.

  • Changing The Subject (Digression, Red Herring, Misdirection, False Emphasis):

    this is sometimes used to avoid having to defend a claim, or to avoid
    making good on a promise. In general, there is something you are not
    supposed to notice.

    For example, I got a bill which had a big announcement about how
    some tax had gone up by 5%, and the costs would have to be passed on
    to me. But a quick calculation showed that the increased tax was only
    costing me a dime, while a different part of the the bill had silently
    gone up by $10.

    This is connected to various diversionary tactics, which may be
    obstructive, obtuse, or needling. For
    example, if you quibble about the meaning of some word a person used,
    they may be quite happy about being corrected, since that means
    they’ve derailed you, or changed the subject. They may pick nits in
    your wording, perhaps asking you to define “is”. They may
    deliberately misunderstand you:

    “You said this happened five years before Hitler came to power. Why
    are you so fascinated with Hitler ? Are you anti-Semitic ?”

    It is also connected to various rhetorical tricks, such as
    announcing that there cannot be a question period because the speaker
    must leave. (But then he doesn’t leave.)

  • Argument By Fast Talking:

    if you go from one idea to the next quickly enough, the audience won’t
    have time to think. This is connected to Changing The Subject and (to some audiences) Argument By Personal Charm.

    However, some psychologists say that to understand what you hear,
    you must for a brief moment believe it. If this is true, then rapid
    delivery does not leave people time to reject what they hear.

  • Having Your Cake (Failure To Assert, or Diminished Claim):

    almost claiming something, but backing out. For example, “It may
    be, as some suppose, that ghosts can only be seen by certain so-called
    sensitives, who are possibly special mutations with, perhaps,
    abnormally extended ranges of vision and hearing. Yet some claim we
    are all sensitives.”

    Another example: “I don’t necessarily agree with the
    liquefaction theory, nor do I endorse all of Walter Brown’s other
    material, but the geological statements are informative.” The
    strange thing here is that liquefaction theory (the idea that the
    world’s rocks formed in flood waters) was demolished in 1788. To
    “not necessarily agree” with it, today, is in the category
    of “not necessarily agreeing” with 2+2=3. But notice that
    writer implies some study of the matter, and only partial rejection.

    A similar thing is the failure to rebut. Suppose I raise an issue.
    The response that “Woodmorappe’s book talks about that”
    could possibly be a reference to a resounding rebuttal. Or perhaps the
    responder hasn’t even read the book yet. How can we tell ? [I later
    discovered it was the latter.]

  • Ambiguous Assertion:

    a statement is made, but it is sufficiently unclear that it leaves
    some sort of leeway. For example, a book about Washington politics did
    not place quotation marks around quotes. This left ambiguity about
    which parts of the book were first-hand reports and which parts were
    second-hand reports, assumptions, or outright fiction.

    Of course, lack of clarity is not always intentional. Sometimes
    a statement is just vague.

    If the statement has two different meanings, this is Amphiboly.
    For example, “Last night I shot a burglar in my pyjamas.”

  • Failure To State:

    if you make enough attacks, and ask enough questions, you may never
    have to actually define your own position on the topic.

  • Outdated Information:

    information is given, but it is not the latest information on the
    subject. For example, some creationist articles about the
    amount of dust on the moon quote a measurement made in the 1950’s.
    But many much better measurements have been done since then.

  • Amazing Familiarity:

    the speaker seems to have information that there is no possible way
    for him to get, on the basis of his own statements. For example:
    “The first man on deck, seaman Don Smithers, yawned lazily and
    fingered his good luck charm, a dried seahorse. To no avail ! At
    noon, the Sea Ranger was found drifting aimlessly, with every man of
    its crew missing without a trace !”

  • Least Plausible Hypothesis:

    ignoring all of the most reasonable explanations. This makes the
    desired explanation into the only one. For example: “I left a
    saucer of milk outside overnight. In the morning, the milk was gone.
    Clearly, my yard was visited by fairies.”

    There is an old rule for deciding which explanation is the most
    plausible. It is most often called “Occam’s Razor”, and it
    basically says that the simplest is the best. The current phrase among
    scientists is that an explanation should be “the most
    parsimonious”, meaning that it should not introduce new concepts
    (like fairies) when old concepts (like neighborhood cats) will do.

    On ward rounds, medical students love to come up with the most
    obscure explanations for common problems. A traditional response is to
    tell them “If you hear hoof beats, don’t automatically think of
    zebras”.

  • Argument By Scenario:

    telling a story which ties together unrelated material, and then
    using the story as proof they are related.

  • Affirming The Consequent:

    logic reversal. A correct statement of the form “if P then
    Q” gets turned into “Q therefore P”.

    For example,

    “All cats die; Socrates died; therefore
    Socrates was a cat.”

    Another example: “If the earth orbits the sun, then the nearer
    stars will show an apparent annual shift in position relative to more
    distant stars (stellar parallax). Observations show conclusively that
    this parallax shift does occur. This proves that the earth orbits the
    sun.” In reality, it proves that Q [the parallax] is
    consistent with
    P [orbiting the sun]. But it might also be
    consistent with some other theory. (Other theories did exist. They
    are now dead, because although they were consistent with a few facts,
    they were not consistent with all the facts.)

    Another example: “If space creatures were kidnapping people
    and examining them, the space creatures would probably hypnotically
    erase the memories of the people they examined. These people would
    thus suffer from amnesia. But in fact many people do suffer from
    amnesia. This tends to prove they were kidnapped and examined by
    space creatures.” This is also a Least
    Plausible Hypothesis explanation.

  • Moving The Goalposts (Raising The Bar, Argument By Demanding Impossible Perfection):

    if your opponent successfully addresses some point, then say he must
    also address some further point. If you can make these points more and
    more difficult (or diverse) then eventually your opponent must fail.
    If nothing else, you will eventually find a subject that your
    opponent isn’t up on.

    This is related to Argument By Question.
    Asking questions is easy: it’s answering them that’s hard.

    If each new goal causes a new question, this may get to be Infinite
    Regression.

    It is also possible to lower the bar, reducing the burden on an
    argument. For example, a person who takes Vitamin C might claim that
    it prevents colds. When they do get a cold, then they move the
    goalposts, by saying that the cold would have been much worse if not
    for the Vitamin C.

  • Appeal To Complexity:

    if the arguer doesn’t understand the topic, he concludes that nobody
    understands it. So, his opinions are as good as anybody’s.

  • Common Sense:

    unfortunately, there simply isn’t a common-sense answer for many
    questions. In politics, for example, there are a lot of issues where
    people disagree. Each side thinks that their answer is common
    sense. Clearly, some of these people are wrong.

    The reason they are wrong is because common sense depends on the
    context, knowledge and experience of the observer. That is why
    instruction manuals will often have paragraphs like these:

    When boating, use common sense. Have one life preserver for each
    person in the boat.

    When towing a water skier, use common sense. Have one person
    watching the skier at all times.

    If the ideas are so obvious, then why the second sentence ? Why do they
    have to spell it out ? The answer is that “use common sense”
    actually meant “pay attention, I am about to tell you something
    that inexperienced people often get wrong.”

    Science has discovered a lot of situations which are far more
    unfamiliar than water skiing. Not surprisingly, beginners find that
    much of it violates their common sense. For example, many people
    can’t imagine how a mountain range would form. But in fact anyone can
    take good GPS equipment to the Himalayas, and measure for themselves
    that those mountains are rising today.

    If a speaker tells an audience that he supports using common sense,
    it is very possibly an Ambiguous
    Assertion.

  • Argument By Laziness (Argument By Uninformed Opinion):

    the arguer hasn’t bothered to learn anything about the topic. He
    nevertheless has an opinion, and will be insulted if his opinion is
    not treated with respect. For example, someone looked at a picture on
    one of my web
    pages, and made a complaint which showed that he hadn’t even
    skimmed through the words on the page. When I pointed this out, he
    replied that I shouldn’t have had such a confusing picture.

  • Disproof By Fallacy:

    if a conclusion can be reached in an obviously fallacious way, then
    the conclusion is incorrectly declared wrong. For example,

    “Take the division 64/16. Now, canceling a 6 on top and a six on
    the bottom, we get that 64/16 = 4/1 = 4.”
    “Wait a second ! You can’t just cancel the six !”
    “Oh, so you’re telling us 64/16 is not equal to 4, are you ?”

    Note that this is different from Reductio Ad
    Absurdum, where your opponent’s argument can lead to an absurd
    conclusion. In this case, an absurd argument leads to a normal
    conclusion.

  • Reductio Ad Absurdum:

    showing that your opponent’s argument leads to some absurd
    conclusion. This is in general a reasonable and non-fallacious
    way to argue. If the issues are razor-sharp, it is a good way to
    completely destroy his argument. However, if the waters are a bit
    muddy, perhaps you will only succeed in showing that your opponent’s
    argument does not apply in all cases, That is, using Reductio Ad
    Absurdum is sometimes using the Fallacy Of The
    General Rule. However, if you are faced with an argument that is
    poorly worded, or only lightly sketched, Reductio Ad Absurdum may be a
    good way of pointing out the holes.

    An example of why absurd conclusions are bad things:

    Bertrand Russell, in a lecture on logic, mentioned that in the sense
    of material implication, a false proposition implies any proposition.
    A student raised his hand and said “In that case, given that 1 =
    0, prove that you are the Pope”. Russell immediately replied,
    “Add 1 to both sides of the equation: then we have 2 = 1. The
    set containing just me and the Pope has 2 members. But 2 = 1, so it
    has only 1 member; therefore, I am the Pope.”

  • False Compromise:

    if one does not understand a debate, it must be “fair” to
    split the difference, and agree on a compromise between the
    opinions. (But one side is very possibly wrong, and in any case one
    could simply suspend judgment.) Journalists often invoke this fallacy
    in the name of “balanced” coverage.

    “Some say the sun rises in the east, some say it rises in the west;
    the truth lies probably somewhere in between.”

    Television reporters like balanced coverage so much that they may
    give half of their report to a view held by a small minority of the
    people in question. There are many possible reasons for this, some of
    them good. However, viewers need to be aware of this tendency.

  • Fallacy Of The Crucial Experiment:

    claiming that some idea has been proved (or disproved) by a pivotal
    discovery. This is the “smoking gun” version of history.

    Scientific progress is often reported in such terms. This is
    inevitable when a complex story is reduced to a soundbite, but it’s
    almost always a distortion. In reality, a lot of background happens
    first, and a lot of buttressing (or retraction) happens
    afterwards. And in natural history, most of the theories are about how
    often certain things happen (relative to some other thing). For those
    theories, no one experiment could ever be conclusive.

  • Two Wrongs Make A Right (Tu Quoque, You Too, What’s sauce for
    the goose is sauce for the gander):

    a charge of wrongdoing is answered by a rationalization that others
    have sinned, or might have sinned. For example, Bill borrows Jane’s
    expensive pen, and later finds he hasn’t returned it. He tells himself
    that it is okay to keep it, since she would have taken his.

    War atrocities and terrorism are often defended in this way.

    Similarly, some people defend capital punishment on the grounds
    that the state is killing people who have killed.

    This is related to Ad Hominem (Argument To The
    Man).

  • Pious Fraud:

    a fraud done to
    accomplish some good end, on the theory that the end justifies the
    means.

    For example, a church in Canada had a statue of Christ which
    started to weep tears of blood. When analyzed, the blood turned out to
    be beef blood. We can reasonably assume that someone with access to
    the building thought that bringing souls to Christ would justify his
    small deception.

    In the context of debates, a Pious Fraud could be a lie. More generally, it would be when an emotionally
    committed speaker makes an assertion that is shaded, distorted or even
    fabricated. For example, British Prime Minister Tony Blair was
    accused in 2003 of “sexing up” his evidence that Iraq had
    Weapons of Mass Destruction.

    Around the year 400, Saint Augustine wrote two books, De Mendacio[On
    Lying] and Contra Medacium[Against Lying], on this subject. He
    argued that the sin isn’t in what you do (or don’t) say, but in your
    intent to leave a false impression. He strongly opposed Pious Fraud.
    I believe that Martin Luther also wrote on the subject.