Letters to Readers
Where to Get It
Fallacies and other thinking errors
For the most advanced students, the author has included his collection of fallacies and other thinking errors, the many situations in which our minds tend not to perceive and/or handle reality well. Often these are seen in separate lists, one for logic, one for psychology, one for sociology, etc. Here, the author has not attempted any strict separation. In his opinion, these, more than any lack of technology, are what keep us from serious space exploration.
This page is a work-in-progress, and will be developed as the author has time. Suggestions for clarifications and additions are welcome.
Asserting the Consequent
Modus Ponens only works in one direction. If one thing causes (or implies) another, and we know the cause happened, then we know the effect happened. But if we know the effect happened, we cannot assume the cause happened, because there might be other causes of the same effect.
Correct modus ponens: Rain causes wet. It rained. It's wet!
Asserting the consequent: Rain causes wet. It's wet. Rain? Dew? Leaking pipe? Spill?
Denying the Antecedent
As Sata and Neti learned in Book One, Modus Tollens only works in one direction. If one thing causes another, and the effect didn't happen, then we know the cause didn't happen either. But if the cause didn't happen, the effect might still happen from another cause.
Correct modus tollens: Rain causes wet. Not wet? It didn't rain!
Denying the antecedent: Rain causes wet. No rain? Look out for the guy with the bucket!
(Causal) Ad Ignorantiam
Fallacies: If an idea is not proven false, then it must be true. If it is not proven true, then it must be false.
This is an example of a large class of fallacies in which a social situation is confused with reality. The fact is, the ability (or inability) of a person (or even all people) to prove an idea true (or false) has absolutely no bearing on whether the idea really is true (or false).
A subtle variation is the belief that "absence of evidence is evidence of absence." In recent centuries we have made some progress at recognizing just how limited human senses and instruments are, but this fallacy is still very common.
In the causal version, a possible cause is (incorrectly) "proven" to be in play if no alternative can be found, or (incorrectly) "proven" to not be in play if a possible alternative is found.
Fallacy: If an idea is generally believed true (or false), then it must be true (or false).
Popular belief may have, and often does have, no relationship to reality. Popular belief tells us that the belief has a function in the society, not that it is true (or false).
In choosing actions, "monkey see, monkey do" is a powerful motivation, but may not lead to the wisest course of action, or even to one that has any utility at all.
Fallacy: If an event is statistically "due," then it is more probable than at other times.
If we flip a coin ten times, and get all heads, we are tempted to think that the odds of getting tails on the next flip are higher than usual. No, the odds are still 50% on each flip, because each flip is physically independent of all other flips.
Post Hoc or False Cause
Fallacy: If one event precedes another, then it must be the cause of the other.
Many things can be happening when one event precedes another, other than causation. Even if the first event ALWAYS precedes the other, we still have only correlation, not necessarily causation. It often happens that both things are caused by an unknown third event, but that one of the effects happens more slowly than the other.
Discounting Past and Future
We tend to give more weight to the present. This shows up in many ways.
"Old" stories are not interesting. "Old" people are not important. "Ancient history" is irrelevant. "Old news" is of no concern. "Old data" is not as reliable as "new data." Any product that is "new" is assumed to be better than the "old" model.
Most people would rather have $10 today than $20 next year, even though a 100% per annum investment yield is considered excellent to any investor.
We elect political leaders who will spend money today for infrastructure and entitlements, even when the money must be borrowed and paid back, with interest, by our children or grandchildren.
Cognitive Dissonance and Denial
It is very difficult to hold two seemingly conflicting facts, perceptions, ideas, or beliefs in our minds. We get very uncomfortable, and usually deny one of them.
If the conflict is between two facts or perceptions, it usually means we don't fully understand one or both, perhaps because we are analyzing them with the wrong tools. Since applying new tools takes time and energy, and may not even be possible with the tools available, we usually just say, "That can't be!" about one of the two.
If the conflict is between two ideas or beliefs, the most "deeply held" notion (often the one our parents taught us) usually wins, and the other is tossed out, regardless of it's validity or utility.
The process of denial during cognitive dissonance seems to be necessary to avoid insanity, even though the "winner" is not always the best choice.
Similar to cognitive dissonance and denial, face-saving occurs when we fail to meet our own expectations, or others' expectations that we have internalized. We might be surprised and "caught off guard," fail at some task, or fail to understand something. In any case, it hurts, and is embarrassing if anyone else knows. Face-saving behaviors are attempts to "brush it off," and in some manner speak or act as though the failure doesn't matter to us, or we actually meant it to go that way. Cats are masters of the art.
Sour Grapes and Sweet Lemons
When the personal ego is involved, it is common for people to value something less if it is unattainable, more if it is easy to attain. Aesop understood this when he wrote the fable The Fox and the Grapes, in which the grapes were out of reach and thus assumed by the fox to be sour. Lemons grow on small bushes and are easy to pick, so we might assume they are sweet. This is a psychological face-saving device, and works opposite to the economic tendency of the price to rise when something is rare, and fall when it is plentiful. It is an example of cognitive dissonance.
The body of evidence is too small, or not sufficiently representative of the idea in question.
Example: "Bill lied to me. Janet lied to me. Everyone lies to me!"
In the science of statistical analysis, when the sample of data is small, conclusions are unreliable. When the sample size drops to one, the reliability of any conclusions is zero.
Although relying on an appropriate authority figure is not a perfect method of determining truth, relying on an inappropriate authority is almost useless.
"Dr. Jones says I need surgery."
The possibilities are presented as exhaustive, but in reality there are other options.
"A, B, or C?"
The possibilities are presented as mutually exclusive, but in reality a combination of them is possible.
"A, B, or C?"
The argument requires assuming the conclusion to be true. The conclusion often merely restates one of the points of the argument. Such statements are not false, they are just empty arguments that don't bring us any closer to the truth.
Example: "Bob is a bad person because he does things that bad people do."
Loaded or Complex Question
A question that includes an unproven statements as if it were proven.
Example: "Have you quit kicking your dog?" (It is a yes/no question, and if you limit yourself to those options, you will be admitting you kicked your dog even if you did not.)
Judgmental words used without evidence to back the judgment. Arguments that use loaded words often result in circular reasoning ("begging the question").
Example: "Are you going to believe that incompetent idiot, or this honorable gentleman?"
Slippery Slope or Camel's Nose
In one sense: if there are two points along a path with no significant difference between them, then all distances along the path are insignificant, so it doesn't matter how far we go.
In the opposite sense: we dare not take a small number of steps, because that will necessarily lead to a large number of steps, and that will put us somewhere we don't want to be.
The abortion debate, on both sides, makes constant use of this fallacy.
A word is used in different senses, but the argument is dependent on the assumption that the two uses of the word are the same.
Example: "Every man is free. Kibi is not a man, therefore Kibi is not free." (The first "man" means "human being," and the second means "male.")
Example: "The human heart is unique, so heart transplants should be illegal." (The first "heart" is the mysterious seat of human emotions, the second is a blood pump.)
Redefining a term, away from its usual meaning, to avoid a counter-example or counter-argument.
"War is a crime against humanity."
Any effect that precedes its cause, including "seeing" the future. Causation, by definition, moves forward in time. There may exist processes that transcend time, but they are not causation.
Bad witch: I see a terrible fate in your future!
Good witch: I see things going on right now that could easily lead to problems later if you don't make some changes.
Omitting steps in a causal chain that are necessary for an understanding of the process. This is another fallacy-like argument that is not exactly invalid, but neither is it helpful.
Example: "I left the oven on, so we're all going to die!"
Labeling, blaming, or projecting evil onto others when the quality is also present in ourselves.
This is the primary tool of political rhetoric. The fact is, any problem that arises from common human nature cannot be solved by replacing the people in political office with other people.
Fallacy of Cliché
Reliance, in an argument, on the listener's emotional response to a trite or vulgar expression.
Example: "I knew it was bad because it was too good to be true." (Everyone assumes that anything "too good to be true" must not be.)
Confusing the fact that something exists, or is the norm, with the judgment that it is good. This is a primary quality of most social thinking.
Fallacies: Normal/typical is good. Abnormal is bad. "Freaks" are bad. It is, therefore it ought to be. Description implies prescription.
(Circumstantial) Ad Hominem
Attacking the arguer instead of the argument (often used when legitimate attempts at persuasion have failed). Often the expression "consider the source" is used in this way (although it can also be a valid point, if the arguer is a known liar or practical joker).
Example: "He's got some fancy words, but he's a ________, and I don't listen to them." (Fill in your favorite "bad guy.")
The fact is, even fools, and people on the opposite side of the political spectrum, can be right.
The circumstantial version is claiming that the arguer has a conflict of interest, when in reality it has no bearing on the validity of his argument.
Example: "He said the brakes could fail on a Ford, but when I learned he drove a Dodge, I knew he had a conflict of interest." (In reality, it makes perfect sense for someone to drive a different vehicle if he thinks one is unsafe.)
In the Tu Quoque (You Also) version of this fallacy, one person claims the other cannot criticize his reasoning because the other also makes this (or other) reasoning errors. In reality, other reasoning errors have no bearing on the question at hand.
Arguing for or against something based on it's origin. This fallacy is similar to Ad Hominem, but refers to the origin of the idea.
Example: "I can't vote for that. I heard they came up with it in ________." (Fill in your favorite "bad" country, culture, or religion.)
If a situation includes an element of risk or danger, then an emotional element colors our thinking. Most people will give less weight to an answer that includes risk. A few people will give more weight.
The human mind can only deal with so much sensory input or information. When it becomes over-stimulated or over-loaded, we become mentally fatigued and make mistakes we wouldn't otherwise make. At some point, we must either condense or discard some of the stimulation or information. Condensing, summarizing, coding, automating, and other such techniques can work well if done carefully, with an awareness of what information might be getting lost. If done carelessly, the information most essential to the situation at hand can be overlooked.
Faith in Authority
Authoritative persons, offices, and organization have their fields of expertise and their "state of the art." They also have their limitations (for example, see the Ad Verecundiam fallacy). To make appropriate use of an authority is one thing. To have "faith" in an authority is different. "Faith" is usually defined in philosophy as "belief that outstrips evidence," and so it includes an element of blindness.
Relative Judgment in an Absolute Situation
Making relative judgments is a social process. There is nothing wrong with such judgments in a purely social situation, but they can be much less valid, or completely invalid, when making judgments about the real world.
Example of relative social judgment: "My lawn is greener than Mr. Jones' lawn."
Example with a problem: "I've got 18 amps on this 15 amp electrical circuit, but I'm not going to worry about it because my dad would have put 20 amps."
Belief in Optimism
As an example of relative judgment, often a social value "category" is used to lend weight to an argument, or even to completely exclude competing arguments. Optimism is a very common social value, but many others work similarly.
Example: "First aid kits and emergency supplies are bad because they make us dwell on terrible things, causing those things to be more likely."
It is certainly true that what we dwell upon can have psychological effects, but most situations that call for emergency supplies are independent of our mental states, and therefore better handled with absolute judgments about the real world, instead of relative (social) judgments.
Another example of inappropriate relative judgment, but more on the unconscious level, is the tendency of our minds to perceive and value something less and less when it persists, or recurs, over time. This fallacy appears to operate on both psychological and physiological levels, as our sense organs themselves can become fatigued when perceiving an unchanging stimulus.
An example is the hiker who forgets to fill his water bottle because it's raining, only to discover that as soon as the rain stops, it all soaks into the ground and he has nothing to drink.
The saying "Count Your Blessings!" is a recognition of this tendency, and a hint about how to overcome it. Some tricks may be useful (such as keeping your eyes moving while driving or piloting), but the primary antidote is awareness and will power.
When a person does not wish to think about an idea with an open mind, they might say, "Oh, pooh-pooh!" or some equivalent verbal or body language. Loaded words are often added if a slightly longer explanation is needed. Doing so may cause them to "win" socially or politically, but brings us no closer to understanding reality.
This fallacy involves presenting an idea in a weak or simplistic form that falls down by itself, or is easy to knock down with a few more words.
Example: "The theory of evolution is nothing but the notion that human beings are descended from apes."
The reality is that no one, especially scientists, think this is an accurate description of the theory of evolution. Even the speaker probably knows this, but his purpose is not accuracy, but rather scoring points with people who do not know any better.
Conflict of Interest
Although we are capable of reasoning, there are situations in life that can make us take a position against finding the truth or utility of an idea. Many of those situations involve a "conflict of interest" in which we would lose something (or not gain something) by thinking open-mindedly and reasoning well. Money, power, sex, personal safety, family/clan/community/political loyalties, and many other factors can sway us.
Confirmation Bias is similar, but without a material gain or loss. It involves seeking out or giving greater weight to evidence which supports a pre-existing point of view, and ignoring evidence to the contrary. It is found in all the sciences, especially the social sciences. In an extreme form, sometimes found in mental health institutions, once a patient is diagnosed, all observations of the patient are interpreted to support the diagnosis, no matter how the patient acts. The same thing can happen during a medical or psychological evaluation of a crime victim when the authorities already have a suspect they want to prosecute.
Deep psychological factors, too numerous to list, can cause us to become self-deceptive and engage in non-rational thinking habits, with little or no awareness that we are doing so. In extreme cases, we can become dysfunctional and unable to live independently.
The Tower of Babel
There may be limits to the magnitude or complexity of any collective human undertaking, just as there are for individuals. Whether these limits are imposed by deity, or by nature, is not the important issue. What we do when faced with these limits may have a great impact on our collective future.
In the Tower of Babel story, communication was the limiting factor in the construction of a stone tower. In our world today, we are faced with possible limits to the supply of fossil fuels, the ability of the environment to absorb pollution, the complexity of political and financial systems, and the human population itself.
People have a wide range of responses when faced with limits. One possibility is a temper tantrum. Another is thoughtful study and mature acceptance. Many possibilities lie in between these two extremes.
Certain concepts that are extremely difficult and/or very uncomfortable for the human mind are sometimes used to "turn off" the minds of the listeners, usually for the purpose of keeping them from detecting flaws in the reasoning of the speaker. Two commonly-used thought-stoppers are "infinite" and "eternal."
"Energy is infinite" probably means "I don't want to think about the limits of energy."
"Hope is eternal" probably means "I refuse to think about whether hope is appropriate or not."
If some is good, more is assumed to be better, and the most is assumed to be best.
Medicine is a good example. The desired pharmaceutical function produced by a substance may only occur at a carefully-calculated dosage, usually relative to the body weight of the patient. Too much may have a stressful or completely different effect, which may be dangerous.
Tragedy of the Commons
The apparent harmlessness of the actions of one individual, in a shared environment, is used to justify the actions of others, or the general actions of all. This is a danger in any environment in which costs are socialized (shared among all) but profits are individual.
If a common pasture will support 100 sheep, and there are 10 shepherds, each may graze 10 sheep. One shepherd realizes that if he has 11 sheep, no one will notice, the pasture will not be visibly affected, and he'll be richer. However, the other shepherds think the same thing, the pasture soon has 110 sheep and is over-grazed and ruined.
Once an idea is accepted, any evidence tends to be accepted in support of the idea. Irrefutability appears, at first, to favor an idea, but is actually a serious weakness, giving us no way to test it.
The fallacy of irrefutability has been found to operate strongly in the mental health profession. Once a person is labeled as having a mental illness, the profession has a difficult time dropping or changing the label, even when no evidence can be found to support it (such as in studies in which mentally-healthy people invented a history to get themselves committed, then behaved normally).
Point of View Fallacy
This fallacy involves making statements that require a certain point of view which the speaker does not have, either personally or through the perception of another.
Most statements about the internal state of another person or creature fall into this category. When we judge that a person or animal does or doesn't have "intelligence," "feelings," "wisdom," "remorse," etc., all we can really say is that the person or animal is not showing the behaviors that we would have, or that we assume they should have, if they had that internal state.
Example: "The accused shows no remorse, so he must be guilty and should be imprisoned for life."
Opposite example: "The prisoner cries crocodile tears and brings flowers to the guards, so he must be rehabilitated."
This fallacy was used extensively against racial minorities, and continues to be used against non-human animals. Before the American Civil War, the fact that black people could not read (because they had had no education) was used to argue that they could not think, and so were not "people." Today, people claim that animals do not feel because they do not cry and scream, as we do, when they feel pain or loss. Most animals, of course, do not have the tear ducts and vocal cords to cry and scream as we do.
If we have invested a significant amount of treasure, blood, sweat, or tears in something, we can easily become unable to accurately judge its current value. This can prevent us from letting go of something that is no longer doing us any good, that has resources tied up that would be better placed elsewhere, or that is actually harming us.
Example: We buy something, thinking the value will go up. It goes down. We are upset, but the shame of getting $50 for something we invested $100 in, is too great. We keep it. A while later, it is worth $25. We need the money, so we finally sell it.
Example: During the 20th century, we invested vast amounts of money, land, and our very culture in the private automobile. Now we are discovering that the price of fuel is getting too high, and the consequences of pollution too great. Even so, it appears we will be unable to change this transportation system until forced to do so.
The Pareto optimum is the state in which the improvement in any part results in the degradation of one or more other parts. It is closely related to the "zero-sum game" in which for one person to win, another must lose. Throughout the 20th century, it was believed by economists to be the ideal state of an economic system, and best achieved by a "free market." It can also be applied to the selection of parameters in an engineering problem.
Although it may still have uses in engineering, it began to be disproved experimentally for economics in the 1950s. It remains a strong belief in most economic schools, but the recent financial crises and the cessation of economic growth in most or all parts of the world have caused it to come under greater scrutiny than ever before.
It's weakness lies in the fact that its mathematical basis assumes that markets exist for all possible goods so there are no externalities, all markets are in full equilibrium, markets are perfectly competitive, transaction costs are negligible, and market participants have perfect information. These conditions do not exist anywhere in the world, have never existed, and most market participants would rebel if they did, as the result would be a "level playing field."
Since all economic theory of the 19th and 20th centuries now appears to be invalid without the temporary situation of cheap and plentiful energy, the author feels justified in placing the Pareto Optimality, as an economic ideal, with the other fallacies.
Book by Book
Copyright © 2009-2013