Letters to Readers
Where to Get It
Fallacies and other thinking errors
For the most advanced students, the author has included his collection of fallacies and other thinking errors, the many situations in which our minds tend not to perceive and/or handle reality well. Often these are seen in separate lists, one for logic, one for psychology, one for sociology, etc. Here, the author has not attempted any strict separation. In his opinion, these, more than any lack of technology, are what keep us from serious space exploration.
This page is a work-in-progress, and will be developed as the author has time. Suggestions for clarifications and additions are welcome.
Asserting the Consequent
Modus Ponens only works in one direction. If one thing causes (or implies) another, and we know the cause happened, then we know the effect happened. But if we know the effect happened, we cannot assume the cause happened, because there might be other causes of the same effect.
Correct modus ponens: Rain causes wet. It rained. It's wet!
Asserting the consequent: Rain causes wet. It's wet. It must have rained (even though I'm sitting under a leaking pipe).
Denying the Antecedent
As Sata and Neti learned in Book One, Modus Tollens only works in one direction. If one thing causes another, and the effect didn't happen, then we know the cause didn't happen either. But if the cause didn't happen, the effect might still happen from another cause.
Correct modus tollens: Rain causes wet. Not wet? It didn't rain!
Denying the antecedent: Rain causes wet. It didn't rain, so I'll stay dry even under this leaking pipe!
(Causal) Ad Ignorantiam
An "argument from ignorance," attempting to prove an untested (even untestable) proposition.
Fallacies: If an idea is not proven false, then it must be true. If it is not proven true, then it must be false. If it cannot be proven true or false, than it MUST be (whichever the arguer prefers).
This is an example of a large class of fallacies in which a social situation is confused with reality. The fact is, the ability (or inability) of a person (or even all people) to prove an idea true (or false) has absolutely no bearing on whether the idea really is true (or false).
A subtle variation is the belief that "absence of evidence is evidence of absence." In recent centuries we have made some progress at recognizing just how limited human senses and instruments are, but this fallacy is still very common.
In the causal version, a possible cause is (incorrectly) "proven" to be in play if no alternative can be found, or (incorrectly) "proven" to not be in play if a possible alternative is found.
Fallacy: If an idea is generally believed true (or false), then it must be true (or false).
Popular belief may have, and often does have, no relationship to reality. Popular belief tells us that the belief has a function in the society, not that it is true (or false).
In choosing actions, "monkey see, monkey do" is a powerful motivation, but may not lead to the wisest course of action, or even to one that has any utility at all.
Fallacy: If an event is statistically "due," then it is more probable than at other times.
If we flip a coin ten times, and get all heads, we are tempted to think that the odds of getting tails on the next flip are higher than usual. No, the odds are still 50% on each flip, because each flip is physically independent of all other flips.
Post Hoc or False Cause
Fallacy: If one event precedes another, then it must be the cause of the other.
Many things can be happening when one event precedes another, other than causation. Even if the first event ALWAYS precedes the other, we still have only correlation, not necessarily causation. It often happens that both things are caused by an unknown third event, but that one of the effects happens more slowly than the other.
Discounting Past and Future
We tend to give more weight to the present. This shows up in many ways.
"Old" stories are not interesting. "Old" people are not important. "Ancient history" is irrelevant. "Old news" is of no concern. "Old data" is not as reliable as "new data." Any product that is "new" is assumed to be better than the "old" model.
Most people would rather have $10 today than $20 next year, even though a 100% per annum investment yield is considered excellent to any investor.
We elect political leaders who will spend money today for infrastructure and entitlements, even when the money must be borrowed and paid back, with interest, by our children or grandchildren.
The first piece of information you receive about something can easily become a mental "anchor" that colors all future through about the thing, and is difficult to forget, even when it has obviously become completely irrelevant.
Recency bias is similar, the tendency to overvalue the latest information available. This tendency encourages the general tendency to discount past and future.
Shifting Baseline Syndrome is the tendency of each generation to perceive the state of the ecosystems they encounter in their childhood as normal and natural, even though it might already be in a state of extreme depletion or imbalance.
Cognitive Dissonance and Denial
It is very difficult to hold two seemingly conflicting facts, perceptions, ideas, or beliefs in our minds. We get very uncomfortable, and usually deny one of them.
If the conflict is between two facts or perceptions, it usually means we don't fully understand one or both, perhaps because we are analyzing them with the wrong tools. Since applying new tools takes time and energy, and may not even be possible with the tools available, we usually just say, "That can't be!" about one of the two.
If the conflict is between two ideas or beliefs, the most "deeply held" notion (often the one our parents taught us) usually wins, and the other is tossed out, regardless of it's validity or utility.
The process of denial during cognitive dissonance seems to be necessary to avoid insanity, even though the "winner" is not always the best choice.
Similar to cognitive dissonance and denial, face-saving occurs when we fail to meet our own expectations, or others' expectations that we have internalized. We might be surprised and "caught off guard," fail at some task, or fail to understand something. In any case, it hurts, and is embarrassing if anyone else knows. Face-saving behaviors are attempts to "brush it off," and in some manner speak or act as though the failure doesn't matter to us, or we actually meant it to go that way. Cats are masters of the art.
Sour Grapes and Sweet Lemons
When the personal ego is involved, it is common for people to value something less if it is unattainable, more if it is easy to attain. Aesop understood this when he wrote the fable The Fox and the Grapes, in which the grapes were out of reach and thus assumed by the fox to be sour. Lemons grow on small bushes and are easy to pick, so we might assume they are sweet. This is a psychological face-saving device, and works opposite to the economic tendency of the price to rise when something is rare, and fall when it is plentiful. It is an example of cognitive dissonance.
The body of evidence is too small, or not sufficiently representative of the idea in question.
Example: "Bill lied to me. Janet lied to me. Everyone lies to me!"
In the science of statistical analysis, when the sample of data is small, conclusions are unreliable. When the sample size drops to one, the reliability of any conclusions is zero.
Although relying on an appropriate authority figure is not a perfect method of determining truth, relying on an inappropriate authority is almost useless:
"Dr. Jones says I need surgery."
But even an appropriate authority only lends weight to an argument, it does not guarantee truth:
"I'm an authority in the field, therefore whatever I say is true!"
The possibilities are presented as exhaustive, but in reality there are other options.
"A, B, or C?"
The possibilities are presented as discrete and mutually exclusive, but in reality a combination of them is possible.
"A, B, or C?"
In political discourse, this fallacy is used to condemn the ideas of any person whose ideas contain any flaws or unpopular elements. This bad reasoning asserts that if your ideas aren't perfect (and politically-correct), then they must be completely wrong.
A variation is to claim that if you don't accept every detail of one political position, then you must be in complete harmony with every detail of the "opposite" political position.
A good rule to remember to help avoid these false dilemas: The exact opposite of one bad idea is usually another bad idea. As Aristotle pointed out, every virtue is not the opposite of a vice, but the midpoint between two vices.
The argument requires assuming the conclusion to be true. The conclusion often merely restates one of the points of the argument. Such statements are not false, they are just empty arguments that don't bring us any closer to the truth.
Example: "Bob is a bad person because he does things that bad people do."
Circular reasoning often makes use of ad ignorantiam by covertly assuming that the arguer's position must be true unless it's disproven, and the opposing position can't be true unless it's proven.
Loaded or Complex Question
A question that includes an unproven statements as if it were proven.
Example: "Have you quit kicking your dog?" (It is a yes/no question, and if you limit yourself to those options, you will be admitting you kicked your dog even if you did not.)
Judgmental words used without evidence to back the judgment. Arguments that use loaded words often result in circular reasoning ("begging the question").
Example: "Are you going to believe that incompetent idiot, or this honorable gentleman?"
Slippery Slope or Camel's Nose
In one sense: if there are two points along a path with no significant difference between them, then all distances along the path are insignificant, so it doesn't matter how far we go.
In the opposite sense: we dare not take a small number of steps, because that will necessarily lead to a large number of steps, and that will put us somewhere we don't want to be.
The abortion debate, on both sides, makes constant use of this fallacy.
A word is used in different senses, but the argument is dependent on the assumption that the two uses of the word are the same.
Example: "Every man is free. Kibi is not a man, therefore Kibi is not free." (The first "man" means "human being," and the second means "male.")
Example: "The human heart is unique, so heart transplants should be illegal." (The first "heart" is the mysterious seat of human emotions, the second is a blood pump.)
A Definitional Dodge is redefining a term, away from its usual meaning, to avoid a counter-example or counter-argument.
"War is a crime against humanity."
Very slightly different, an Undefinition is the insistence that some issue or other can only be discussed if all participants use a label for it that predefines the outcome of the discussion, by erasing all the points that matter to people on one side of the debate.
Any effect that precedes its cause, including "seeing" the future. Causation, by definition, moves forward in time. There may exist processes that transcend time, but they are not causation.
Bad witch: I see a terrible fate in your future!
Good witch: I see things going on right now that could easily lead to problems later if you don't make some changes.
Omitting steps in a causal chain that are necessary for an understanding of the process. This is another fallacy-like argument that is not exactly invalid, but neither is it helpful.
Example: "I left the oven on, so we're all going to die!"
Labeling, blaming, or projecting evil onto others when the quality is also present in ourselves.
This is the primary tool of political rhetoric. The fact is, any problem that arises from common human nature cannot be solved by replacing the people in political office with other people.
Fallacy of Cliché
Reliance, in an argument, on the listener's emotional response to a trite or vulgar expression.
Example: "I knew it was bad because it was too good to be true." (Everyone assumes that anything "too good to be true" must not be.)
Confusing the fact that something exists, or is the norm, with the judgment that it is good. This is a primary quality of most social thinking.
Fallacies: Normal/typical is good. Abnormal is bad. "Freaks" are bad. It is, therefore it ought to be. Description implies prescription.
(Circumstantial) Ad Hominem
Attacking the arguer instead of the argument (often used when legitimate attempts at persuasion have failed). Often the expression "consider the source" is used in this way (although it can also be a valid point, if the arguer is a known liar or practical joker).
Example: "He's got some fancy words, but he's a ________, and I don't listen to them." (Fill in your favorite "bad guy.")
The fact is, even fools, and people on the opposite side of the political spectrum, can be right.
The circumstantial version is claiming that the arguer has a conflict of interest, when in reality it has no bearing on the validity of his argument.
Example: "He said the brakes could fail on a Ford, but when I learned he drove a Dodge, I knew he had a conflict of interest." (In reality, it makes perfect sense for someone to drive a different vehicle if he thinks one is unsafe.)
In the Tu Quoque (You Also) version of this fallacy, one person claims the other cannot criticize his reasoning because the other also makes this (or other) reasoning errors. In reality, other reasoning errors have no bearing on the question at hand.
Arguing for or against something based on it's origin. This fallacy is similar to Ad Hominem, but refers to the origin of the idea.
Example: "I can't vote for that. I heard they came up with it in ________." (Fill in your favorite "bad" country, culture, or religion.)
If a situation includes an element of risk or danger, then an emotional element colors our thinking. Most people will give less weight to an answer that includes risk. A few people will give more weight.
The human mind can only deal with so much sensory input or information. When it becomes over-stimulated or over-loaded, we become mentally fatigued and make mistakes we wouldn't otherwise make. At some point, we must either condense or discard some of the stimulation or information. Condensing, summarizing, coding, automating, and other such techniques can work well if done carefully, with an awareness of what information might be getting lost. If done carelessly, the information most essential to the situation at hand can be overlooked.
Faith in Authority
Authoritative persons, offices, and organization have their fields of expertise and their "state of the art." They also have their limitations (for example, see the Ad Verecundiam fallacy). To make appropriate use of an authority is one thing. To have "faith" in an authority is different. "Faith" is usually defined in philosophy as "belief that outstrips evidence," and so it includes an element of blindness.
Relative Judgment in an Absolute Situation
Making relative judgments is a social process. There is nothing wrong with such judgments in a purely social situation, but they can be much less valid, or completely invalid, when making judgments about the real world.
Example of relative social judgment: "My lawn is greener than Mr. Jones' lawn."
Example with a problem: "I've got 18 amps on this 15 amp electrical circuit, but I'm not going to worry about it because my dad would have put 20 amps."
Belief in Optimism
As an example of relative judgment, often a social value "category" is used to lend weight to an argument, or even to completely exclude competing arguments. Optimism is a very common social value, but many others work similarly.
Example: "First aid kits and emergency supplies are bad because they make us dwell on terrible things, causing those things to be more likely."
It is certainly true that what we dwell upon can have psychological effects, but most situations that call for emergency supplies are independent of our mental states, and therefore better handled with absolute judgments about the real world, instead of relative (social) judgments.
Another example of inappropriate relative judgment, but more on the unconscious level, is the tendency of our minds to perceive and value something less and less when it persists, or recurs, over time. This fallacy appears to operate on both psychological and physiological levels, as our sense organs themselves can become fatigued when perceiving an unchanging stimulus.
An example is the hiker who forgets to fill his water bottle because it's raining, only to discover that as soon as the rain stops, it all soaks into the ground and he has nothing to drink.
The saying "Count Your Blessings!" is a recognition of this tendency, and a hint about how to overcome it. Some tricks may be useful (such as keeping your eyes moving while driving or piloting), but the primary antidote is awareness and will power.
When a person does not wish to think about an idea with an open mind, they might say, "Oh, pooh-pooh!" or some equivalent verbal or body language. Loaded words are often added if a slightly longer explanation is needed. Doing so may cause them to "win" socially or politically, but brings us no closer to understanding reality.
This fallacy involves presenting an idea in a weak or simplistic form that falls down by itself, or is easy to knock down with a few more words.
Example: "The theory of evolution is nothing but the notion that human beings are descended from apes."
The reality is that no one, especially scientists, think this is an accurate description of the theory of evolution. Even the speaker probably knows this, but his purpose is not accuracy, but rather scoring points with people who do not know any better.
Conflict of Interest
Although we are capable of reasoning, there are situations in life that can make us (consciously or unconsciously) take a position against finding the truth or utility of an idea. Many of those situations involve a "conflict of interest" in which we would lose something (or not gain something) by thinking open-mindedly and reasoning well. Money, power, sex, personal safety, family/clan/community/political loyalties, and many other factors can sway us.
Confirmation Bias is similar, but without a material gain or loss. It involves seeking out or giving greater weight to evidence which supports a pre-existing point of view, and ignoring evidence to the contrary. It is found in all the sciences, especially the social sciences. In an extreme form, sometimes found in mental health institutions, once a patient is diagnosed, all observations of the patient are interpreted to support the diagnosis, no matter how the patient acts. The same thing can happen during a medical or psychological evaluation of a crime victim when the authorities already have a suspect they want to prosecute.
The Illusory Truth Effect describes the way people are more likely to believe something is true after hearing it said many times. This is due to the fact that the familiar feeling we experience when hearing something we've heard before feels very similar to our experience of knowing that something is true. When we hear a familiar idea, its familiarity provides us with something called cognitive ease, which is the relaxed, unlabored state we experience when our minds aren't working hard at something. We also experience cognitive ease when we are presented with a statement that we know to be true.
Selection or Sampling Bias can occur when choosing or processing samples in any statistical study.
Reporting or Publication Bias usually involved the social, economic, or political pressures that influence the publishing process.
Survivor(ship) or Attrition Bias cause us to pay more attention to those "still in the race" even when the "losers" should be part of the picture.
Pareidolia, Apophenia, and the Clustering Illusion, are names for our tendency to see patterns or meanings in data even when it is completely random. The Hot Hand Fallacy sees a "streak" in an athlete's performance where there is none.
When a scientist forms a hypothesis that is suggested by data he has already collected, many biases can easily enter the process. The Texas Sharpshooter Fallacy involves ignoring differences but stressing similarities, or "drawing the target after the most bullet holes are found."
A Salient Exemplar is something newsworthy that is different enough from daily routine to catch our interest. Because we only read about such events (by choice or because they are the only ones reported), it is easy to then believe that they are typical and events like them constitute the norm, when by definition they generally do not. This mental process is sometimes called the Availability Heuristic.
Deep psychological factors can cause us to become self-deceptive and engage in non-rational thinking habits, with little or no awareness that we are doing so. In extreme cases, we can become dysfunctional and unable to live independently.
The Backfire Effect, skillfully illustrated by Matthew Inman:
The recognition that the "universe" (on whatever scale) appears to be just right for human life because we (human life) are obviously here observing it. If it wasn't right for human life, we wouldn't be here to observe it!
The Tower of Babel
There may be limits to the magnitude or complexity of any collective human undertaking, just as there are for individuals. Whether these limits are imposed by deity, or by nature, is not the important issue. What we do when faced with these limits may have a great impact on our collective future.
In the Tower of Babel story, communication was the limiting factor in the construction of a stone tower. In our world today, we are faced with possible limits to the supply of fossil fuels, the ability of the environment to absorb pollution, the complexity of political and financial systems, and the human population itself.
People have a wide range of responses when faced with limits. One possibility is a temper tantrum. Another is thoughtful study and mature acceptance. Many possibilities lie in between these two extremes.
Certain words or phrases that combine absurdity and powerful emotions, that are extremely difficult and/or very uncomfortable for the human mind to process, are sometimes used to essentially short-circuit the thinking process and "turn off" the minds of the listeners, usually for the purpose of keeping them from detecting flaws in the reasoning of the speaker. Two commonly-used thought-stoppers are "infinite" and "eternal."
"Energy is infinite" probably means "I don't want to think about the limits of energy."
"Hope is eternal" probably means "I refuse to think about whether hope is appropriate or not."
"I believe in people."
If some is good, more is assumed to be better, and the most is assumed to be best.
Medicine is a good example. The desired pharmaceutical function produced by a substance may only occur at a carefully-calculated dosage, usually relative to the body weight of the patient. Too much may have a stressful or completely different effect, which may be dangerous.
Tragedy of the Commons
The apparent harmlessness of the actions of one individual, in a shared environment, is used to justify the actions of others, or the general actions of all. This is a danger in any environment in which costs are socialized (shared among all) but profits are individual.
If a common pasture will support 100 sheep, and there are 10 shepherds, each may graze 10 sheep. One shepherd realizes that if he has 11 sheep, no one will notice, the pasture will not be visibly affected, and he'll be richer. However, the other shepherds think the same thing, the pasture soon has 110 sheep and is over-grazed and ruined.
An idea does not allow the possibility of it being tested and proven wrong. This might appears, at first, to favor an idea, but is actually a serious weakness.
Once such an idea is accepted, any evidence tends to be seen as supporting the idea, even evidence that should contradict it.
The fallacy of irrefutability has been found to operate strongly in the mental health profession. Once a person is labeled as having a mental illness, the profession has a difficult time dropping or changing the label, even when no evidence can be found to support it (such as in studies in which mentally-healthy people invented a history to get themselves committed, then behaved normally).
This fallacy also operates in law enforcement and military settings, and any other professions in which egos are bruised if initial findings or plans are not ultimately accepted.
Point of View Fallacy
This fallacy involves making statements that require a certain point of view which the speaker does not have, either personally or through the perception of another.
Most statements about the internal state of another person or creature fall into this category. When we judge that a person or animal does or doesn't have "intelligence," "feelings," "wisdom," "remorse," etc., all we can really say is that the person or animal is not showing the behaviors that we would have, or that we assume they should have, if they had that internal state.
Example: "The accused shows no remorse, so he must be guilty and should be imprisoned for life."
Opposite example: "The prisoner cries crocodile tears and brings flowers to the guards, so he must be rehabilitated."
This fallacy was used extensively against racial minorities, and continues to be used against non-human animals. Before the American Civil War, the fact that black people could not read (because they had had no education) was used to argue that they could not think, and so were not "people." Today, people claim that animals do not feel because they do not cry and scream, as we do, when they feel pain or loss. Most animals, of course, do not have the tear ducts and vocal cords to cry and scream as we do.
If we have invested a significant amount of treasure, blood, sweat, or tears in something, we can easily become unable to accurately judge its current value. This can prevent us from letting go of something that is no longer doing us any good, that has resources tied up that would be better placed elsewhere, or that is actually harming us.
Example: We buy something, thinking the value will go up. It goes down. We are upset, but the shame of getting $50 for something we invested $100 in, is too great. We keep it. A while later, it is worth $25. We need the money, so we finally sell it.
Example: During the 20th century, we invested vast amounts of money, land, and our very culture in the private automobile. Now we are discovering that the price of fuel is getting too high, and the consequences of pollution too great. Even so, it appears we will be unable to change this transportation system until forced to do so.
The Pareto optimum is the state in which the improvement in any part results in the degradation of one or more other parts. It is closely related to the "zero-sum game" in which for one person to win, another must lose. Throughout the 20th century, it was believed by economists to be the ideal state of an economic system, and best achieved by a "free market." It can also be applied to the selection of parameters in an engineering problem.
Although it may still have uses in engineering, it began to be disproved experimentally for economics in the 1950s. It remains a strong belief in most economic schools, but the recent financial crises and the cessation of economic growth in most or all parts of the world have caused it to come under greater scrutiny than ever before.
It's weakness lies in the fact that its mathematical basis assumes that markets exist for all possible goods so there are no externalities, all markets are in full equilibrium, markets are perfectly competitive, transaction costs are negligible, and market participants have perfect information. These conditions do not exist anywhere in the world, have never existed, and most market participants would rebel if they did, as the result would be a "level playing field."
Since all economic theory of the 19th and 20th centuries now appears to be invalid without the temporary situation of cheap and plentiful energy, the author feels justified in placing the Pareto Optimality, as an economic ideal, with the other fallacies.
Perceptual "Dead Zones"
Many different situation cause some part of reality to be inaccessible to our senses. In fact, there are so many of these "dead zones" that it's amazing we know as much as we do about the real world.
Too fast - When anything moves too fast for us to see, like a humming bird's wings or a bullet, it's easy to miss it, even when we know it's quite real. When something moves very fast that we don't yet know about, we can easily remain completely ignorant about it. Instruments and photography have helped us overcome this perceptual "dead zone" to some degree, but it remains a weakness for us.
Too slow - Just like with "too fast," anything that moves too slowly is a problem for us. For example, in most places, the geological changes that shape the land are happening just as fast as they always have been, in other words, very slowly. Over millions of years, mountains crumble and continents drift. Science, working with animators and other artists, can help us understand these changes.
Too big - The stars and galaxies form patterns that might be better proof of God than any old-fashioned book, and the Earth is, in many ways, a living being, but none of that can be seen from any place where a person can stand. Seeing the very large requires science, art, imagination, and a willingness to set aside our (usually-too-big) egos.
Too small - People have been peering into microscopes for hundreds of years, but most of us do not understand the very small creatures of our world (bacteria, viruses, and many other), so we often do things to kill the good ones and encourage the dangerous ones. At the smallest scale, where atoms and particles dwell, material reality and spiritual forces interact, which scares both science and religion.
Invisible - The air we breathe is easy to forget about -- unless we live in Beijing or Los Angeles, and even there, if we don't look very far away, we can ignore all the poisons. And anyway, the ultimate poison, carbon dioxide, creeping up past 400 parts per million, is invisible, so it must not be a problem, right? Most poisons in our drinking water are also completely invisible. Likewise, we'll never see the electricity in high voltage wires (until we touch them), and the heat being absorbed by the oceans (until climate change kills us).
Too far - Where does the trash go when we throw something "away"? Where does the poop go when we flush the toilet? Out of sight, out of mind. We love animals, so we let someone else, in another country if possible, make our hamburgers and Thanksgiving turkeys. We have improved dangerous factories and eliminated child labor in "modern" societies, so we import stuff from countries where we don't have to see it happen.
Too much - By the time a child grows up in a "modern" society, he or she has seen and heard tens of thousands of deaths, most of fellow human beings, but has never actually felt mortal danger or even serious hunger. In any one year, today, we create (and try to keep up with) as much information as was created from the beginning of history to the year 2000, but have no idea how to tell a story or analyze an argument. This is often called "information overload." In the realm of nutrition, after millions of years having barely enough to eat, "modern" people find themselves with a new problem: how to get hungry in time for the next meal.
This is just a sampling of the perceptual "dead zones" that limit the human mind and human society. Some of these have been partly overcome by science and instruments, but the average person remains very distrustful of science. Whether or not these limitation will someday be our undoing, only time will tell.
When trying to predict the future, even the general shape of the very near future, it is easy and common to commit the Fallacy of Present Uniqueness. People with an interest in the future being different than similar times in the past are prone to say, "This time is different!" We like to think that we are smarter and stronger than people in the past. Where our ancestors failed, we will surely succeed, won't we?
Unfortunately, in most ways we are not smarter and stronger, and will not succeed where they failed. The most classic example is a "financial bubble," a period in which prices are rising rapidly and a large number of people are trying to make some money from the situation. Every financial bubble in the past has "popped," has come to a point where people start pulling their money out of the investment, causing prices to fall quickly, and leaving most investors with huge losses. But still, every time there is a new financial bubble, many voices are heard saying, "This time is different! This wonderful situation will go on and on!" It never does.
But the Fallacy of Present Uniqueness does not operate on all scales. When looking at individual events, another, and completely opposite, fallacy can rear its ugly head. When we commit the Fallacy of Eternal Sameness or Positive Recency, we are assuming that tomorrow MUST be the same as today, or very similar.
It is true that at most points in history, tomorrow is usually, in most ways, very similar to today. But not always. Unexpected things do sometimes occur. In the following series, moving in time from left to right, A through H are points at which predictions are made that event X is about to occur.
A B C D E F G HX
At points A and B, people might take the prediction seriously. At points C and beyond, they tend to lose interest and say, "No, we're heard that before, and it looks like tomorrow is always about like today."
But event X, like most things, cares nothing for predictions, or what we think of those predictions. It just is, driven by the forces that can move it, which are most often beyond human control even when the event is social or political in nature. When the event has it roots in geology, ecology, or the climate, it is even further outside our control.
Fallacy of Composition
Fallacy: what is true of the parts must be true of the whole.
The fact is that any composition (a whole made of parts) may have qualities that were not present in all, or even any, of the parts, and also may NOT have qualities that were present in some, or even all, of the parts.
An excellent example is water, H2O, composed of Hydrogen and Oxygen, two gasses, both of which are flammable.
The inclination, after an event has occurred, to see the event as having been predictable, despite there having been little or no objective basis for predicting it.
Platonicity, the more general case, is when a pure, well-defined, and easily discernible idea is accepted, when the reality at hand is more complex.
In the Ludic Fallacy, the relationships in a game or model, with well-defined rules or algorithms, are assumed to be operative instead of the more-complex, messy, unpredictable reality.
Book by Book
Copyright © 2009-2018