Carl Sagan’s Baloney Detection Kit

"In the course of their training, scientists are equipped with a baloney detection kit. The kit is brought out as a matter of course whenever new ideas are offered for consideration" - Carl Sagan

The human understanding is no dry light, but receives infusion from the will and affections; whence proceed sciences which may be called ‘sciences as one would’. For what a man had rather were true he more readily believes. Therefore he rejects difficult things from impatience of research; sober things, because they narrow hope; the deeper things of nature, from superstition; the light of experience, from arrogance and pride; things not commonly believed, out of deference to the opinion of the vulgar. Numberless in short are the ways, and sometimes imperceptible, in which the affections colour and infect the understanding.

– Francis Bacon, Novum Organon, 1620

 

carl sagan 1977 MIT

Carl Sagan, 1977 at MIT

In the year of his death, Carl Sagan (November 9, 1934–December 20, 1996) published The Demon-Haunted World: Science as a Candle in the Dark. Chapter 12 is called The Fine Art of Baloney Detection.

Through their training, Sagan argues, scientists become equipped with a “baloney detection kit”, the tools of reason, scepticism and circumspection. The premise is simple: do ideas stand up to rigorous questioning? And what matters for science also concerns the greater world. There’s no need to label items as ‘fake news’ or have hosts of ‘fact-checkers’, often pushing their own agendas while processing social media posts for truth, so long as we keep listening, read a lot and question everything.

To construct and maintain a critical mind, Sagan explains methods to help distinguish between ideas that are considered valid science and those that can be considered pseudoscience. We can use his nine-point guide to recognise the fallacious or fraudulent. As he writes:

In the course of their training, scientists are equipped with a baloney detection kit. The kit is brought out as a matter of course whenever new ideas are offered for consideration. If the new idea survives examination by the tools in our kit, we grant it warm, although tentative, acceptance. If you’re so inclined, if you don’t want to buy baloney even when it’ reassuring to do so, there are precautions that can be taken there’s a tried-and-true, consumer-tested method.

What’s in the kit?

What sceptical thinking boils down to is the means to construct and to understand a reasoned argument and, especially important, to recognize a fallacious or fraudulent argument. This question is not whether we like the conclusion that emerges out of a train of reasoning, but whether the conclusion follows from this premises or starting point and whether that premise is true.

Among the tools:

1. Wherever possible there must be independent confirmation of the “facts.”

2. Encourage substantive debate on the evidence by knowledgeable proponents of all points of view.

3. Arguments from authority carry little weight — “authorities” have made mistakes in the past. They will do so again in the future. Perhaps a better way to say it is that in science there are no authorities; at most, there are experts.

4. Spin more than one hypothesis. If there’s something to be explained, think of all the different ways in which it could be explained. Then think of tests by which you might systematically disprove each of the alternatives. What survives, the hypothesis that resists disproof in this Darwinian selection among “multiple working hypotheses,” has a much better chance of being the right answer than if you had simply run with the first idea that caught your fancy.

5. Try not to get overly attached to a hypothesis just because it’s yours. It’s only a way station in the pursuit of knowledge. Ask yourself why you like the idea. Compare it fairly with the alternatives. See if you can find reasons for rejecting it. If you don’t, others will.

6. Quantify. If whatever it is you’re explaining has some measure, some numerical quantity attached to it, you’ll be much better able to discriminate among competing hypotheses. What is vague and qualitative is open to many explanations. Of course there are truths to be sought in the many qualitative issues we are obliged to confront, but finding them is more challenging.

7. If there’s a chain of argument, every link in the chain must work (including the premise) — not just most of them.

8. Occam’s Razor. This convenient rule-of-thumb urges us when faced with two hypotheses that explain the data equally well to choose the simpler.

9. Always ask whether the hypothesis can be, at least in principle, falsified. Propositions that are untestable, unfalsifiable are not worth much. Consider the grand idea that our Universe and everything in it is just an elementary particle — an electron, say — in a much bigger Cosmos. But if we can never acquire information from outside our Universe, is not the idea incapable of disproof? You must be able to check assertions out. Inveterate skeptics must be given the chance to follow your reasoning, to duplicate your experiments and see if they get the same result.

 

Goya : The sleep of reason produces monsters, 1799.

Goya : The sleep of reason produces monsters, 1799. – Buy this print

 

Before moving to the second part of Sagan’s guide, it’s worth noting that Sagan did not coin the phrase ‘Baloney Detection Kit’. In a 2020 interview for Skeptical Inquirer, Sagan’s wife Ann Druyan was asked about the origin of the phrase. She replied that it came from a need to find common ground:

‘It didn’t really come from Carl. It actually came from a friend of mine named Arthur Felberbaum who died about forty years ago. He and Carl and I once sat down for dinner together. His politics were very left wing, so Carl and Arthur and I were trying to find common ground so that we could have a really good dinner together. And at one point, Arthur said, “Carl, it’s just that I dream that every one of us would have a baloney detection kit in our head.” And that’s where that idea came from.’

And so to the second part to the baloney detection kit. Sagan offers up two examples of how prejudice must and can be avoided:

Suppose you’re seasick, and given both an acupressure bracelet and 50 milligrams of meclizine. You find the unpleasantness vanishes. What did it – the bracelet or the pill? You can tell only if you take the one without the other next time you’re seasick. Now imagine that you’re not so dedicated to science as to be willing to be seasick. Then you won’t separate the variables. You’ll take both remedies again. You’ve achieved the desired practical result; further knowledge, you might say, is not worth the discomfort of attaining it.

Often the experiment must be done ‘double-blind’, so that those hoping for a certain finding are not in the potentially compromising position of evaluating the results. In testing a new medicine, for example, you might want the physicians who determine which patients’ symptoms are relieved not to know which patients have been given the new drug. The knowledge might influence their decision, even if only unconsciously. Instead the list of those who experienced remission of symptoms can be compared with the list of those who got the new drug, each independently ascertained. Then you can determine what correlation exists. Or in conducting a police line-up or photo identification, the officer in charge should not know who the prime suspect is, so as not consciously or unconsciously to influence the witness.

Having seen the nine positive steps to free thought, we now get Sagan’s route to negative freedom – ridding ourselves from things we should avoid if we are to be reasoned:

In addition to teaching us what to do when evaluating a claim to knowledge, any good baloney detection kit must also teach us what not to do. It helps us recognize the most common and perilous fallacies of logic and rhetoric. Many good examples can be found in religion and politics, because their practitioners are so often obliged to justify two contradictory propositions.

Stage 2 of the Baloney Detection Kit: 20 things to avoid:

* Ad hominem – Latin for ‘to the man’, attacking the arguer and not the argument (e.g., the Reverend Dr Smith is a known Biblical fundamentalist, so her objections to evolution need not be taken seriously).

* Argument from authority (e.g., President Richard Nixon should be re-elected because he has a secret plan to end the war in Southeast Asia – but because it was secret, there was no way for the electorate to evaluate it on its merits; the argument amounted to trusting him because he was President: a mistake, as it turned out).

*Argument from adverse consequences (e.g., a God meting out punishment and reward must exist, because if He didn’t, society would be much more lawless and dangerous – perhaps even ungovernable.’ Or: the defendant in a widely publicized murder trial must be found guilty; otherwise, it will be an encouragement for other men to murder their wives).

[* A more cynical formulation by the Roman historian Polybius: Since the masses of the people are inconstant, full of unruly desires, passionate, and reckless of consequences, they must be filled with fears to keep them in order. The ancients did well, therefore, to invent gods, and the belief in punishment after death.]

* Appeal to ignorance – the claim that whatever has not been proved false must be true, and vice versa (e.g., there is no compelling evidence that UFOs are not visiting the Earth; therefore UFOs exist – and there is intelligent life elsewhere in the Universe. Or: there may be seventy kazillion other worlds, but not one is known to have the moral advancement of the Earth, so we’re still central to the Universe). This impatience with ambiguity can be criticized in the phrase: absence of evidence is not evidence of absence.

* Special pleading, often to rescue a proposition in deep rhetorical trouble (e.g., how can a merciful God condemn future generations to unending torment because, against orders, one woman induced one man to eat an apple? Special plead: you don’t understand the subtle Doctrine of Free Will. Or: how can there be an equally godlike Father, Son and Holy Ghost in the same Person? Special plead: you don’t understand the Divine Mystery of the Trinity. Or: how could God permit the followers of Judaism, Christianity and Islam – each in their own way enjoined to heroic measures of loving kindness and compassion – to have perpetrated so much cruelty for so long? Special plead: you don’t understand Free Will again. And anyway, God moves in mysterious ways).

* Begging the question, also called assuming the answer (e.g., we must institute the death penalty to discourage violent crime. But does the violent crime rate in fact fall when the death penalty is imposed? Or: the stock market fell yesterday because of a technical adjustment and profit-taking by investors. But is there any independent evidence for the causal role of ‘adjustment’ and profit-taking; have we learned anything at all from this purported explanation?).

* Observational selection, also called the enumeration of favourable circumstances, or as the philosopher Francis Bacon described it, counting the hits and forgetting the misses* (e.g., a state boasts of the Presidents it has produced, but is silent on its serial killers).

[* My favourite example is this story, told about the Italian physicist Enrico Fermi, newly arrived on American shores, enlisted in the Manhattan nuclear weapons project, and brought face-to-face in the midst of World War Two with US flag officers:

So-and-so is a great general, he was told.

‘What is the definition of a great general?’ Fermi characteristically asked.

‘I guess it’s a general who’s won many consecutive battles.’

‘How many?’

After some back and forth, they settled on five.

‘What fraction of American generals are great?’

After some more back and forth, they settled on a few per cent. 1 But imagine, Fermi rejoined, that there is no such thing as a great general, that all armies are equally matched, and that winning a battle is purely a matter of chance. Then the chance of winning one battle is one out of two, or 1/2; two battles 1/4, three 1/8, four 1/16, and five consecutive battles 1/32, which is about three per cent. You would expect a few per cent of American generals to win five consecutive battles, purely by chance. Now, has any of them won tenconsecutive battles…?]

* Statistics of small numbers – a close relative of observational selection (e.g., ‘they say 1 out of 5 people is Chinese. How is this possible? I know hundreds of people, and none of them is Chinese. Yours truly.’ Or: V’ve thrown three sevens in a row. Tonight I can’t lose.’).

* Misunderstanding of the nature of statistics (e.g., President Dwight Eisenhower expressing astonishment and alarm on discovering that fully half of all Americans have below average intelligence).

* Inconsistency (e.g., prudently plan for the worst of which a potential military adversary is capable, but thriftily ignore scientific projections on environmental dangers because they’re not ‘proved’. Or: attribute the declining life expectancy in the former Soviet Union to the failures of communism many years ago, but never attribute the high infant mortality rate in the United States (now highest in the major industrial nations) to the failures of capitalism. Or: consider it reasonable for the Universe to continue to exist forever into the future, but judge absurd the possibility that it has infinite duration into the past).

* Non sequitur – Latin for ‘it doesn’t follow’ (e.g., our nation will prevail because God is great. But nearly every nation pretends this to be true; the German formulation was ‘Gott mil uns’). Often those falling into the non sequitur fallacy have simply failed to recognize alternative possibilities.

* Post hoc, ergo propter hoc – Latin for ‘it happened after, so it was caused by’ (e.g., Jamie Cardinal Sin, Archbishop of Manila: 7 know of… a 26-year-old who looks 60 because she takes [contraceptive]pills.’ Or:before women got the vote, there were no nuclear weapons).

* Meaningless question (e.g., What happens when an irresistible force meets an immovable object? But if there is such a thing as an irresistible force there can be no immovable objects, and vice versa).

* Excluded middle, or false dichotomy – considering only the two extremes in a continuum of intermediate possibilities (e.g., ‘sure, take his side; my husband’s perfect; I’m always wrong.’ Or: ‘either you love your country or you hate it.’ Or: ‘if you’re not part of the solution, you’re part of the problem’).

* Short-term v. long-term – a subset of the excluded middle, but so important I’ve pulled it out for special attention (e.g., we can’t afford programmes to feed malnourished children and educate pre-school kids. We need to urgently deal with crime on the streets. Or: why explore space or pursue fundamental science when we have so huge a budget deficit?). Slippery slope, related to excluded middle (e.g., if we allow abortion in the first weeks of pregnancy, it will be impossible to prevent the killing of a full-term infant. Or, conversely: if the state prohibits abortion even in the ninth month, it will soon be telling us what to do with our bodies around the time of conception).

* Confusion of correlation and causation (e.g., a survey shows that more college graduates are homosexual than those with lesser education; therefore education makes people gay. Or: Andean earthquakes are correlated with closest approaches of the planet Uranus; therefore – despite the absence of any such correlation for the nearer, more massive planet Jupiter – the latter causes the former.*

[* Or: children who watch violent TV programmes tend to be more violent when they grow up. But did the TV cause the violence, or do violent children preferentially enjoy watching violent programmes? Very likely both are true. Commercial defenders of TV violence argue that anyone can distinguish between television and reality. But Saturday morning children’s programmes now average 25 acts of violence per hour. At the very least this desensitizes young children to aggression and random cruelty. And if impressionable adults can have false memories implanted in their brains, what are we implanting in our children when we expose them to some 100,000 acts of violence before they graduate from elementary school?]

* Straw man – caricaturing a position to make it easier to attack (e.g., scientists suppose that living things simply fell together by chance – a formulation that wilfully ignores the central Darwinian insight, that Nature ratchets up by saving what works and discarding what doesn’t. Or – this is also a short-term/long-term fallacy – environmentalists care more for snail darters and spotted owls than they do for people).

* Suppressed evidence, or half-truths (e.g., an amazingly accurate and widely quoted ‘prophecy’ of the assassination attempt on President Reagan is shown on television; but – an important detail – was it recorded before or after the event? Or: these government abuses demand revolution, even if you can’t make an omelette without breaking some eggs. Yes, but is this likely to be a revolution in which far more people are killed than under the previous regime? What does the experience of other revolutions suggest? Are all possible revolutions against oppressive regimes desirable and in the interests of the people?).

* Weasel words (e.g., the separation of powers of the US Constitution specifies that the United States may not conduct a war without a declaration by Congress. On the other hand, Presidents are given control of foreign policy and the conduct of wars, which are potentially powerful tools for getting themselves re-elected. Presidents of either political party may therefore be tempted to arrange wars while waving the flag and calling the wars something else – ‘police actions’, ‘armed incursions’, ‘protective reaction strikes’, ‘pacification’, ‘safeguarding American interests’, and a wide variety of ‘operations’, such as ‘Operation Just Cause’. Euphemisms for war are one of a broad class of reinventions of language for political purposes. Talleyrand said, ‘An important art of politicians is to find new names for institutions which under old names have become odious to the public’).

In seeking to identify the height of anti-scientific thought, Sagan moves on to a critique of the tobacco industry and its “massive advertising campaign to portray smoking as advanced and fashionable”:

Part of the success of the tobacco industry in purveying this brew of addictive poisons can be attributed to widespread unfamiliarity with baloney detection, critical thinking and scientific method. Gullibility kills.

He ends the chapter with a note that no rules are infallible:

Knowing the existence of such logical and rhetorical fallacies rounds out our toolkit. Like all tools, the baloney detection kit can be misused, applied out of context, or even employed as a rote alternative to thinking. But applied judiciously, it can make all the difference in the world, not least in evaluating our own arguments before we present them to others.

The upshot is that healthy scepticism guards against passive acceptance and lazy inculcation. Freedom of speech is freedom of thought. Acceptance without doubt, question, challenge and investigation is for fools. And because new ideas can change the world, we must all be free to imagine and express our dreams, and confident enough to confront them and have them undermined by proven testimony and facts.

Would you like to support Flashbak?

Please consider making a donation to our site. We don't want to rely on ads to bring you the best of visual culture. You can also support us by signing up to our Mailing List. And you can also follow us on Facebook, Instagram and Twitter. For great art and culture delivered to your door, visit our shop.