The late Carl Sagan brought us many lasting intellectual treasures, among them an inspiring blend of enthusiasm, imagination, and knowledge about our cosmos. A lesser known, but still immensely useful, contribution to the never-ending search for knowledge is Sagan’s “Baloney Detection Kit,” a list of mental tools to avoid our ever-present tendency toward “magical” thinking.

Sagan’s final book, The Demon-Haunted World, contains the kit along with many examples from the world of pseudoscience.

On our own quest for rationality and lucidity in legal matters, we lawyers, too, can learn from Sagan’s example. Let us push back against our own lazy legal thinking by recalling the “Baloney Detection Kit.” In every case where we must prove something, in every case where our client makes a factual claim, and in every case where the facts matter – in other words, in all cases, we should apply the following questions:

  • How reliable is the source?
  • Does this source make similar claims?
  • Have the claims been verified by another reliable source?
  • How does the claim fit with what we know about how the world works?
  • Has anyone gone out of the way to disprove the claim or has only supportive evidence been sought?
  • Does the preponderance of evidence point to the conclusion or another?
  • Is the claimant employing accepted rules of reason and research?
  • Is the claimant providing explanation of observed phenomena or merely denying the existence of explanation?
  • If the claimant proffers a new explanation, does it account for as many phenomena as the old explanation?
  • Does the claimant’s personal beliefs and biases drive the conclusion, or vice versa?

Michael Shermer offers a similar list of tools to debunk nonsense in his book, Why People Believe Weird Things:

  • Anecdotes do not make proof.  E.g., 100 Kansas farmers see UFO without corroborative physical proof.  Remember, we are fallible human storytellers
  • Scientific language does not make science.  E.g., end run around missing evidence by sounding impressive.
  • Bold statements do not make statements true.  E.g., Remember “cold fusion”?  Enormous claims made for its power and truth but supportive evidence is scarce as hen’s teeth
  • Hearsay does not equal correctness.  E.g., they laughed at Copernicus, laughed at the Wright Brothers, and yes, well, they laughed at the Marx Brothers.
  • Burden of proof always on the person making the claim.
  • Rumors do not equal reality.
  • The unexplained is not inexplicable.  E.g., magicians don’t tell their secrets because their tricks are simple but difficult to perceive.
  • Failures are rationalized—errors get us closer to the truth.  Don’t rationalize; admit.
  • After-the-fact reasoning.  E.g., baseball player who does not shave and hits 2 home runs. It must be the beard!  Correlation does not necessarily equal causation.
  • Coincidence.  E.g., conjunctions of events without apparent causal relationship.  We don’t understand probability very well.  Human mind is designed to make easy connections.
  • Representativeness—remember the large context in which the event occurs.  Always analyze the event for its representativeness of its class, e.g., Bermuda triangle has heavy ship traffic, making accidents more likely.
  • Emotive words and false analogies—tools of rhetoric designed to provoke emotion and obscure rationality.
  • Ad ignorantiam—appeal to ignorance, i.e., if you can’t disprove it must be true:  if you cant’ disprove the existence of psychic powers, then they must exist.  Absurd.
  • Ad Hominem—discredit the person by personal attack.
  • Hasty generalization—improper induction or prejudice, conclusion drawn before the facts are in.
  • Over reliance on authority.
  • Either-Or fallacy (fallacy of negation or false dilemma)—dichotomize to discredit one position, forcing them to accept the other.
  • Circular reasoning—fallacy of redundancy, begging Qs, conclusion merely restates the premises.
  • Reductio ad absurdum—slippery slope—carry argument to its logical end, reducing it to an absurdity.
  • Effort inadequacy—need for simplicity because thinking is hard work.  Solutions may be simple but usually are not.
  • Ideological immunity—built up immunity to change in system of ideas.
  • The observer changes the observed.  E.g., anthropologist studies a tribe, a scientist observes an electron
  • The equipment constructs the results.  E.g., size of telescope influences our belief about the size of the universe
Advertisements