lesson2

Lesson 2: Evaluating Evidence - Separating Fact from Fiction

Lesson Objectives

By the end of this lesson, you’ll be able to: - Distinguish between different types of evidence and their relative strength - Evaluate the credibility of information sources - Identify common patterns of misleading evidence - Apply a systematic framework for assessing evidence in everyday situations

The Evidence Explosion

We’re living in an age of information overload. Every day, we’re bombarded with claims, statistics, studies, anecdotes, and opinions—all competing for our attention and belief. Your social media feed alone probably contains dozens of “facts” before breakfast, many of them contradicting each other.

In this environment, the ability to evaluate evidence isn’t just an academic skill—it’s a survival tool. Without it, you’re at the mercy of whoever has the flashiest headline, the most compelling story, or simply the last word.

What Counts as Evidence?

Before we can evaluate evidence, we need to understand what it is. Evidence is information that supports (or contradicts) a claim or belief. But not all evidence is created equal. Let’s look at different types, from strongest to weakest:

1. Direct Empirical Evidence

This is information we can observe directly through our senses or reliable measuring instruments. - Example: You see dark clouds gathering and feel raindrops on your skin—strong evidence it’s raining.

2. Peer-Reviewed Scientific Research

Studies that have been scrutinised by other experts in the field and published in reputable journals. - Example: Multiple large-scale clinical trials showing a medication’s effectiveness.

3. Expert Consensus

When the vast majority of specialists in a field agree on something based on their collective expertise. - Example: The consensus among climate scientists that human-caused climate change is occurring.

4. Official Records and Documentation

Information from established institutions with verification processes. - Example: Census data, court records, or official government statistics.

5. Journalistic Reporting

Information gathered by professional journalists following standards of verification. - Example: An investigative report based on multiple sources and fact-checking.

6. Eyewitness Testimony

First-hand accounts of events, which can be compelling but are also notoriously unreliable. - Example: A bystander’s description of a car accident they witnessed.

7. Anecdotal Evidence

Personal stories or isolated examples that may not represent broader patterns. - Example: “My grandmother smoked a pack a day and lived to 95.”

8. Testimonials

Statements from individuals about their personal experiences, often used in marketing. - Example: “This weight loss supplement changed my life!”

9. Appeals to Tradition or Popularity

Claims based on how long something has been done or how many people believe it. - Example: “People have been using this remedy for centuries.”

[Suggested graphic: A pyramid or hierarchy showing these types of evidence from strongest at the top to weakest at the bottom, with brief examples of each.]

The CRAAP Test: Evaluating Information Sources

When assessing any piece of information, the CRAAP test provides a useful framework. Yes, the acronym is deliberately memorable (you’re welcome). It stands for:

Currency

  • When was the information published or last updated?
  • Is it recent enough for your topic? (Medical advice from 1950 probably isn’t.)
  • Has newer information superseded it?

Relevance

  • Does the information directly address your question?
  • Is it at an appropriate level (not too basic or too technical)?
  • Would you be comfortable citing this source to others?

Authority

  • Who is the author/publisher/source?
  • What are their credentials or qualifications?
  • Are they respected in their field?
  • Do they have a potential bias or conflict of interest?

Accuracy

  • Is the information supported by evidence?
  • Has it been reviewed or fact-checked?
  • Can you verify the information in other sources?
  • Does the language or tone seem free of emotion and bias?

Purpose

  • Why was this information created?
  • Is it trying to inform, teach, sell, entertain, or persuade?
  • Is the purpose clearly stated or hidden?
  • How might the purpose affect the reliability of the information?

Red Flags: When to Be Extra Skeptical

While applying the CRAAP test, watch out for these warning signs that evidence might be misleading:

1. Cherry-Picking

Selecting only the data that supports a position while ignoring contradictory evidence. - Example: A diet book that cites only the studies showing positive results for their approach.

2. Correlation Confused with Causation

Assuming that because two things happen together, one must cause the other. - Example: “Ice cream sales and drowning deaths both increase in summer, so ice cream causes drowning.”

3. Small or Biased Samples

Drawing broad conclusions from too few examples or non-representative groups. - Example: Surveying only college students and applying the results to all adults.

4. Misleading Statistics

Numbers presented without context or manipulated to create a false impression. - Example: “Product X reduces your risk by 50%!” (But the risk might have gone from 0.002% to 0.001%—technically true but practically meaningless.)

5. Appeal to False Authority

Citing experts speaking outside their area of expertise. - Example: A famous actor promoting medical treatments.

6. Weasel Words

Vague language that creates an impression without making a verifiable claim. - Example: “Studies suggest,” “Experts believe,” “May help,” without specifying which studies or experts.

7. Emotional Manipulation

Using emotional language or imagery to override critical thinking. - Example: Frightening anecdotes about rare vaccine side effects that ignore the much greater risks of the disease.

Practical Exercise: Evidence Evaluation

Let’s practice evaluating evidence with some common claims:

  1. “A new study shows that chocolate prevents heart disease.”
    • Questions to ask:
      • Who conducted the study? Was it peer-reviewed?
      • How large was the sample size?
      • Did it show correlation or causation?
      • What type and amount of chocolate was studied?
      • Were there conflicts of interest (e.g., funding from chocolate companies)?
  2. “According to experts, this investment is guaranteed to double your money.”
    • Questions to ask:
      • Which specific experts?
      • What are their credentials and track record?
      • What do they mean by “guaranteed”?
      • What’s the timeframe and risk level?
      • Do they benefit from your investment?
  3. “80% of customers report significant improvement within two weeks.”
    • Questions to ask:
      • How was “significant improvement” defined and measured?
      • Who conducted the survey?
      • How were participants selected?
      • What happened to the other 20%?
      • Was there a control group?

Navigating Scientific Evidence

Scientific research is often considered the gold standard of evidence, but not all science is created equal. Here’s a quick guide to assessing scientific claims:

The Hierarchy of Scientific Evidence

From strongest to weakest: 1. Systematic reviews and meta-analyses (combining results from multiple studies) 2. Randomized controlled trials 3. Cohort studies 4. Case-control studies 5. Cross-sectional surveys 6. Case reports 7. Expert opinions 8. Animal or laboratory studies

Questions to Ask About Scientific Claims

  • Has the research been published in a peer-reviewed journal?
  • Has it been replicated by other researchers?
  • Is it a preliminary finding or well-established?
  • Does it show correlation or causation?
  • How large was the study?
  • Were there methodological limitations?
  • Are the conclusions justified by the data?

Dealing with Information Overload

With so much information available, it’s impossible to deeply evaluate every claim you encounter. Here are some practical strategies:

1. Prioritize What Matters

Apply more scrutiny to information that: - Will influence important decisions - Seems surprising or contradicts established knowledge - Comes from unfamiliar sources - Is emotionally triggering for you

2. Use Trusted Information Filters

Identify reliable sources in different domains that do good fact-checking and analysis, such as: - Fact-checking organizations - Reputable science journalism outlets - Academic institutions - Professional associations in relevant fields

3. Look for Consensus Rather Than Outliers

When experts in a field largely agree on something, there’s usually good reason. Be extra skeptical of claims that go against strong consensus without compelling new evidence.

4. Practice “Satisficing”

You don’t need perfect information for every decision. Sometimes “good enough” evidence is sufficient, especially for low-stakes matters.

Conclusion

Evaluating evidence isn’t about being cynical or dismissing everything you hear. It’s about developing a balanced approach that’s neither gullible nor paranoid—one that helps you navigate a complex information landscape with confidence.

Remember, the goal isn’t to only accept information that’s 100% certain (very little is). Instead, it’s about adjusting your level of belief according to the strength of the evidence. Strong evidence warrants stronger belief; weak evidence suggests holding opinions more tentatively.

In our next lesson, we’ll build on these skills by exploring logical fallacies—the common reasoning errors that can make arguments sound convincing when they’re actually flawed.

[Suggested graphic: A person with a “evidence evaluation toolkit” examining different types of claims with tools like a magnifying glass labeled “CRAAP test” and a scale showing the weight of different evidence types.]

Next Up: Lesson 3 - Spotting Logical Fallacies: Common Reasoning Errors