986
https://smallstepspod.com/?p=3069
- Statistics can be reported from polls and studies set up so the data can be cherry-picked to prove a specific point. For example, the Colgate commercial asked about all toothpaste but only reported what they wanted it to show.
- When we talk about percentages, It can sound a lot scarier if someone uses a significant percentage increase, which might mean we went from 1 person to two people. Not scary at all!
- Simpson fallacy is when someone confounds or confuses cause and effect. The cause of an issue or the resolution of the problem is confused with the false solution. For example, did the Bear Committee remove the bears from Springfield, were there never bears, or did bears leave for other reasons? In the podcast, it was named for Edward H. Simpson and not The Simpsons TV show.
- Correlation never shows cause and effect. If a statistic seems to follow with another statistic, it could be they are related or that they have a shared cause. When they share the same reason, having a confounding relationship is called.
- Standards and definitions are constantly changing as we fine-tune and update terms to our time. Because of those changes, it could mean we can no longer compare statistics over time. For example, are there more home runs because players are better off, the materials are different, or the other stadiums? It could be all three!
- Misleading charts
- They have wrong scales, so the results look dramatic or subdued
- Pie charts don’t add up to 100%
- Fancy charts to hide the results
- Misleading or confusing charts to hide the results
- Data that is cherry-picked to cut off data that does not prove the point
- Heartfelt stories that do not represent what causes or solves problems but the tug on the heartstrings misleads the point
- Availability Heuristics: The brain thinks something happens more often because it is seen more often or vivid images. People see all the lottery winners but barely know the lottery losers. It gives a false impression.
Challenge
- Look for one correlation that you’re told this week. Did the person who mentioned it try to infer a cause-and-effect relationship between the two things, or do they explain it adequately? Can you figure out how the two things are related or not related? Or maybe have a confounding component connected, but one doesn’t cause the other. See how many of those things happen to you every week?
Links
- https://smallstepspod.com/?p=3069
- https://statanalytica.com/blog/misuse-of-statistics/
- https://en.wikipedia.org/wiki/Misuse_of_statistics
- https://thedecisionlab.com/biases/availability-heuristic/
- https://www.youtube.com/watch?v=OfVaOqLUbZA – How Not to Fall for Bad Statistics – with Jennifer Rogers
- https://www.youtube.com/watch?v=E91bGT9BjYk – How to spot a misleading graph – Lea Gaslowitz
- https://www.youtube.com/watch?v=bVG2OQp6jEQ – This is How Easy It Is to Lie With Statistics
- https://www.youtube.com/watch?v=ioxWuCd-mn0 This is how easy it is to manipulate public perception
- https://tylervigen.com/discover – Spurious Correlations Generator