The Extremist Group Fallacy

The Idea Channel has this great video and this follow-up video about logical fallacies– those errors that we make intentionally or inadvertently when we twist perspectives and facts together to make an argument. The videos got me thinking about a particular fallacy I’ve noticed before, but it wasn’t mentioned so I did some research and decided to write about it. I will call it the extremist group fallacy and develop it in this post.

The extremist group fallacy defined:

The extremist group fallacy is committed when an observable trait is incorrectly ascribed or linked to an entire sample, but the trait is actually only attributable with statistical significance to a group (or subgroup), the “extremists,” within that sample. The rest of the sample, the non-extremists, may exhibit that trait at normal rates on par with rest of the population, but the behavior of the extremist group artificially increases the prominence of the trait when considering the entire sample.

Committing this logical fallacy can lead to the erroneous implication that merely being a member of the greater sample is linked to (or causes) the observed trait, whereas the trait may actually be the result of some other factor that is exclusive to individuals within the extremist group.

Related fallacies and concepts:

Fallacies do not exist in a vacuum, so I’d like to address three related ideas before dissecting two examples of the extremist group fallacy.

The extremist fallacy is like the correlation does not imply causation fallacy, but it additionally addresses the fact that sometimes the correlation itself might be incomplete or flawed. I also believe that it often doesn’t matter whether or not we are aware that published data show correlation and not causality, because the seeds of a causal link might already have been planted in the minds of all who read or hear about the correlation.

The hasty generalization fallacy is somewhat similar to but not as specific as the extremist group fallacy. It involves drawing a fallacious conclusion from an incomplete sample.

When discussing the extremist group fallacy with a friend, he mentioned the concept of outliers in statistics, and I believe this to be a very closely related idea. Outliers are statistical anomalies that are abnormally distant from other data points within a set. Outliers are like an extremist group in that they might mischaracterize an analysis of the set, but the idea of outliers is not a fallacy in and of itself.

Examples of extremist group fallacies:

Due to the very nature of the extremist group fallacy, it will be easiest (and the most fun) to invent scenarios involving it rather than dissecting real situations with baggage in tow. If I keep it fictitious, I can make up both the data that leads to the logic fallacy as well as the data that exposes the fallacy while limiting complexity and confusion.

Fallacy 1: Study finds Muslims four times as likely to commit acts of terrorism.

Let’s say that this fallacious finding was drawn from an analysis of data that found that Muslims have higher rates of committed acts of terrorism than Hindus, Christians, Buddhists, Jews, and atheists. Let’s say that the analysts defined an “act of terrorism” as a religiously or politically motived hate crime.

Now let’s say that we get a team of extremist group fallacy busters, the EGFB, to turn the analysis towards the extremist groups within all religions, rather than just focusing on religions as a whole. They find that the rates of terrorism within extremist groups are all about equal regardless of religion, let’s say around 1.8 acts per every 1,000 extremist organization members. In other words, extremist Muslims may be just as predisposed to terrorism as extremist Christians or Hindus.

Next, the EGFB analyzes the prominence and size of extremist organizations within each religion, and finds that while only 0.003% of all other religious people are extremists, 0.012% of all Muslims are aligned with some sort of extremist group. Here is the obvious answer as to why the original data revealed higher rates of terrorism among Muslims: Muslims are four times as likely as other religious people to belong to an extremist group. (This is a good time to reiterate that I made this data up.)

In their continued analysis of extremist organizations all over the world, the EGFB finds that extremist networks, no matter their doctrine, thrive in environments that have a particular set of political and economic circumstances. With this data, we postulate that extremist groups flourish in places that lack alternative forms of social cohesion. We note that the areas of the world that have large Muslim populations, like the Middle-East, seem to exhibit these political and economic circumstances more frequently than the rest of the world. This makes Muslim regions fertile ground for any kind of extremist group to grow, and Islam happens to be the religion du jour.

Our deeper analysis reveals that the original analysis drew an incomplete correlation. This could have been dangerous, because it would have put the focus of terrorism on the nature of Islam itself or its followers, rather than the political and economic conditions that give numbers and power to extremist organizations in Muslim areas.

Fallacy 2: People who regularly use tanning beds are over 30 times as likely to get skin cancer.

Let’s say that this fallacious finding came from a study stating that out of the 1% of Americans diagnosed with skin cancer, a staggering 50% of them (so 0.5% of the population at large) use tanning beds on a regular basis. Let’s say “regular basis” is defined as at least 20 sun bed visits per year. Let’s say that 3% of the population at large tans on a regular basis, skin cancer diagnosis or not. That means that of the 3% of the population that tans regularly, one-sixth of them have been diagnosed with skin cancer. Since half of the 1% of Americans diagnosed with skin cancer use tanning beds regularly, the other half, (0.5% of the population at large) do not use tanning beds but got skin cancer anyway, out of a pool of 97% of the population that does not use tanning beds on a regular basis.

0.5% (don’t use tanning bed, got cancer) out of 97% (don’t use tanning bed) is only about 0.52%, so this study shows that the odds of getting skin cancer while not using a tanning bed is about 0.52%. On the other hand, 0.5% (use tanning bed, got cancer) out of 3% (use tanning bed) is 16.67%, so the breathtaking odds of getting skin cancer if you use a tanning bed regularly are 16.67%. From 0.52% to 16.67% is an over 30-fold jump, so it’s hard to believe there’s not causality between tanning bed use and skin cancer with such a strong correlation!

Let’s call the EGFB. They survey a sample from the 1% who were diagnosed with skin cancer and ask them thrifty and exhaustive questions about their personal habits. When they analyze their findings, the folks at the EGFB reveal a sharp divide: 55% of those with skin cancer are addicts who tan obsessively whether it’s with a tanning bed or the real sun, while the remaining 45% seem to just be genetically predisposed to get skin cancer easily. It should not be surprising to find out that tanning bed use is high among tanning addicts, since the real sun is less efficient and predictable at giving them what they want.

The EGFB then analyzes a sample of people who use tanning beds but are not diagnosed with skin cancer. They find that while 10% of them behave like tanning addicts but somehow avoid getting cancer, the other 90% of tanning bed users are extremely moderate with their tanning bed use. This group uses lotion and goggles to protect sensitive areas, normally allows 48 to 72 hours between tan sessions, and allows their skin ample time to recover when they may have accidentally “overdone it.”

Once again, our deeper analysis reveals that the correlation from the original study was incomplete. While tanning beds play a role in skin cancer for many people, the extremely high rates of skin cancer among tanning addicts, who tend to seek out tanning beds, artificially elevates the prevalence of skin cancer when considering all those who regularly use tanning beds.

Conclusion:

The extremist group fallacy is a logical fallacy that begins with some sort of statistical oversight, one that failed to recognize the outliers within a sample, and draws a misleading correlation, which can be dangerous even when causality is not explicitly stated. Again, my examples are fictitious, but I made them real enough to make it obvious how a correlation can be misinterpreted. The link between Islam and terrorism is assumed to be cause and effect by many people, otherwise Donald Trump would not have attempted to gain popularity in late 2015 by insisting that Muslims should be subjected to expanded surveillance. Tanning beds have been called “cancer in a box” while tanning salons are forced to pass extra taxes along to their clients based on the assumption that tanning beds themselves, no matter how they’re used, shine cancer directly into your skin.

Awareness of the extremist group fallacy may reveal that social hazards are not so much the direct result of an organization or technology, but rather are the result of something more fundamental: erratic human behavior. Believe it or not, this actually ties in with the rest of my work on proprietism, because it’s yet another plea to abandon our mental model of the world as a giant assembly line of institutions, where we can blame one entire group of people for something as though they are a malfunctioning machine in that assembly line. I suggest instead that we see ourselves for what we are: a living, breathing, granulated and complex network of mostly good but imperfect people who can, on occasion, be quite extreme.