Often attributed to Mark Twain, it was actually Benjamin Disraeli who said “There are three types of lies – lies, damn lies, and statistics.” Twain had a similar comment, “Facts are stubborn things, but statistics are pliable.” Both statements are important to keep in mind when reading reports of causes of and cures for autism spectrum disorder (ASD). In this blog we will look at some key issues in research (and in reports of that research) to consider when deciding credibility.
What types of research do we see related to ASD? There is certainly a lot of research trying to identify causes, in the hopes that possible elimination of the cause could prevent future cases of ASD. There is also research that looks at different treatment methods to see what helps individuals with ASD. Virtually all of this research starts as a single experiment. But a single experiment doesn’t prove anything. It raises a possibility that needs to be examined carefully with other experiments and expanded research to see if the results can be reliably reproduced by others at different times and places.
Many media reports of something “new” (new ways to diagnose, new information about causes, new treatments) describe the results of a single experiment typically involving a relatively small number of people. These reports are important, because they provide incentives for others to repeat the experiment with different (and hopefully larger numbers of) people to see if they get the same findings. When the experiment has been successfully repeated several times with different researchers, different subjects, and a larger number of subjects we can begin to consider it proof of something. But what is that something? What do we have to consider when reviewing research to decide for ourselves what it really shows?
Perhaps the first thing to have clear in our minds is the difference between correlation and causation. Correlation is actually about math and statistics. It means that the relationship between two sets of data can be expressed mathematically. When this relationship falls within certain values, the two sets of data are said to correlate. When we graph both sets of data on the same graph, we can see the relationship pretty clearly. We may hear this as “X is directly related to Y” meaning as X increases so does Y, or “X is inversely related to Y” meaning as X increases, Y decreases. This describes a relationship in the available data. Causation says “X causes Y” or “Y happens because of X.” Causation is talking about a cause and an effect. It is a much stronger statement than saying that “X correlates with Y.”
How do we know whether a news report it telling us about causation or correlation? Well, anything that connects 1 thing to another is talking about correlation. In fact, you can’t have causation without correlation! Here’s where it might start getting a little confusing, so pause and reread the paragraph right before this. Now say “If X causes Y, then X is also strongly correlated with Y.” This is a factual statement; it is always a true statement. But the converse (if X is strongly correlated with Y, X causes Y) is NOT a factual statement; it is actually an example of false logic.
An article from George Mason University explains it this way:
For example, eating breakfast has long been correlated with success in school for elementary school children. It would be easy to conclude that eating breakfast causes students to be better learners. It turns out, however, that those who don’t eat breakfast are also more likely to be absent or tardy — and it is absenteeism that is playing a significant role in their poor performance.
In short, that example shows that eating breakfast is correlated with success in school but does not cause success in school. If we dig through enough data we can find lots of things that are correlated, but which we know logically show only coincidence. I am not going to give you more examples, but you can see a number of them here. The challenge in looking at reports of research is when the lack of true relationship (different from correlation) is not so clear – like the example of eating breakfast and success in school. Those are the claims that are often mistakenly made, apparently supported by the data, and then accepted as true.
There are many other things to consider when evaluating news reports and research claims. But if you clearly understand the difference between correlation and causation, you have a good start. We will look at the issues of control for variables, sample sizes, and replicability in some future blog post.
As Sean Connery said in one of the Indiana Jones films, “So endeth the lesson.”
Russell J. Bonanno, M.Ed.
TAP Program Manager