This is what I am talking about.
Scott McLeod over at Dangerously Irrelevant has a post that illustrates what I have been trying to say about examining sources and thinking critically about what we read. He took a news release about research on Facebook users and analyzed it. The news release, in proper journalistic style, cited the authors of the study it reported on but did not cite any sources used in their research. Scott took the numbers in the press release though, and discovered that there wasn’t a lot of substance to it. He said,
In my mind, the overall generalizations from the study don’t seem to adequately recognize the extremely heavy skew in the non-Facebook group toward graduate students. If I saw that the data (to which she alluded) show that the lower grade trend for grad students was of equivalent size to the undergrad group, then I’d have more confidence in the overall generalizations that are being made in the news release.
Again, this research was presented as fact and we have no real way, short of stumbling upon a copy of their study, to verify anything that the release says. Scott makes very clear his opinion on the need to evaluate research like this:
One final note: We all should look at – and think carefully about – any research findings that get reported out like this. We need to ask questions like Does this make gut-level sense? and Are the generalizations limited to the data or overbroad? and What more do I need to know to be confident in these findings?. Being informed consumers of research is critical if we are to make research- and/or data-driven decisions to benefit our students.
When academic research is reported on, as Scott says, we need to think about what else we need to know. It would be nice if we could be given some access to this additional information in the report. I understand that this may not be possible due to copyright or publication agreements. But it would be nice.