Critical Thinkers Check the Methodology

By Martin Cohen

Part of Critical Thinking Skills For Dummies Cheat Sheet

A key critical thinking skill is to be able to understand, and criticise, a writer’s methodology. When authors write books, conduct studies, or investigate a topic, they operate within a research paradigm (a theoretical framework) that affects how they view and investigate the subject. In formal academic studies, authors discuss the research paradigm upfront, and so that’s straightforward.

But more often, they leave the nature of the chosen paradigm in the background – as a given. So the Critical Reader has to make a specific effort to work it out — and consider how the choice may skew the information reported.

Here are some useful questions to ask when looking at reports and research findings in the broad area of social science:

  • Theoretical or empirical: Is the text primarily concerned with ideas and theories or primarily based on observations and measurements? Most texts mix the two approaches, but Critical Readers need to identify which element should be the primary focus — even if the author seems confused!

  • Nomothetic or idiographic: These grand terms originate from Ancient Greek (nomos means law and idios means own or private) and refer to laws or rules that apply in general in contrast to ones that relate to individuals. Most social research is concerned with the nomothetic — the general case — because even when studying individuals researchers usually hope to generalise the findings to everyone else. Always bear in mind the extent to which entirely valid observations about a particular case can safely be generalised.

  • Cause or correlation: So many people mix up these terms that the error has its own special name — cum hoc ergo propter hoc (Latin for ‘with this, therefore because of this’). In other words, putting things together whose connection is unproven. Take a medical example. A recent study of over a million women with breast cancer checked how many were cured by operations to remove suspected cancerous cells. It found that two thirds were still alive ten years later.

    It might seem natural to assume that the survival was due to the treatment, but the study also found that a control group of women given a mock operation (involving no removal of any cells) had an identical survival rate — plus greatly reduced risks or ill-effects from the procedures. Be aware that in experimental studies, there is a built-in bias exists to see causation even when, maybe, none exists.

  • Statistical answers or ideological hypotheses: A lot of research is based on probabilities. But working them out is something that even experienced researchers get wrong — perhaps applying the wrong statistical procedure to their data and generally overestimating the significance of their findings. Statistics aren’t simple plain-as-a-pikestaff facts; they’re created, misunderstood, and manipulated, which is why politicians and businesses sometimes seize on them in order to present a partial picture.