Publication | Closed Access
‘Clarity bordering on stupidity’: where’s the quality in systematic review?
270
Citations
16
References
2005
Year
Family MedicineSecond Language WritingSystematic Literature StudyWriting AssessmentEducationResearch EthicsEducation ResearchEthical PracticeLanguage TeachingQuality CriterionQuality ReviewDiscourse AnalysisLanguage StudiesLiteracy PracticeReliabilitySystematic ReviewHealth PolicyPedagogy'Systematic ReviewWriting StudiesResearch SynthesisHiv/aids EducationEnglish WritingEducational PracticeEvidence-based PracticeEducation Policy
Abstract The article presents a critique of the discourse of 'systematic review' in education, as developed and promoted by the EPPI‐Centre at the University of London. Based on a close reading of the exhortatory and instructional literature and 30 published reviews, it argues that the approach degrades the status of reading and writing as scholarly activities, tends to result in reviews with limited capacity to inform policy or practice, and constitutes a threat to quality and critique in scholarship and research. The claims that are made for the transparency, accountability and trustworthiness of systematic review do not therefore, it is argued, stand up to scrutiny. The article concludes that systematic review is animated, not just by dissatisfaction with the uncertainties of educational research (a dissatisfaction that it shares with the 'evidence‐informed movement' with which it is associated), but by a fear of language itself. Notes * Education and Social Research Institute (ESRI), Manchester Metropolitan University, Didsbury Campus, 799 Wilmslow Road, Manchester M20 2RR, UK. Email: m.maclure@mmu.ac.uk. 1. The full title of the EPPI‐Centre is The Evidence for Policy and Practice Information and Co‐ordinating Centre (Director: Ann Oakley). The Centre's homepage is http://eppi.ioe.ac.uk/EPPIWeb/home.aspx. 2. This is a version of a paper of the same title presented to the Annual Conference of the British Educational Research Association, Manchester, September 2004. The paper was part of a symposium entitled 'Quality Street and other cul‐de‐sacs: what is 'quality' in education and educational research, and where are its cutting edges?' 3. See for example Elliott (Citation2001), Avis (Citation2003), Torrance (Citation2004) and chapters in the second half of Thomas and Pring (Citation2003). 4. Cf. Strathern (2000), Hammersley (Citation2001), Avis (Citation2003). 5. Cf. Oakley: 'the disadvantages of not doing this [i.e., introducing more 'codification into the knowledge base'] can … literally be fatal' (2003, p. 22). Oakley is referring here to 'areas such as HIV/AIDS education' but the remark comes in the middle of a paragraph about the general importance of making it more difficult for health and education professionals to 'hide failures'. 6. The original critiques are: Hargreaves (Citation1996), Hillage et al. (Citation1998) and Tooley and Darby (Citation1998). 7. 'How will the [review] group ensure that the methodological and conceptual issues are understood and agreed upon across the review team in time for the protocol to be written?' 'Which computer software are you going to use to manage citations?' 'What will happen if this person is unable to fulfil this role'? The 'Quality assurance' questions in the 'Companion' are written in red typeface and are presumably especially important: 'Please describe the procedures by which the EPPI‐Centre worked with the group to provide external quality assurance of application of inclusion/exclusion criteria'. 'Please describe the procedures by which the EPPI‐Centre worked with the group to provide external quality assurance of application of mapping keywords'. 'Please describe the procedures by which the EPPI‐Centre worked with the group to provide external quality assurance of data extraction'. (EPPI‐Centre, 2003, pp. 6–16). 8. One review team considered (but decided against) departing from the EPPI‐Centre's framework (Moyles & Yates, Citation2003, p. 39). Another was somewhat equivocal, describing systematic review approvingly as 'a searchlight', but also debating its inflexibility and narrow focus, and its inability to capture the conceptual complexity of their field (Dyson et al., Citation2002, pp. 53, 54–55). There were also instances of reviewers using their specialist knowledge and skills in ways that went beyond the remit of mere 'synthesis', to incorporate old‐school, 'narrative' reviewing habits such as interpretation and argument: see further below. 9. See for example Derrida (Citation1981), Zeller and Farmer (Citation1999). MacLure (Citation2003, chapter 6) provides an overview of the 'threat of writing', or textuality, and the ways in which philosophers, literary theorists and social scientists have attempted to contain that threat. Systematic review is a particularly extreme reaction to the threat of textuality: see 'Conclusion'. 10. 'How much time/how many studies can [the named reviewer] spend/do?' ('EPPI‐Centre review companion', n.d., p. 16). 11. Not all reviews start out with narrow questions. For instance Dyson et al. (Citation2002) and Cordingley et al. (Citation2003) both began with broad questions, and their initial searches yielded very large numbers of potentially relevant 'hits'. However this initial breadth had narrowed dramatically by the final, 'in‐depth review' stage: from 14,692 to 6, and 13,479 to 17 respectively: see below. 12. Reviewers will, of course, have read those research studies with which they were already familiar, unless the review team are novices to the substantive area—a possibility that is not ruled out in systematic review (cf. Torrance, Citation2004). 13. Some reviews carried out their in‐depth review on the total corpus of studies included in the keyword map: Harlen and Deakin Crick (Citation2003, 12 studies); Howes et al. (Citation2003, 24 studies); Deakin Crick et al. (Citation2004, 14 studies); Harlen (Citation2004, 30 studies). 14. A total of 30 education reviews were published on the EPPI‐Centre web site at this date. However two of these did not report an 'in‐depth' review stage: EPPI‐Centre (Citation2001) and Fletcher and Lockhart (Citation2003). The format of these two reviews also diverged in other respects from the usual EPPI review structure. 15. Three primary studies were 'subjected to the full EPPI procedures of in‐depth reviewing' (Hall & Harding, Citation2003, p. 30). There is some ambiguity: the reviewers identified a total of 12 studies for in‐depth review on the grounds of their 'direct relevance to the Teacher Training Agency'; but only three were subjected to the full treatment because only these gained high enough ratings for quality of empirical evidence. 16. The resulting exclusions included one presumably key journal, since it was listed by the authors in their protocol (Journal of emotional and behavioural disorders). Moreover if issues, or even whole volumes, were missing from their institution's library, the authors did not try to obtain these from other sources; 14 issues were 'missing', and 17.5 whole volumes (Harden et al., Citation2003, p. 72). 17. The 'narrower set' of inclusion criteria applied by Harden et al. (Citation2003) excluded all studies that did not employ a control or comparison group or a reversal design, or which employed a sample size of less than 20—thus endorsing the bias towards quantitative design as the benchmark of quality in systematic review. 18. Of the eight studies (which are not assigned numerical 'weights' according to the usual EPPI procedure), one was 'of limited relevance' (Bell et al., Citation2003, p. 17); one was a case study of a single school, from which 'one cannot generalise' (p. 22); one showed 'moderate' and 'negligible' correlations between leadership and student outcomes (p. 17); one found no 'significant, positive relationship' (p. 18); one found that principals made 'a disappointing contribution to student engagement' (p. 19); one found 'indirect' effects (p. 19); one had a faulty design and, in any case, found only a 'weak correlation between leadership and achievement' (p. 20); and one found a 'mediated' (i.e., indirect) relationship between 'transformational leadership' and student achievement in maths (p. 20). 19. For instance Dyson et al. discuss the features of systematic review that tend towards narrowing of focus and reduced flexibility, referring to the '(understandable) emphasis on tight delineation of review topic, requirement for clear a priori criteria for what forms of evidence can be included and current lack of procedures for "analogical" synthesis'. They also note 'the practical issue of managing a very wide ranging review within a limited timescale and budget' (2002, p. 55). 20. Amanda Coffey, as Discussant to the symposium in which the original version of this paper was presented, cautioned against manufacturing a 'moral panic' about systematic review. I take her point. It may be that this paper over‐estimates the power of the evidence 'movement' and its ability to influence research and scholarship. I think I would still argue that the time has come for concerted opposition rather than covert operations. But I could be wrong. 21. Details of the course can be found online by following the link on the EPPI‐Centre's homepage at http://eppi.ioe.ac.uk/EPPIWeb/home.aspx. 22. www.nerf‐uk.org/funders/systematic/. Additional informationNotes on contributorsMaggie MacLureFootnote* * Education and Social Research Institute (ESRI), Manchester Metropolitan University, Didsbury Campus, 799 Wilmslow Road, Manchester M20 2RR, UK. Email: m.maclure@mmu.ac.uk.
| Year | Citations | |
|---|---|---|
Page 1
Page 1