Abstract
Bonell et al. discuss the challenges of carrying out randomised controlled trials (RCTs) to evaluate complex interventions in public health, and consider the role of realist evaluation in enhancing this design (Bonell, Fletcher, Morton, Lorenc, & Moore, 2012). They argue for a "synergistic, rather than oppositional relationship between realist and randomised evaluation" and that "it is possible to benefit from the insights provided by realist evaluation without relinquishing the RCT as the best means of examining intervention causality." We present counter-arguments to their analysis of realist evaluation and their recommendations for realist RCTs. Bonell et al. are right to question whether and how (quasi-)experimental designs can be improved to better evaluate complex public health interventions. However, the paper does not explain how a research design that is fundamentally built upon a positivist ontological and epistemological position can be meaningfully adapted to allow it to be used from within a realist paradigm. The recommendations for "realist RCTs" do not sufficiently take into account important elements of complexity that pose major challenges for the RCT design. They also ignore key tenets of the realist evaluation approach. We propose that the adjective 'realist' should continue to be used only for studies based on a realist philosophy and whose analytic approach follows the established principles of realist analysis. It seems more correct to call the approach proposed by Bonell and colleagues 'theory informed RCT', which indeed can help in enhancing RCTs.
Original language | English |
---|---|
Journal | Social Science and Medicine |
Volume | 94 |
Pages (from-to) | 124-128 |
Number of pages | 5 |
ISSN | 0277-9536 |
DOIs | |
Publication status | Published - 2013 |
Keywords
- Research
- Public health
- Randomized controlled trials
- Evaluation
- Methodology
- Interventions
- Experimental
- Programs design