Skip to main content

A joint leap into a future of high-quality simulation research—standardizing the reporting of simulation science

Simulation has offered a practical means to train and rehearse clinical skills for many years. Simulated environments, patients, and related technologies have been used to develop, validate, and maintain a wide range of clinical skills across numerous clinical specialties. In the past 30 years, the field has truly thrived, as evidenced in rapidly evolving simulation technologies; the ever-increasing volume and quality of simulation-based scientific studies; the institution of numerous peer-reviewed outlets for the dissemination of these studies; the number of learned societies dedicated to promoting simulation and their expansive memberships; and the widespread development and availability of clinical educational resources, curricula, and policies centered on application of simulation. Such simulation-based training applications and interventions within the health professions has been termed an “ethical imperative” [1], whereby demonstrating proficiency on simulation-based tasks and procedures before performing them in a clinical environment on patients seems to be a trend gaining significant momentum [24].

Clinical simulation science is thus past its early developmental stages. Evidence reviews and syntheses are taking stock of where the field is, and where it should be heading. From this perspective of a self-reflective science, the paper by Cheng et al. [5] on the extension of existing guidelines to encompass the reporting of simulation research is as valuable as it is timely. Cheng et al. applied an elaborate consensus-building methodology using panels of international experts in the field. In successive stages, they reviewed the existing guidelines for applicability to simulation research and edited them accordingly. In addition to the expert-derived modifications to the guidelines, simulation-specific items were also developed to account for the unique needs of simulation science, as new information and methods are generated [6]. These items largely focus on contextual elements of the study design. They include the type of simulator and simulation environment used, the ways that study participants were oriented to it (based on their prior extent of exposure to simulation), the description of the event/scenario used, the challenges presented to participants, and finally, the feedback/debriefing (if any) that was conducted. This comprehensive process, and its result have numerous strengths, including its detailed method (which applied an iterative and reflective approach to the guideline development), expert coverage, and good response rates by the consulted experts (95 % of whom provided partial input, and 75 % of whom contributed fully to the process). These outnumber certain limitations of the study, most of which typify inherent aspects of any consensus-building methodology, but also include the scope of the guideline extension, which excluded qualitative and mixed-methods research, computational simulation studies, as well as validation studies.

These guidelines were urgently needed. Simulation research is certainly progressing well; however, for the field to achieve maturity, the quality of the reported science is a paramount focus. As Cheng et al. briefly review, simulation research, no differently than for other health and medical science work, is often poorly reported and at the very least lacks consistent reporting across similar studies. This poses multiple problems. First, poorly reported research presents a dilemma to readers, making it hard to tell if is a study that was well-conducted but suboptimally described, or if it was actually poorly conducted (but accurately described). Second, academic reviews and syntheses of inconsistently or poorly reported studies suffer owing to lack of homogeneity and lack of comparability. As a result, capturing the state of the field becomes problematic, slowing forward progress. Third, the reputation of the field is at risk, as inconsistency in reporting style and outcomes creates confusion among wider audiences such as clinical leaders or policy makers.

This issue is not peculiar to simulation. The wider problem of how to increase the value of biomedical, clinical, and health research was the subject of a high-profile 2014 series of articles in the Lancet [711]. Poor reporting of research findings is part of this wider problem. It causes repercussions in (poorly informed) planning, selection, and funding decisions about research. It undermines efforts to make sound clinical and educational policy decisions based on extant research. Accurate, systematic, and unbiased reporting should be part of the wider effort to deliver value through novel research. The guidelines offered by Cheng et al. [5] have the potential, if implemented widely, to help address these problems in the simulation community by uniformly improving the quality and consistency of simulation study reports.

As with any guidelines, however, their envisioned positive impact will only materialize if they are suitably implemented. From our collective perspective as editors of peer-reviewed journals within the field, we endorse them for use within the journals we represent. We pledge to encourage our author colleagues to use the guidelines whenever appropriate when crafting the studies and writing the manuscripts that they submit to us. We are optimistic that this will happen. Over many years, our field has shown exceptional innovation and commitment to high-quality science. We have witnessed dedication to simulation research and its application by many in our still young field, and we do expect that a move toward standardization of research conduct and reporting will be welcomed and used in practice.

Looking to the future, we suggest that as the field matures further, the direction of research design and reporting of a variety of study types could integrate good practices from nonsimulation research paradigms. This may be especially true when evaluating complex health care interventions that use simulation approaches in whole or in part. Within the applied health research field, guidelines detail how to study and subsequently report the process via which an intervention was delivered to its intended recipients (e.g. clinical services and their users) [12]. Such detailed “process evaluations”, typical examples of what has been termed “implementation science” [13], allow one to appraise the effectiveness of the intervention’s implementation, the nature of the implementation context, and finally, the mechanisms by which the intervention affected patient outcomes. The novel elements that Cheng et al. [5] have developed that apply specifically to the reporting of simulation research fall within these concepts of intervention implementation and context, as well as enhance accurate and meaningful interpretation of the results of simulation studies. This means that some simulation studies might be good candidates for reporting detailed process evaluations, so that readers have the ability to fully appreciate the educational and/or clinical context, as well as the delivery of the simulation-based intervention. Members of our community should reflect further on these and other elements, with a view to improving not only how simulation studies are reported, but also how they are carried out.

References

  1. Ziv A, Wolpe PR, Small SD, et al. Simulation-based medical education: an ethical imperative. Acad Med. 2003;78:783–8.

    Article  PubMed  Google Scholar 

  2. Scott DJ, Dunnington GL. The new ACS/APDS skills curriculum: moving the learning curve out of the operating room. J Gastrointest Surg. 2008;12:213–21.

    Article  PubMed  Google Scholar 

  3. Amanda R, Burden MD, Pardo Jr M, et al. ASA Simulation Education Network offering simulation courses for MOCA® part IV update from the ASA Editorial Board for Simulation-Based Training (Formerly the ASA Committee on Simulation Education). ASA Monitor. 2013;77:50–1.

    Google Scholar 

  4. Anderson A, Baxendale B, Scott L, et al. The National Simulation Development Project: Summary report. Association for Simulated Practice in Healthcare (ASPiH); 2014. Available from www.aspih.org.uk.

  5. Cheng A, Kessler D, MacKinnon R, et al. Reporting guidelines for health care simulation research: extensions to the CONSORT and STROBE statements. Simul Health 2016. In press.

  6. Shneider A. Four stages of a scientific discipline; four types of scientist. Trends Biochem Sci. 2009;34:217–23.

    Article  CAS  PubMed  Google Scholar 

  7. Chalmers I, Bracken MB, Djulbegovic B, et al. How to increase value and reduce waste when research priorities are set. Lancet. 2014;383(9912):156–65.

    Article  PubMed  Google Scholar 

  8. Ioannidis JP, Greenland S, Hlatky MA, et al. Increasing value and reducing waste in research design, conduct, and analysis. Lancet. 2014;383(9912):166–75.

    Article  PubMed  PubMed Central  Google Scholar 

  9. Al-Shahi Salman R, Beller E, Kagan J, et al. Increasing value and reducing waste in biomedical research regulation and management. Lancet. 2014;383(9912):176–85.

    Article  PubMed  Google Scholar 

  10. Chan AW, Song F, Vickers A, et al. Increasing value and reducing waste: addressing inaccessible research. Lancet. 2014;383(9913):257–66.

    Article  PubMed  PubMed Central  Google Scholar 

  11. Glasziou P, Altman DG, Bossuyt P, et al. Reducing waste from incomplete or unusable reports of biomedical research. Lancet. 2014;383(9913):267–76.

    Article  PubMed  Google Scholar 

  12. Moore GF, Audrey S, Barker M, et al. Process evaluation of complex interventions: Medical Research Council guidance. BMJ. 2015;350:h1258.

    Article  PubMed  PubMed Central  Google Scholar 

  13. Eccles MP, Mittman BS. Welcome to implementation science. Implement Sci. 2006;1:1.

    Article  PubMed Central  Google Scholar 

Download references

Funding

Sevdalis’ research was supported by the National Institute for Health Research (NIHR) Collaboration for Leadership in Applied Health Research and Care South London at King’s College Hospital NHS Foundation Trust. Sevdalis is a member of King’s Improvement Science, which is part of the NIHR CLAHRC South London and comprises a specialist team of improvement scientists and senior researchers based at King’s College London. Its work is funded by King’s Health Partners (Guy’s and St Thomas’ NHS Foundation Trust, King’s College Hospital NHS Foundation Trust, King’s College London and South London and Maudsley NHS Foundation Trust), Guy’s and St Thomas’ Charity, the Maudsley Charity and the Health Foundation. The views expressed are those of the authors and not necessarily those of the NHS, the NIHR, or the Department of Health.

Authors’ contributions

NS conceived the idea and drafted the manuscript. DN, SKE and DMG contributed significant intellectual content, commented on and edited the manuscript. All authors read and approved the final manuscript.

Competing interests

Sevdalis is the Director of London Safety & Training Solutions Ltd, which provides team skills training and advice on a consultancy basis in hospitals and training programs in the UK and internationally. The other authors have no competing interests to report of relevance to this paper.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Nick Sevdalis.

Rights and permissions

Open Access Advances in Simulation is pleased to co-publish this article jointly with Simulation in Healthcare. The article is published here under the terms of the Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License (http://creativecommons.org/licenses/by-nc-nd/4.0/).

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Sevdalis, N., Nestel, D., Kardong-Edgren, S. et al. A joint leap into a future of high-quality simulation research—standardizing the reporting of simulation science. Adv Simul 1, 24 (2016). https://0-doi-org.brum.beds.ac.uk/10.1186/s41077-016-0026-x

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: https://0-doi-org.brum.beds.ac.uk/10.1186/s41077-016-0026-x

Keywords