1

Measuring non-technical skills of anesthesiologists in the operating room: A systematic review of assessment tools and their measurement properties

Start Date and Anticipated Completion Date

October 2017-June 2018

Principal Investigator/Guarantor of Review

Sylvain Boet, MD, PhD, Department of Anesthesiology and Pain Medicine

The Ottawa Hospital, General Campus

501 Smyth Rd, Critical Care Wing 1401

Ottawa, Ontario K1H 8L6

Review Team Members and Organizational Affiliations

Sarah Larrigan, BSc (Candidate) – University of Ottawa

Leonardo Calderon, MD Candidate–University of Ottawa

Henry Liu, MD Candidate - University of Ottawa

Katrina J. Sullivan, MSc – The Ottawa Hospital Research Institute

Nicole Etherington, PhD – The Ottawa Hospital Research Institute

Protocol Development and Contributions

This protocol follows the Preferred Reporting Items for Systematic Review and Meta-Analysis Protocols (PRIMSA-P). The conception, design, and initial drafting of this protocol was led by the core group of investigators identified above.

Organizational Affiliation of Review

Ottawa Hospital Research Institute

Funding Sources/Sponsors

Sylvain Boet was supported by The Ottawa Hospital Anesthesia Alternate Funds Association.

Conflicts of Interest

The authors have no conflicts of interest to declare.

Background and Rationale

Clinical competence in anesthesia requires proficiency in non-technical skills (e.g. communication, leadership, situation awareness).1–3Despite the importance of technical skills for effective operating room (OR) performance, a large proportion of intraoperative errors, adverse patient outcomes, and mortality4–10 is due to deficiencies in non-technical skills.

There has been more frequent assessment of professional practice in recent years as a result of emerging competency-based medical education requirements and an emphasis on accountability of professional performance for patient safety.11 Interpersonal skills, communication skills, leadership, collaboration, situation awareness, and professionalism have been identified by both US and Canadian institutions as core competencies for anesthesiologists.13,14Measuring non-technical performancein the OR is therefore critical to ensuring the provision of safe, high-quality intraoperative care.

Comprehensive assessment of non-technical skills performance in clinical practice, however, requires robust tool(s) to be identified. It is currently unknown which assessment tools are the most robust for assessing non-technical skills in anesthesia, with inconsistency in the tools and settings used in evaluations of anesthesiologists’ non-technical skills.

Implications

Identifying the most robust tool(s) for assessing anesthesiologists’ non-technical skills will allow the anesthesia community to standardize future research on evaluation of interventions to improve anesthesiologists’ clinical performance. This will facilitate accurate, generalizable, and effective performance-tracking mechanisms as well as evidence-based education.

Objectives

This systematic review will:(1) summarize the tools used to measure the intraoperative non-technical performance of anesthesiologists; and (2) synthesize the psychometric properties of these tools.

Search strategy and information sources

Literature searches will be conducted by an experienced librarian collaborating closely with the team of investigators. Medline and Medline in Process (via OVID), PsycINFO, CINAHL, Embase (via OVID), and ERIC will be searched with no date or language restrictions.The Medline search strategy will be peer-reviewed by a second information specialist using the PRESS tool. Adjustments will be made to the search for each database to optimize search results. Reference lists of previously published systematic reviews and of included articleswill also be searched for additional relevant references.

Eligibility criteria

Inclusion Criteria:

-Examine the psychometric properties (i.e. validity and reliability) of tools specifically intended to assess the non-technical skills of anesthesiologists (either trainee or graduated).

-Tools are evaluated within a clinical or simulation intraoperative environment. The intraoperative period is defined as the time from when the patient is physically in the operating room or anesthetics room, where the anesthesiologist performs procedures or administers medication to the patient, until the time the patient leaves the OR. This excludes both the pre- and post-operative periods (e.g. recovery room, endoscopy suite, interventional radiology).

-Study must include a quantitative analysis of psychometric properties or qualitative assessment of forms of validity.

-Tools must be developed for objective assessment of skills.

Exclusion Criteria:

-Tools that assess anesthesia assistants, nurse anesthetists, and interprofessional teams.

-Tools that include technical skills items.

-Studies where psychometric assessment is not the primary outcome.

-Studies that evaluateanesthesiologists’ performance but not the assessment tool itself.

-Tools that are developed for subjective (i.e. self-reported) assessmentof skills.

Study selection

Titles and abstractswill be screened in duplicate for eligibility by two independent reviewers. Full-texts of included studies will then be reviewed. Disagreements at each level of screening will be resolved by consensus discussion or assistance from another reviewer if needed.

Data extraction

Data extraction will be conducted by one reviewer using an electronic data collection form for all included articles. Extracted information will them be verified by a second reviewer. The final list of included tools will be reviewed by a group of anesthesiologists to determine accuracy and completeness. The data extraction form will collect general article information (e.g. year and study location), demographics of learners (e.g. trainee status), tool design (e.g. name and number of items), and psychometric outcomes (e.g. properties assessed and validation values).

Study quality

Screeners will assess the methodological quality of included studies in duplicate using The COSMIN checklist.22Disagreements will be resolved through consensus or a third reviewer as required.

Synthesis

A narrative summary of the types of reliability and validity, psychometric coefficients, validation context (i.e. simulation or clinical), and level of psychometric evidence for each assessment tool (i.e. minimal, moderate, or extensive) will be completed for each included assessment tool.

For tools with more than one validation study, a meta-analysis of reliability coefficients (i.e. reliability generalization) will be performed. This type of analysis integrates coefficients from different applications of a reliability test and computes an average estimate.23,24Statistical models will be chosen based on the heterogeneity present and the guidelines proposed by Sanchez-Meca and colleagues.25

References

1. Larsson J, Holmströ M IK. How excellent anaesthetists perform in the operating theatre: a qualitative study on non-technical skills. doi:10.1093/bja/aes359.

2. Crossingham G V, Sice PJA, Roberts MJ, Lam WH, Gale TCE. Development of workplace-based assessments of non-technical skills in anaesthesia. Anaesthesia. 2012;67(2):158-164.

3. Fletcher GCL, Mcgeorge P, Flin RH, Glavin RJ, Maran NJ. The role of non-technical skills in anaesthesia: a review of current literature. Br J Anaesth. 2002;88:418-429. Accessed August 15, 2017.

4. Hu YY, Arriaga AF, Peyre SE, Corso KA, Roth EM, Greenberg CC. Deconstructing intraoperative communication failures. J Surg Res. 2012;177(1):37-42. doi:10.1016/j.jss.2012.04.029.

5. Anderson O, Davis R, Hanna GB, Vincent CA. Surgical adverse events: A systematic review. Am J Surg. 2013;206(2):253-262. doi:10.1016/j.amjsurg.2012.11.009.

6. Mazzocco K, Petitti DB, Fong KT, et al. Surgical team behaviors and patient outcomes. Am J Surg. 2009;197(5):678-685. doi:10.1016/j.amjsurg.2008.03.002.

7. Catchpole K, Mishra A, Handa A, McCulloch P. Teamwork and error in the operating room: analysis of skills and roles. Ann Surg. 2008;247(4):699-706. doi:10.1097/SLA.0b013e3181642ec8.

8. Weller J, Boyd M. Making a Difference Through Improving Teamwork in the Operating Room: A Systematic Review of the Evidence on What Works. Curr Anesthesiol Rep. 2014;4(2):77-83. doi:10.1007/s40140-014-0050-0.

9. Van Beuzekom M, Boer F, Akerboom S, Hudson P. Patient safety: Latent risk factors. Br J Anaesth. 2010;105(1):52-59. doi:10.1093/bja/aeq135.

10. Stein JE. The Swiss cheese model of adverse event occurrence—Closing the holes. Semin Pediatr Surg. 2015;24(6):278-282. doi:10.1053/j.sempedsurg.2015.08.003.

11. Stodel EJ, Wyand A, Crooks S, Moffett S, Chiu M, Hudson CCC. Designing and Implementing a Competency-Based Training Program for Anesthesiology Residents at the University of Ottawa. Anesthesiol Res Pract. 2015;2015. doi:10.1155/2015/713038.

12. Ebert TJ, Fox CA. Competency-based Education in Anesthesiology. Anesthesiology. 2014;120(1):24-31. doi:10.1097/ALN.0000000000000039.

13. Royal College of Physicians and Surgeons of Canada. Anesthesiology Competencies. Published 2017. Accessed August 9, 2017.

14. Accreditation Council for Graduate Medical Education. The Anesthesiology Milestone Project. Published 2015. Accessed August 9, 2017.

15. Sharma B, Orzech N, Boet S, Grantcharov T. Non-technical skills assessment in the post-operative setting. J Am Coll Surg. 2011;213(3 SUPPL. 1):S122.

16. Whittaker G, Abboudi H, Khan MS, Dasgupta P, Ahmed K. Teamwork Assessment Tools in Modern Surgical Practice: A Systematic Review. Surg Res Pract. 2015;2015:494827. doi:10.1155/2015/494827.

17. Larsson J. Monitoring the anaesthetist in the operating theatre – professional competence and patient safety. Anaesthesia. 2017;72:76-83. doi:10.1111/anae.13743.

18. Shea BJ, Hamel C, Wells GA, et al. AMSTAR is a reliable and valid measurement tool to assess the methodological quality of systematic reviews. J Clin Epidemiol. 2009;62(10):1013-1020. doi:10.1016/j.jclinepi.2008.10.009.

19. Moher D, Liberati A, Tetzlaff J, Altman DG, PRISMA Group. Reprint--preferred reporting items for systematic reviews and meta-analyses: the PRISMA statement. Phys Ther. 2009;89(9):873-880. Accessed August 4, 2017.

20. Eva KW, Regehr G. Self-assessment in the health professions: a reformulation and research agenda. Acad Med. 2005;80(10 Suppl):S46-S54. doi:10.1097/00001888-200510001-00015.

21. Google Inc. Google Translate. Published 2016. Accessed December 9, 2016.

22. Terwee CB, Mokkink LB, Knol DL, Ostelo RWJG, Bouter LM, De Vet HCW. Rating the methodological quality in systematic reviews of studies on measurement properties: A scoring system for the COSMIN checklist. Qual Life Res. 2012;21(4):651-657. doi:10.1007/s11136-011-9960-1.

23. Vacha-Haase T, Henson RK, Caruso JC. Reliability Generalization: Moving toward Improved Understanding and Use of Score Reliability. Educ Psychol Meas. 2002;62(4):562-569. doi:10.1177/0013164402062004002.

24. Vacha-Haase T. Reliability generalization: exploring variance in measurement error affecting score reliability across studies. Educ Psychol Meas. 1998;58(1):6-20.

25. Sánchez-Meca J, López-López JA, López-Pina JA. Some recommended statistical analytic practices when reliability generalization studies are conducted. Br J Math Stat Psychol. 2013;66(3):402-425. doi:10.1111/j.2044-8317.2012.02057.x.

26. Weller JM, Bloch M, Young S, et al. Evaluation of high ®delity patient simulator in assessment of performance of anaesthetists. doi:10.1093/bja/aeg002.

27. Langerman A, Grantcharov TP. Are We Ready for Our Close-up? Ann Surg. 2017;XX(Xx):1. doi:10.1097/SLA.0000000000002232.

1