Developing a pediatric nursing simulation scenario template in South Korea: applying real-time Delphi methods

Article information

Child Health Nurs Res. 2024;30(2):142-153
Publication date (electronic) : 2024 April 30
doi : https://doi.org/10.4094/chnr.2024.012
1Associate Professor, Department of Nursing, Gangneung-Wonju National University, Wonju, Korea
2Assistant Professor, Department of Nursing, Gangneung-Wonju National University, Wonju, Korea
3Lecturer, Department of Nursing, Gangneung-Wonju National University, Wonju, Korea
Corresponding author: Meen Hye Lee Department of Nursing, Gangneung-Wonju National University, 150 Namwon-ro, Heaungup-Myun, Wonju 26403, Korea TEL: +82-33-760-8658 FAX: +82-33-760-8640 E-MAIL: leemh@gwnu.ac.kr
Received 2024 March 19; Revised 2024 April 15; Accepted 2024 April 21.

Abstract

Purpose

This study aimed to describe the process of developing a validated pediatric nursing simulation scenario template using the real-time Delphi method.

Methods

A panel of 13 pediatric nursing experts participated in a real-time Delphi survey conducted over two rounds. Initially, 83 items were included in the questionnaire focusing on the structure and content of the simulation scenario template. Data analysis involved calculating the content validity ratio (CVR) and the coefficient of variation to assess item validity and stability.

Results

Through iterative rounds of the Delphi survey, a consensus was reached among the experts, resulting in the development of a pediatric nursing simulation scenario template comprising 41 items across nine parts. The CVR values ranged from 0.85 to 1.0, indicating a high consensus among experts regarding the inclusion of all items in the template.

Conclusion

This study presents a novel approach for developing a pediatric nursing simulation scenario template using real-time Delphi methods. The real-time Delphi method facilitated the development of a comprehensive and scientifically grounded pediatric nursing simulation scenario template. Our template aligns with the International Nursing Association for Clinical Simulation and Learning standards, and provides valuable guidance for educators in designing effective simulation scenarios, contributing to enhanced learning outcomes and better preparation for pediatric clinical practice. However, consideration of cultural and contextual adaptations is necessary, and further research should explore alternative consensus criteria.

INTRODUCTION

Simulation education enables repeated learning in a safe environment similar to clinical situations, allowing for the experience of serious and rare clinical situations without risking humans [1]. This educational method is effective in improving clinical performance abilities, clinical reasoning capacities, and nursing practitioners' confidence [2]. Especially in situations where direct patient care in clinical settings is limited, simulation education plays an essential role in enhancing knowledge, skills, and decision-making abilities [3]. In simulation education, students repeatedly learn various complex scenarios, and the process of giving and receiving feedback through debriefing can enhance their learning outcomes [4]. This process develops the competencies needed in situations similar to real clinical environments and prepares them for actual patient care [3].

In recent years, owing to the impact of low birth rates, the number of births in 2022 has decreased to 249,000 in South Korea, a 4.4% decrease compared with that in 2021 [5]. This rapid decline in the number of births has affected opportunities in pediatric nursing practice and education. Moreover, the emergence of various new infectious diseases like the coronavirus disease-19 (COVID-19), has increased the risk of infection, limiting pediatric nursing practice [6]. In this context, simulation education has been established as a method that provides students with the opportunity to experience various clinical cases and effectively acquire the skills and knowledge necessary for patient care [7].

Simulation education programs in the fields of medicine and nursing, which began in the 1960s, have significantly evolved since then [8]. During this development, simulators for teaching cardiopulmonary resuscitation became widespread as an example, bringing revolutionary changes to medical and nursing education [8]. In particular, pediatric nurses require a high level of flexibility and rapid response capabilities to address the diverse and complex health conditions and clinical situations that arise during a child's growth and development process [9]. Children may experience physical and mental health issues at different stages of development that demand special attention and a professional approach to pediatric nursing practice [9]. As a key strategy to meet these demands, simulation education supports better preparation for complex challenges encountered in pediatric nursing [7].

The components of simulation education generally include educational goals, reproducibility, complexity, cues, and debriefing [10]. This necessitates clear educational objectives to maximize learners’ educational effects. Reproducibility and complexity are related to the range of education that realistically represents simulation, which should be planned from simple to complex [4]. Cues are information provided by instructors to help students effectively approach simulation scenarios, and debriefing is the process of critically thinking and discussing complex situations with each other, connecting theory, practice, and research for students' positive experiences and learning effects. The most significant learning effects can be obtained in this area, including nursing knowledge and technical aspects, learner satisfaction, and improvement in critical thinking and confidence [11].

A well-designed template is crucial for effective implementation of the core components of simulation education. The International Nursing Association for Clinical Simulation and Learning (INACSL) emphasizes the importance of evidence-based design templates that educators can use to effectively design simulation scenarios in their standard guidelines. Such templates assist in standardizing the simulation process and ensuring consistent quality and evidence-based approaches during scenario composition and implementation by educators [12].

However, owing to the absence of universal and effective tools currently used for scientifically validating and evaluating scenarios [13], it is challenging to identify the essential components of a scenario and determine an ideal template structure. The standards and guidelines provided by the INACSL for clinical simulation and learning offer a crucial framework for developing and evaluating pediatric nursing simulation scenarios. However, they do not completely resolve the differences in components and expressions across various scenarios [14]. In addition, although some studies have emphasized the importance of pediatric simulation-based education [9], there has been no research on the development of standardized simulation scenario templates for pediatric nursing.

Therefore, efforts are needed to ensure the consistency and standardization of scenario templates for effective learning and assessment. This includes setting objectives that consider learners' levels, verifying the authenticity of patient information, and concretizing effective debriefing and feedback. Additionally, according to previous studies, although simulation education is spreading globally, many scenarios published in various countries do not fully comply with the INACSL’s standard guidelines. In particular, the template elements analyzed according to the INACSL standards show differences in the order and expression of components and sub-elements in various scenarios, indicating limitations in ensuring consistency [15].

To address these issues, the real-time Delphi method can play a crucial role in integrating the opinions of various experts to determine the key components and structure of the template. This approach contributes to determining a balance between educational value and practical feasibility by deriving a broad consensus among experts [16]. This process will allow the development of scientifically validated, realistic, and educationally effective pediatric nursing simulation templates by reconciling differences in opinions during the scenario development process.

The real-time Delphi method enables the collection and analysis of panel members’ opinions through rounds of real-time feedback based on the initial responses of the expert panel [17]. This process ultimately leads to more precise decision-making, playing a pivotal role in the development of pediatric nursing simulation templates. Consequently, this study aimed to use the real-time Delphi method to assemble a panel of experts in pediatric nursing to develop a simulation template. Through this process, it is anticipated that a fundamental structure for pediatric nursing simulations can be established by applying an evidence-based process as the basis for scenario development.

This study describes the process of developing a pediatric nursing simulation template using a real-time Delphi technique.

METHODS

Ethics statement: This study was approved by the Institutional Review Board (IRB) of the Gangneung-Wonju National University (No. GWNUIRB-2023-11). Informed consent was obtained from all participants.

1. Study Design

This study employed a real-time Delphi survey to develop a pediatric nursing simulation template. This involved gathering and reaching a consensus among expert panel opinions. Reporting of this study was based on the Conducting and Reporting Delphi Studies (CREDES) guidelines [18].

2. Development of Questionnaire

To construct a pediatric nursing simulation scenario template, a content analysis of the scenarios developed for current pediatric nursing simulations was conducted [15]. For directed qualitative content analysis, 32 pediatric patient nursing simulation scenarios were collected and analyzed using the PRISMA 2020 method. Additionally, content analysis was performed based on the INACSL protocol to analyze the contents of the scenarios. The study results led us to establish preliminary principles for developing essential components and contents guided by the INACSL Standards of Best Practice: Simulation. The “INACSL Standards of Best Practice: Simulation” is publicly available, providing specific standards for simulation educators [19]. Our template was structured into 9 parts consisting of 59 items. The questionnaire comprised 83 items, including 59 for the template structure, 19 for the scenario theme, and five for the target developmental stage (Table 1).

Contents of the Preliminary Survey Questionnaire for the Pediatric Nursing Simulation Scenario Template

3. Selection of Experts

The expert panel for this study composed of 13 experts with sufficient background knowledge and experience in the field related to the pediatric research topic. The selection criteria for the experts participating in the study were as follows: individuals with 10 or more years of experience in nursing education and practice, individuals with five or more years of experience in conducting simulation training, or researchers in the field of simulation. The researcher invited experts after assessing their suitability as research participants.

4. The Real-Delphi Survey Process

A real-time Delphi survey was conducted in two phases. The first survey was conducted from September 22 to 28, 2023, and the second from October 30 to November 10, 2023. A dedicated website (https://k-realtimedelphi.net/) was developed to apply the real-time Delphi method. This website’s automated tasks are traditionally performed manually in the Delphi method, such as collecting expert panel responses, survey coding, statistical analysis, and providing feedback. This enabled the effective participation of both researchers and expert panels.

1) Round 1 real-time Delphi survey

The first survey, which used the real-time Delphi method, was structured into nine parts for the initial template. The experts were asked to review the appropriateness of including each part and to evaluate the items and content of those parts. Examples of the survey question included: “Do you think it is appropriate to include ‘Title’ in part 1. Initial Elements?” The survey responses were structured as Keep, Delete, Modify, or Additional Comments. If the experts felt the contents were appropriate for a part, they would select Keep; if inappropriate, Delete; if appropriate but needing modifications, Modify or leave Additional Comments.

2) Round 2 real-time Delphi survey

The template was revised and enhanced following the first real-time Delphi survey result, considering experts’ opinions, content validity ratio (CVR), and coefficient of variation (CV). While maintaining the structure of the template, some content within each part was added or modified based on the comments of the panelists or the CVR values. Unlike the first survey, the second-round questions focused on whether each part and its content within the template could be appropriately used. The scale ranged from “Strongly Agree” to “Strongly Disagree” on a 4-point scale, and optional section for “Other” was provided for additional comments.

5. Data Analysis

The data collected through the two rounds of Delphi surveys were analyzed by calculating the CVR and CV for each item using Microsoft Excel. The interpretation of the CVR values followed the criteria set by Lawshe [20]. Figure 1 displays the flowchart of the decision-making process based on the analyzed results.

Figure 1.

Flowchart of real time Delphi survey. CVR, content validity ratio; INACSL, International Nursing Association for Clinical Simulation and Learning.

The CVR depends on the number of panel members. For the first Delphi survey with 13 panel members, a CVR value of 0.54 or higher was required to consider the content valid. Therefore, items with a CVR value less than 0.54 in the first survey were considered to have low content validity and were subject to deletion or modification. A CV of less than 0.5 indicates high consistency in expert responses; thus, it is considered stable and reliable [20].

RESULTS

1. General Characteristics of the Participants

The general characteristics of the experts who participated in the real-time Delphi survey are presented in Table 2. The average age was 41.38±10.65 years, with a range of 32 to 63 years old. 12 out of 13 participants (92.3%) were female, and the total career experience as a nurse was 79.23±65.60 months, with a range of 25 to 254 months. The career length as teaching professionals in nursing was 159.85±97.24 months, ranging from 19 to 326 months. The most common nursing specialty among the experts was pediatrics, with 10 (66.7%) responses. When asked about their experience with simulation-related research or projects, 11 participants (84.6%) reported being experienced. All experts had experience in simulation scenario development and operating simulation education.

General Characteristics of Experts Participating in the Survey (N=13)

2. Assessment of Appropriate Themes and Skill Levels for Pediatric Nursing Simulation Scenarios

The results on the appropriate age for developing pediatric nursing simulation scenarios showed that “Preschooler” was considered most valid with CVR of 1 (Table 3). The survey results for the appropriate skill level for each academic year for the freshman, “Level 1 Novice” had a CVR of 0.67 and a CV of 0.11. For sophomores, “Level 1 Novice” and “Level 2 Advanced Beginner” had CVRs of 0.56 and 0.67, and CVs of 0.28 and 0.33, respectively. In the junior, “Level 3 Competent” was indicated with a CVR of 0.67 and a CV of 0.41. For the senior, “Level 3 Competent” and “Level 4 Proficient” showed a CVR of 0.78 and a CV of 0.37.

Assessment of Appropriate Themes and Skill Levels for Pediatric Nursing Simulation Scenarios

3. Real-time Delphi First and Second Round

The final number of respondents in the first round of the real-time Delphi survey was 13. Initially, the CVR and CV values for each item were calculated. The range of content validity index (CVI) ranged from 0.32 to 1, with the “Development of scenario: Is it appropriate to include the ‘Theoretical framework’ item?” having the lowest score of 0.31, and “part 2. Objectives and Expected Outcomes: Do you think it is appropriate to include the following options in the ‘Evaluation’ section?” being the second lowest at 0.46 (Table 4).

Results of the Real-time Delphi for the First Round and Second Round of the Development of Pediatric Nursing Simulation Scenario Template

In the development of pediatric nursing simulation scenarios, the CV values were below 0.5 for all themes, indicating a high level of stability in these assessments (Table 4). However, Kendall’s W value was 0.00072 (χ2 value=43.65, p=.319), indicating an overall low consensus among experts in round 1 (Table 4). This may suggest some variability or lack of consistency among the experts’ opinions. Based on the results of the first-round Delphi process, we removed or modified items with a CVR of 0.54 or below. The remaining items were consolidated, reclassified, or newly generated, yielding 41 items. For all 41 items, the CVR ranged from a minimum of 0.85 to a maximum of 1.0, with the lowest average and standard deviation, including the newly generated items, being 3.31±1.11 and highest being 4.00±0.00, indicating a high level of agreement among the experts. Additionally, Kendall’s W value was 0.34986 (χ2 value=181.93, p<.001), confirming consistency among the experts’ opinions. Consequently, the Delphi rounds for the template were concluded based on this consensus (Table 4).

DISCUSSION

This study developed a pediatric simulation template reflecting the views and opinions of 13 pediatric nursing experts using the real-time Delphi technique. This process yielded a final template comprising 41 items across 9 sections, deemed suitable for application as a pediatric simulation template (Supplement 1). All 41 items demonstrated high consensus, ranging from 0.85 to 1 in CVR, indicating their essential inclusion. The CV values for all items were also low ranging from 0.07 to 0.32, indicating good agreement between the predictions or evaluations expressed by the experts [21].

In addition to evaluating the CV and CVR values that provided information on whether to maintain or delete the items, qualitative changes were made to the items based on expert feedback, including modifications to the wording or format. For example, some categories were modified in part 1 (initial elements). Reflecting on the first-round comment on this categorization, we combined the two groups into one group (nurses) for simplicity and conciseness. Specific options were added to certain items in the environment and setting from part 3. We added options “pediatric and adolescent ward/pediatric emergency department/NICU/NR/PICU/Others” in the environment part. The provision of specific options was also added in the “Equipment required and embedded patients and roles” in part 3. Through the round 1 Delphi survey, this specification suggested more concrete information on the template, which would guide educators in preparing for the required equipment, as well as how to set up the designated environment and setting. Only one item (objectives and expected outcomes: following options for evaluation) was deleted from the 42 full items based on the first survey, presenting low CVR (0.46) and high CV (0.27) values, which suggest deletion. A low CVR may be explained due to having a very similar item (objectives and expected outcomes: evaluation), which would be deemed as a repeated question.

In addition to the noted changes, two new items were proposed through the first-round survey: self-reflection and discussion in part 7 (Debriefing Planning). This addition is in line with the INACSL Standard, emphasizing the essential point of integrating the debriefing process into simulation-based experiences that enhance learning and increase participants’ self-awareness and self-efficacy [12]. Many studies have reported that high self-reflection predicts higher nursing competence and mediates the relationship between students’ anxiety and nursing competence [22,23]. A previous study also reported that students enjoyed a more meaningful learning experience when they engaged in self-reflective practices by examining and understanding their emotions, attitudes toward self and others, and reactions to everyday professional situations [24]. Even though our study was mainly developed for template development, and did not include a sample question, future studies may consider a guide comment to facilitate students’ self-reflection in debriefing plans based on the developed scenario content.

The themes of these scenarios were categorized into acute stage care, symptom management, and skill training. Acute stage care and skill training themes generally had high CVR values ranging from 0.89 to 1, indicating their significant inclusion. However, symptom management themes showed more varied CVR values, including fever and gastroenteritis (1), fracture (0.78), acute pyelonephritis (0.67), and sickle cell anemia (SCA; 0.44), suggesting the potential for selective inclusion. These findings offer valuable guidance for prioritizing scenario themes in curriculum development. Although we decided to maintain SCA in the template, the results imply that SCA is a subject of controversy and can thus be skipped depending on the educator.

The results on the appropriate themes represent what content should be prioritized, which would offer helpful information when selecting themes for simulation curriculum development. In order to assess learners’ abilities, academic grades were matched to specific skill levels. The results indicated that skill level increased as the academic grade increased. As it is difficult to determine the level of goals and expected outcomes, this type of question could assist educators in establishing suitable objectives and offering assessment tools aligned with learners’ abilities.

The real-time Delphi technique enabled experts to simultaneously refer to other panelists’ responses and comments while completing their surveys. In the real-time Delphi procedure, participants were provided entry into an online survey platform for a defined duration. Upon entering the platform, expert panelists could view their own responses alongside continuous updates, thus providing real-time, anonymized responses from fellow panelists. The primary novelty of real-time Delphi investigation lies in the concurrent computation and provision of feedback. Unlike the traditional approach, participants in a real-time Delphi approach are not restricted to evaluating at fixed intervals (such as rounds), but can alter their viewpoints as frequently as desired within a designated timeframe [17,25].

The Delphi technique has not been widely used for simulation template development, and only a few studies have used it for survey tool development, such as violence and self-reporting triage survey tools [26,27]. In this aspect, our study is important because it is the first to apply the real-time Delphi method for simulation template development, providing significant findings through the Delphi process. A noteworthy strength of this study is the application of INACSL standards and a previous comprehensive finding on pediatric simulation scenarios. For instance, we included the role of the facilitator in our template, as INACSL emphasized the facilitator’s role as a necessary element in achieving the desired goals throughout the process of scenario-based learning, and that a program to train facilitators should be provided [15]. Since previous findings have reported that only half of the examined scenarios addressed facilitation, integrating facilitator guidelines into this template would serve as a reminder of their significance and underscore the necessity for facilitator training programs. Furthermore, a prompt was displayed in the rightmost section, designed to prompt educators to review each section [15]. INACSL emphasizes the importance of a more effective learning experience and achievement of educational objectives. As one of the ways to achieve this goal, INACSL recommends applying SMART (specific, measurable, attainable, relevant, timely) standards when establishing a simulation template. We included the SMART section in the right column to assist educators in formulating scenario objectives. This addition would remind educators to use action verbs to enable the performance of skills and competencies in scenario-based learning processes [15]. Moreover, we added criteria for expected outcomes and evaluation, such as psychomotor function, cognition, and affectation, which are listed in line with the INACSL guidelines. This reminder serves as a crucial element in ensuring the achievement of the desired learning objectives during the completion of the template [26,27].

The limitations of this study are as follows. First, we did not account for the applicability of the template to various cultural, educational, or local healthcare system contexts because our primary aim was to develop a universal template without considering specific contexts. For example, this study did not consider a virtual reality education method, potentially rendering it unsuitable for such applications. Future template users should customize their content to align with their local contexts and specific requirements. Second, although this study reached predetermined consensus levels, this is not always recommended because psychometric criteria are typically utilized in positivist methodologies [25]. In fact, some studies opt to select an appropriate level within the data rather than relying solely on preset levels. While achieving consensus may not entirely eliminate the panel bias associated with self-interest issues, investigators should be mindful of this concern when determining consensus levels [27]. Our high CVR results for most items and the consensus reached in the two rounds suggest that controversial issues did not arise during the survey process. However, future studies should establish their own criteria to enhance the credibility of the analyzed data when employing the Delphi method for survey development.

CONCLUSION

This study used the Delphi technique to create a pediatric simulation scenario aimed at enhancing the quality of pediatric nursing simulation education. As mentioned previously, this study highlights the need for a standardized pediatric scenario template to ensure consistency and effectiveness in learning and assessment. Our study presents significant findings, as it is the first to employ the real-time Delphi technique in crafting a pediatric simulation template following the INACSL guidelines and drawing upon comprehensive prior research. The broader dissemination of this template will result in an enhanced version incorporating updates suggested by other pediatric nursing education experts in the future.

Notes

Authors' contribution

Conceptualization: all authors; Data collection, Formal analysis: all authors; Writing-original draft: all authors; Writing-review and editing: all authors; Final approval of published version: all authors.

Conflict of interest

No existing or potential conflict of interest relevant to this article was reported.

Funding

This study was supported by a National Research Foundation of Korea (NRF) grant funded by the Korean government (No. 2021R1A2C1095530).

Data availability

Please contact the corresponding author for data availability.

Acknowledgements

None.

Supplementary material

Supplement 1.

Pediatric Simulation Template

chnr-2024-012-Supplement-1.pdf

References

1. Fletcher JL. AANA Journal course: update for nurse anesthetists--ERR WATCH: anesthesia crisis resource management from the nurse anesthetist's perspective. AANA Journal 1998;66(6):595–602.
2. Shin S, Park JH, Kim JH. Effectiveness of patient simulation in nursing education: meta-analysis. Nurse Education Today 2015;35(1):176–182. https://doi.org/10.1016/j.nedt.2014.09.009.
3. Gaba DM. A brief history of mannequin-based simulation & application. In: Dunn WF, Editor. Simulators in critical care education and beyond. Society of Critical Care Medicine; 2004. p. 7-14.
4. Jeffries PR, Rodgers B, Adamson K. NLN Jeffries simulation theory: brief narrative description. Nursing Education Perspectives 2015;36(5):292–293. https://doi.org/10.5480/1536-5026-36.5.292.
5. Korean Statistical Information Service. Birth rate statistics [Internet]. 2023 [cited 2024 April 14]. Available from: https://kostat.go.kr/board.es?mid=a10301010000&bid=204&list_no=426806&act=view&mainXml=Y.
6. Lee H, Jeon H. Experience in child health nursing practice using virtual simulation in the COVID-19 pandemic. Journal of Korea Society for Simulation in Nursing 2022;10(1):73–87. https://doi.org/10.17333/JKSSN.2022.10.1.73.
7. Díaz-Guio DA, Ríos-Barrientos E, Santillán-Roldan PA, Mora-Martinez S, Díaz-Gómez AS, Martínez-Elizondo JA, et al. Online-synchronized clinical simulation: an efficient teaching-learning option for the COVID-19 pandemic time and: beyond. Advances in Simulation 2021;6(1):30. https://doi.org/10.1186/s41077-021-00183-z.
8. Cooper JB, Taqueti VR. A brief history of the development of mannequin simulators for clinical education and training. Postgraduate Medical Journal 2008;84(997):563–570. https://doi.org/10.1136/qshc.2004.009886.
9. Kim E, Song S, Kim S. Development of pediatric simulation-based education - a systematic review. BMC Nursing 2023;22(1):291. https://doi.org/10.1186/s12912-023-01458-8.
10. Kneebone R. Simulation in surgical training: educational issues and practical implications. Medical Education 2003;37(3):267–277. https://doi.org/10.1046/j.1365-2923.2003.01440.x.
11. Jeffries PR. Simulation in nursing education: from conceptualization to evaluation 2nd edth ed. New York, NY: Natinal League for Nursing; 2012. p. 25–42.
12. INACSL Standards Committee. INACSL standards of best practice: SimulationSM simulation design. Clinical Simulation in Nursing 2016;12(Suppl):S5–S12. https://doi.org/10.1016/j.ecns.2016.09.005.
13. Shim K, Shin H. The reliability and validity of the lasater clinical judgement rubric in Korean nursing students. Child Health Nursing Research 2015;21(2):160–167. https://doi.org/10.4094/chnr.2015.21.2.160.
14. Mirza N, Cinel J, Noyes H, McKenzie W, Burgess K, Blackstock S, et al. Simulated patient scenario development: a methodological review of validity and reliability reporting. Nurse Education Today 2020;85:104222. https://doi.org/10.1016/j.nedt.2019.104222.
15. Kim EJ, Cho KM, Song SS. Child nursing simulation scenario content analysis: a directed qualitative content analysis. Clinical Simulation in Nursing 2024;87:101488. https://doi.org/10.1016/j.ecns.2023.101488.
16. Gordon TJ. The real-time Delphi method. Futures research methodology — version 3.0 [Internet]. 2009 [cited 2024 January 31]. Available from: https://millennium-project.org/wp-content/uploads/2022/01/05-Real-Time-Delphi.pdf.
17. Aengenheyster S, Cuhls K, Gerhold L, Heiskanen-Schüttler M, Huck J, Muszynska M. Real-time Delphi in practice — a comparative analysis of existing software-based tools. Technological Forecasting and Social Change 2017;118:15–27. https://doi.org/10.1016/j.techfore.2017.01.023.
18. Jünger S, Payne SA, Brine J, Radbruch L, Brearley SG. Guidance on Conducting and REporting DElphi Studies (CREDES) in palliative care: recommendations based on a methodological systematic review. Palliative Medicine 2017;31(8):684–706. https://doi.org/10.1177/0269216317690685.
19. International Nursing Association for Clinical Simulation and Learning (INACSL). Healthcare Simulation Standards of Best PracticeTM [Internet]. 2023 [cited 2024 April 15]. Available from: https://www.inacsl.org/healthcare-simulation-standards-ql.
20. Lawshe CH. A quantitative approach to content validity. Personnel Psychology 1975;28(4):563–575. https://doi.org/10.1111/j.1744-6570.1975.tb01393.x.
21. Ju QY, Huang LH, Zhao XH, Xing MY, Shao LW, Zhang MY, et al. Development of evidence-based nursing-sensitive quality indicators for emergency nursing: a Delphi study. Journal of Clinical Nursing 2018;27(15-16):3008–3019. https://doi.org/10.1111/jocn.14256.
22. Ambrose LJ, Ker JS. Levels of reflective thinking and patient safety: an investigation of the mechanisms that impact on student learning in a single cohort over a 5 year curriculum. Advances in Health Sciences Education 2014;19(3):297–310. https://doi.org/10.1007/s10459-013-9470-8.
23. Pai HC, Ko HL, Eng CJ, Yen WJ. The mediating effect of self-reflection and learning effectiveness on clinical nursing performance in nursing students: a follow-up study. Journal of Professional Nursing 2017;33(4):287–292. https://doi.org/10.1016/j.profnurs.2017.01.003.
24. Colomer J, Pallisera M, Fullana J, Burriel MP, Fernández R. Reflective learning in higher education: a comparative analysis. Procedia - Social and Behavioral Sciences 2013;93:364–370. https://doi.org/10.1016/j.sbspro.2013.09.204.
25. Varndell W, Fry M, Elliott D. Applying real-time Delphi methods: development of a pain management survey in emergency nursing. BMC Nursing 2021;20(1):149. https://doi.org/10.1186/s12912-021-00661-9.
26. Wilkes L, Mohan S, Luck L, Jackson D. Development of a violence tool in the emergency hospital setting. Nurse Researcher 2010;17(4):70–82. https://doi.org/10.7748/nr2010.07.17.4.70.c7926.
27. Fry M, Burr G. Using the Delphi technique to design a self-reporting triage survey tool. Accident and Emergency Nursing 2001;9(4):235–241. https://doi.org/10.1054/aaen.2001.0245.

Article information Continued

Figure 1.

Flowchart of real time Delphi survey. CVR, content validity ratio; INACSL, International Nursing Association for Clinical Simulation and Learning.

Table 1.

Contents of the Preliminary Survey Questionnaire for the Pediatric Nursing Simulation Scenario Template

Section Contents
Part 1. Initial elements (13 items) 1. Title, 2. Target learners, 3. Target skill acquiring, 4. Approximate Timing, 5. Prerequisite competencies, 6. Brief description of case, 7. Instructors, 8. Facilitators, 9–13. Following options
Part 2. Objectives and expected outcomes (6 items) 1. Scenario Objectives, 2. Expected outcomes, 3. Evaluation, 4. Specific, measurable, attainable, relevant, timely test, 5–6. Following options
Part 3. Preparation (7 items) 1. Environment and Setting, 2. Fidelity & Patient, 3. Equipment required, 4. Embedded participants and Roles, 5–7. Following options
Part 4. Pre-briefing plan (3 items) 1–2. Briefing lists, 2. Psychologically safe learning environment
Part 5. Case information (8 item) 1. Name, Age, Sex, Weight, Height, Head Circumference, 2. Chief complaints & Concerns, 3. Present Illness & medications, 4. Past health history & medications, 5. Allergies, 6. Family history, 7–8. Attachment
Part 6. Scenario progression (9 items) 1 Initial Stage, 2. Secondary Stage, 3. Third Stage, 4. Final Stage, 5. Time, 6. Patient status, 7. Learner Actions, 8. Modifiers & Trigger to Move to Next state, 9. Facilitator Notes
Part 7. Debriefing planning (6 items) 1. Types, 2. Group size, 3. Gather stage, 4. Analyze stage, 5. Summarize stage, 5. Self-reflection, 6. Discussion
Part 8. Evaluation tools development of scenario (4 items) 1. Reaction, 2. Learning Outcomes (Pre-test/post-test), 3. Competency checklist, 4. Results (Debriefing)
Part 9. Development of scenario (4 items) 1. Content validity, 2. Reliability, 3. Evidence & reference, 4. Theoretical framework
Theme of scenario (19 items) Acute stage care (7 items), Symptom management (5 items), Skill training (7 items)
Target developmental stage (5 items) Infant, Toddler, Preschooler, Schooler, Adolescent
Appropriate skill level by academic grade (20 items) Skill level by Freshman, Sophomore, Junior, Senior

Table 2.

General Characteristics of Experts Participating in the Survey (N=13)

Variables n (%) M±SD Min Max
Age (year) 41.38±10.65 32 63
Sex Male 1 (7.7)
Female 12 (92.3)
Total career (nurse) experience (month) 79.23±65.60 25 254
Total career (teaching in nursing) experience (month) 159.85±97.24 19 326
Field of specialization (duplicated responses) Elderly 1 (6.7)
Pediatrics 10 (66.7)
Emergency 2 (13.3)
ICU (pediatrics) 2 (13.3)
Experience in simulation-related research or projects, presentations, and publications Yes 11 (84.6)
No 2 (15.4)
Experience in simulation scenario development Yes 13 (100)
No 0 (0)
Experience in operating simulation education Yes 13 (100)
No 0 (0)

ICU, intensive care unit; M, mean; SD, standard deviation.

Table 3.

Assessment of Appropriate Themes and Skill Levels for Pediatric Nursing Simulation Scenarios

Categories Theme CVR CV
Acute stage care Cardiopulmonary resuscitation 1 0.08
Anaphylaxis 0.89 0.20
Asthma 1 0.08
Seizure 0.89 0.19
Sepsis 0.89 0.20
Prematurity 1 0
Hypoglycemia of newborn 1 0.14
Symptom management Fever 1 0
Fracture 0.78 0.30
Gastroenteritis 1 0.08
Sickle cell anemia 0.44 0.37
Acute pyelonephritis 0.67 0.28
Skill training Intravenous insertion 0.89 0.19
Levin tube feeding 1 0.14
Transfusion 0.89 0.20
Postoperation care 0.78 0.32
Developmental assess & communication 1 0.11
Helping babies breathe training 0.67 0.34
Well child check up 1 0.14
Target developmental Stage Infant 0.89 0.11
Toddler 0.78 0.24
Preschooler 1 0.14
Schooler 0.67 0.35
Adolescent 0.67 0.41
Appropriate skill level by academic grade
 Freshman Level 1 Novice 0.67 0.11
Level 2 Advanced beginner 0.22 0.27
Level 3 Competent 0.11 0.40
Level 4 Proficient 0 0.19
Level 5 Expert 0 0.10
 Sophomore Level 1 Novice 0.56 0.28
Level 2 Advanced beginner 0.67 0.33
Level 3 Competent 0.11 0.40
Level 4 Proficient 0.11 0.28
Level 5 Expert 0 0.26
 Junior Level 1 Novice 0.22 0.45
Level 2 Advanced beginner 0.44 0.54
Level 3 Competent 0.67 0.41
Level 4 Proficient 0.33 0.44
Level 5 Expert 0.11 0.51
 Senior Level 1 Novice 0.33 0.30
Level 2 Advanced beginner 0.33 0.35
Level 3 Competent 0.78 0.37
Level 4 Proficient 0.78 0.37
Level 5 Expert 0.33 0.30

CV, coefficient of variation; CVR, content validity ratio.

Table 4.

Results of the Real-time Delphi for the First Round and Second Round of the Development of Pediatric Nursing Simulation Scenario Template

Order in round 1 Comments CVR CV Decision Order in Round 2 CVR CV M±SD
1 Part 1. Initial Elements: Title 1 0 Retained 1 1 0.09 3.85±0.38
2 Part 1. Initial Elements: Target learners 0.85 0.09 Revised 2 1 0.11 3.77±0.44
3 Part 1. Initial Elements: Following options for Target learners 0.54 0.30
4 Part 1. Initial Elements: Target skill acquiring 0.62 0.22 Revised 3 1 0.09 3.85±0.38
5 Part 1. Initial Elements: Target skill acquiring 0.62 0.22
6 Part 1. Initial Elements: Approximate Timing 0.92 0.14 Revised 4 1 0.09 3.85±0.38
7 Part 1. Initial Elements: Following options for Approximate Timing 0.92 0.14
8 Part 1. Initial Elements: Prerequisite competencies 0.92 0.14 Revised 5 1 0.09 3.85±0.38
9 Part 1. Initial Elements: Following options for Prerequisite competencies 0.77 0.24
10 Part 1. Initial Elements: Brief description of case 1 0 Revised 6 1 0.13 3.69±.048
11 Part 1. Initial Elements: Following options for Brief description of case 0.54 0.30
12 Part 1. Initial Elements: Instructors 0.77 0.20 Revised 7 1 0.07 3.92±0.28
13 Part 1. Initial Elements: Facilitators 0.62 0.25
14 Part 2. Objectives and Expected Outcomes: Scenario Objectives 0.85 0.20 Revised 8 1 0.07 3.92±0.28
15 Part 2. Objectives and Expected Outcomes: Following options for Scenario Objectives 0.69 0.13
16 Part 2. Objectives and Expected Outcomes: Expected outcomes 0.92 0.14 Revised 9 0.92 0.17 3.62±0.65
17 Part 2. Objectives and Expected Outcomes: Following options for Expected outcomes 0.62 0.25
18 Part 2. Objectives and Expected Outcomes: Evaluation 0.77 0.20 Revised 10 0.92 0.15 3.77±0.60
19 Part 2. Objectives and Expected Outcomes: Following options for Evaluation 0.46 0.27 Deleted
20 Part 3. Preparation: Environment and Setting 0.92 0.14 Revised 11 1 0.09 3.85±0.38
21 Part 3. Preparation: Following options for Environment and Setting 0.62 0.28
22 Part 3. Preparation: Fidelity & Patient 1 0 Revised 12 1 0 4.00±0.00
23 Part 3. Preparation: Following options for Fidelity & Patient 0.77 0.24
24 Part 3. Preparation: Equipment required 0.92 0.14 Revised 13 0.92 0.15 3.77±0.60
25 Part 3. Preparation: Following options for Equipment required 0.77 0.20
26 Part 3. Preparation: Embedded participants and Roles 0.85 0.20 Retained 14 1 0.09 3.85±0.38
27 Part 4. Pre-briefing Plan: Briefing Lists 0.85 0.15 Revised 15 1 0.07 3.92±0.28
28 Part 4. Pre-briefing Plan: Following options for Briefing Lists 0.77 0.20
29 Part 4. Pre-briefing Plan: Psychologically Safe Learning Environment 0.85 0.20 Retained 16 1 0.13 3.69±0.48
30 Part 5. Case Information: Name, Age, Gender, Weight 0.77 0.24 Retained 17 0.92 0.16 3.69±0.63
31 Part 5. Case Information: Chief complaints 0.92 0.14 Retained 18 1 0 4.00±0.00
32 Part 5. Case Information: Past history & medical, medications 0.77 0.24 Retained 19 1 0.09 3.85±0.38
33 Part 5. Case Information: Allergies 1 0 Retained 20 1 0.11 3.77±0.44
34 Part 5. Case Information: Family history 1 0 Retained 21 1 0.07 3.92±0.28
35 Part 5. Case Information: Physical exam 0.85 0.20 Revised 22 1 0.09 3.85±0.38
36 Part 5. Case Information: Attachment 0.85 0.20
37 Part 5. Case Information: Following options for Attachment 0.77 0.24 Retained 23 1 0.09 3.85±0.38
38 Part 6. Scenario Progression: Categories 0.85 0.20 Revised 24 1 0.07 3.92±0.28
39 Part 6. Scenario Progression: Initial State 1 0
40 Part 6. Scenario Progression: Following options for Initial State 0.77 0.24
41 Part 6. Scenario Progression: Secondary State 0.92 0.14 Revised 25 0.92 0.21 3.77±0.83
42 Part 6. Scenario Progression: Following options for Secondary State 0.85 0.20
43 Part 6. Scenario Progression: Third State 0.85 0.15 Revised 26 0.85 0.31 3.54±1.13
44 Part 6. Scenario Progression: Following options for Third State 0.77 0.20
45 Part 6. Scenario Progression: Final State 1 0 Revised 27 0.92 0.15 3.69±0.85
46 Part 6. Scenario Progression: Following options for Final State 0.77 0.23
47 Part 7. Debriefing Planning: Types 1 0 Retained 28 1 0.23 3.77±0.44
48 Part 7. Debriefing Planning: Following options for Types 0.77 0.23 Retained
49 Part 7. Debriefing Planning: Gather 0.85 0.15 Retained 29 0.92 0.15 3.77±0.60
50 Part 7. Debriefing Planning: Analyze 0.85 0.20 Retained 30 1 0.07 3.92±0.28
51 Part 7. Debriefing Planning: Summarize 0.77 0.20 Retained 31 0.92 0.15 3.77±0.60
52 Part 8. Evaluation Tools: Reaction 0.77 0.24 Retained 34 0.92 0.22 3.69±0.85
53 Part 8. Evaluation Tools: Learning 0.77 0.24 Retained 35 0.85 0.32 3.31±1.11
54 Part 8. Evaluation Tools: Behavior 0.92 0.14 Retained 36 1 0.30 3.77±0.44
55 Part 8. Evaluation Tools: Results 0.77 0.24 Retained 37 0.85 0.27 3.38±0.96
56 Part 9. Development of scenario: Theoretical framework 0.31 0.24 Revised 38 0.92 0.17 3.62±0.65
57 Part 9. Development of scenario: Content validity 0.69 0.17 Retained 39 0.92 0.23 3.62±087
58 Part 9. Development of scenario: Reliability 0.62 0.18 Retained 40 0.92 0.23 3.54±0.88
59 Part 9. Development of scenario: Evidence & Reference 0.77 0.20 Retained 41 0.92 0.17 3.62±0.65
Part 7. Debriefing Plan: self-reflection Newly added 32 1 0 4.00±0.00
Part 7. Debriefing Plan: Discussion Newly added 33 1 0.11 3.77±0.44
W value: 0.00072, χ2 value: 43.65, p value: .319 W value: 0.34986, χ2 value: 181.93, p value: <.001

CV, coefficient of variation; CVR, content validity ratio; M, mean; SD, standard deviation.