Skip to main content

SYSTEMATIC REVIEW article

Front. Med., 16 June 2023
Sec. Healthcare Professions Education
Volume 10 - 2023 | https://doi.org/10.3389/fmed.2023.1124264

Tools for faculty assessment of interdisciplinary competencies of healthcare students: an integrative review

Sharon Brownie1,2,3* Denise Blanchard4,5,6 Isaac Amankwaa3,7 Patrick Broman3 Marrin Haggie3 Carlee Logan3 Amy Pearce3 Kesava Sampath3 Ann-Rong Yan7 Patrea Andersen3,8,9
  • 1School of Health Sciences, Swinburne University, Hawthorn, QLD, Australia
  • 2School of Medicine and Dentistry, Griffith University, Gold Coast, QLD, Australia
  • 3Centre for Health and Social Practice and Centre for Sports Science and Human Performance, Waikato Institute of Technology – Te Pukenga, Hamilton, New Zealand
  • 4School of Nursing, Eastern Institute of Technology – Te Pukenga, Hawkes Bay, New Zealand
  • 5School of Nursing, Paramedicine and Healthcare Sciences, Charles Sturt University, Bathurst, NSW, Australia
  • 6School of Nursing and Midwifery, The University of Newcastle Central Coast Clinical School, Ourimbah, NSW, Australia
  • 7Faculty of Health, University of Canberra, Canberra, NSW, Australia
  • 8School of Nursing, Midwifery and Paramedicine, University of the Sunshine Coast, Sippy Downs, QLD, Australia
  • 9School of Nursing, Midwifery and Social Science, Central Queensland University, Sippy Downs, QLD, Australia

Increasingly, interprofessional teamwork is required for the effective delivery of public health services in primary healthcare settings. Interprofessional competencies should therefore be incorporated within all health and social service education programs. Educational innovation in the development of student-led clinics (SLC) provides a unique opportunity to assess and develop such competencies. However, a suitable assessment tool is needed to appropriately assess student progression and the successful acquisition of competencies. This study adopts an integrative review methodology to locate and review existing tools utilized by teaching faculty in the assessment of interprofessional competencies in pre-licensure healthcare students. A limited number of suitable assessment tools have been reported in the literature, as highlighted by the small number of studies included. Findings identify use of existing scales such as the Interprofessional Socialization and Valuing Scale (ISVS) and the McMaster Ottawa Scale with Team Observed Structured Clinical Encounter (TOSCE) tools plus a range of other approaches, including qualitative interviews and escape rooms. Further research and consensus are needed for the development of teaching and assessment tools appropriate for healthcare students. This is particularly important in the context of interprofessional, community-partnered public health and primary healthcare SLC learning but will be of relevance to health students in a broad range of clinical learning contexts.

1. Introduction

Effective interprofessional engagement and collaborative practice are crucial to quality public health and primary healthcare delivery, especially given the growing prevalence of non-communicable illness (1). Therefore, fundamental skills of professional teamwork are essential to the preparation of 21st-century health and social workforces (25). Despite the necessity of pre-licensure healthcare students developing these interprofessional competencies, the educational experience and assessment process is often constrained by profession-specific boundaries and logistical barriers which require specific strategies to address (57). There is significant agreement that more work is needed in transforming curricula and effectively assessing the development of interprofessional competencies throughout the educational experience (8). This requires, for educators, the identification of interprofessional competencies required of members of healthcare teams and careful consideration of how these are taught and assessed (9). Prompted by the development of a student-led clinic in Aotearoa New Zealand, this search inquiry was undertaken to identify tools used globally by faculty to evaluate and assess interprofessional competencies in pre-licensure students from two or more healthcare professions. The search sought examples where two or more professions had worked together rather than tools developed or utilized from the activity and perspective of one profession alone.

2. Background

2.1. Student-led clinics

Student-led clinics (SLCs) are an increasingly widely used model of clinical practice education that increases the involvement of pre-licensure students in hands-on practice, particularly within primary healthcare settings, while providing a broad range of benefits to service users and communities (10). Of particular note, SLCs are shown to be a helpful health delivery model in providing public health and primary healthcare services to support underserved and marginalized health communities (1, 11, 12). SLCs may involve a single professional group (10) or may be interprofessional in nature (13, 14). The success of SLCs clinics is enhanced by thoroughly planning clinical activities, student experience and competency assessment. Detailed planning is vital if the clinics are interprofessional. While the benefits of interprofessional practice are well-understood, the IPE dimension adds more complexity to the endeavor of establishing an SLC (5, 6). Evidence-based pedagogical approaches are needed to inform the development of clinical placement rotations and experience.

2.2. Context

The researchers undertaking this review are involved in establishing an interprofessional SLC in the Waikato region of Aotearoa New Zealand. The region's high prevalence of non-communicable diseases such as Type two Diabetes Mellitus (T2DM), cardiovascular disease and respiratory illness calls for greater public health awareness and literacy and enhanced primary healthcare (15). An initial feasibility study canvassed the views of community organizations and members, enabling the proposed development to be community-led and aligned with the specific needs of local communities (16). Following community prioritization of need, it was agreed that the proposed SLC would focus on increasing public health awareness and enhancing primary healthcare access for a broad range of services related to T2DM and related non-communicable diseases. Services are intended to improve health knowledge and care access. Interprofessional delivery helps to address related equity issues (17). This integrative review was designed as part of the planning process for the SLC, to identify competency assessment tool/s currently being used by teaching faculty to inform the development of a teaching and assessment tool common to all pre-licensure students participating in the proposed SLC. Relevant professional groups include nursing, midwifery, physiotherapy, osteopathy, social work, counseling, clinical exercise physiology, dietetics, osteopathy, and sport science students.

2.3. Operational definitions

Ambiguity is not uncommon as various nomenclature is used within the literature to describe concepts of interdisciplinarity and assessment. Thus, definitions were explored as a precursor to this review with the following utilized for the purposes of the review.

2.3.1. Interdisciplinarity

Interprofessional (IP), interdisciplinary and multidisciplinary practices are inconsistently defined in the literature. IP practice is perhaps best defined as multiple health team members from different professional backgrounds working together in clinical practice (18). In contrast, interdisciplinary practice involves “knowledge sharing” (19) from multiple knowledge bases and collaborating to achieve a shared outcome, typically with an educational focus (20, 21). Multidisciplinary practice is differentiated further, as professionals achieve this by working from their own knowledge base, with minimal/no knowledge of each other's knowledge base (19). IP is often also suffixed with education and learning. While IP practice refers to the clinical practice context, IP education and learning “occurs when two or more professions learn about, from and with each other to enable effective collaboration and improve health outcomes” (18) and is the process of preparing people for collaborative IP practice (22). Another important distinction to make is collaborative practice, when members of the healthcare teamwork with people from within their profession, people outside their profession, and multiple other stakeholders, such as patients/clients and their families or non-health members of the team (23). In this review, the focus is on assessment of IP practice in a clinical setting and, while this is an interdisciplinary context where collaborative practice will occur, the term IP will be used throughout.

2.3.2. Assessment

This review searched for and appraised appropriate “tools” and “instruments” to inform how to best evaluate or assess IP practice in learners. Assessment “tools” and “instruments” are terms also used interchangeably in the literature (2426), with contradictory definitions positioning assessment instruments as a component of assessment tools and vice versa (27, 28). For this review, the terms are interchangeable, and both are included as search terms, however, the term assessment tool is reported for consistency.

2.4. Research question

Our interests lie in understanding how competency for interprofessional practice has been measured, by teaching faculty, among pre-licensure healthcare students in practice settings (as opposed to the assessment of profession-specific competencies). Specifically, we sought to identify existing assessment tools used by faculty to assess interprofessional competency attainment of pre-li from two or more professions censure healthcare students in clinical learning contexts and which could be utilized within an interprofessional student-assisted clinic. Thus, this review focused on the following questions:

• What tools have been used by teaching faculty to assess interprofessional competencies of pre-licensure healthcare students experiencing learning in interprofessional contexts (i.e., involving two or more professions)?

• How might identified tools be used to inform development of an assessment instrument for assessing interprofessional competency attainment of healthcare students in clinical learning contexts such as a primary healthcare-focused interprofessional student-led clinic?

3. Method

This review was conducted using an integrative approach as described by Whittemore and Knafl (29). Interprofessional concepts and their associated measurement are complex and context specific (29). One study type or design cannot capture all the dimensions of healthcare students' interprofessional competency assessment and related tools. An integrative review allows for synthesizing methodologically diverse studies to comprehensively understand a particular issue or phenomenon to inform practice or policy (30). Adopting this methodology enables going beyond the narrow focus of traditional systematic reviews to ask broader, practice-based questions that can direct practice-based scientific knowledge (31, 32). The five integrative review methodology stages described by Whittemore and Knafl (31) – (1) problem identification, (2) literature search, (3) data evaluation, (4) data analysis, and (5) presentation – were utilized in this review.

3.1. Inclusion and exclusion criteria

The review's concepts and search terms were based on the PICO/PECO frameworks (P—Participants, I/E—Interventions/Exposure, C—Comparisons and O—Outcomes) (33). The selection criteria are summarized in Table 1. We placed no time restrictions; however, we included only studies published in English. The review includes primary studies only, excluding reviews, books, editorials, letters, and commentaries. Both qualitative, quantitative, and mixed methods studies were included.

TABLE 1
www.frontiersin.org

Table 1. Inclusion and exclusion criteria.

3.2. Databases and search terms

We searched published materials and gray literature using three broad concepts (healthcare student, assessment and interprofessional competence) derived from our research question and refined by MeSH terms in Medline. An initial test string was tested in ERIC for relevance: (Pre-registration OR Pre-licensure) AND (Healthcare student OR Healthcare student) AND (postgraduate OR undergraduate) AND (Evaluate OR Assessment OR assessing OR assess OR outcome OR outcomes OR examin* OR evaluate) OR (measurement OR measure OR measuring) AND (Competenc* OR Competent) AND interprofession*) AND tools). We continued to develop this initial search strategy iteratively and tailor it across these databases: CINAHL, PubMed/Medline, Embase, ERIC and Proquest One Academic. Comprehensiveness in the search scope was achieved through a review of the reference list of relevant primary papers and other sources like Google and Google Scholar search. The search strategy is shown in Table 2.

TABLE 2
www.frontiersin.org

Table 2. Search strategy on Proquest ONE Academic, ERIC, Medline and Embase, and search results on 25/05/2022.

3.3. Data screening and selection

Identified records from databases and Google searches were imported into Covidence® (34), an online screening and data management software. Automatic removal of duplicates in Covidence was followed by a two-staged screening of unique studies by two sets of independent reviewers including PB, SB, KKS, and IA. The initial screening of the titles and abstracts was followed by a further screening of full-text articles identified. Finally, a third and fourth reviewer (DB and A-RY) consulted together to resolve discrepancies and conflicts between the reviewer judgements in each stage of the review process. The screening and conflict resolution process in Covidence were blinded. The search strategy and data screening procedures, using the Preferred Reporting Items for Systematic Reviews and Meta-Analyses Protocols (PRISMA-P) Statements (35), are reported in Table 2 and Figure 1, respectively.

FIGURE 1
www.frontiersin.org

Figure 1. PRISMA flow chart of study selection process.

3.4. Data extraction and synthesis

Data were extracted and synthesized following Whittemore and Knafl (31) guidelines. The data extraction process involved reviewing each study's details, research design, aims, ethical considerations, sample population and size, comparative interventions, outcome measures, findings, and limitations. Covidence was used as the primary tool for data extraction. Data were then synthesized by identifying themes and concepts related to the review questions. The synthesis process involved sorting the data into intellectual bins, naming themes, and looking for relationships to guide future studies. The studies' psychometric features, such as internal consistency, inter-item and inter-total correlations, and inter-rater reliability, were examined to assess the quality and reliability of the findings. The key themes and relationships are summarized in Table 5.

3.5. Evaluation of data

Including both primary and theoretical literature in integrative review makes quality appraisal more complex (31). In line with our decision to integrate quantitative, qualitative and mixed methods studies, we adopted the “mixed-methods assessment tool (MMAT), version 2018” (36) for the quality appraisal of eligible studies. Two reviewers (DB and A-RY) independently appraised the quality of included studies and resolved any disagreements by consensus. Each study's quality is presented. In keeping with the integrative review methods, no eligible study was excluded based on research quality issues (31, 37).

4. Results

Eight manuscripts were identified for inclusion in the review (3845), however, two reported activities from the same context. The PRISMA Flow Chart and study selection process (Figure 1) outlines the process of assessment and inclusion.

Application of the the ‘MMAT version 2018′ (36) provided the quality appraisal results shown in Figure 2.

FIGURE 2
www.frontiersin.org

Figure 2. Quality appraisal of the included articles.

In terms of study quality, notable issues exist where, sample representativeness is questionable due to the sample size being too small (3840) or reported inconsistently (41). Selection bias may exist when the participants are recruited on a voluntary basis and if not all the participants are included for analysis (42). Also, the measurements may be inappropriate if only one rater is used in the competency assessment (41), and to assess tool quality. Bias is reduced when two faculty members rate and compare results vs. the assessment of a single faculty member alone.

Six of the eight studies were based in the United States of America, one in Canada and one in an unstated country. Each included diverse aims, as shown in Table 3. The different approaches included emphasis on the development and delivery of the interprofessional education program with the application of assessment tools (40, 41), or alternatively focusing on testing the assessment tools (38, 39, 42).

TABLE 3
www.frontiersin.org

Table 3. Characteristics of the included studies.

The interprofessional initiatives assessed in the eight studies were equally diverse and included ongoing interprofessional activities; interprofessional collaboration with community partners; an interprofessional escape room; an interprofessional team-based care rubric, and a Team Observed Structured Clinical Encounter (TOSCE) station focused on stroke (see Table 4).

TABLE 4
www.frontiersin.org

Table 4. Characteristics of the interprofessional education delivered in the included studies.

A single study (40) reported a multi-site inquiry of five sites; other studies involved single-site initiatives and evaluations. One study included four participating professions, namely, occupational therapy, pharmacy, dentistry and medicine (45) with the remaining studies involving fewer professions, for example, nursing and medicine (42) or nursing and pharmacy (41).

Each research team described their interprofessional assessment tool in detail and evaluated the performance in their specific study context (see Table 5) Five assessment tools were used across the 8 studies, none of which are the same, though four of them are modified from the McMaster-Ottawa scale in different ways (3840, 52) Two studies evaluated internal consistency of the assessment tools (Observed Interprofessional Collaboration [OIPC] and Indiana University Simulation Integration Rubric [IUSIR], respectively) and reported the Cronbach's alphas, which ranged from 0.79 to 0.91 indicating a high reliability (38). Two studies analyzed interrater reliability of the assessment tools (IUSIR and TOSCE) between two and sixteen assessors, respectively (38, 42): Reising et al. reported high accuracy for both individual (92%) and team (94%) assessment by IUSIR from two assessors, while Lie et al. found a lower accuracy in individual (38–81%) than team (50–100%) assessment by TOSCE from sixteen faculty raters. These two studies also validated the assessment tools. The assessment tool IUSIR was found to have significant discriminatory capacity to differentiate junior-/senior-level performance (42); however, with the assessment tool TOSCE individual but not team performance may be over-rated (38).

TABLE 5
www.frontiersin.org

Table 5. Characteristics and performance of the assessment tools applied in the included studies.

5. Discussion

The authoring team closely followed Whittemore and Knafl (31) five integration stages in conducting this review: (1) problem identification, (2) literature search, (3) data evaluation, (4) data analysis, and (5) presentation. During the first stage of the review the team clarified the need to seek, locate and review existing tools utilized by teaching faculty in the assessment of interprofessional competencies of relevance to pre-licensure healthcare students. The second through fourth stages of literature search, evaluation and analysis are reported in Sections 2.2 to 2.5 with results presented in Tables 3, 4. The final presentation of results is aided by the analysis in Table 5 and the ensuing discussion.

Results yielded a paucity of published work in the field. The search focused on identifying examples where faculty had worked together in the development and evaluation of IPE competency assessment tools for pre-licensure students from two or more healthcare professions. The identified tools included the OIPC, a five-item modified TOSCE Scale, the IUSIR, TOSCE modified from the McMaster-Ottawa scale, the Interprofessional Team-based Care Rubric (ITCR), the modified McMaster-Ottawa scale, and others.

The reported consequences of deficits in interprofessional communication and teamwork include increases in medical errors, poor patient outcomes and persistence of embedded health inequalities (17, 41). As early as the 1970's, entities such as the World Health Organization (WHO) and the Institute of Medicine (IOM) highlighted the need for an increased focus on public health and primary healthcare supported by increased collaboration between the professions (53, 54). The IOM Conference of 1972 focused specifically on the transformation of health professional curricula to address the increasingly important need for interprofessional education (53). The ensuing decades have seen continuing calls for curriculum transformation and emphasis on interprofessional education (3, 18, 46, 55, 56) and yet significant work remains to be done. A clear finding of this review is that while progress has been made, major gaps persist in various aspects of curriculum transformation, IPE pedagogy and assessment processes. Additional development and research are needed in respect to the education and assessment of interprofessional competencies among health professionals including pre-licensure healthcare students (5, 47).

Despite the small volume of work identified in this search, valuable insights were gained regarding assessment tools that could be utilized with pre-licensure healthcare students in an IP SLC service or other clinical learning context. Lie et al. (38) adopted an existing scale, specifically, the 9-point McMaster-Ottawa Scale and associated TOSCE tool (44, 48) and converted this to a 3-point scale with behavioral anchors. Participating faculty indicated comfort in assessing up to four students within the TOSCE period of 35 minutes. However, a leniency error was noted among faculty even after comprehensive training. It is recommended that two trained faculty raters are included in each TOSCE station (38). The McMaster-Ottawa Scale was also adapted by Forest et al. (44) to develop a three-point scale, with Lie et al. (39) building on their earlier developments – Forest and Lie both reported the usefulness and validity of the McMaster- Ottawa Scale as a basis for development and implementation (39, 43).

In the ITCR approach utilized by Hayes et al. (43), interprofessional practice competency domains were used to inform the criteria standards within the tool. Testing occurred in respect to both the level and content of the scale with results showing excellent content validity (49). Reising et al. (42) undertook psychometric testing using the IUSIR which is a tool that has been developed to measure interprofessional communication during clinical simulation (42). While useful, the tool is somewhat narrow in focus in that it assesses the interprofessional communication domain only rather than a broader set of interprofessional competencies. A further limitation is that design and testing using the IUSIR tool has occurred in simulated contexts only, with utility in practice contexts yet to be determined.

The use of an interprofessional escape room is reported by Foltz-Ramos et al. (41) to improve and test interprofessional collaboration in pre-license nursing and pharmacy students (41). Escape rooms are a relatively recent teaching innovation that integrates gaming technology with learning – an attractive approach among 21st-century learners (50). Escape room technology requires students to cooperate to effectively escape a particular scenario and achieve a good outcome. Escape rooms help build teamwork skills. The tool was shown to be effective, however, escape room development requires high levels of technical expertise and resource (41) and while fruitful they are essentially a simulated learning activity and further innovation is required to implement within the context of clinical rotations such as SLCs (41).

Transforming curricula to strengthen the focus on public health and primary healthcare priorities and reduce healthcare inequalities must take the student out of the classroom and into the community (51). However, studies reporting IPE assessment in the community and SLC settings are not commonly reported (40). Uniquely, Gentry et al. (40) collaborated with community partners over six months to deliver and assess interprofessional competencies of pre-licensure students in practice settings within primary care settings. Teams were drawn from ten professional groupings across five universities with a mixed-method approach taken to education and assessment. Participating community partners were not-for-profit entities delivering services to specific underserved and vulnerable populations. Faculty undertook continuous assessment and provided feedback to students throughout the six-month placement. Faculty assessments included qualitative assessment of IP domains; feedback on student presentation to community partners; utilization of existing tools specifically, the Interprofessional Socialization and Valuing Scale (ISVS) (57) completed prior to and after the placement; use of the McMaster-Ottawa Scale and TOSCE assessments, and analysis and feedback on student reflections.

The ISVS is a 24-point self-reporting measure focused on attitudes, behaviors and beliefs that underpin interprofessional socialization. The scale is used before and after the educational/clinical placement experience with a view to measuring the impact of the placement experience (57). The McMaster Ottawa Scale with TOSCE was explicitly developed for assessments of interprofessional competencies in primary care with the view to enable public health and primary healthcare teams to assess and then improve their team collaboration competencies – patient safety and better outcomes being a major aim (44, 48). In the Gentry et al. (40) study faculty utilized each of these assessment and feedback tools. Students reported a major benefit of the experience as getting to know the perspectives of others and working with like-mind people who also brought entirely different skill sets (40). Faculty and students also reported a greater understanding and comfort with team-based roles, improved competence in shared decision-making and problem-solving, and a greater understanding and empathy for community needs (40). The mixed method, community-based approach detailed by Gentry and team aligns well with a community-based, student-led interprofessional health service, the development of which formed the impetus of this search.

The identified tools provide valuable insight into the development of an assessment instrument for evaluating interprofessional competency attainment of healthcare students in clinical learning contexts, such as a primary healthcare focused interprofessional student-led clinic. While unvalidated, the McMaster-Ottawa Scale with TOSCE and the ISVS seem to show the greatest promise as tools for this purpose. The McMaster-Ottawa Scale with TOSCE is designed for assessing interprofessional competencies in primary care settings, enabling teams to evaluate and improve their collaborative skills, ultimately aiming for better patient safety and outcomes (38). The ISVS is a 24-point self-reporting measure that focuses on attitudes, behaviors, and beliefs underpinning interprofessional socialization (40, 51), which can be used before and after educational or placement experiences to gauge the impact of these experiences on students' interprofessional competency development.

When developing an assessment instrument for a primary healthcare focused interprofessional student-led clinic, it may be beneficial to incorporate elements from these existing tools while adapting them to the specific context and learning objectives of the clinic. Combining a mixed-method approach that includes continuous assessment, feedback loops, and strong community engagement, as demonstrated in the Gentry et al. (40) study, can further enhance competency development and assessment. Utilizing a variety of assessment methods such as self-reporting, qualitative assessments, and observed clinical encounters will provide a comprehensive evaluation of interprofessional competency development among students. Ultimately, ongoing research and evaluation are essential to refine any assessment instrument and ensuring its effectiveness in fostering interprofessional competencies in future healthcare professionals.

5.1. Limitations

It is appropriate to note some limitations of this review. Perhaps most obvious is the possibility that the search did not capture all relevant literature, especially given the heterogenous nature of terminology used to describe practice involving representatives from more than one health profession; and an assessment or measurement instrument. Determining what was a tool used by teaching faculty to assess (as opposed to self-assessment) was also difficult. Including only published articles in the English language may have excluded examples of international examples or tools in the gray literature, especially as teaching and learning tools are often informal and evolving and not always well-documented. Educators working to promote interprofessional collaboration among health profession students, and formally assessing the results, should be encouraged to share the tools or applications they have built or explored to do so. Additionally, each of the identified works was very different. The majority were based in the USA and one in Canada, where there is a strong emphasis on interprofessional practice collaboration across all health professional accrediting bodies (47). The lack of global representation in the identified studies is noted as a limitation within the findings of this review.

6. Conclusion

Effective interprofessional teamwork is a cornerstone to improved health outcomes and reductions in healthcare inequalities. Purposefully designed placement experiences and assessment activities are required to better develop interprofessional competencies among pre-licensure healthcare students and prepare them for practice. The mixed method assessment approach with continuous feedback loops and strong community engagement aligns well with the planning and delivery of a student-led clinic engaged delivering of public health and primary healthcare services. Existing assessment tools, such as the ISVS and the McMaster Ottawa Scale with TOSCE can further guide assessment processes and form the basis of future tool validation studies. Ongoing research and validation studies are needed to inform education and practice developments in this field of interprofessional competency assessment tools for faculty assessing students.

Data availability statement

The raw data supporting the conclusions of this article will be made available by the authors, without undue reservation.

Author contributions

SB, PA, and PB conceived the evaluative design of the study. SB, DB, A-RY, and IA developed the search strategy. All authors provided substantial contributions to this work and accept accountability for the finished product, participated in the collection of data, contributed to data analysis including COVIDENCE screening and writing of the manuscript, and reviewed and approved final drafts.

Funding

This project was supported by a Trust Waikato Community Impact Grant.

Acknowledgments

The authors acknowledge Ema Tokolahi and Cassie Cook for contributions during the early phase of the review process, and Jia Rong Yap for assistance in editing and preparation of the manuscript.

Conflict of interest

The authors declare that the research was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest.

Publisher's note

All claims expressed in this article are solely those of the authors and do not necessarily represent those of their affiliated organizations, or those of the publisher, the editors and the reviewers. Any product that may be evaluated in this article, or claim that may be made by its manufacturer, is not guaranteed or endorsed by the publisher.

References

1. Fröberg M, Leanderson C, Fläckman B, Hedman-Lagerlöf E, Björklund K, Nilsson GH, et al. Experiences of a student-run clinic in primary care: a mixed-method study with students, patients and supervisors. Scand J Prim Health Care. (2018) 36:36–46. doi: 10.1080/02813432.2018.1426143

PubMed Abstract | CrossRef Full Text | Google Scholar

2. Cust F. Interprofessional education is an essential component of health professional education. Available online at: https://www.nursingtimes.net/opinion/interprofessional-educationis-an-essential-component-of-health-professional-education-27-04-2021/ (accessed February 15, 2023).

Google Scholar

3. Bhutta ZA, Chen L, Cohen J, Crisp N, Evans T, Fineberg H, et al. Education of health professionals for the 21st century: a global independent Commission. Lancet. (2010) 375:1137–8. doi: 10.1016/S0140-6736(10)60450-3

PubMed Abstract | CrossRef Full Text | Google Scholar

4. Frenk J, Chen L, Bhutta ZA, Cohen J, Crisp N, Evans T, et al. Health professionals for a new century: transforming education to strengthen health systems in an interdependent world. Lancet. (2010) 376:1923–58. doi: 10.1016/S0140-6736(10)61854-5

PubMed Abstract | CrossRef Full Text | Google Scholar

5. Samarasekera DD, Nyoni CN, Amaral E, Grant J. Challenges and opportunities in interprofessional education and practice. Lancet. (2022) 400:1495–7. doi: 10.1016/S0140-6736(22)02086-4

PubMed Abstract | CrossRef Full Text | Google Scholar

6. Brewer ML, Flavell HL, Jordon J. Interprofessional team-based placements: the importance of space, place, and facilitation. J Interprof Care. (2017) 31:429–37. doi: 10.1080/13561820.2017.1308318

PubMed Abstract | CrossRef Full Text | Google Scholar

7. Mladenovic J, Tilden VP. Strategies for overcoming barriers to IPE at a health sciences university. J Interprof Educ Pract. (2017) 8:10–3. doi: 10.1016/j.xjep.2017.05.002

CrossRef Full Text | Google Scholar

8. Batteson T, Garber SS. Assessing constructs underlying interprofessional competencies through the design of a new measure of interprofessional education. J Interprof Educ Pract. (2019) 16:100195. doi: 10.1016/j.xjep.2018.08.004

CrossRef Full Text | Google Scholar

9. Frost J, Chipchase L, Kecskes Z, D'Cunha NM, Fitzgerald R. Research in brief: exploring perceptions of needs for the same patient across disciplines using mixed reality: a pilot study. Clin Simul Nurs. (2020) 43:21–5. doi: 10.1016/j.ecns.2020.02.005

CrossRef Full Text | Google Scholar

10. Kavannagh J, Kearns A, McGarry T. The benefits and challenges of student-led clinics within an Irish context. J Pract Teaching Learn. (2015) 13:58–72. doi: 10.1921/jpts.v13i2-3.858

CrossRef Full Text | Google Scholar

11. Stuhlmiller CM, Tolchard B. Developing a student-led health and wellbeing clinic in an underserved community: collaborative learning, health outcomes and cost savings. BMC Nurs. (2015) 14:32. doi: 10.1186/s12912-015-0083-9

PubMed Abstract | CrossRef Full Text | Google Scholar

12. Tokolahi E, Broman P, Longhurst G, Pearce A, Cook C, Andersen P, et al. Student-led clinics in Aotearoa New Zealand: a scoping review with stakeholder consultation. J Multidiscip Healthc. (2021) 14:2053–66. doi: 10.2147/JMDH.S308032

PubMed Abstract | CrossRef Full Text | Google Scholar

13. Frakes KA, Brownie S, Davies L, Thomas J, Miller ME, Tyack Z. Experiences from an interprofessional student-assisted chronic disease clinic. J Interprof Care. (2014) 28:573–5. doi: 10.3109/13561820.2014.917404

PubMed Abstract | CrossRef Full Text | Google Scholar

14. Frakes KA, Brownie S, Davies L, Thomas JB, Miller ME, Tyack Z. Capricornia allied health partnership (CAHP): a case study of an innovative model of care addressing chronic disease through a regional student-assisted clinic. Aust Health Rev. (2014) 38:483–6. doi: 10.1071/AH13177

PubMed Abstract | CrossRef Full Text | Google Scholar

15. Chepulis L, Morison B, Keenan R, Paul R, Lao C, Lawrenson R. The epidemiology of diabetes in the Waikato region: an analysis of primary care data. J Prim Health Care. (2021) 13:44. doi: 10.1071/HC20067

PubMed Abstract | CrossRef Full Text | Google Scholar

16. Brownie S, Smith G, Broman P, et al. He Kaupapa Oranga Tahi: Working in Partnership to Grow the Health Workforce Through Tauira-Assisted Health Services. Report No.: ISBN 978-1-877510-21–2, 2021. Hamilton: Wintec – Te Pukenga.

Google Scholar

17. National Academies of Practice. National Academies of Practice Position Paper: Interprofessional Collaboration. Lexington, KY: National Academis of Practice (2022).

Google Scholar

18. World Health Organization (ed). Framework for Action on Interprofessional Education and Collaborative Practice. Geneva, Switzerland: World Health Organization (2010).

Google Scholar

19. Hero L-M, Lindfors E. Students' learning experience in a multidisciplinary innovation project. Educ Train. (2019) 61:500–22. doi: 10.1108/ET-06-2018-0138

CrossRef Full Text | Google Scholar

20. Choi BC, Pak AW. Multidisciplinarity, interdisciplinarity and transdisciplinarity in health research, services, education and policy: definitions, objectives, and evidence of effectiveness. Clin Invest Med. (2006) 29:351–64.

PubMed Abstract | Google Scholar

21. Columbia University. Team Healthcare Models. Columbia University) [n.d.]. Available online at: https://ccnmtl.columbia.edu/projects/sl2/pdf/glossary.pdf (accessed November 2, 2022).

Google Scholar

22. Canadian Interprofessional Health Collaborative. A national interprofessional competency framework. Canadian Interprofessional Health Collaborative. (2010). Available online at: https://phabc.org/wp-content/uploads/2015/07/CIHC-National-Interprofessional-Competency-Framework.pdf (accessed November 2, 2022).

Google Scholar

23. Services AH. Collaborative practice [n.d.]. Available online at: https://www.albertahealthservices.ca/assets/careers/ahs-careers-stu-supporting-interprofessional-placements.pdf (accessed November 2, 2022).

Google Scholar

24. Dennis V, Craft M, Bratzler D, Yozzo M, Bender D, Barbee C, et al. Evaluation of student perceptions with 2 interprofessional assessment tools-the collaborative healthcare interdisciplinary relationship planning instrument and the interprofessional attitudes scale-following didactic and clinical learning experiences in the United States. J Educ Eval Health Prof. (2019) 16:35. doi: 10.3352/jeehp.2019.16.35

PubMed Abstract | CrossRef Full Text | Google Scholar

25. Iverson L, Todd M, Haddad AR, Packard K, Begley K, Doll J, et al. The development of an instrument to evaluate interprofessional student team competency. J Interprof Care. (2018) 32:531–8. doi: 10.1080/13561820.2018.1447552

PubMed Abstract | CrossRef Full Text | Google Scholar

26. Peltonen J, Leino-Kilpi H, Heikkilä H, Rautava P, Tuomela K, Siekkinen M, et al. Instruments measuring interprofessional collaboration in healthcare—A scoping review. J Interprof Care. (2020) 34:147–61. doi: 10.1080/13561820.2019.1637336

PubMed Abstract | CrossRef Full Text | Google Scholar

27. Australian, Skills Quality Authority,. What is the Difference Between An Assessment Tool and an Assessment Instrument? (Clause 1.8)) [n.d.]. Available online at: https://www.asqa.gov.au/faqs/what-difference-between-assessment-tool-and-assessment-instrument-clause-18 (accessed November 7, 2022).

Google Scholar

28. Industry, Network Training & Assessment Resources,. Difference between ‘tools' and ‘instruments' - Chapter 3) [n.d.]. Available online at: https://www.intar.com.au/resources/training_and_assessing/section_3/chapter3_developing_assessment_tools/lesson2_tools_and_instruments.htm (accessed November 7, 2022).

Google Scholar

29. Wang Z, Song G. Towards an assessment of students' interdisciplinary competence in middle school science. Int J Sci Educ. (2021) 43:693–716. doi: 10.1080/09500693.2021.1877849

PubMed Abstract | CrossRef Full Text | Google Scholar

30. Broome M. Integrative literature reviews for the development of concepts. In: Rodgers BL and Knafl KA (eds) Concept Development in Nursing: Foundations, Techniques and Applications. Philadelphia: WB Saunders Company (2000), pp.231-250.

Google Scholar

31. Whittemore R, Knafl K. The integrative review: updated methodology. J Adv Nurs. (2005) 52:546–53. doi: 10.1111/j.1365-2648.2005.03621.x

PubMed Abstract | CrossRef Full Text | Google Scholar

32. Soares CB, Hoga LAK, Peduzzi M, Sangaleti C, Yonekura T, Silva DRAD. Integrative review: concepts and methods used in nursing. Revista da Escola de Enfermagem da USP. (2014) 48:335–45. doi: 10.1590/S0080-6234201400002000020

PubMed Abstract | CrossRef Full Text | Google Scholar

33. Liberati A, Altman DG, Tetzlaff J, Mulrow C, Gøtzsche PC, Ioannidis JPA, et al. The PRISMA statement for reporting systematic reviews and meta-analyses of studies that evaluate health care interventions: explanation and elaboration. PLoS Med. (2009) 6:e1000100. doi: 10.1371/journal.pmed.1000100

PubMed Abstract | CrossRef Full Text | Google Scholar

34. Babineau J. Product review: covidence (systematic review software). J Can Health Libr Assoc/Journal de l'Association des bibliothèques de la santé du Canada. (2014) 35:68–71. doi: 10.5596/c14-016

CrossRef Full Text | Google Scholar

35. McCluskey J, Gallagher AL, Murphy CA. Reflective practice across speech and language therapy and education: a protocol for an integrative review. HRB Open Res. (2021) 4:29. doi: 10.12688/hrbopenres.13234.1

PubMed Abstract | CrossRef Full Text | Google Scholar

36. Hong QN, Fàbregues S, Bartlett G. The mixed methods appraisal tool (MMAT) version 2018 for information professionals and researchers. Educ Inf . (2018) 34:285–91. doi: 10.3233/EFI-180221

PubMed Abstract | CrossRef Full Text | Google Scholar

37. Ganong LH. Integrative reviews of nursing research. Res Nurs Health. (1987) 10:1–11. doi: 10.1002/nur.4770100103

PubMed Abstract | CrossRef Full Text | Google Scholar

38. Lie D, May W, Richter-Lagha R, Forest C, Banzali Y, Lohenry K. Adapting the McMaster-Ottawa scale and developing behavioral anchors for assessing performance in an interprofessional team observed structured clinical encounter. Med Educ Online. (2015) 20:26691. doi: 10.3402/meo.v20.26691

PubMed Abstract | CrossRef Full Text | Google Scholar

39. Lie DA, Richter-Lagha R, Forest CP, Walsh A, Lohenry K. When less is more: validating a brief scale to rate interprofessional team competencies. Med Educ Online. (2017) 22:1314751. doi: 10.1080/10872981.2017.1314751

PubMed Abstract | CrossRef Full Text | Google Scholar

40. Gentry C, Espiritu E, Schorn MN, Hallmark B, Bryan M, Prather P, et al. Engaging the community through a longitudinal, interprofessional, interinstitutional experiential learning collaboration. Curr Pharm Teach Learn. (2021) 13:169–76. doi: 10.1016/j.cptl.2020.09.012

PubMed Abstract | CrossRef Full Text | Google Scholar

41. Foltz-Ramos K, Fusco NM, Paige JB. Saving patient X: a quasi-experimental study of teamwork and performance in simulation following an interprofessional escape room. J Interprof Care. (2021) 3:1–8. doi: 10.1080/13561820.2021.1874316

PubMed Abstract | CrossRef Full Text | Google Scholar

42. Reising DL, Carr DE, Tieman S, Feather R, Ozdogan Z. Psychometric testing of a simulation rubric for measuring interprofessional communication. Nurs Educ Perspect. (2015) 36:311–6. doi: 10.5480/15-1659

PubMed Abstract | CrossRef Full Text | Google Scholar

43. Hayes CA, Carzoli JA, Robinson JL. Development and implementation of an interprofessional team-based care rubric to measure student learning in interprofessional education experiences: a pilot study. J Interprof Educ Prac. (2018) 11:26–31. doi: 10.1016/j.xjep.2018.02.003

CrossRef Full Text | Google Scholar

44. Forest Chrisptopher P, Lie Désirée A, Ma Sae B. Evaluating interprofessional team performance: a faculty rater tool. MedEdPORTAL. (2016) 12:10447. doi: 10.15766/mep_2374-8265.10447

PubMed Abstract | CrossRef Full Text | Google Scholar

45. Murray-Davis B, Marshall D, Mueller V. A Team Observed Structured Clinical Encounter (TOSCE) for pre-licensure learners in maternity care: a short report on the development of an assessment tool for collaboration. J Res Interprof Pract Edu. (2013) 3:122–8. doi: 10.22230/jripe.2013v3n1a89

CrossRef Full Text | Google Scholar

46. Zorek J, Raehl C. Interprofessional education accreditation standards in the USA: a comparative analysis. J Interprof Care. (2013) 27:123–30. doi: 10.3109/13561820.2012.718295

PubMed Abstract | CrossRef Full Text | Google Scholar

47. McMaster-Ottawa Team,. The McMaster-Ottawa Team Observed Structured Clinical Encounter (TOSCE). (2010). Available online at: https://fhs.mcmaster.ca/tosce/documents/tosce_checklist_user_instructions.pdf (accessed December 1, 2022).

Google Scholar

48. McMaster-Ottawa Team. McMaster/Ottawa TOSCE (Team Observed Structured Clinical Encounter) Toolkit: An Innovative Tool for Building and Assessing Interprofessional Competencies. Hamilton, ON: McMaster University (2022).

Google Scholar

49. Zhang XC, Lee H, Rodriguez C, Rudner J, Chan TM, Papanagnou D. Trapped as a group, escape as a team: applying gamification to incorporate team-building skills through an 'escape room' experience. Cureus. (2018) 10:e2256. doi: 10.7759/cureus.2256

PubMed Abstract | CrossRef Full Text | Google Scholar

50. Yuan M, Lin H, Wu H, Yu M, Tu J, Lü Y. Community engagement in public health: a bibliometric mapping of global research. Arch Public Health. (2021) 79:6. doi: 10.1186/s13690-021-00525-3

PubMed Abstract | CrossRef Full Text | Google Scholar

51. King G, Shaw L, Orchard CA, Miller S. The interprofessional socialization and valuing scale: a tool for evaluating the shift toward collaborative care approaches in health care settings. Work. (2010) 35:77–85. doi: 10.3233/WOR-2010-0959

PubMed Abstract | CrossRef Full Text | Google Scholar

52. Institute of Medicine (ed). Educating for the Health Team. Conference on the Interrelationships of Educational Programs for Health Professionals 1972 October 2–3. Washington, D.C.: Institute of Medicine. (1972).

Google Scholar

53. World Health Organization (ed). Alma-Mata: Primary Healthcare. International Conference on Primary Healthcare (PHC)) 1978 September 6-12) Alma-Mata, Kazakhstan, USSR: World Health Organization and United Nation Children's Fund.

Google Scholar

54. Rifkin SB. Alma Ata after 40 years: primary health care and health for all-from consensus to complexity. BMJ Glob Health. (2018) 3:e001188. doi: 10.1136/bmjgh-2018-001188

PubMed Abstract | CrossRef Full Text | Google Scholar

55. World Health Organization. Transforming and Scaling Up Health Professionals' Education and Training: World Health Organization Guidelines. Geneva, Switzerland: World Health Organization (2013).

Google Scholar

56. Rogers GD, Thistlethwaite JE, Anderson ES, Dahlgren MA, Grymonpre RE, Moran M, et al. International consensus statement on the assessment of interprofessional learning outcomes. Med Teach. (2017) 39:347–59. doi: 10.1080/0142159X.2017.1270441

PubMed Abstract | CrossRef Full Text | Google Scholar

Keywords: interdisciplinary education, interdisciplinary communication, interprofessional relations, public health, primary healthcare, collaboration, assessment, measurement

Citation: Brownie S, Blanchard D, Amankwaa I, Broman P, Haggie M, Logan C, Pearce A, Sampath K, Yan A-R and Andersen P (2023) Tools for faculty assessment of interdisciplinary competencies of healthcare students: an integrative review. Front. Med. 10:1124264. doi: 10.3389/fmed.2023.1124264

Received: 19 January 2023; Accepted: 25 May 2023;
Published: 16 June 2023.

Edited by:

Muhammad Shahid Iqbal, Prince Sattam bin Abdulaziz University, Saudi Arabia

Reviewed by:

Muhammad Zahid Iqbal, AIMST University, Malaysia
Marie-Claire O'Shea, Griffith University, Australia

Copyright © 2023 Brownie, Blanchard, Amankwaa, Broman, Haggie, Logan, Pearce, Sampath, Yan and Andersen. This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) and the copyright owner(s) are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.

*Correspondence: Sharon Brownie, sbrownie@swin.edu.au

Download