- 1Key Laboratory of Molecular Biology in High Cancer Incidence Coastal Chaoshan Area of Guangdong Higher Education Institutes, Department of Cell Biology and Genetics, Shantou University Medical College, Shantou, China
- 2Department of Physiology, Shantou University Medical College, Shantou, China
- 3Department of Gastroenterology, First Affiliated Hospital of Shantou University Medical College, Shantou, China
Background: Problem-based learning (PBL) has emerged as a transformative educational strategy aimed at enhancing critical thinking in medical education. While some studies suggest PBL significantly improves critical thinking skills, others show minimal differences compared to traditional teaching methods. This study aims to synthesize existing research to assess the overall effectiveness of PBL in enhancing critical thinking skills in medical education.
Methods: A comprehensive literature search was conducted using PubMed, the Cochrane Library, and FMRS databases. Studies included were comparative, involving medical students, with a PBL intervention group and a conventional method control group, and assessed critical thinking skills. Data extraction and quality assessment were performed by two independent reviewers, using the Cochrane Risk of Bias Tool for RCTs and the ROBINS-I tool for non-randomized studies. Meta-analysis was conducted using Review Manager version 5.3, employing random-effect models due to high heterogeneity. Publication bias was assessed using funnel plots, Egger’s regression, and Begg’s test.
Results: Eleven studies involving 14 to 267 students from various countries were included. The aggregated effect size for PBL versus conventional methods showed a significant improvement in critical thinking skills. Comparison of pre- and post-PBL critical thinking scores also indicated significant improvement post-PBL. Publication bias was assessed and found to be minimal.
Conclusion: PBL is significantly more effective than conventional teaching methods in enhancing critical thinking skills among medical students. These findings support the integration of PBL into medical curricula to foster critical thinking skills crucial, which are crucial for clinical practice.
Introduction
The evolving landscape of medical education demands innovative teaching methodologies to foster critical thinking skills among students, essential for navigating the complexities of clinical practice. Problem-based learning (PBL) has gained prominence as an effective, student-centered approach to developing these skills, particularly in medical education (Alreshidi and Alreshidi, 2023; Manuaba et al., 2022). By engaging students with real-world clinical scenarios, PBL promotes active learning, deeper cognitive processing, and sustained motivation compared to traditional lecture-based approaches (Magdalena et al., 2023).
Critical thinking is essential to medical practice, enabling healthcare professionals to analyze complex information, make informed decisions, and provide high-quality care (Falcó Pegueroles et al., 2021; Khosravizadeh et al., 2022; Papp et al., 2014). The American Philosophical Association defines critical thinking as “purposeful, self-regulatory judgment,” encompassing interpretation, analysis, evaluation, and inference (Dwyer and Walsh, 2020; Wanjari et al., 2020). These cognitive abilities are crucial for medical students, who must navigate increasingly complex clinical settings and provide patient-centered care.
Numerous studies have explored the impact of PBL on critical thinking skills in medical education, yielding inconsistent results. Some research indicates that PBL leads to significant improvements in critical thinking abilities (Deshanty et al., 2023; Sopwan et al., 2018; Tiwari et al., 2006), while other studies suggest minimal or no differences compared to traditional teaching methods (Puranik et al., 2023; Zabit et al., 2016). These inconsistencies highlight the need for a comprehensive systematic review and meta-analysis to assess the overall effectiveness of PBL in enhancing critical thinking skills in medical education.
Prior literature reviews have explored the impact of PBL on critical thinking skills in medical education, but most prior systematic reviews have either focused narrowly on specific subgroups (e.g., nursing students) or lacked a quantitative meta-analytic synthesis. For example, Wei et al. (2024) reported significant improvements in critical thinking among nursing students using PBL, while Manuaba et al. (2022) found no significant differences in first-year medical students. These discrepancies highlight the need for a comprehensive quantitative review across broader health-related disciplines to evaluate the effectiveness of PBL on critical thinking across varied educational contexts.
This study aimed to fill this gap by synthesizing both randomized and non-randomized studies from multiple countries and disciplines, using robust meta-analytic methods. Furthermore, this study explicitly considered various conceptualizations of critical thinking. We adopted the widely accepted framework by Facione (1990), which defines critical thinking as purposeful, self-regulatory judgment, encompassing interpretation, analysis, evaluation, inference, explanation, and metacognition. We also referenced complementary perspectives, such as Ennis’s (1987)) taxonomy of dispositions and abilities, and Paul and Elder’s (2008) model emphasizing intellectual standards. These theories guided both our interpretation of critical thinking and the classification of the measurement tools used in included studies.
This study synthesizes research on the effectiveness of PBL in medical education, focusing on its impact on students’ critical thinking skills. Through a systematic review and meta-analysis, it aims to clarify PBL’s educational outcomes and provide practices recommendations to guide medical curriculum design and improve critical thinking development in diverse student populations.
Materials and methods
Literature search strategy
Data were sourced from PubMed, the Cochrane Library, and the FMRS database. The Foreign Medical Literature Retrieval Service (FMRS), developed by Shenzhen Maitesi Creative Co., Ltd., is a platform for accessing foreign medical information resources. FMRS2020 covers 97% of journals indexed in the SCI database, 90% of journals indexed in EMBASE, and 100% of journals indexed in PubMed.
The same search terms and Boolean operators were applied across all databases to ensure uniformity. Specifically, the following search terms were used uniformly: “Medical*,” “educat*,” “Problem-Based Learning,” “Problem-Based,” “Problem Based Learning,” “Problem Based,” “PBL,” “critical*,” and “think*.”
For PubMed, Cochrane Library, and FMRS, the same Boolean operators were used to combine the search terms across all databases. The only variation in the strategy arose from the different search syntax requirements of each database, but the core search terms and Boolean logic remained unchanged across all of them.
For clarity, the detailed search strategies used for each database are presented in Supplementary Table 1.
Study selection
The studies included met the following criteria for comparative studies: (1) Participants: Medical students engaged in health-related studies. For the purposes of this review, “medical students” were defined broadly to include students enrolled in health-related disciplines such as medicine, nursing, dentistry, speech-language pathology, and medical education: fields in which critical thinking and clinical reasoning are essential components of professional education. (2) Intervention group using problem-based learning (PBL). (3) Control group using conventional method. (4) Outcome measure: Assessment of critical thinking skills. (5) Reported sample size, mean difference, and standard deviation of critical thinking scores. (6) No restriction on publication year and languages. Articles retrieved in languages other than English or Chinese were translated using OpenAI’s ChatGPT (version GPT-4, accessed June 2024) to facilitate review and data extraction. Translations were subsequently checked for consistency and relevance to the study objectives.
Exclusion criteria included: (1) Non-peer-reviewed literature, opinion articles, reviews, and conference abstracts. (2) Not-medical students. (3) Studies without assessment of critical thinking. (4) Duplicated or overlapping data. (5) Studies with serious design flaws or incomplete data.
Data extraction and management
Two reviewers independently extracted data from the included studies using a standardized data extraction protocol. Extracted data included study characteristics (authors, publication year, and country), participant characteristics (sample size and major), PBL descriptions, study design, and critical thinking tools (mean differences and standard deviations from baseline to post-test). Discrepancies were resolved through discussion or with a third reviewer if needed.
To ensure transparency, we implemented a consistent approach for extracting quantitative effect sizes. For each outcome (e.g., critical thinking scores), we extracted either mean differences and standard deviations or standardized mean differences, depending on how the results were reported. In cases where multiple outcomes were reported (e.g., comparisons between PBL-first and LBL-first sequences), we treated each outcome as an independent data point in the meta-analysis to maintain statistical independent. When studies reported multiple cohorts or time points, each was treated as an independent batch to preserve the robustness of the pooled estimates. For studies published in languages other than English or Chinese, we translated relevant full-text content into Chinese using advanced translation tools to ensure accurate data extraction.
To ensure consistency and reduce bias, inter-rater reliability for data extraction and quality assessment was measured using Cohen’s kappa, which showed a high level of agreement (κ = 0.86).
Quality assessment
To assess the quality of the included studies, we employed the Cochrane Risk of Bias Tool (Higgins et al., 2011) for randomized controlled trials (RCTs) and the ROBINS-I tool (Sterne et al., 2016; Thomson et al., 2018) for non-randomized studies of interventions. For the RCTs, the Cochrane Risk of Bias Tool evaluated each study based on criteria such as random sequence generation, allocation concealment, blinding of participants and personnel, blinding of outcome assessment, incomplete outcome data, and selective reporting. For the non-randomized studies, the ROBINS-I tool assessed the risk of bias across several domains, including confounding, participants, intervention classification, deviations from intended interventions, missing data, outcomes measurement, and selection of the reported result.
The results of the quality assessment were integrated into the sensitivity analysis. Specifically, studies rated as having a high risk of bias in two or more domains were excluded in secondary analyses to evaluate the robustness of pooled effect sizes. These results were clearly presented and considered into the interpretation of the meta-analytic results.
Data synthesis and analysis
Heterogeneity was assessed using the I2 statistic. The I2 statistic represents the percentage of variation due to heterogeneity, thereby offering an interpretable estimate of its degree (Hemming et al., 2021). I2 values between 25 and 50% indicate low heterogeneity, 50 to 75% moderate, and over 75% high. An I2 value greater than 50% was treated as evidence of substantial heterogeneity. For low-heterogeneity data, a fixed-effect model was used to aggregate effect sizes. For high-heterogeneity data, a random-effect model was applied.
For studies with independent control group data, we used a between-group comparison approach to assess differences in critical thinking scores between the PBL (problem-based learning) and conventional teaching method groups. In many studies, the conventional teaching method was used before the PBL intervention. Therefore, pre- and post-intervention data were also used to compare critical thinking scores before and after PBL, with the pre-PBL period representing traditional instruction and the post-PBL period representing the intervention. To determine statistical significance, a two-tailed p-value threshold of < 0.05 was set for all analyses.
Where studies reported multiple cohorts, time points, or experimental arms, each was treated as an independent analytical batch in the meta-analysis (see Figures 3, 4 legends).
The analyses mentioned above were conducted using Review Manager version 5.3 (RevMan Cochrane, London, UK).
Publication bias
To ensure the reliability of the meta-analysis results, we assessed potential publication bias using multiple complementary methods, including a funnel plot, Egger’s regression test, and Begg’s test. The funnel plot was generated using Review Manager version 5.3, while Egger’s and Begg’s test were performed using Python-based statistical scripts to evaluate the symmetry of effect sizes and detect potential small-study effects.
Results
Study selection results
We conducted a comprehensive literature search on PBL conducted among medical students, using critical thinking as the primary outcome measure, and identified 11 eligible studies for inclusion (Figure 1).
Study characteristics
All 11 included studies were published between 2004 and 2023, involving sample sizes ranging from 14 to 267 participants across several countries: China (n = 4), Korea (n = 1), Iran (n = 2), South Korea (n = 2), Iran (n = 1) and the USA (n = 1). The participants’ academic majors spanned nursing (n = 7), clinical medicine (n = 1), language pathology (n = 1), medical education and medical library information science (n = 1), and dentistry (n = 1). To assess critical thinking, the included studies used a range of validated instruments, including the California Critical Thinking Disposition Inventory (CCTDI, n = 3), California Critical Thinking Skills Test (CCTST, n = 2), Critical Thinking Disposition Inventory (CTDI, n = 2), Clinical Thinking Ability Evaluation Scale (CTAES, n = 1), Critical Thinking Ability Scale for College Students, (CTASCS, n = 1), Concept Map Assessment Profile (CMAP, n = 1), and Oral Triple Jump Assessment (TJA, n = 1). Basic characteristics of the studies included are shown in Table 1. To enhance transparency and comparability, Supplementary Table 2 provides additional study-level details, including participant academic level, intervention duration and format, the dimension of critical thinking assessed (e.g., skills, dispositions, or both), and whether the intervention and control groups were situated in the same instructional environment.
Risk of bias
Of the 11 included studies, three were randomized controlled trials (RCTs). All RCTs had an unclear risk in at least one domain (Figure 2A), with the most common sources of bias related to inadequate blinding procedures (Figure 2C). The non-randomized studies exhibit varying levels of bias as assessed by the ROBINS-I tool. All of the non-randomized studies had an unclear risk in at least one domain, and approximately 37.5% of the studies presented a high risk of bias in one or two domains (Figure 2B). The primary sources of bias of the non-randomized studies were due to confounding and selection of participants into the study (Figure 2D).

Figure 2. Risk of bias for included studies. (A) Risk of bias summary: each risk of bias item for RTC study. (B) Risk of bias summary: each risk of bias item for non-RTC study. (C) Risk of bias graph: each risk of bias item presented as percentages across all RTC studies. (D) Risk of bias graph: each risk of bias item presented as percentages across all non-RTC studies. Sort by effect size.
Meta-analysis results
For studies comparing critical thinking scores between PBL and conventional method groups, 8 studies were included, resulting in 9 batches of data. Among these, Study Yu et al. (2013) was a crossover-experimental study, which produced 2 independent data batches. Heterogeneity analysis showed high-heterogeneity between PBL and conventional method groups (I2 = 93%), so a random-effect model was used. The aggregated effect size indicated that PBL was significantly more effective than conventional method groups (Mean Difference [MD] = 0.98, 95% Confidence Interval [CI] = 0.19–1.77, P = 0.02) (Figure 3).

Figure 3. The analysis of the critical thinking between PBL and conventional method groups. Some studies [e.g., Yu et al. (2013)] contributed multiple batches to the meta-analysis. Each batch represents an independent comparison based on different experimental rounds: Yu et al. (2013) (PBL + LBL): In the first round of the experiment, Group A received PBL and Group B received LBL. Yu et al. (2013) (LBL + PBL): In the second round of the experiment, Group A received LBL and Group B received PBL.
For analyzing critical thinking scores between pre- and post-PBL, there are 8 studies in total, resulting in 10 batches of data. Among these, study Mok et al. (2014) conducted 3 PBL interventions, resulting in 3 independent data batches. Study Gavgani et al. (2021) conducted PBL interventions for students in 2 different fields: Library and Information Science and Medical Education. Only data from the Medical Education students were included in the meta-analysis, while data from Library and Information Science students were excluded to maintain the focus on medical education contexts. The heterogeneity analysis showed high-heterogeneity between pre- and post-PBL (I2 = 81%), so a random-effects model was employed. The aggregated effect size indicated that post-PBL was significantly better than pre-PBL (Mean Difference [MD] = 1.43, 95% Confidence Interval [CI] = 0.24–2.62, P = 0.02) (Figure 4).

Figure 4. The analysis of the critical thinking between pre- and post-PBL. Some studies [e.g., Mok et al. (2014)] contributed multiple batches. For Mok et al. (2014), all batches refer to the same cohort of students, who were followed over three academic years. Each batch reflects a comparison between pre- and post-PBL outcomes at different time points: Mok et al. (2014) year 1: First year—comparison between pre-PBL (baseline) and post-PBL (after initial intervention). Mok et al. (2014) year 2: Second year—same students, followed up one year later. Mok et al. (2014) year 3: Third year—same students, final follow-up.
Publication bias
For studies comparing critical thinking scores between PBL and conventional method groups, the funnel plot shows that the points are not perfectly symmetrical around the mean MD line, which may indicate potential publication bias upon visual inspection (Figure 5A). However, this asymmetry may also result from methodological heterogeneity or small study effects, rather than true publication bias. We subsequently performed Egger’s Regression Test and Begg’s Test to achieve a more rigorous assessment of publication bias. Specifically, Egger’s Regression Test analyzes the relationship between the effect sizes and their standard errors to detect bias, with no significant results (P > 0.05, Figure 5B), indicating no significant publication bias. Similarly, Begg’s Test, which assesses the correlation between the ranks of effect sizes and their variances, also showed no statistical significance (Kendall’s tau: 0.44, P = 0.12), further suggesting the absence of significant publication bias.

Figure 5. The analysis of publication bias. (A) Funnel plot of comparison: PBL vs. control. (B) Egger regression analysis of comparison: PBL vs. control. (C) Funnel plot of comparison: post-test vs. pre-test. (D) Egger regression analysis of comparison: post-test vs. pre-test.
For analyzing critical thinking scores between pre- and post-PBL, the funnel plot also shows that the points are not perfectly symmetrical around the mean MD line (Figure 5C). Egger’s Regression Test showed no significant results (P > 0.05, Figure 5D). Begg’s Test also showed no statistical significance (Kendall’s tau: −0.16, P = 0.60). These results suggest the absence of significant publication bias.
Discussion
Our study aimed to evaluate the effectiveness of PBL in enhancing critical thinking skills in medical education by synthesizing data from 11 studies involving 14 to 267 medical students from China, Korea, Iran, and the USA. The included student groups spanned different academic years and disciplines, including nursing, clinical medicine, language pathology, medical education, medical library information science, and dentistry, which increases the general applicability of the findings.
Our meta-analysis results demonstrated that PBL significantly improved critical thinking skills in medical students compared to conventional teaching methods, as indicated by the aggregated effect size. Similarly, pre- and post-PBL comparisons also showed significant improvements in critical thinking scores.
The current meta-analysis demonstrated a significant advantage of PBL over traditional teaching methods in enhancing critical thinking. However, substantial heterogeneity was observed across studies (I2 > 80%). This variability likely stems from differences in assessment tools (e.g., skills vs. dispositions), educational context, cultural norms, and PBL implementation fidelity. In addition, several moderator variables were inconsistently reported across studies, such as intervention duration, critical thinking dimensions, and regional differences. For example, the intervention duration varied from 6 weeks to 28 weeks in different studies (e.g., Tiwari et al., 2006 used 28 weeks, while Choi, 2004 and Yu et al., 2013 used 6–16 weeks). The differing dimensions of critical thinking assessed (e.g., total score vs. specific dimensions like systematic thinking or evidence-based thinking) also contributed to variability. Furthermore, studies from different regions, such as China, Korea, Iran, and the USA, may reflect cultural and educational differences that impact the development of critical thinking skills.
Although subgroup analyses could help disentangle these sources of variability—such as differences by region, academic discipline, or outcome measurement—we were unable to perform such analyses due to insufficient and inconsistently reported stratified data. We acknowledge this as a limitation and recommend that future research provide more standardized subgroup-level information to support moderator analysis.
In addition to heterogeneity, potential publication bias was considered. Although the funnel plots showed some visual asymmetry, we interpret this with caution. Such asymmetry may arise from small-study effects, outcome measurement variation, or methodological quality differences, rather than true publication bias. Importantly, Egger’s regression test and Begg’s test revealed no significant bias in either comparison, suggesting that the overall pooled results are robust. Nevertheless, future meta-analyses may benefit from a larger and more evenly distributed sample base to improve the reliability of funnel plot interpretations.
Two other meta-analyses have explored the impact of PBL on critical thinking in medical students compared to traditional lectures. Wei et al. (2024) found that PBL had a greater impact on nursing students’ critical thinking skills, while Manuaba et al. (2022) reported no significant difference between PBL and traditional methods in critical thinking assessments among first-year medical students. In Wei et al. (2024), the PBL intervention was often combined with other teaching methods like simulations, case-based learning, teamwork, concept mapping, and clinical practice. In Manuaba et al.’s (2022) study, the control group included conventional methods like LBL (lecture base learning), tutorial learning, and theory-based discussions. The studies in our research, PBL intervention did not combined with other teaching methods (except LBL), and conventional methods mainly used LBL. The differences in study grouping and research subjects may account for the varying conclusions.
Drawing from educational psychology, PBL promotes critical thinking through several mechanisms: contextual learning encourages knowledge integration, facilitation supports metacognition, and peer collaboration fosters reflective dialog (Schmidt et al., 2007). These features align with social constructivist perspectives, which emphasize that peer interaction fosters critical thinking through shared reasoning and reflection (Gokhale, 1995). However, such benefits may vary across regions—Western students may be more accustomed to discussion-based methods, while in Asian settings, PBL may represent a more novel and impactful shift (Frambach et al., 2012).
Furthermore, several mechanisms may explain why PBL is particularly effective at fostering critical thinking. First, it engages learners in complex, authentic problem-solving, requiring synthesis and judgment. Second, group discussion promotes reasoning, justification, and reflective thinking. Third, PBL encourages learner autonomy through self-directed exploration of knowledge gaps. These mechanisms are supported by empirical findings that show PBL can improve metacognitive awareness and critical thinking dispositions, including truth-seeking and analytical ability (Tiwari et al., 2006; Falcó Pegueroles et al., 2021).
Future studies should explore moderating variables (e.g., cultural context, instructional design), adopt longitudinal or mixed-methods designs, and clarify the alignment between critical thinking/clinical reasoning conceptualization, the PBL learning objectives, and assessment tools. This would help build a more unified understanding of how PBL translates into clinical reasoning and decision-making. Specifically, future research should attempt to use consistent critical thinking assessment tools across studies to perform subgroup analyses, which could reveal how different tools influence PBL outcomes and enhance the reliability and transparency of findings.
Despite these promising findings and theoretical underpinnings, several limitations should be acknowledged. First, some studies did not thoroughly report their randomization methods, which could lead to higher selection bias. Also, the geographical and cultural contexts were relatively limited, potentially affecting the results’ broad applicability. Although multiple assessment tools were used, the psychometric properties and applicability of each tool require further validation.
Conclusion
Overall, PBL as a teaching method shows significant advantages in improving medical students’ critical thinking skills. This supports its continued integration into health professions education. However, given the substantial heterogeneity observed across studies, future research should aim to standardize outcome measures, align critical thinking assessments with clear conceptual frameworks, and improve methodological rigor through randomized, longitudinal, and mixed-methods designs. Additionally, future studies should explore how cultural context, academic discipline, and instructional design influence the effectiveness of PBL, and assess how improvements in critical thinking translate into clinical reasoning and professional practice. These efforts will help refine the implementation of PBL and maximize its impact on curriculum development and student outcomes.
Data availability statement
The original contributions presented in this study are included in this article/Supplementary material, further inquiries can be directed to the corresponding author.
Author contributions
TS: Conceptualization, Formal Analysis, Investigation, Writing – review and editing. JL: Data curation, Methodology, Writing – review and editing. LM: Data curation, Writing – review and editing. YL: Formal Analysis, Writing – review and editing. QK: Investigation, Writing – review and editing. LX: Conceptualization, Funding acquisition, Methodology, Project administration, Supervision, Validation, Visualization, Writing – original draft.
Funding
The authors declare that financial support was received for the research and/or publication of this article. This research was funded by the National Natural Science Foundation of China (NSFC), grant number 32000456 and Natural Science Foundation of Guangdong province, grant number 2023A1515010434.
Acknowledgments
We thank Stanley L. Lin for the English language editing.
Conflict of interest
The authors declare that the research was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest.
Generative AI statement
The authors declare that no Generative AI was used in the creation of this manuscript.
Publisher’s note
All claims expressed in this article are solely those of the authors and do not necessarily represent those of their affiliated organizations, or those of the publisher, the editors and the reviewers. Any product that may be evaluated in this article, or claim that may be made by its manufacturer, is not guaranteed or endorsed by the publisher.
Supplementary material
The Supplementary Material for this article can be found online at: https://www.frontiersin.org/articles/10.3389/feduc.2025.1565556/full#supplementary-material
References
Alreshidi, R., and Alreshidi, F. S. (2023). The effectiveness of problem-based learning in improving critical thinking and problem-solving skills in medical students: A systematic review of fifteen years’ experience (2005-2019). Middle East J. Family Med. 7, 10–75. doi: 10.5742/MEWFM.2023.95256077
Choi, E., Lindquist, R., and Song, Y. (2014). Effects of problem-based learning vs. traditional lecture on Korean nursing students’ critical thinking, problem-solving, and self-directed learning. Nurse Educ. Today 34, 52–56. doi: 10.1016/j.nedt.2013.02.012
Choi, H. (2004). [The effects of PBL (Problem-Based Learning) on the metacognition, critical thinking, and problem solving process of nursing students] [Korean]. Taehan Kanho Hakhoe Chi 34, 712–721. doi: 10.4040/jkan.2004.34.5.712
Deshanty, N. A., Kartikowati, R. S., and Syabrus, H. (2023). The use of the problem-based learning (PBL) model in improving students’ critical thinking skills on economy learning subjects at sma negeri 2 karimun. J. Pendidikan Dan Pengajaran 7, 586–598. doi: 10.33578/pjr.v7i3.9402
Dwyer, C. P., and Walsh, A. (2020). An exploratory quantitative case study of critical thinking development through adult distance learning. Educ. Technol. Res. Dev. 68, 17–35. doi: 10.1007/s11423-019-09659-2
Ennis, R. H. (1987). “A taxonomy of critical thinking dispositions and abilities,” in Teaching thinking skills: Theory and practice, eds I. J. B. Baron and R. J. Sternberg (New York, NY: W. H. Freeman & Times Books & Henry Holt & Co).
Facione, P. A. (1990). Critical thinking: A statement of expert consensus for purposes of educational assessment and instruction. Millbrae, CA: California Academic Press.
Falcó Pegueroles, A., Rodríguez Martín, D., Ramos Pozón, S., and Zuriguel Pérez, E. (2021). Critical thinking in nursing clinical practice, education and research: From attitudes to virtue. Nurs. Philos. 22:e12332. doi: 10.1111/nup.12332
Frambach, J. M., Driessen, E. W., Chan, L. C., and van der Vleuten, C. P. (2012). Rethinking the globalisation of problem-based learning: How culture challenges self-directed learning. Med. Educ. 46, 738–747. doi: 10.1111/j.1365-2923.2012.04290.x
Gholami, M., Moghadam, P. K., Mohammadipoor, F., Tarahi, M. J., Sak, M., Toulabi, T., et al. (2016). Comparing the effects of problem-based learning and the traditional lecture method on critical thinking skills and metacognitive awareness in nursing students in a critical care nursing course. Nurse Educ. Today 45, 16–21. doi: 10.1016/j.nedt.2016.06.007
Gokhale, A. A. (1995). Collaborative learning enhances critical thinking. J. Technol. Educ. 7:22. doi: 10.21061/jte.v7i1.a.2
Hemming, K., Hughes, J. P., McKenzie, J. E., and Forbes, A. B. (2021). Extending the I-squared statistic to describe treatment effect heterogeneity in cluster, multi-centre randomized trials and individual patient data meta-analysis. Statist. Methods Med. Res. 30, 376–395. doi: 10.1177/0962280220948550
Higgins, J. P. T., Altman, D. G., Gøtzsche, P. C., Jüni, P., Moher, D., Oxman, A. D., et al. (2011). The Cochrane Collaboration’s tool for assessing risk of bias in randomised trials. BMJ-Br. Med. J. 343, 889–893. doi: 10.1136/bmj.d5928
Khosravizadeh, O., Ahadinezhad, B., Maleki, A., Hashtroodi, A., Vosoughi, P., and Kohan, S. (2022). Path analysis of critical thinking effect on self-efficiency among nursing and medical students. Evid. Based Health Policy Manag. Econ. 6, 235–244.
Lee, J., and Son, H. K. (2021). Comparison of learning transfer using simulation problem-based learning and demonstration: An application of papanicolaou smear nursing education. Int. J. Environ. Res. Public Health 18:1765. doi: 10.3390/ijerph18041765
Magdalena, I., Fadhillahwati, N. F., Amalia, R., and Farhana, S. (2023). Improving mathematics learning outcomes through problem-based learning (PBL) learning model for 4th grade students. Rev. Multidiscipl. Educ. Cult. Pedagogy 2, 78–82. doi: 10.55047/romeo.v2i2.692
Manuaba, I. B. A. P., No, Y., and Wu, C. (2022). The effectiveness of problem based learning in improving critical thinking, problem-solving and self-directed learning in first-year medical students: A meta-analysis. PLoS One 17:e277339. doi: 10.1371/journal.pone.0277339
Mok, C. K., Whitehill, T. L., and Dodd, B. J. (2014). Concept map analysis in the assessment of speech-language pathology students’ learning in a problem-based learning curriculum: A longitudinal study. Clin. Ling. Phonet. 28, 83–101. doi: 10.3109/02699206.2013.807880
Ozturk, C., Muslu, G. K., and Dicle, A. (2008). A comparison of problem-based and traditional education on nursing students’ critical thinking dispositions. Nurse Educ. Today 28, 627–632. doi: 10.1016/j.nedt.2007.10.001
Papp, K. K., Huang, G. C., Lauzon Clabo, L. M., Delva, D., Fischer, M., Konopasek, L., et al. (2014). Milestones of critical thinking. Acad. Med. 89, 715–720. doi: 10.1097/ACM.0000000000000220
Paul, R., and Elder, L. (2008). The miniature guide to critical thinking: Concepts and tools, 5ed Edn. Santa Barbara, CA: Foundation for Critical Thinking Press.
Puranik, C. P., Pickett, K., and de Peralta, T. (2023). Evaluation of problem-based learning in dental trauma education: An observational cohort study. Dental Traumatol. 39, 625–636. doi: 10.1111/edt.12870
Schmidt, H. G., Loyens, S. M. M., Vangog, T., and Paas, F. (2007). Problem-Based learning is compatible with human cognitive architecture: Commentary on kirschner. Sweller, and Clark (2006). Educ. Psychol. 42, 91–97. doi: 10.1080/00461520701263350
Sopwan, I. D., Soetisna, U., and Redjeki, S. (2018). Implementation of PBL model to enhance critical thinking skills and argumentation skills of students. Edubiol. J. Penel. Ilmu Dan Pendidikan Biol. 6, 94–98. doi: 10.25134/edubiologica.v6i2.2369
Sterne, J. A., Hernán, M. A., Reeves, B. C., Savović, J., Berkman, N. D., Viswanathan, M., et al. (2016). ROBINS-I: A tool for assessing risk of bias in non-randomised studies of interventions. BMJ-Br. Med. J. 355:i4919. doi: 10.1136/bmj.i4919
Thomson, H., Craig, P., Hilton-Boon, M., Campbell, M., and Katikireddi, S. V. (2018). Applying the ROBINS-I tool to natural experiments: An example from public health. Syst. Rev. 7:15. doi: 10.1186/s13643-017-0659-4
Tiwari, A., Lai, P., So, M., and Yuen, K. (2006). A comparison of the effects of problem-based learning and lecturing on the development of students’ critical thinking. Med. Educ. 40, 547–554. doi: 10.1111/j.1365-2929.2006.02481.x
Wanjari, D. S., Vagha, D. S., and Mahakalkar, D. C. (2020). Assessment of critical thinking skills of postgraduate students by using the critical self-thinking inventory for clinical examination. Glob. J. Med. Res. 20, 33–40.
Wei, B., Wang, H., Li, F., Long, Y., Zhang, Q., Liu, H., et al. (2024). Effectiveness of problem-based learning on development of nursing students’ critical thinking skills: A systematic review and meta-analysis. Nurse Educ. 49, E115–E119. doi: 10.1097/NNE.0000000000001548
Yu, D., Zhang, Y., Xu, Y., Wu, J., and Wang, C. (2013). Improvement in critical thinking dispositions of undergraduate nursing students through problem-based learning: A crossover-experimental study. J. Nurs. Educ. 52, 574–581. doi: 10.3928/01484834-20130924-02
Zabit, M. N. M., Karagiannidou, E., and Zachariah, T. Z. (2016). Teaching business in Malaysia and the use of PBL to nurture students’ critical thinking: A case study of Sultan Idris Education University. Geografia 12, 1–11.
Gavgani, V., Hazrati, H., and Sohrabi, Z. (2021). [Effect of problem-based learning and reasoning tests on learners’ critical thinking skills before and after the educational intervention in graduate students of basic sciences]. Depict. Health 12, 34–43. doi: 10.34172/doh.2021.05
Keywords: problem-based learning (PBL), medical education, critical thinking, meta-analysis, higher education
Citation: Su T, Liu J, Meng L, Luo Y, Ke Q and Xie L (2025) The effectiveness of problem-based learning (PBL) in enhancing critical thinking skills in medical education: a systematic review and meta-analysis. Front. Educ. 10:1565556. doi: 10.3389/feduc.2025.1565556
Received: 23 January 2025; Accepted: 19 May 2025;
Published: 04 June 2025.
Edited by:
Emilia Isabel Martins Da Costa, University of Algarve, PortugalReviewed by:
Rita Payan Carreira, University of Evora, PortugalHugo Ferreira Rebelo, University of Évora, Portugal
Ida Bagus Amertha Putra Manuaba, Udayana University, Indonesia
Copyright © 2025 Su, Liu, Meng, Luo, Ke and Xie. This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) and the copyright owner(s) are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.
*Correspondence: Lingzhu Xie, bHp4aWVAc3R1LmVkdS5jbg==
†These authors have contributed equally to this work