- 1Vice Presidency for Lifelong Learning and Future Education, Tecnologico de Monterrey, Monterrey, Mexico
- 2Institute for the Future of Education, Tecnologico de Monterrey, Monterrey, Mexico
Introduction: This study assesses an andragogical teaching model, INSPIRA, deployed in continuing education (CE) programs at a private university in Mexico. Workshops have been conducted to train CE facilitators (instructors) on this model, which was first implemented in 2016. This research aims to evaluate the effectiveness of these teaching model workshops and to identify areas for improvement based on facilitators’ perceptions, drawing on the theories of competency development and job satisfaction.
Methods: A mixed-methods approach was utilized, incorporating historical data, surveys, and focus groups. Historical data were gathered from institutional records, including teaching evaluations for 536 facilitators conducted both before and after their training on the teaching model. From this group, 158 facilitators completed a researcher-developed survey assessing the impact of the training. Furthermore, two focus groups were held with 16 facilitators in total.
Results: The results indicated a statistically significant enhancement in teaching evaluations following the training, with mean evaluation scores increasing by 0.82 points on a 10-point scale (p ≤ 0.001). More than 60% of facilitators reported improvements in teaching clarity, relevance, and practicality. Approximately 65% expressed that the topic became more applicable to their work, while only 30% observed increases in visibility, recognition, or income. Qualitative feedback indicated that facilitators appreciated the model’s clarity, structure, and focus, but suggested that workshops should be tailored to specific continuing education programs, supplemented with ongoing support, and regularly updated.
Discussion: In conclusion, the workshops effectively improved facilitators’ teaching competency. However, there is a need to enhance the components of job satisfaction. Significant increases in teaching evaluations are associated with facilitators’ improvements in structuring sessions, applying adult learning principles, and using technology, boosting their confidence and professional identity. However, these benefits didn’t always lead to recognition or income, requiring institutional support. Tailoring workshops to different modalities and providing certifications could further enhance implementation and facilitator engagement. The reliance on self-reported data and the study’s singular institutional context limit the applicability of its findings.
Introduction
Adult education
An adult person is defined as someone who is financially independent and responsible for their personal life (Chang, 2025). Teaching and learning activities aimed at them are called adult education, which includes various forms of learning and educational initiatives (basic education, training, continuing education, etc.) as well as approaches (professional, recreational, or informational) (Chang, 2025). In adult education and training for professional development, trainers from diverse fields teach content in their respective areas and facilitate training across different sectors. The scope covers any sector offering adult education and training activities, including companies, public, and private institutions (Chang, 2025). This type of adult education, which sometimes aligns with a broader lifespan approach, highlights the generally low participation rates of adults, especially vulnerable and marginalized groups who face various barriers (Belete et al., 2022). In practice, it often reduces education to international economic competition and labor market needs. This instrumental view has been criticized as narrow and driven by neoliberal discourses and practices (Duke, 2015).
In Africa, obtaining certificates does not necessarily reflect genuine commitment or interest and can give a false sense of security regarding the real impact of educational efforts. It is recommended that professional development should not only provide educational opportunities but also foster authentic interest and relevance to practitioners’ work for meaningful growth (Ndlovu et al., 2025). In Asia, some institutions view learning and community as an ongoing process where each person’s self is continually regenerated in relation to others, which promotes collaboration to build a new community (Duke, 2015). In Latin America, adult education discussions focus on processes involving observing, interacting, and engaging with others through social activities, highlighting daily events and emphasizing that teaching should consider characteristics like being self-directed, experiential, and transformative (Barrantes-Elizondo, 2022). Regardless of this or new approaches (whether spirituality, narrative, critical theory, postmodernism, feminist perspectives, or non-Western traditions), these frameworks recognize the complexities of adult identities and social contexts (Barrantes-Elizondo, 2022).
Background and context
For organizations to remain competitive and relevant in today’s fast-paced society, employees rely on formal Continuing Education (CE) (Laal et al., 2014; Kaplan, 2016). Our Institution, a Mexican private university, offers various formal postgraduate CE options and recognizes adult learners who complete them by issuing certificates. It also offers workshops to facilitators (teachers) and course coordinators to develop skills in andragogy (this is an understanding of how and why adults learn).
One key workshop certifies facilitators in the INSPIRA andragogical model, implemented in 2016. The model is designed to foster long-term learning and professional impact, guided by the following core elements:
• Inspiration—Arouse interest and encourage active participation.
• Nourishment—Provide relevant and applicable content that has the potential to transform.
• Significance—Learners create meaning through active participation in activities of repetition, reflection, discussion, argumentation, and action that generate a connection with their reality and previous knowledge.
• Practice—To validate and complement knowledge, develop the ability to apply it, and generate a sense of achievement that fosters the retention of the concept and confidence in its application.
• Integration—Integrate the concepts learned, connecting with existing ones and modifying necessary neural structures to achieve long-term learning.
• Real Challenges—Applying real-life learning (objectives, activities, times, and concrete deliverables).
• Advice—Accompany the application by coaching, monitoring, and tutoring.
This model was designed to incorporate Marzano’s educational taxonomy (Marzano and Kendall, 2007), Knowles’ theory of self-directed learning (Knowles et al., 2020), the Kirkpatrick evaluation model (Kirkpatrick and Kirkpatrick, 2016), and results from research on neuroscience (Andreatta, 2019). Unlike these models, INSPIRA emphasizes practical application in the workplace, learner motivation, and the evaluation of job performance improvements. Its strength lies in contextualizing the social component in Latin America, including diagnosis of expectations, training practice-oriented modules (micro-modules), practical application in class (experiential phase and resources for practice), cases applied at work as challenges, pre- and post-feedback and evaluation, follow-up (continuous support), and mentoring.
Research gap
Despite the implementation of the INSPIRA model by plenty of facilitators in Latin America, the effectiveness of the workshop has not been formally evaluated. Few studies have assessed facilitator education programs (instructors/trainers from diverse fields teaching content in their respective areas), their long-term impacts, and effectiveness (Finsterwald et al., 2013; Balwant, 2020). This gap underscores the need for comprehensive, context-sensitive evaluations that bridge the gap between theory and practice (Dixon et al., 2005).
Study objectives
The objective of this study is to evaluate the effectiveness of the INSPIRA workshops for facilitators and identify areas for improvement based on facilitator perceptions. Specifically, the study seeks to answer the following research questions:
• What indicators reflect INSPIRA’s effectiveness and areas for improvement?
• What skills have facilitators developed through INSPIRA workshops?
• What benefits do facilitators perceive from their participation in INSPIRA workshops?
Addressing these objectives can provide insights to enhance professional development models for facilitators in CE.
Theoretical framework
Our work is guided by two theoretical frameworks: the competency development framework under which the facilitators are trained, to increase their capability to teach other adults in their own field, and the job satisfaction framework, which also considers how facilitators, commonly recruited from industry, feel supported by the hiring institution in conducting their teaching endeavors.
Competency development in professional training
Competency development is essential in professional training, aiming to effectively enhance teaching practices (Finsterwald et al., 2013). Facilitator training programs grounded in solid theoretical frameworks and evaluated using evidence-based criteria have a significant impact on the effectiveness of adult education (U.S. Department of Education, 2015). Moreover, evaluating facilitator preparedness (through surveys, focus groups, and interviews) helps ensure competence. This is especially important in the rapidly growing adult-oriented online education contexts where facilitator preparedness is often insufficient (Darling-Hammond et al., 2010; Donavant, 2009; Florea, 2014). These considerations directly relate to the INSPIRA model’s goal of enhancing facilitator competencies for improved teaching effectiveness.
Job satisfaction in Andragogical models
From a job satisfaction perspective, employees who experience a sense of insignificance in their roles may become demotivated and dissatisfied (Scholten et al., 2022). Lencioni’s theory (2007) suggests that factors such as job satisfaction, motivation, and career development positively influence employee performance, whereas anonymity, irrelevance, and lack of measurement contribute to job dissatisfaction. Facilitator training addressing these factors can enhance employee engagement and reduce turnover, particularly in continuing education contexts (Safrit and Owen, 2010; Maity, 2019). Thus, evaluating job satisfaction through the INSPIRA model provides insights into facilitators’ professional motivation and engagement with the Institution.
Literature review
Previous studies emphasize the development of facilitators’ competency as a central component of successful lifelong learning (LLL) initiatives. According to Patten and Galvan (2019), practitioner-driven evaluation approaches rely on empiricism (gaining knowledge through observation), which supports the iterative design of adult training approaches based on evidence from real-world practice, “professional wisdom and educators´ individual experiences and consensus” (U.S. Department of Education, 2015).
Research also links job satisfaction with facilitator performance. Safrit and Owen (2010) applied Lencioni’s model to high-turnover roles, such as in CE programs, advocating for training to mitigate the causes of job dissatisfaction. However, literature on professional development program evaluation (Kirkpatrick and Kirkpatrick, 2016) typically focuses on immediate or short-term outcomes, overlooking long-term effectiveness, and rarely incorporates facilitators’ perspectives (Balwant, 2020).
More context-sensitive evaluations connecting theory with practical implementation that highlight the value of interpretive case studies in exploring under-researched phenomena, such as training effectiveness, within specific institutional settings, are highlighted by some authors (Dixon et al., 2005; Ponelis, 2015). This aligns with the practitioner research approach, where those who design and implement training (Borko et al., 2007) take an active role in its investigation.
Despite the recognition of competency development and job satisfaction as vital factors, existing literature lacks an evaluation of these aspects from practitioners’ perspectives (coordinators or those responsible for maintaining CE offerings), particularly concerning facilitators and program developers (de Jong and Emmelkamp, 2000). Few studies directly link these frameworks to the effectiveness of facilitator training models such as INSPIRA. This gap highlights the need for empirical research to explore facilitators’ perceptions and experiences regarding competency and job satisfaction for the continuous improvement of CE offerings.
Methodology
Study design
This study followed a mixed-methods, empirical, and exploratory design. The approach aligns with naturalistic inquiry, emphasizing contextual and subjective insights from participants (Athens, 2010; Neuman, 1989). The research flow summary is illustrated in Figure 1.
Participants and sampling
A convenience sample from a single institution was selected due to ease of access and available time constraints. The total population consisted of 536 facilitators who had completed the INSPIRA workshop by April 2023. An anonymized dataset was extracted from institutional records showing facilitators’ teaching evaluations before and after the workshop. Of this group, 158 facilitators completed an online survey anonymously, and 16 accepted to participate in one of two focus groups (N1 = 9, N2 = 7). The demographic breakdown of survey and focus group participants is shown in Tables 1, 2.
Survey design and data collection
A researcher-designed, five-part survey was administered using Google Forms over 3 weeks in April 2023 (see the Supplementary Material). It contained eight questions, combining Likert scale items, multiple choice, and one open-ended question. The five sections covered: (1) demographics; (2) perceptions of INSPIRA’s impact using a 12-item Likert scale (1 = strongly disagree to 5 = strongly agree); (3) multiple-choice questions aligned with INSPIRA objectives and job satisfaction elements (Lencioni, 2007); (4) institutional impact based on a 4-item Likert scale (Scholten et al., 2022); and (5) an open-ended comment section.
All facilitators trained in INSPIRA were contacted via institutional e-mail with the invitation to participate in the survey, which contained a consent form. This consent form explained that participation was voluntary, anonymous, and without consequences for non-participation.
In the same e-mail, facilitators were also invited to participate in virtual focus groups. Two groups, comprising a total of 16 participants who agreed to participate, were formed and conducted based on availability. Sessions were conducted on Zoom and lasted approximately 1 h. Sessions included guided questions about INSPIRA’s impact on teaching practices, learning design, emotional connection, and perceived organizational support (Supplementary Material). Questions were adapted or rotated if redundancy and saturation of responses occurred. Discussions were recorded, transcribed in Spanish, anonymized, and translated into English for analysis.
Data analysis
Descriptive statistics and inferential tests were conducted using Excel and Jamovi. A Shapiro–Wilk test was used to assess normality. Paired-sample t-tests were used to compare facilitators’ teaching evaluations before and after the workshop. ANOVA and Pearson correlations were used to explore the relationships between survey responses and demographic variables. Thematic analysis was applied to the open-ended survey responses and focus group transcripts, following the guidelines of Hennink et al. (2019). Manual codification involved three progressive rounds by two independent reviewers to ensure inter-coder reliability.
Ethical considerations
Informed consent, anonymity, and confidentiality were upheld throughout the study. Participants could withdraw at any time without consequence. The study was reviewed and supported by the Vice Presidency for Lifelong Learning and Future Education.
Results
This study examines the effectiveness and areas for improvement of the INSPIRA workshop from the facilitators’ perspectives, drawing on two theoretical frameworks: competence development and job satisfaction. This section is organized by the three data collection tools used.
Quantitative results: facilitator evaluations before and after INSPIRA
Evaluation data from CE learners demonstrated a notable improvement in facilitators’ performance following attendance at the INSPIRA workshop. As shown in Table 3 and Figure 2, only 43.66% of facilitators received evaluations above nine before INSPIRA, but this number nearly doubled to 80.78% after the workshop.
Table 3. Descriptives of the evaluation made by adult learners on the facilitators’ teaching, before and after the facilitator took the INSPIRA workshop (N = 536).
Figure 2. Percentage distribution of facilitators’ teaching evaluations (on a scale from 0 to 10) before and after participating in the INSPIRA workshop.
Given the non-normal distribution (W = 0.915, p < 0.001), the Wilcoxon Signed-Rank Test confirmed a significant difference in evaluations (Kerby, 2014) with a significant effect size (Tak and Ercan, 2023). These findings (Table 4) suggest that the workshop had not only a statistically significant effect but also a practically meaningful impact on facilitator performance.
Table 4. Results of the paired samples Wilcoxon rank test comparing the facilitators’ teaching evaluations before and after they took the INSPIRA workshop.
Survey insights: facilitator perceptions
Of the 536 facilitators invited, 158 responded, yielding a response rate of 29.8%. Figure 3 illustrates the diverse backgrounds of respondents, including consultancy, education, and organizational fields, with the majority teaching in two of the four programs. Only 27% of teachers taught across all four delivery formats.
Figure 3. The professional background of 158 facilitators certified in the INSPIRA model (A,B) and their participation in different delivery programs (C,D). (A) Fields or areas where facilitators have served as professionals. (B) Facilitators’ career study areas (other areas include politics, law, economy, and marketing). (C) Distribution of the facilitators that only teach in one program (N = 33). (D) Time distribution of the facilitators who teach in two programs (N = 42). (E) Time distribution of the facilitators who teach in three programs (N = 29). (F) Time distribution of the facilitators who teach in all four programs (N = 36). The time distribution for the remaining respondents (15) could not be calculated since the sum exceeded 100%.
In response to question 6, facilitators selected 3–4 positive impacts of INSPIRA on average (35%). Figure 4 shows that over 60% agreed that INSPIRA improved content clarity, dynamism, and relevance. Fewer respondents (below 50%) felt it improved their domain expertise or learners’ satisfaction. Approximately 65% agreed that INSPIRA enhanced the relevance of their work, but only 30% felt it increased their visibility or income.
Figure 4. Number of facilitators selecting the specified option among the ten provided in Section 3 of the survey, regarding how the INSPIRA workshop has affected their competencies (P) and sense of job satisfaction (O) (N = 158).
Internal consistency was high (Cronbach’s alpha = 0.9729), confirming the reliability of Likert scale responses. As shown in Figure 5 and Table 5, the highest-rated items included session design, facilitator training, and perceptions of institutional support. Lower-rated items included NPS improvement and commitment to work.
Figure 5. Distribution of facilitators’ Likert scale (1 to 5) responses (%) to survey questions in Sections 2 and 4 about the INSPIRA model’s impact on competencies and work satisfaction variables (N = 158). Each variable number matches those listed in Table 5.
Table 5. Descriptive statistics of the Likert-scale responses concerning the impact of the INSPIRA workshop on facilitators’ competencies and job satisfaction (n = 158).
Regarding the correlation with demographics, a Pearson correlation analysis (see Table A1) revealed limited relationships between the survey items and demographics. However, significant associations were found for facilitators teaching real-time virtual classes (VC). A Kruskal-Wallis test (W = 0.755, p < 0.001) revealed that facilitators with 31–100% VC teaching reported stronger perceptions of institutional support (p ≤ 0.05; Table 6), with a small to medium effect size (Saha and Paul, 2023). Figure A1 shows the dispersion of answers to the Likert scale, comparing the answers of three groups of facilitators who teach in VC, 0–10% of their time (N = 72); 11–30% of their time (N = 41); 31–100% of their teaching time (N = 45), over their perception regarding work commitment, institutional support and confidence in the institution.
Table 6. Results of the Kruskal-Wallis test comparing the answers of three groups of facilitators who teach in VC, different fractions of time (0–10% = 72, 11–30% = 41, 31–100% = 45), over their perception regarding work commitment, institutional support, and confidence in the institution.
Qualitative results: open-ended question and focus group findings
The open-ended question “Please provide any additional comments about the INSPIRA workshop” was answered only by 95 facilitators. The top ten codes highlight key themes identified by participants: “Congratulations” (N = 13), followed by “Certification” (N = 9) and “Actualization” and “Unconformity” (N = 7 each). Other common themes included “Technology,” “Flexibility,” “Content,” and “Length” (N = 6 each), while “Accompaniment” and “Relevance” were mentioned less frequently (N = 4 each). These responses reflect a mix of positive feedback and constructive criticism regarding the training experience. Common themes included clarity, structure, and the need for customization. Suggestions included more flexibility, content updates, impact measurement, and technology support.
Themes for the Focus Group were categorized into three domains: Andragogy, Technology, and Stakeholders (see Table 7). The top codes included active learning, teaching strategies, feedback, and learner profiles. A 75% saturation rate was achieved in the first group, approaching the recommended 80% for small samples (Hennink et al., 2019).
Participants highlighted several strengths of the INSPIRA model, particularly its structured approach and its impact on both facilitators and learners. One facilitator noted:
“One of the main issues that has personally benefited me from the INSPIRA model is having structure, having my time very well defined, and knowing how I can impact even the participants of the courses and work sessions.”
Others emphasized the value of relevant and andragogically sound content, as reflected in the comment:
“The nourishment [step] is very important because we are going to content that is andragogically accepted by the adult, which adds. This part has decreased the amount of straw in the courses.”
Increased learner engagement was also mentioned:
“I achieve more than 80% of people with a camera on after the first cycle [of INSPIRA] because they feel you are speaking personally or focusing the course on each one's needs.”
In addition, the importance of content mastery and delivery style was underscored:
“I also agree with the relevant content and the mastery and knowledge of the subject. […] One point that stands out is the dynamism of how the class is conducted and the dynamics of the practice.”
At the same time, facilitators identified areas for improvement. Some expressed concern about the misalignment between participant needs and organizational expectations, stating:
“I almost always notice differences between the participants and the organization's expectations or needs. […] The impact is transforming lives in that sense.”
Others highlighted the need for better customer service throughout the training process:
“We have gone into a problem, which is customer service. This is a wake-up call. […] There should be an instrument that can be a form where you raise the entire profile of the participant, the expectations.”
Finally, the potential of using perceptual learning assessments was mentioned to improve instructional design:
“Note that perhaps the theme of this perceptual approach is to understand the participants' learning: kinesthetic, visual, or acoustic. If it can be measured with instruments, it can be a good alternative.”
These reflections underscore both the positive outcomes of the INSPIRA model and the need for ongoing adaptation to enhance its effectiveness.
Discussion
This section discusses the effectiveness and opportunity areas of the INSPIRA workshop, structured around the research questions and integrating both quantitative and qualitative results.
Key findings and interpretation
Figure 6 shows a summary of the results of this study.
The INSPIRA model shows promise in enhancing facilitators’ teaching competencies, as evidenced by the statistically significant improvement in facilitator evaluations following the workshop. The quantitative results indicated clear gains in learners’ perceptions, especially in content clarity, dynamism, and applicability. These align with qualitative insights from the focus groups and open-ended survey responses, in which facilitators emphasized improved structure, session design, and emotional connection with learners. As some facilitators indicated:
“Sometimes solving cases helps that emotional connection a lot, because it also improves their well-being, and that is an essential element that they must take away from the training. Well-being is a key aspect of what we strive for at INSPIRA; we aim for people to leave with strengthened competencies. This, in turn, fosters a change in habits or observable behaviors, as well as positive experiences with their collaborators, boss, colleagues, and others. The connection has helped me a lot, especially taking it to an experiential part.”
“We want this, which did us so much good, to impact them in the same way, at least in the issue of emotional management and leadership.”
Facilitators reported benefits across various domains, including active learning, engagement strategies, and technology use. However, they also identified shortcomings in applying the final steps of the INSPIRA model, particularly in “Real Challenges” and “Advice.” This limitation may stem from the short duration of CE courses or varying organizational contexts. Other studies have similarly emphasized the importance of tailoring content to the specific needs of adult learners and organizations (Galehdar et al., 2020).
Survey data indicated that facilitators teaching in fully virtual (VC) environments found INSPIRA especially valuable. This was reinforced by focus group comments describing increased participant engagement and relevance of course design in online settings. Thus, INSPIRA’s model appears well-suited to address the complexities of online adult education (Mott, 2009).
It is important to highlight that in the job satisfaction sphere, we only used four items on the Likert scale, not based on works that had covered this dimension (i.e., Spector, 2022; University of Minnesota, 2025). However, some facilitators acknowledge how they feel to be part of something big after being trained in the INSPIRA model:
“Each of us gives very different topics, health, technology, leadership. Although they are different topics, I feel that they somehow homogenize us as facilitators of the TEC and also serve as a distinction when we teach elsewhere. […]. Yes, it makes a difference when someone has been instructed in this way. With this method, you do notice the difference when you give your session,”
“My presentation would have been very different if I had not had these modules. I could not evaluate a before-and-after comparison since I had not done it on Zoom before. However, after taking [INSPIRA], I felt more prepared to lead a super session, and it went well. So, I thank Tecnologico de Monterrey for this preparation.”
Addressing the research questions
1. What are the indicators of INSPIRA’s effectiveness and opportunity areas? The primary indicator of INSPIRA’s effectiveness was the statistically significant increase in facilitators’ teaching evaluations following completion of the workshop. This was supported by improvements in content clarity, relevance, and teaching dynamism. Opportunity areas included the limited application of the final steps of the model, due to time constraints, organizational diversity, and a lack of contextual data. Facilitators also expressed a need for updated content and more flexibility in the model’s design.
2. What skills have the CE facilitators developed with the INSPIRA workshop? Facilitators reported gains in structuring sessions, designing learning cycles, enhancing emotional connection, and incorporating active learning. The workshop helped improve their ability to apply adult learning principles and use technology effectively in virtual settings. These outcomes reflect a meaningful development of pedagogical and andragogical competencies. This aligns with the view that effective trainers combine teaching skills with industry knowledge (Leow et al., 2023) and should be supported in cultivating LLL mindsets (Todd, 2002).
3. What benefits does the INSPIRA program bring to the CE facilitators? Facilitators noted an increase in confidence, professional identity, and ability to deliver impactful courses. Many appreciated the institutional support and peer-learning opportunities. However, these benefits did not always translate into increased visibility, recognition, or income, highlighting a need for institutional strategies that reinforce motivation through tangible rewards.
Theoretical contribution and implications
Unlike traditional training program evaluations that often exclude the perspectives of those implementing instruction, this research centers on the lived experience of facilitators and is analyzed from the practitioners’ perspective (de Jong and Emmelkamp, 2000). The findings suggest that institutions aiming to improve facilitator effectiveness, especially in adult and continuing education, may adopt training models that integrate neuroscience, andragogy, and practical applications, such as INSPIRA. Emphasizing emotional connection, structured learning cycles, and relevance to learners’ real-world challenges appears to benefit facilitator performance significantly.
To improve implementation, institutions could consider offering differentiated versions of the INSPIRA workshop tailored to specific program modalities (e.g., in-person, virtual) and content types. Further, integrating learner profile assessments at the course design stage may enhance the applicability of INSPIRA’s final stages. Formal certification mechanisms and ongoing support would also incentivize facilitator engagement, aligning with recommendations in recent literature (Yaqub et al., 2021; Via et al., 2019; Mott, 2009).
Study limitations and future research
This work can serve as a brief communication of possible benefits of the INSPIRA workshop for facilitators, but requires replication with larger samples to generalize the benefits. Thus, the conclusions of this study must be interpreted in light of the following methodological limitations. First, the small size and uneven participation in focus groups (16 participants total) may not capture the full diversity of experiences. Additionally, the absence of longitudinal follow-up restricts insight into the workshop’s long-term effects on teaching performance and learner outcomes. Future studies should incorporate longitudinal designs to assess the retention and transfer of competencies over time. Another limitation is that the reliance on self-reported data introduces the risk of social desirability bias, and this work is also subject to the no-response bias.
Some areas of social sciences struggle to achieve adequate response rates for participating in studies. Most have a response % of 10 to 35%, while others, such as Organizational Research or Entrepreneurship, have percentages of 48% or 39%, respectively (Scheaf et al., 2023). Nonresponse bias limits the generalizability of results and biases relationships among variables. Factors linked to nonresponse may be falsely correlated with key variables, leading to inaccurate estimates of means or correlations (Scheaf et al., 2023).
In our work, we had a response % of 29.48%. Some reasons that may have contributed to this are an inadequate perception of the study, a lack of time or access to technology to complete the survey online, and an incorrect mailing address, among others. Using Slovin’s formula (Adhikari, 2021), with this percentage, it is estimated that with a 95% confidence level, the margin of error of the results can vary ±7%, which increases to ±9% if we use a 99% confidence level.
Although for standardized mean differences in the answers between groups, the nonresponse bias does not affect so much (Scheaf et al., 2023), in our case, we propose to do cross-sectional studies with a larger population, in which it is possible to measure the aspects that were relevant in this explorative study, which are the organizational factors of commitment, support and confidence. Additionally, the factor of time elapsed since the facilitator’s training in the INSPIRA model should be considered in future work to ensure internal validity.
On the other hand, in the case of social desirability bias, participants in focus groups might feel pressured to respond positively, especially in interviewer-led studies where social desirability demands are more evident (Oceno, 2025). Consequently, and as we based our work on a single institution (without controlling other institutional factors, i.e., taking other training or getting more experience), our following studies will employ a survey approach that operationalizes the qualitative findings of this study and covers other regions in Latin America to strengthen external validity. A strategy using randomized incentives could potentially be employed (Dutz et al., 2021).
As higher education systems adapt to the demands of LLL, green and digital transitions (UNESCO, 2023), initiatives like INSPIRA offer a valuable model for enhancing facilitator effectiveness. Looking forward, integrating artificial intelligence, adaptive learning tools, and participant feedback loops will be essential to maintaining the model’s relevance and reach across national and Latin American contexts.
Conclusion
This study examined the effectiveness of the INSPIRA workshop from the perspective of CE facilitators at a private university in Mexico. Quantitative findings showed statistically significant improvements in facilitator evaluations following the workshop, particularly in the clarity, dynamism, and relevance of the content. Qualitative data reinforced these findings, suggesting enhanced instructional design, stronger emotional connections with learners, and a shift toward more structured and learner-centered teaching. However, areas such as the “Real Challenges” and “Advice” stages of the INSPIRA model require better integration.
The study suggests the value of adopting evidence-based andragogical approaches in professional development programs. The INSPIRA model uses neuroscience-based methods, learner-centered approaches, and real-world relevance to promote metacognitive awareness and reflection, aligning with Marzano’s self-system thinking. It also adheres to Knowles’ adult learning principles by emphasizing autonomy, relevance, and immediate use.
By directly involving CE facilitators as key informants, this research examines their perceptions of competency development and job satisfaction. These two core areas are often acknowledged but rarely measured empirically in LLL training models. The findings demonstrate that INSPIRA supports professional growth and reinforces the significance of these theoretical frameworks in practice.
These insights underscore the importance of continuous improvement and responsiveness to facilitators’ needs. Institutions should tailor facilitator training to different program modalities (virtual or in-person), provide learner profile data to improve contextualization, and incentivize training completion through certification pathways.
While these findings primarily inform institutional practices within the studied context, they may inspire other universities to explore context-sensitive, competency-based facilitator training rooted in adult learning theory and reflective practice.
Data availability statement
The datasets presented in this article are not readily available because they were requested solely for analysis (in the case of the teacher evaluation), not for publication in raw form, and they contain information that could compromise the privacy of research participants (in the case of the survey and the focus groups). Requests to access the datasets should be directed to the corresponding author, who can prepare an anonymized version of the datasets after obtaining the necessary institutional permission.
Ethics statement
Ethical approval was not obtained for this study, as it was considered part of the continuous improvement of the strategies followed by the Vice Presidency for Lifelong Learning and Future Education. However, the study adheres to the ethical principles outlined in the Declaration of Helsinki. The study involved anonymous participation via an online survey and the anonymization of the two focus group results. Given these precautions and the adherence to ethical standards for informed consent and data protection, the study aligns with international ethical guidelines. The studies were conducted in accordance with the local legislation and institutional requirements. Written informed consent for participation was not required from the participants because respondents who completed the survey implied their consent to participate in the study, as outlined at the beginning of the questionnaire. Participants were informed of the research’s purpose, assured anonymity, and told that their responses would be used solely for research purposes. Additionally, participants in the focus groups were explicitly informed that their involvement was voluntary and that their responses would remain anonymous.
Author contributions
IdCTM: Conceptualization, Writing – review & editing, Project administration, Writing – original draft, Methodology, Investigation. JNMB: Writing – review & editing, Supervision, Resources, Conceptualization. PV-V: Conceptualization, Methodology, Writing – review & editing, Visualization, Data curation, Writing – original draft, Formal Analysis.
Funding
The author(s) declare that financial support was received for the publication of this article.
Acknowledgments
The authors would like to acknowledge the financial support of Wring Lab, Institute for the Future of Education, Tecnologico de Monterrey, Mexico, in the production of this work.
Conflict of interest
The authors declare that the research was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest.
Generative AI statement
The authors declare that Gen AI was used in the creation of this manuscript. To help in the writing process, Grammarly AI prompts were used.
Any alternative text (alt text) provided alongside figures in this article has been generated by Frontiers with the support of artificial intelligence and reasonable efforts have been made to ensure accuracy, including review by the authors wherever possible. If you identify any issues, please contact us.
Publisher’s note
All claims expressed in this article are solely those of the authors and do not necessarily represent those of their affiliated organizations, or those of the publisher, the editors and the reviewers. Any product that may be evaluated in this article, or claim that may be made by its manufacturer, is not guaranteed or endorsed by the publisher.
Supplementary material
The Supplementary material for this article can be found online at: https://www.frontiersin.org/articles/10.3389/feduc.2025.1601508/full#supplementary-material
References
Adhikari, G. P. (2021). Calculating the sample size in quantitative studies. Scholars J. 2021, 14–29. doi: 10.3126/scholars.v4i1.42458
Andreatta, B. (2019). Wired to grow: Harness the power of brain science to master any skill. 2nd Edn. Santa Bárbara, CA: 7th Mind Publishing.
Athens, L. (2010). Naturalistic inquiry in theory and practice. J. Contemp. Ethnogr. 39, 87–125. doi: 10.1177/0891241609343663
Balwant, P. T. (2020). Training and development of instructor-leadership: an instructional systems design approach. J. Hum. Serv. Train. Res. Pract. 6:3.
Barrantes-Elizondo, L. (2022). A look at how philosophical perspectives of adult education shape the practice at Universidad Nacional, Costa Rica. Rev. Electr. Educ. 26, 1–14. doi: 10.15359/ree.26-3.31
Belete, S., Duke, C., Hinzen, H., Owusu-Boampong, A., and Khau, H. P. (2022). Community learning Centres (CLCs) for adult learning and education (ALE): development in and by communities. Int. Rev. Educ. 68, 259–290. doi: 10.1007/s11159-022-09954-w
Borko, H., Liston, D., and Whitcomb, J. A. (2007). Genres of empirical research in teacher education. J. Teach. Educ. 58, 3–11. doi: 10.1177/0022487106296220
Chang, B. (2025). The changing landscape of adult education: Historical impacts and future directions. New York, NY: Routledge.
Darling-Hammond, L., Newton, X., and Wei, R. C. (2010). Evaluating teacher education outcomes: a study of the Stanford teacher education Programme. J. Educ. Teach. 36, 369–388. doi: 10.1080/02607476.2010.513844
de Jong, G. M., and Emmelkamp, P. M. G. (2000). Implementing a stress management training: comparative trainer effectiveness. J. Occup. Health Psychol. 5, 309–320. doi: 10.1037/1076-8998.5.2.309
Dixon, R., Meier, R. L., Brown, D. C., and Custer, R. L. (2005). The critical entrepreneurial competencies required by instructors from institution-based enterprises: a Jamaican study. J. Ind. Teach. Educ. 42, 25–51.
Donavant, B. W. (2009). The new, modern practice of adult education: online instruction in a continuing professional education setting. Adult Educ. Q. 59, 227–245. doi: 10.1177/0741713609331546
Duke, C. (2015). Lost soul or new dawn? Lifelong learning lessons and prospects from East Asia. J. Adult Contin. Educ. 21, 72–88. doi: 10.7227/JACE.21.1.6
Dutz, D., Huitfeldt, I., Lacouture, S., Mogstad, M., Torgovitsky, A., and Dijk, W.van. (2021) Selection in surveys: Using randomized incentives to detect and account for nonresponse Bias (no. w29549). Cambridge, MA: National Bureau of Economic Research.
Finsterwald, M., Wagner, P., Schober, B., Lüftenegger, M., and Spiel, C. (2013). Fostering lifelong learning – evaluation of a teacher education program for professional teachers. Teach. Teach. Educ. 29, 144–155. doi: 10.1016/j.tate.2012.08.009
Florea, R. (2014). Teaching methods in adult education. An appraisal of the effectiveness of methods used in training future teachers. Procedia. Soc. Behav. Sci. 142, 352–358. doi: 10.1016/j.sbspro.2014.07.684
Galehdar, N., Ehsani, M., Irajpour, A., and Jafari-Mianaei, S. (2020). Evaluation of in-person continuing education programs from the perspective of ward nurses. Journal of Education and Health Promotion, 9:258. doi: 10.4103/jehp.jehp_58_20
Hennink, M. M., Kaiser, B. N., and Weber, M. B. (2019). What influences saturation? Estimating sample sizes in focus group research. Qual. Health Res. 29, 1483–1496. doi: 10.1177/1049732318821692
Kaplan, A. (2016). Lifelong learning: conclusions from a literature review. Int. Online J. Primary Educ. 5:2.
Kerby, D. S. (2014). The simple difference formula: an approach to teaching nonparametric correlation. Compr. Psychol. 3:11. doi: 10.2466/11.IT.3.1
Kirkpatrick, J. D., and Kirkpatrick, W. K. (2016). Kirkpatrick’s four levels of training evaluation. Alexandria, VA: ATD Press.
Knowles, M. S., Holton, E., and Swanson, R. A. (2020). The adult learner: The definitive classic in adult education and human resource development. 9th Edn. London: Routledge.
Laal, M., Laal, A., and Aliramaei, A. (2014). Continuing education; lifelong learning. Procedia Soc. Behav. Sci. 116, 4052–4056. doi: 10.1016/j.sbspro.2014.01.889
Lencioni, P. (2007). The three signs of a miserable job: A fable for managers (and their employees). 1st Edn. San Francisco, CA: Jossey-Bass.
Leow, A., Chua, S., Billett, S., and Le, A. H. (2023). Employers’ perspectives of effective continuing education and training in Singapore. High. Educ. Skills Work Based Learn. 13, 217–232. doi: 10.1108/HESWBL-05-2022-0115
Maity, S. (2019). Identifying opportunities for artificial intelligence in the evolution of training and development practices. J. Manage. Dev. 38, 651–663. doi: 10.1108/JMD-03-2019-0069
Marzano, R. J., and Kendall, J. S., (2007) The new taxonomy of educational objectives. 2nd ed. Thousand Oaks, CA: Corwin Press.
Mott, V. W. (2009). “Evolution of adult education: is our future in E-learning?” in Handbook of research on E-learning applications for career and technical education. ed. V. Wang (Hershey, PA: IGI Global), 791–804.
Ndlovu, N., Erasmus, R. T., and Zemlin, A. E. (2025). Narrative review: continuous professional development training programmes in Africa and their limitations. Afr. J. Lab. Med. 14:8. doi: 10.4102/ajlm.v14i1.2602
Neuman, D. (1989). Naturalistic inquiry and computer-based instruction: rationale, procedures, and potential. Educ. Technol. Res. Dev. 37, 39–51. doi: 10.1007/BF02299055
Oceno, M. (2025). How social desirability bias impacts the expression of emotions. Polit. Sci. Res. Methods 2025, 1–10. doi: 10.1017/psrm.2025.10016
Ponelis, S. R. (2015). Using interpretive qualitative case studies for exploratory research in doctoral studies: a case of information systems research in small and medium enterprises. Int. J. Doctoral Stud. 10, 535–550. doi: 10.28945/2339
Safrit, R. D., and Owen, M. B. (2010). A conceptual model for retaining county extension program professionals. J. Ext. 48, 1–10. doi: 10.34068/joe.48.02.02
Saha, I., and Paul, B. (2023). Essentials of Biostatistics and Research Methodology (4th ed.). Kolkata, India: Academic Publishers.
Scheaf, D. J., Loignon, A. C., Webb, J. W., and Heggestad, E. D. (2023). Nonresponse bias in survey-based entrepreneurship research: a review, investigation, and recommendations. Strateg. Entrepreneurship J. 17, 291–321. doi: 10.1002/sej.1453
Scholten, M., Correia, M. F., Esteves, T., and Gonçalves, S. P. (2022). No place for pointless jobs: how social responsibility impacts job performance. Sustainability 14:31. doi: 10.3390/su141912031
Tak, A. Y., and Ercan, I. (2023). Ensemble of effect size methods based on meta fuzzy functions. Eng. Appl. Artif. Intell. 119:105804. doi: 10.1016/j.engappai.2022.105804
Todd, B. (2002). Short, Instructional Module To Address Lifelong Learning Skills. 2002 Annual Conference Proceedings, 7.996.1–7.996.7. doi: 10.18260/1-2--10081
U.S. Department of Education (2015) Evidence-Based Instruction and Teacher Induction | Adult Education and Literacy. Available online at: https://lincs.ed.gov/professional-development/resource-collections/profile-843 (Accessed October 5, 2025).
UNESCO (2023) International trends of lifelong learning in higher education: research report. UNESCO Institute for Lifelong Learning [online]. Available online at: https://unesdoc.unesco.org/ark:/48223/pf0000385339 (Accessed April 3, 2024).
University of Minnesota (2025) Minnesota Satisfaction Questionnaire. Available online at: https://vpr.psych.umn.edu/node/26 (Accessed October 5, 2025).
Via, A., Attwood, T. K., Fernandes, P. L., Morgan, S. L., Schneider, M. V., Palagi, P. M., et al. (2019). A new pan-European train-the-trainer programme for bioinformatics: pilot results on feasibility, utility and sustainability of learning. Brief. Bioinform. 20, 405–415. doi: 10.1093/bib/bbx112
Keywords: andragogy, educational innovation, higher education, lifelong learning, stakeholders
Citation: Torres Mata IdC, Miranda Becerra JN and Vázquez-Villegas P (2025) Exploring the impact of a training program for continuing education facilitators—an empirical journey. Front. Educ. 10:1601508. doi: 10.3389/feduc.2025.1601508
Edited by:
Pedro Tadeu, CI&DEI-ESECD-IPG, PortugalReviewed by:
Nilakantan Ananthakrishnan, Sri Balaji Vidyapeeth University, IndiaNuša Erman, Faculty of Information Studies Novo Mesto, Slovenia
Copyright © 2025 Torres Mata, Miranda Becerra and Vázquez-Villegas. This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) and the copyright owner(s) are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.
*Correspondence: Patricia Vázquez-Villegas, cGF0eS52YXpxdWV6QHRlYy5teA==