ORIGINAL RESEARCH article

Front. Psychol., 14 July 2025

Sec. Educational Psychology

Volume 16 - 2025 | https://doi.org/10.3389/fpsyg.2025.1607024

Effects of psychology teachers’ didactic performance on student didactic performance

  • 1Facultad de Psicología, Universidad Nacional Federico Villarreal, Lima, Peru
  • 2Departamento de Educación y Humanidades, Universidad Nacional José María Arguedas, Andahuaylas, Peru
  • 3Facultad de Psicología, Universidad Nacional Mayor de San Marcos, Lima, Peru
  • 4Facultad de Medicina, Universidad Nacional Federico Villarreal, Lima, Peru

There is an abundance of literature on students’ evaluations of their teachers’ didactic performance. However, few studies have investigated the relationship between teachers’ didactic performance and its effect on students’ didactic performance, with self-reports that have a solid basis in the substantive theory of such measurements. This study presents a cross-sectional and predictive analysis of the effects of teacher didactic performance (TDP) on student didactic performance (DPPS), examined through structural models that incorporate primary, second-order, and mediating factors within a causal framework. A total of 757 psychology students from a Peruvian public university (171 males and 586 females), selected by non-probabilistic sampling, participated in the study. The scales assessing student perception of teaching didactic performance and self-assessment of their didactic performance were administered. The structural regression model analyzing the direct and indirect effects of the six TDP criteria on the six DPPS criteria presented satisfactory fit indices: χ2(1048) = 2569.701, CFI = 0.928, TLI = 0.923, RMSEA = 0.044, SRMR = 0.064. This model demonstrates that the indirect effects of teaching performance, mediated by SDP criteria such as pre-current learning, illustration-participation, relevant practice, and feedback-enhancement, have a joint impact of 77% on the student criterion evaluation-application (transfer of disciplinary competencies). The second structural model analyzing the direct and indirect effects of the two second-order factors (teaching and formative assessment) of the TDP on the six criteria of the DPPS also presented adequate fit indices: χ2(1061) = 2564.619, CFI = 0.929, TLI = 0.924, RMSEA = 0.043, SRMR = 0.058. Together, the two second-order factors presented indirect effects with an overall impact of 69% on the criterion evaluation-application. Finally, the third model, which incorporates two chain mediators, analyzes the effects of the two second-order quantitative factors of the TDP on the two second-order quantitative factors of the DPPS. This model highlights that the highest ranking indirect effect between teaching and student criterion improvement-application occurs when formative assessment serves as a mediator. It is concluded that the possibility for students to improve, apply, and transfer their professional competencies to the solution of disciplinary problems depends on the optimization of formative assessment, which is linked to the teaching factor that corresponds to the teacher’s didactic performance.

1 Introduction

Psychoeducational processes, particularly those related to teaching and learning in university education, involve teacher–student interactions, with planned and intentional actions of the teacher to promote, regulate, and enhance the acquisition of academic-professional skills and competencies, as well as skills and values for sustainable development in students (Morales et al., 2017; Silva et al., 2014). These teaching behaviors during teaching and learning processes in higher education contexts involve skills and competencies in the field of teaching practice in some disciplines. They are forms of teacher behavior in teaching situations; that is, they are actions deployed by the teacher during didactic interaction to promote learning in their students (Bazán-Ramírez et al., 2022a; Kaplan and Owings, 2001; Mou et al., 2022).

These actions are part of the teacher’s instructional and assessment practices during the learning processes in a particular discipline (Bell et al., 2012; Serbati et al., 2020; Liu and Cohen, 2021; Mou et al., 2022). Such teaching practices are susceptible to evaluation from the perspectives of the student body, educational managers and administrators, educational research itself, and also from the self-evaluation of the teachers themselves. This evaluative practice, employing various methods and techniques, is commonly referred to as teacher performance evaluation.

Didactic performance is a term that describes the synchronous and asynchronous relationships between teachers and students during disciplinary and professional training in practical or theoretical-practical settings. According to Dees et al. (2007), teaching and learning are complex human encounters characterized by a teacher-student dialogue, in which the teacher’s role is crucial in creating an environment conducive to learning. This teacher-student encounter constitutes interactions, which, according to Prince et al. (2020), involve the didactic (teaching) actions of the teacher and the participation and reflection of the students (student actions). These didactic interactions encompass teaching performances (the teacher’s didactic performance) and learning performances (the student’s didactic performance) (Bazán-Ramírez et al., 2022a; Bazán-Ramírez et al., 2023). The teacher’s didactic performance and the student’s didactic performance are functionally related.

In the case of Spain, Gómez and Rumbo (2023) have referred to the Program to Support the Evaluation of the Teaching Activity of University Teaching Staff, known as DOCENTIA, which assesses the following dimensions of teacher performance in teaching: teaching planning, teaching development, and associated results. Similarly, teaching dedication is considered transversally and is a precondition for evaluation. On the other hand, based on the conception of 19 university teachers in Portugal, Grácio et al. (2023) identified five dimensions (competencies) of university teacher performance: pedagogy and didactics, multiplicity of knowledge, relational aspects, position concerning knowledge and teaching, and technical and scientific qualification.

In a recent systematic review study on the teaching competencies of university teachers, Moreira et al. (2023) analyzed 51 scientific articles published between 2009 and 2019 and identified three domains of instructional performance: didactic competence, communicative competence, and digital competence. Taking only didactic competence, they identified six domains or competencies: (1) ability to use different active teaching methods, strategies, or techniques to support student learning; (2) ability to structure and manage the course or lessons; (3) ability to implement fair and motivating learning assessment; (4) ability to create a supportive classroom climate and manage the classroom effectively; (5) coaching and mentoring ability; and (6) possess strong content knowledge.

Given the diversity of theoretical approaches to define and characterize didactic performance, the present study based on the perspective interbehavioral of psychology (Ibáñez and Ribes, 2001; Irigoyen et al., 2011; Kantor, 1975; Morales et al., 2013; Silva et al., 2014) postulates that the didactic performance of the teacher refers to the behaviors and practices deployed in the didactic interaction to sponsor, regulate and mediate student learning. The teacher is the one who establishes the teaching-learning conditions, structures the teaching and learning content and activities, establishes the achievement criteria to be met by their students, and mediates the students’ relationship with the learning object (Silva et al., 2014). These models, which have been put to the test, have made it possible to assess six dimensions of teacher didactic performance as perceived by students: competence exploration, explicitness of criteria, illustration, supervision of practices, feedback, and evaluation (Bazán-Ramírez et al., 2021).

Student didactic performance is conceived as the actions exercised by students in response to the demands of pedagogical interaction, aiming to develop or evoke behaviors, skills, and competencies in accordance with the criteria established by the teacher (Morales et al., 2017). Some models of student didactic performance, including self-assessment, have been tested in undergraduate psychology and biology (Bazán-Ramírez et al., 2023) and with graduate students in educational sciences (Bazán-Ramírez et al., 2022b). These studies have confirmed six dimensions of student didactic performance: precurrent to learning, identification of criteria, illustration-participation, pertinent practice, feedback-improvement, and evaluation-application.

1.1 Problem and justification of the study

When there are no studies that allow directly identifying the functional relationships between the dimensions of didactic performance of the teacher and the dimensions of didactic performance of the students during didactic interactions, for example, as reported by Velarde-Corrales and Bazán-Ramírez (2019), the study with self-reports can allow having an approximation of what happens in didactic interactions, regarding the actions of teachers and students. In interactive teaching-learning relationships, although students’ performance corresponds functionally with the didactic performance of the teacher (Bazán-Ramírez et al., 2022a; Bazán-Ramírez et al., 2023), it is also important to highlight that the teacher’s performance precedes the deployment of behaviors by students because the functions of directing, managing and facilitating learning by the teacher correspond to various dispositional factors, such as teaching competencies to design an instructional plan and use technological resources (Morales et al., 2017; Silva et al., 2014).

Although several studies have been reported on student ratings of teaching performance, there are very few studies based on a substantive theory that allow testing explanatory models based on latent variables to understand the variability of the effect of teacher teaching performance criteria perceived by students on student self-rated teaching performance.

In correlation with the above, the present study aims to fill the existing knowledge gap, offering an answer to the following research question: How do the criteria of teacher didactic performance, as perceived by students, affect the psychology of students’ self-evaluated didactic performance in a public Peruvian university?

This research seeks to contribute to the analysis of the mastery of competencies and deficiencies in the didactic performance of teachers and students, to strengthen such competencies through training programs aimed at promoting reflection or self-evaluation on the assessment of the performance criteria used in their classes, leading them to make changes and improve their learning strategies.

In accordance with the research problem formulated, the planned research objectives were: (1) to assess validity based on the internal structure of the construct and the reliability of the instruments. (2) To describe the didactic performance of psychology teachers and students. (3) To evaluate the effect of teacher didactic performance criteria on the didactic performance of psychology students at a public university, as perceived by the student body. (4) To evaluate the indirect effect of the second-order latent factors of teacher performance, “teaching and formative assessment,” on the latent factor of student performance, “assessment-application.” (5) To evaluate the indirect (mediational) effects of the second-order quantitative factors “formative assessment and learning” on the relationship between teaching (teaching performance competency factor), and assessment-application (student performance competency factor).

2 Materials and methods

2.1 Design and variables

The study employs a non-experimental and cross-sectional design, as it is multivariate and predictive of the relationships (Hair et al., 2008). The model proposed as the central hypothesis of the study is that teacher didactic performance criteria have direct and indirect effects on student didactic performance criteria (see Figure 1).

Figure 1
Complex diagram illustrating the relationship between teacher didactic performance and student didactic performance. It features interconnected nodes such as

Figure 1. Model of direct effects and indirect on didactic performance in university students.

2.1.1 Variables of the didactic performance of the teacher as perceived by students

2.1.1.1 Competence exploration

The teacher evaluates the aptitudes, knowledge, and prior skills of their students to determine how well they can learn a specific teaching topic.

2.1.1.2 Explicitness of criteria

The teacher clearly states the expected achievement criteria for student learning and explains how these criteria are to be met.

2.1.1.3 Illustration

The teacher illustrates to the students how the activities are developed, explains what they consist of, and what is required to solve a particular problem within the framework of professional work.

2.1.1.4 Supervision of practices

The teacher monitors the work of his students and accompanies them in solving problems.

2.1.1.5 Feedback

Feedback to the learner regarding the achievement criteria that are implicit in their own learning process corrects and supports them in meeting the achievement criteria.

2.1.1.6 Evaluation

Contrasting the learner’s actual level of performance against a performance considered in the teacher’s initial formulation as a learning objective.

2.1.2 Self-perceived student teaching performance variables

2.1.2.1 Precurrent to learning

The student body demonstrates the knowledge and competencies necessary for the beginning of a subject or topic.

2.1.2.2 Identification of criteria

The student identifies and explains what the achievement criteria made explicit by the teacher are and how they must be satisfied.

2.1.2.3 Illustration-participation

Solves problems as a professional would in his or her discipline, participating in classes in accordance with the structure learned in class.

2.1.2.4 Pertinent practice

Comply with the activities planned and established in the respective learning unit under the guidance and supervision of the educator.

2.1.2.5 Feedback-improvement

Describe in class what was done, how it was done, and in what context, recognizing successes and mistakes in order to correct the latter, according to the various criteria specified by the teacher.

2.1.2.6 Evaluation-application

Demonstrate solvency with respect to the theoretical-conceptual mastery of the topic addressed in relation to knowledge and competencies in accordance with the expected achievements and their compliance criteria.

2.2 Population and sample

The population consists of psychology students enrolled in the year 2024, between their first and sixth years of study, at a public university in Lima. It comprises students between 18 and 30 years of age of both sexes.

The sample comprised 757 students (171 males and 586 females). The sample size was estimated using the structural equation modeling calculator (Soper, 2024) for an effect size of 0.20, a statistical power of 0.90, and a 95% confidence level (α = 0.05). The restrictions on access to population resources and the lack of a sampling frame made randomized sampling unfeasible, which is why non-probability convenience sampling was employed.

A total of 72 teachers were evaluated, comprising 38 women. With respect to their academic degree, they were 3 bachelors, 38 masters, and 31 doctors. In relation to the teaching category, there were 37 assistants, 21 associates, and 14 principals.

Inclusion criteria: acceptance of informed consent.

Exclusion criteria: students with medical or psychological treatment.

2.3 Instruments

2.3.1 Scale of student perception of the didactic performance of the teachers (EDDO)

The scale reported and validated by Bazán et al. with Peruvian students of educational sciences and students of biological sciences was used (Bazán-Ramírez et al., 2022b; Bazán-Ramírez et al., 2023). This scale aims to identify the didactic interaction performances deployed by the teacher, as perceived by the students, in terms of patterns or frequencies of occurrence, with the goal of enhancing the teacher’s didactic practice and ultimately improving the teaching-learning process.

This self-report consists of 24 items, distributed in six factors (competence exploration, explicitness of criteria, illustration, supervision of practices, feedback, and evaluation) with four response options (never, almost never, almost always, and always).

In a sample of 552 biology students from a public university in Peru, satisfactory evidence of validity based on the internal structure of the construct has been reported (CFI and TLI > 0.95, RMSEA and SRMR < 0.06) as well as internal consistency coefficients with McDonald’s omega between 0.84 and 0.91 (Bazán-Ramírez et al., 2023).

In the present study, the psychometric properties of the EDDO scale were reviewed in the sample of psychology students. Table 1 presents the results of Confirmatory Factor Analysis (CFA) to examine the evidence of validity based on the internal structure of the construct. In this regard, the multifactorial model presented very good fit indices (χ2 = 763, gl = 237, p < 0.001, CFI = 0.963, TLI = 0.957, SRMR = 0.029, MRSEA = 0.059 [0.05, 0.06]). The evaluation of the parameters denotes high factor loadings (≥ 0.69). Finally, the Interfactor Covariances ranged from 0.78 to 0.94. Regarding reliability, both Cronbach’s alpha and ordinal omega yielded coefficients of 0.98, indicating high precision in the scores estimated by the scale.

Table 1
www.frontiersin.org

Table 1. Confirmatory factor analysis and internal consistency reliability of the EDDO.

2.3.2 Self-Assessment Scale of Student Didactic Performance (EADDE)

Self-Assessment Scale of Student Didactic Performance (EADDE) was validated with Peruvian students from educational sciences and biological sciences (Bazán-Ramírez et al., 2022b; Bazán-Ramírez et al., 2023). The scale comprises 24 items organized into six subscales or dimensions of student didactic performance: precurrent to learning, identification of criteria, illustration-participation, pertinent practice, feedback-improvement, and evaluation-application. Each subscale presents four items with graded responses (never, sometimes, almost always, always). In a study by Bazán-Ramírez et al. (2023) involving biology students from a Peruvian university, the scale demonstrated adequate construct validity (CFI and TLI ≥ 0.95, RMSEA and SRMR < 0.08) and reliable factor scores (ω between 0.88 and 0.93).

For use in the present study, the psychometric properties of the EADDE scale were examined in the sample of psychology students (see Table 2).

Table 2
www.frontiersin.org

Table 2. Confirmatory factor analysis and internal consistency reliability of the EADDE.

According to the global evaluation indices, the factorial model of the EADDE presents an adequate fit (χ2 = 842, gl = 237, p < 0.001, CFI = 0.939, TLI = 0.929, SRMR = 0.044, MRSEA = 0.063 [0.05, 0.07]), that is, the internal structure of the construct is reproduced according to what is established by the theory. Table 2 also shows that the factors contain items that saturate with high factor loadings. As for the Interfactor Covariances, these vary between moderate and strong. With respect to reliability, the scale, in general, as well as its dimensions, presents ordinal alpha and omega values that denote high precision in the measured scores.

2.4 Procedure

The instruments were administered both in person and through the virtual modality with a form designed in Google Drive. In the case of the virtual applications, the URL links were sent to the study population’s email addresses and WhatsApp accounts. At the beginning of the form, an informative text was presented outlining the research objectives, potential benefits and risks, and the request for informed consent. In the face-to-face application, consent was printed and signed in person. For the virtual application, consent was obtained online and signed virtually.

The data were collected anonymously, treated with confidentiality, coded, and entered into a database. The data collection period was between July and 1 August 30, 2023.

2.5 Data analysis

In the first phase, the psychometric properties of the research instruments were reviewed with the study sample.

SPSS version 27 was used for descriptive and comparative analyses. Both psychometric analyses and structural regression model analyses to examine the direct and indirect effects of teacher didactic performance on students’ didactic performance criteria used the freely distributed software R version 4.3.1 and RStudio version 2023.06.2.

The two SEM models were processed using the robust maximum likelihood estimation (MLM) method. As in SEM models, the evaluation of structural parameters (interfactor relationships and standard errors), in addition to fit indices, is of primary interest. Robust ML was used as it is more advantageous than the WLSMV estimator (suggested for ordinal data) when the sample is less than 1,000 cases (Li, 2016). Even when instruments have two to four response categories, robust ML offers unbiased estimates of factorial correlations (Rhemtulla et al., 2012). The recommended robust fit indices were used to assess the overall fit of the structural models (Hu and Bentler, 1999; Keith, 2019). The Comparative Fit Index (CFI) and the Tucker-Lewis index (TLI) denote adequate fit when their values are ≥ 0.90 and good fit if they are ≥ 0.95; the Root Mean Square Error of Approximation (RMSEA) denotes adequate fit when their values are ≤ 0.08 and good fit if their values are ≤ 0.05; and the Standardized Root Mean Square Residual (SRMR) denotes adequate fit when their values are ≤ 0.08 and good fit if they are ≤ 0.06.

Mediation analyses were performed between second-order factors, which served as metric variables for both teacher teaching performance as perceived by students and student self-assessment of teaching performance, using the PROCESS macro module version 4.2 for SPSS (Hayes, 2022). The results of this analysis were based on bootstrap resampling of 5,000 cases. According to Hayes (2022), Confidence Intervals (CIs) are significant when they do not contain zero. In the case of mediational models, confidence intervals (CIs) based on bootstrapping enable us to identify indirect effects and determine, among significant mediations, the order of superiority.

Both structural regression models of latent variables, which are part of structural equation modeling (SEM), and mediation analysis with quantitative variables are techniques for analyzing direct and indirect effects in the relationships between variables. According to Keith (2019), in cross-sectional studies, these techniques correspond to weak causality relationships. In this case, weak causality relationships are configured when three conditions are met (Byrne, 2010; Keith, 2019): (1) the existence of a functional relationship between variables, (2) the cause precedes the effect in a real or logical way in time and (3) the relationship is not spurious. It is worth noting that strong causality is typically associated with experimental studies.

3 Results

3.1 Students’ perception of the didactic performance of teachers

Table 3 shows the students’ perception of their teachers’ didactic performance in the six competency criteria.

Table 3
www.frontiersin.org

Table 3. Descriptive analysis of the teacher’s didactic performance for each criterion according to the student body.

According to the confidence intervals of the mean and Z-scores, the criteria rated as the best performers are feedback, supervision of practice and learning activities, explicitness of criteria, and illustration. On the other hand, the criteria for competency requiring optimization are competence exploration and evaluation.

3.2 Students’ self-assessment of their teaching performance

According to the Z scores and confidence intervals of the mean shown in Table 4, the competence performance criteria that the students best developed are, on the other hand, feedback-improvement, pertinent practice, and identification of criteria. Performance that needs to be corrected and strengthened includes competency criteria pre-current to learning, illustration-participation, and evaluation-application.

Table 4
www.frontiersin.org

Table 4. Descriptive analysis of self-assessment of their teaching performance students for each criterion.

3.3 Effect of teacher didactic performance criteria on the didactic performance of psychology students at a public university

Figure 2 presents the results of the structural regression analysis, which tests the proposed hypothesis that the teacher’s didactic performance criteria have both direct and indirect effects on the student’s didactic performance criteria. The structural model presents satisfactory fit indices: χ2(1051) = 2400.336, CFI = 0.936, TLI = 0.932, RMSEA = 0.041 [0.039, 0.043], SRMR = 0.055. Therefore, the data support the validity of the structural model.

Figure 2
Structural equation model diagram showing relationships between teacher and student didactic performance. Variables include competences, explicit criteria, illustrations, supervision, feedback, and evaluation. Arrows indicate direction and strength of impact with standardized coefficients. R-squared value is 0.73.

Figure 2. Latent structural regression model of the effects of didactic teaching performance on the didactic performance of psychology students. The values (bold) shown in the Figure are the structural regression coefficients. * p < 0.05, ** p < 0.01, *** p < 0.001.

It can be seen in the model resulting (Figure 2) that competency exploration by the teacher has a direct effect of 47% on the actions pre-current learning performed by the students. Likewise, the factors of teaching performance, competence exploration, and explicitness of criteria have a direct impact of 49% on the identification of the criteria by the students. The criterion illustration student participation contains an impact of 87% of the teaching performance criteria illustration (direct effect) and explicitness of criteria (indirect effect), as well as the direct effects of precurrents to learning deployed by students. The fourth performance criterion, pertinent practice, exercised by students presents an R2 of 0.90 as a result of the direct effects of teacher supervision of practices and the criterion illustration-participation student. The endogenous feedback-improvement factor corresponding to student performance is affected directly (feedback) and indirectly (supervision of practice and learning activities) by teaching performance criteria (R2 = 0.82). Finally, the indirect effects of teaching performance through mediators corresponding to student didactic performance criteria, such as precurrents to learning, illustration-participation, and feedback-improvement, plus the direct effect of the teacher’s evaluation factor, have a joint impact of 73% (total R2 of the model) on student evaluation-application criteria.

The latent correlations (Table 5) between teacher teaching performance criteria and student teaching performance criteria are positive and correspond to a large effect size (r > 0.50).

Table 5
www.frontiersin.org

Table 5. Matrix of correlations of latent factors of teacher performance and student performance.

The second structural model (see Figure 3) configures the six criteria of teaching performance into two second-order factors (Teaching and Formative Assessment), which are directly and indirectly related to the six criteria of student performance. The examined model presents fit indices teaching satisfactory overall: χ2(1061) = 2757.704, CFI = 0.920, TLI = 0.915, RMSEA = 0.046 [0.044, 0.048], SRMR = 0.064. Therefore, the evidence supports the validity of the model. Figure 3 shows that the most significant indirect effect is between the factor of teaching and evaluation application, mediated through a path involving double serial mediation (identification of criteria and illustration-participation). The other factor in teaching performance, called formative evaluation, channels its indirect effects on evaluation-application through two-chain mediators (pertinent practice and feedback-improvement). In summary, the two second-order factors account for an explained variance of 68% in student teaching performance, specifically in the evaluation-application domain.

Figure 3
A structural equation model illustrating relationships between various educational factors: Competency Exploration, Explicit Criteria, Illustration, Supervised Practices, and Feedback, all contributing to Teaching. Formative Assessment influences Pertinent Practice and Feedback Improvement. These lead to Pre-curricular Learning, Illustration Participation, Evaluation Application, and Student Didactic Performance. Paths are labeled with coefficients, some being statistically significant. The model's R-squared value is 0.68.

Figure 3. Latent structural regression model of the effects of formative teaching and assessment on didactic performance criteria in psychology students. The values (bold) shown in the Figure are the structural regression coefficients. * p < 0.05, ** p < 0.01, *** p < 0.001, ns = not significant (p > 0.05).

3.4 Multiple mediation analysis

To determine the indirect effects between the second-order factors of the teacher and the second-order factors of student teaching performance, a third-order model is examined. Thus, Table 6 examines, through regression analysis (Hayes model 6), the impact of the mediating variables formative assessment (M1) and learning (M2) on the relationship between teaching as a teaching competence (X) and the application-transfer competence (Y) in psychology students.

Table 6
www.frontiersin.org

Table 6. Multiple mediation analysis with two mediating variables in a causal chain.

The first regression model shows that the teaching variable is statistically significantly effective on the first mediating variable (a1 = B = 0.84, p < 0.001). The second regression model presents results for the second mediating variable (learning). It can be seen that both the teaching variable (a2 = B = 0.23, p < 0.001) and the mediating variable 1 (d21 = B = 0.40, p < 0.001) exert significant effects on learning. The third regression analysis shows that the independent variable (c´ = B = −0.08, p < 0.01) and the two mediating variables, formative assessment (b1 = B = 0.46, p < 0.001) and learning (b2 = B = 0.56, p < 0.001) exert statistically significant effects on the dependent variable. The fourth analysis reveals that the total effect of the model is mediated (c = B = 0.62, p < 0.001).

Finally, Table 6 indicates that the three indirect effects are statistically significant, as indicated by the 95% confidence intervals. The contrast results allow us to identify the order of superiority of the indirect effects. In this sense, indirect effect 1, operating on the first mediating variable (see the path of the thickest dates in Figure 4), is superior to the other two indirect effects. The second most significant indirect effect corresponds to the pathway involving both mediating variables in sequence (X → M1 → M2 → Y), as illustrated in Figure 4.

Figure 4
Conceptual diagram showing the relationships among Teaching (X), Formative Evaluation (M1), Learning (M2), and Improvement-Application (Y). Arrows indicate paths with corresponding coefficients: a1 = 0.84, a2 = 0.23, b1 = 0.46, b2 = 0.56, c' = -0.08, c = 0.62, and d21 = 0.40. An indirect effects key describes three paths: 1) Teaching to Formative Evaluation to Improvement-Application, 2) Teaching to Learning to Improvement-Application, 3) Teaching to Formative Evaluation to Learning to Improvement-Application.

Figure 4. Model of effects mediational in the teaching relationship on application—transfer competence in psychology students. The values shown in the figure are the regression coefficients unstandardized. ** p < 0.01 *** p < 0.001, c: total effect, c’ direct effect, X = Independent variable, Y = Dependent variable, M1, Mediating variable 1, and M2, Mediating variable 2.

4 Discussion

The present study provides updated information on the recent state of the variables analyzed in the academic context of higher education in Peruvian students and describes and explains how they affect or regulate the criteria of the didactic performance of teachers in the academic and scientific training of future psychology professionals.

In the first instance, an important contribution of the study, at the methodological level, was to provide new evidence on the accuracy (reliability) and suitability of the instruments to derive correct interpretations and diagnoses from their scores, for which there was evidence of validity based on the internal structure of the construct obtained. As Muñiz (2018), providing instruments with suitable qualities of validity and reliability will not only allow the development and increase of new research that expand the frontiers of scientific knowledge but also facilitate the work of professionals, who as agents of social change, seek the generation of better quality graduates and who can constitute the efficient workforce.

Our results of the univariate analysis regarding the teachers’ didactic performance highlight feedback, supervision of practice and learning activities, the explicitness of criteria, and illustration as the most efficiently used; this expresses that the students’ perception of these three competency criteria of teachers has significant implications in their educational process. When comparing these results with those of Bazán-Ramírez et al. (2023), there is concordance regarding the explicitness of the criteria, but not with the evaluation criterion.

On the other hand, in terms of the students’ didactic performance competency criteria, feedback-improvement, pertinent practice, and identification of criteria stood out as those with the best development and efficient use. On the other hand, the performance competency criteria that need to be corrected and strengthened are precurrent for learning, illustration-participation, and evaluation-application. These results partially correspond to the study of Bazán-Ramírez et al. (2023), who identified outstanding student performance in undergraduate students of biological sciences as relevant practice and identification of criteria, but not in feedback-improvement.

In accordance with the objectives referring to the effect of the criteria on the didactic performance of the teacher as perceived by the students on the didactic performance of the students, three predictive models were tested. The interpretation of the data is basically supported by the conceptual framework of the study, given that there are no similar investigations with which to contrast the results, with the exception of the only precedent carried out in a sample of graduate students in education (Bazán-Ramírez et al., 2022a) which is close to the present study.

The first model consisted of evaluating through structural regression analysis of first-order latent variables the direct and indirect effects of the six teacher didactic performance criteria on the six student didactic performance criteria. The central finding of this model indicates that the indirect effects of teacher performance mediated by student performance criteria, i.e., precurrent for learning, illustration-participation, and feedback-enhancement, had a joint impact of 73% on the student criterion evaluation-application (transfer of disciplinary competencies). This means that the possibility of transferring the performance parameters to new problems and situations (evaluation-application) depends on the fulfillment of the student didactic performance criteria and the evaluation of the didactic performance of the teacher as perceived by the student body.

Specifically, the model has allowed us to observe that the teacher’s exploration of competencies has a direct effect on the actions of pre-current learning deployed by students. This implies that the teacher, when evaluating the skills and competencies necessary for learning the topic in a class to be started, not only encourages the student to deploy his/her potential capabilities for new learning but also that the student identifies what they need to know and what the didactic situation requires.

It was also identified that the factors of teaching performance, competency exploration, and the explicitness of criteria have a direct impact on students’ identification of criteria. This means that the evaluation of the student’s potential knowledge and capabilities, together with the explicitness of the disciplinary and didactic criteria that the student must satisfy with his/her performance, allows adjusting his/her didactic performance (what he/she will do, how, when and where) and reaching the expected achievement based on the established criteria.

Another important finding concerns the explicitness of criteria on the part of the teacher and the adjustment of the student’s didactic performance based on the identification of these criteria. In this way, the student is able to create and utilize strategies and resources to learn the forms of action expected by the teacher.

It is worth highlighting the direct and significant impact that the teacher’s supervision of practices has on the criterion of pertinent practice exercised by the students. This means that the didactic adjustment of the student in class, based on the learning situations regulated by the teacher (where the supervision and correction are made possible moment by moment of the trainee’s performance), is conducive to the student demonstrating competent performance adjusted to the requirements and achievement criteria.

Finally, it has been found that feedback (enabling the student to get in touch with his own behavior and its possible variants) allows the student to monitor to what extent he is in a position to meet the achievement criterion(s), how close he is, and what adjustments to make to achieve the satisfaction of the expected achievement criterion(s).

In the analysis carried out in the second structural model on the perception of psychology students regarding the didactic performance of the teacher, the results indicate that the route of the most relevant indirect effects is between the teaching and factor evaluation-application through the route that contemplates double mediation in the identification of criteria and illustration-participation. For the competency criterion formative evaluation of teacher performance, the indirect effects are channeled in series through two mediators: pertinent practice and feedback-improvement. Together, the two second-order factors of teacher didactic performance have an overall impact of 68% on the most important terminal criterion of student didactic performance, which is evaluation-application. Ibáñez and Ribes (2001) as well as Kantor (1975), stated that didactic performance depends on the teacher’s ability to facilitate student learning through effective and meaningful interactions. This vision is complemented by Morales et al. (2013, 2017), who note that didactic performance not only implies knowledge transmission but also fosters an environment conducive to learning, where skills, attitudes, and values are integrated. Additionally, Bazán-Ramírez et al. (2022a) when analyzing the teaching-learning conditions in the context of higher education, emphasized the importance of the didactic performance factors of the teacher that mediate the relationship between the student’s didactic performance and the object of learning.

In the third model, the results of the multiple mediation analysis, which considers formative assessment and learning as mediating variables, show that these factors exert indirect effects with different degrees of relevance between teaching and the transfer of disciplinary competencies in students. The fundamental indirect route to optimize and enhance the possibility that students can apply and transfer the acquired knowledge and competencies is through the optimization of formative assessment, a consequence of the teacher’s didactic performance in teaching. In contrast, although the indirect route through the learning mediator shows significant indirect effects between teaching and the possibility of improvement-application of competencies, it is the one that would have the least impact on the formative quality that enables the transfer and application of competencies in solving disciplinary problems.

4.1 Implications for practice

The results of the research could be useful in optimizing curricular experiences, both in theory and practice hours, that benefit the teaching-learning process oriented toward the acquisition and strengthening of skills and academic-professional competencies. In university teacher training, the results would also have implications in the following contexts: (1) educational innovation: to reinforce quality education where teacher training at the higher level involves greater active participation of students, as well as to allow teachers to articulate their didactic competencies in teaching actions. (2) feedback: the mechanism involved would facilitate the conscious self-demand that usually accompanies the self-evaluation of the learner’s own abilities according to the teaching framework established by the teacher. (3) tutoring and educational guidance: this guidance procedure could directly influence the student’s academic commitment and performance by facilitating the identification of mediating factors that enhance or limit learning.

The social impact in terms of the training of professionals (a task entrusted by society to universities) is not only local but also has national and international scope. For example, according to the background information reviewed, there is interest in understanding the didactic teacher-student interaction in professional careers in biology (Peru), education, and psychology (Mexico) so that the findings of this study will allow contrasting educational realities and having the possibility of sharing experiences with these institutions to optimize the training quality of professionals, who will be the agents of social and economic change in the country in the near future.

5 Limitations

Some of the most important limitations of the study include the use of non-probabilistic sampling, which undermines the external validity of the research, even though a large sample has been taken (two-thirds of the population) to reduce selection biases of the units of analysis, it is recommended that in future studies randomized samples be taken. The fact that the sample corresponds to psychology students and is from a single public university can also be considered a limitation of the study. It is suggested in future studies to take samples from several universities and different professional careers. Likewise, another limitation could lie in the use of self-reports for the measurement of the variables due to a possible social desirability bias committed by the respondents, even though this undesired effect is minimized by valid and reliable scores, as well as by the quality control of extreme data through the multivariate analysis of centroids with the Mahalanobis distance. To strengthen the findings, in addition to replicating the study, it would be interesting to use an observational methodology for measuring the variables (Velarde-Corrales and Bazán-Ramírez, 2019). Despite these limitations, the present study is relevant because there are few studies aimed at offering models to explain the interactions or effects between the criteria or factors corresponding to the didactic performance of teachers and the performance of students in the university context, especially in a sample of psychology students, in this sense, the present study will fill this knowledge gap. This study will provide a basis for future basic and applied studies.

6 Conclusion

1. According to the structural regression model with first-order factors, three student performance criteria (illustration-participation, relevant practice, and feedback-improvement) constitute the mediators of indirect effects of the teaching performance criteria on the student’s competence criteria to evaluate and apply solutions to disciplinary problems of the profession.

2. Between the second-order factors of the teacher’s didactic performance (teaching and formative evaluation) and the student’s evaluation-application criterion, there are indirect effects regulated by the participation of two mediators in a causal chain that corresponds to the student performance criteria. The route of the first mediation is teaching → identification of criteria → illustration-participation → evaluation-application, while the route of the second mediation is formative evaluation → relevant practice → feedback improvement → evaluation-application.

3. The mediational analysis of the second-order quantitative factors highlights that the indirect effect of greater hierarchy between teaching as a teaching competence and the student performance criterion improvement-application is where formative assessment participates as a mediator. This means that the possibility for students to improve, apply and transfer their professional competencies to the solution of disciplinary problems depends on the optimization of the formative evaluation that is linked to the teaching factor that corresponds to the didactic performance of the teacher.

Data availability statement

The raw data supporting the conclusions of this article will be made available by the authors, without undue reservation.

Ethics statement

The studies involving humans were approved by Comité de Ética de la Unidad de Investigación, Innovación y Emprendimiento de la Facultad de Psicología, Universidad Nacional Federico Villarreal, INFORME2·W.CAPA. The studies were conducted in accordance with the local legislation and institutional requirements. The participants provided their written informed consent to participate in this study.

Author contributions

WC-L: Conceptualization, Data curation, Formal analysis, Funding acquisition, Investigation, Methodology, Project administration, Resources, Software, Supervision, Validation, Visualization, Writing – original draft, Writing – review & editing. AB-R: Conceptualization, Data curation, Formal analysis, Funding acquisition, Investigation, Methodology, Project administration, Resources, Software, Supervision, Validation, Visualization, Writing – original draft, Writing – review & editing. LM-F: Conceptualization, Data curation, Formal analysis, Project administration, Resources, Software, Supervision, Validation, Visualization, Writing – review & editing. EB: Data curation, Formal analysis, Funding acquisition, Methodology, Project administration, Resources, Supervision, Validation, Visualization, Writing – review & editing. EH-G: Formal analysis, Investigation, Methodology, Project administration, Resources, Software, Supervision, Validation, Visualization, Writing – original draft. WM-U: Conceptualization, Data curation, Formal analysis, Investigation, Methodology, Resources, Validation, Visualization, Writing – original draft. CB-V: Conceptualization, Data curation, Formal analysis, Investigation, Methodology, Software, Visualization, Writing – review & editing. DG-R: Data curation, Investigation, Resources, Supervision, Visualization, Writing – review & editing.

Funding

The author(s) declare that no financial support was received for the research and/or publication of this article.

Acknowledgments

We thank the students and teachers who participated in the study sample.

Conflict of interest

The authors declare that the research was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest.

The author(s) declared that they were an editorial board member of Frontiers, at the time of submission. This had no impact on the peer review process and the final decision.

Generative AI statement

The author(s) declare that no Gen AI was used in the creation of this manuscript.

Publisher’s note

All claims expressed in this article are solely those of the authors and do not necessarily represent those of their affiliated organizations, or those of the publisher, the editors and the reviewers. Any product that may be evaluated in this article, or claim that may be made by its manufacturer, is not guaranteed or endorsed by the publisher.

References

Bazán-Ramírez, A., Ango-Aguilar, H., Cárdenas-López, V., Anaya-González, R. B., Capa-Luque, W., and Bazán-Ramírez, M. A. (2023). Self-reporting of teacher-student performance in virtual classroom interactions in biological sciences during the SARS-CoV-2/COVID-19 pandemic. Sustain. For. 15:16198. doi: 10.3390/su152316198

Crossref Full Text | Google Scholar

Bazán-Ramírez, A., Capa-Luque, W., Bello-Vidal, C., and Quispe-Morales, R. (2022a). Influence of teaching and the teacher's feedback perceived on the didactic performance of Peruvian postgraduate students attending virtual classes during the COVID-19 pandemic. Front. Educ. 7:818209. doi: 10.3389/feduc.2022.818209

Crossref Full Text | Google Scholar

Bazán-Ramírez, A., Pérez-Morán, J. C., and Bernal-Baldenebro, B. (2021). Criteria for teaching performance in psychology: invariance according to age, sex, and academic stage of Peruvian students. Front. Psychol. 12:764081. doi: 10.3389/fpsyg.2021.764081

PubMed Abstract | Crossref Full Text | Google Scholar

Bazán-Ramírez, A., Quispe-Morales, R., De La Cruz-Valdiviano, C., and Henostroza-Mota, C. (2022b). Teacher-student performance criteria during online classes due to COVID-19: self-report by postgraduate students in education. Eur. J. Educ. Res. 11, 2101–2114. doi: 10.12973/eu-jer.11.4.2101

Crossref Full Text | Google Scholar

Bell, C. A., Gitomer, D. H., McCaffrey, D. F., Hamre, B. K., Pianta, R. C., and Qi, Y. (2012). An argument approach to observation protocol validity. Educ. Assess. 17, 62–87. doi: 10.1080/10627197.2012.715014

Crossref Full Text | Google Scholar

Byrne, B. M. (2010). Structural equation modeling with AMOS. 2nd Edn. New York, NY: Routledge/Taylor and Francis Group.

Google Scholar

Dees, D. M., Ingram, A., Kovalik, C., Allen-Huffman, M., McClelland, A., and Justice, L. (2007). A transactional model of college teaching. Int. J. Teach. Learn. High. Educ. 19, 130–140. Available online at: https://eric.ed.gov/?id=EJ901291

Google Scholar

Gómez, T. F., and Rumbo, B. (2023). Estudio en clave comparada de la evaluación del profesorado universitario [A comparative study on teaching performance assessment at higher education]. Profr. Rev. Curríc. Form. Profr. 27, 373–397. doi: 10.30827/profesorado.v27i1.21576

Crossref Full Text | Google Scholar

Grácio, L., Aguiar, H., Pire, S. H., and Carapeto, M. J. (2023). Teaching and quality of teaching: conceptions of higher education professors in Sao Tome and Principe. Front. Educ. 8:1144147. doi: 10.3389/feduc.2023.1144147

Crossref Full Text | Google Scholar

Hair, J., Anderson, R., Tatham, R., and Black, W. (2008). Análisis Multivariante [multivariate analysis]. 5th Edn. Madrid: Pearson Prentice Hall.

Google Scholar

Hayes, A. F. (2022). Introduction to mediation, moderation, and conditional process analysis. A regression based approach. 3rd Edn. New York, NY: Guilford Press.

Google Scholar

Hu, L. T., and Bentler, P. M. (1999). Cutoff criteria for fit indexes in covariance structure analysis: conventional criteria versus new alternatives. Struct. Equ. Model. 6, 1–55. doi: 10.1080/10705519909540118

Crossref Full Text | Google Scholar

Ibáñez, B. C., and Ribes, E. (2001). Un análisis interconductual de los procesos educativos (an interbehavioral analysis of educational processes). Rev. mex. psicol. 18, 359–371. Available online at: https://psycnet.apa.org/record/2003-99047-008

Google Scholar

Irigoyen, J. J., Acuña, K. F., and Jiménez, M. Y. (2011). “Interacciones didácticas en educación superior. Algunas consideraciones sobre la evaluación de desempeños [Didactic interactions in higher education. Some considerations about performance evaluation], in Evaluación de desempeños académicos [Performance evaluation academics] ”, eds. J. J. Irigoyen, K. F. Acuña, and M. Y. Jiménez (Universidad de Sonora), 73–96. Available online at: https://bit.ly/3n3R79r

Google Scholar

Kantor, J. R. (1975). Education in psychological perspective. Psychol. Rec. 25, 315–323. doi: 10.1007/BF03394321

Crossref Full Text | Google Scholar

Kaplan, L. S., and Owings, W. A. (2001). Teacher quality and student achievement: recommendations for principals. NASSP Bull. 85, 64–73. doi: 10.1177/019263650108562808

Crossref Full Text | Google Scholar

Keith, T. Z. (2019). Multiple regression and beyond. An introduction to multiple regression and structural equation modeling. 3rd Edn. New York, NY: Routledge/Taylor and Francis Group.

Google Scholar

Li, C. H. (2016). Confirmatory factor analysis with ordinal data: comparing robust maximum likelihood and diagonally weighted least squares. Behav. Res. Methods 48, 936–949. doi: 10.3758/s13428-015-0619-7

PubMed Abstract | Crossref Full Text | Google Scholar

Liu, J., and Cohen, J. (2021). Measuring teaching practices at scale: a novel application of text-as-data methods. Educ. Eval. Policy Anal. 43, 587–614. doi: 10.3102/01623737211009267

Crossref Full Text | Google Scholar

Morales, G., Alemán, M., Canales, C., Arroyo, R., and Carpio, C. (2013). Las modalidades de las interacciones didácticas: entre los disensos esperados y las precisiones necesarias [the modalities of didactic interactions: between expected dissents and necessary precisions]. Conductual 1, 73–89. doi: 10.59792/DSKU8151

Crossref Full Text | Google Scholar

Morales, G., Peña, B., Hernández, A., and Carpio, C. (2017). Competencias didácticas y competencias de estudio: Su integración funcional en el aprendizaje de una disciplina [Didactic competencies and competencies of study: Its functional integration in the learning of a discipline]. Altern. Psicol. 21, 24–35. Available online at: https://alternativas.me/competencias-didacticas-y-competencias-de-estudio-su-integracion-funcional-en-el-aprendizaje-de-una-disciplina/

Google Scholar

Moreira, M. A., Arcas, B. R., Sánchez, T. G., García, R. B., Melero, M. J. R., Cunha, N. B., et al. (2023). Teachers' pedagogical competences in higher education: a systematic literature review. J. Univ. Teach. Learn. P 20, 90–123. doi: 10.53761/1.20.01.07

Crossref Full Text | Google Scholar

Mou, C., Tian, Y., Zhang, F., and Zhu, C. (2022). Current situation and strategy formulation of college sports psychology teaching following adaptive learning and deep learning under information education. Front. Psychol. 12:766621. doi: 10.3389/fpsyg.2021.766621

PubMed Abstract | Crossref Full Text | Google Scholar

Muñiz, J. (2018). “Introducción a la Psicometría” in Teoría clásica y TRI [introduction to psychometrics. Classical theory and IRT] (Madrid: Pirámide).

Google Scholar

Prince, M., Felder, R., and Brent, R. (2020). Active student engagement in online STEM classes: approaches and recommendations. Adv. Eng. Educ. 8, 1–25. Available online at: https://advances.asee.org/wp-content/uploads/Covid%2019%20Issue/Text/2%20AEE-COVID-19-Felder.pdf

Google Scholar

Rhemtulla, M., Brosseau-Liard, P. É., and Savalei, V. (2012). When can categorical variables be treated as continuous? A comparison of robust continuous and categorical SEM estimation methods under suboptimal conditions. Psychol. Methods 17, 354–373. doi: 10.1037/a0029315

PubMed Abstract | Crossref Full Text | Google Scholar

Serbati, A., Aquario, D., Da Re, L., Paccagnella, O., and Felisatti, E. (2020). Exploring good teaching practices and needs for improvement: implications for staff development. J. Educ. Cult. Psychol. Stud. (ECPS J.). 21, 43–64. doi: 10.7358/ecps-2020-021-serb

Crossref Full Text | Google Scholar

Silva, H. O., Morales, G., Pacheco, V., Camacho, A. G., Garduño, H. M., and Carpio, C. A. (2014). Didáctica como conducta: una propuesta para la descripción de las habilidades de enseñanza [Didactics as behavior: a proposal for the description of teaching skills]. Rev. Mex. Anal. Conducta. 40, 32–46. doi: 10.5514/rmac.v40.i3.63679

Crossref Full Text | Google Scholar

Soper, D. S.. (2024). A-priori sample size calculator for structural equation models. Available online at: https://www.danielsoper.com/statcalc

Google Scholar

Velarde-Corrales, N. M., and Bazán-Ramírez, A. (2019). Sistema observacional Para analizar interacciones didácticas en clases de ciencias en bachillerato [Observacional system to analyze didactic interactions in science classes in bachelor]. Rev. Investig. Psicol. 22, 197–216. doi: 10.15381/rinvp.v22i2.16806

Crossref Full Text | Google Scholar

Keywords: didactic performance, mediation, teaching, learning, formative assessment

Citation: Capa-Luque W, Bazán-Ramírez A, Mayorga-Falcón LE, Barboza-Navarro E, Hervias-Guerra E, Montgomery-Urday W, Bello-Vidal C and García-Ramírez DR (2025) Effects of psychology teachers’ didactic performance on student didactic performance. Front. Psychol. 16:1607024. doi: 10.3389/fpsyg.2025.1607024

Received: 06 April 2025; Accepted: 18 June 2025;
Published: 14 July 2025.

Edited by:

Ramón García Perales, University of Castilla-La Mancha, Spain

Reviewed by:

Alberto Rocha, Higher Institute of Educational Sciences of the Douro, Portugal
Rocío Díaz Zavala, National University of Saint Augustine, Peru

Copyright © 2025 Capa-Luque, Bazán-Ramírez, Mayorga-Falcón, Barboza-Navarro, Hervias-Guerra, Montgomery-Urday, Bello-Vidal and García-Ramírez. This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) and the copyright owner(s) are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.

*Correspondence: Aldo Bazán-Ramírez, YWJhemFucmFtaXJlekBnbWFpbC5jb20=; Walter Capa-Luque, d2NhcGFAdW5mdi5lZHUucGU=

Disclaimer: All claims expressed in this article are solely those of the authors and do not necessarily represent those of their affiliated organizations, or those of the publisher, the editors and the reviewers. Any product that may be evaluated in this article or claim that may be made by its manufacturer is not guaranteed or endorsed by the publisher.