- 1School of Education and Human Development, University of Virginia, Charlottesville, VA, United States
- 2Department of Special Education, Early Childhood, and Prevention Science, College of Education and Human Development, University of Louisville, Louisville, CO, United States
- 3Department of Special Education, School of Education and Human Sciences, University of Kansas, Lawrence, KS, United States
We conducted a synthesis of recent systematic reviews with a specific focus on the professional learning aspects of behavior-specific praise (BSP). We examined training procedures and support types aimed at facilitating effective BSP implementation, addressing the following research questions: What procedures were used in training the intervention agent to deliver BSP? To what degree were training integrity and social validity of training procedures assessed? What feedback did intervention agents receive about their implementation of BSP (e.g., emailed graph of BSP rate; self-monitored rate of BSP)? We examined implications of these findings for informing future efforts to support educators implementing BSP, given effective implementation is a critical skill for reinforcing prosocial behaviors, fostering positive school cultures, and mitigating issues associated with exclusionary practices. Results indicated many studies included training for educators on how to implement praise and provide feedback effectively, yet few reported training integrity. We found social validity was primarily assessed in studies through surveys and interviews. Few studies included checks for understanding for educators with permanent products measuring knowledge to identify areas for support. Most studies utilized in-person and verbal BSPs, with some using notes. We discuss limitations and future directions, suggesting results from this review may be useful for informing future professional learning efforts to assist teachers in implementing BSP, a low-intensity strategy for supporting students and educators to promote a positive culture within educational settings.
Introduction
Educators face challenges supporting students with diverse needs, including those with emotional and behavioral disorders (EBD), particularly since disruptions like the COVID-19 pandemic (Lane et al., 2021). Even prior to the pandemic, the prevalence was substantial, with estimates suggesting ∼ 20% of youth experience mild-to-moderate forms of students with EBD in the United States (Forness et al., 2012). Yet, most students do not receive special education supports, with less 1% receiving services under the eligibility category of emotional disturbance (U.S. Department of Education, 2021). This means all educators, including general educators, must be prepared to provide high quality academic instruction while also implementing proactive behavioral strategies to promote students’ social and emotional wellbeing. This preparation requires effective training strategies to establish requisite skills to prevent and respond to challenging behavior. In this synthesis, we examined current research on implementation of one such strategy, behavior-specific praise (BSP), with a specific focus on procedures used to train and support educators in using BSP in an effective and sustained manner.
Behavior-specific praise: definitions and multiple uses
Behavior-specific praise is a foundational strategy often implemented to promote prosocial behaviors and reinforce previously taught expectations (Jolivette, 2013). BSP is a brief acknowledgment specifying a positive behavior (praising student effort versus focusing on ability) delivered after the behavior’s occurrence (Brophy, 1981). BSP works through social positive reinforcement, which increases the future likelihood of behavior through attention in the form of praise (Cooper et al., 2007). BSP is also an effective method for developing and increasing probability of rule-governed behavior, and is considered a generalized reinforcer (Cooper et al., 2007). This means BSP becomes associated with a range of other reinforcers, which can increase its value across multiple settings. Although relatively simple to implement, these multiple aspects of BSP make it a powerful tool for promoting students’ ability to navigate the world through rules used to access rewarding experiences, avoid encountering punitive contingencies (e.g., negative social interactions with adults and peers), and prevent negative outcomes later in life.
As a low-intensity strategy, BSP can be used to efficiently create a positive school climate throughout the school day (Lane et al., 2015). Specifically, BSP focuses on desirable behaviors and can serve as an alternative to ineffective or potentially punitive disciplinary practices. For example, when challenging behavior occurs, teachers can avoid reprimands and exclusionary consequences by re-stating expectations, acknowledging other students in the area currently demonstrating expected behavior, and increasing the rate of reinforcement when the student engaging in challenges behavior begins demonstrating expected behavior. By specifying and giving attention to prosocial behaviors, teachers remind students who are off-task of expectations and prompt them to obtain attention by engaging in expected behavior. This procedure leverages the potential power of social reinforcement without requiring punishment.
Behavior-specific praise statements are systematically delivered by educators to reinforce positive behavior through specific, labeled praise that is immediate, context-specific, and tied to observable behavior. For example, instead of general praise such as “Good job,” teachers are encouraged to provide more specific and informative feedback like, “I appreciate how you raised your hand before speaking, that helps keep our discussion organized.”
Additionally, BSP can be provided in a flexible manner, allowing for adaptation based on individual characteristics, contextual fit, and cultural differences. For example, teachers may adapt the method of reinforcement, such as using providing BSP in public, in private, via verbal interactions, via written notes (Caldarella et al., 2011). Educators can also implement BSP to recognize the prosocial behavior of other adults, which further contributes to building a positive school climate, offers modeling of healthy relationships to students, and can increase use of effective teaching practices in the context of training, coaching, and mentoring (Lane et al., 2015; Pérez et al., 2023). With BSP serving multiple vital functions in educational contexts, educators often require high-quality professional learning and ongoing support (e.g., coaching) to support effective and sustained implementation.
Training and support for behavior-specific praise
High-quality professional learning typically consists of more than just a one-time training experience for educators. To date, studies have shown that even the most effective professional development practices do not always result in the desired changes in teacher behavior, indicating that ongoing support after professional development is often essential (Hemmeter et al., 2011). Effective training often includes follow-up coaching from specialists, training in self-monitoring, and/or peer coaching support. One example of effective professional learning developed by Simonsen et al. (2014) is the Multitiered System for Professional Development. This model customizes levels of professional learning support based on observed needs of teachers, aiming to maximize time and resource efficiency. The framework consists of three tiers: Tier 1 (Universal Support) includes a 30-min didactic professional learning session on a classroom management skill for all teachers, with opportunities for questions and practice, and tools for self-monitoring. Tier 2 (Targeted Support) is for teachers who do not respond to universal professional learning. It involves a 20-min session with a coach who reteaches the skill, provides visual performance feedback, and offers suggestions for improvement, with ongoing feedback for at least a week. For teachers who continue to need support, Tier 3 (Intensive Individualized Support) is provided, which includes intensive coaching, action planning, goal setting, and modeling. This tiered approach adjusts the intensity of professional learning to meet teachers’ needs, enhancing both efficiency and effectiveness. While the tiered framework provides a useful structure for implementation of practices and programs in the school setting, it serves as a conceptual model rather than a strictly evidence-based practice (EBP). The framework is informed by research on multi-tiered systems of support, which constantly requires further empirical validation different contexts, with varying levels of fidelity of implementation, and considering the elements of proactive practices such as BSP, across Tier 1, and 3.
Ideally, teachers receive professional learning, coaching, and follow-up support based on what schoolwide data sources (e.g., treatment integrity, social validity, and systematic screening) show as an area for development. Social validity, or what teachers think about the goals, procedures, and outcomes of an intervention, may be helpful in identifying goals for professional learning and coaching support with buy-in from faculty and staff (Allen, 2021). Treatment integrity data, the extent to which a plan was delivered as intended, can be useful to inform refinement of implementation efforts. For example, a discrepancy between teacher and observer ratings of low-intensity strategy implementation in the classroom, like BSP, can be a coaching and learning opportunity (Buckman et al., 2021). Training integrity, the extent to which training was implemented as planned (Erath et al., 2020), is another important data source to examine to ensure professional learning efforts are delivered as high-quality experiences. Yet, these data sources are not always collected or reported in published literature. This is a gap in need of attention, given how useful it would be to know more details about the training, reinforcement, and feedback educators receive when it comes to supporting sustainable changes in teaching practices that ultimately impact student learning, behavior, and social-emotional wellbeing.
Behavior-specific praise: examining the evidence-base
There is substantial evidence supporting efficacy and feasibility of using BSP with a range of students and in a variety of contexts to increase likelihood of a wide array of behaviors (e.g., engagement, social interaction, and work completion; Ennis et al., 2020b), including three recent reviews of the evidence for BSP. Two reviews found this strategy has been studied across grade and age spans as well as in traditional and non-traditional school settings (Ennis et al., 2020b; Royer et al., 2019). Royer et al. (2019) established BSP as a potentially evidence-based practice, applying an 80% weighted criterion of the Council for Exceptional Children (Council for Exceptional Children [CEC], 2014) quality indicators. Results suggested BSP demonstrates strong potential as an EBP, but more studies meeting quality indicators and including sufficient numbers of participants are needed to achieve full EBP status.
Ennis et al. (2020a) conducted another systematic review to examine the impact of coaching in different modalities (e.g., oral, written, visual, self-monitored, and bug-in-ear) on educators’ rate of BSP. Authors applied Council for Exceptional Children [CEC] (2014) standards and determined coaching teachers and other educators to increase BSP met criteria to be considered an evidence-based practice. Thus, there is evidence to suggest educator training and coaching (including self-coaching through self-monitoring strategies) can effectively increase the rate of teacher-delivered BSP (Ennis et al., 2020a). Ennis et al. (2020b) also conducted a comprehensive map of the literature on studies of BSP implemented in schools over the last 50 years to understand contexts in which BSP was effective. The authors found most studies took place in general education classrooms with students who showed challenging behaviors. They also found three categories of research on BSP including teacher-delivered BSP, student-delivered BSP to peers, and professional learning for educators to increase BSP delivery. Although findings from these systematic reviews are insightful for determining effectiveness of BSP as a practice, and effectiveness of training on BSP, there is more to be learned from included studies. Specifically, there is a need to extract precise information about procedures for training and supporting implementation used in this body of research, as this information can be used to inform future efforts to refine and scale up training of this essential practice.
Purpose
The purpose of this review was to conduct additional analyses of recent systematic reviews conducted by Ennis et al. (2020a,b) and Royer et al. (2019). The three selected reviews were chosen due to their comprehensive scope, methodological rigor, and relevance to current BSP practices. While more recent reviews exist, these were prioritized for their in-depth analyses and direct applicability to the study’s aims. However, we acknowledge the limitation of not incorporating more recent literature and address its implications in the discussion section. Specifically, our intent was to identify detailed components of professional learning for BSP, focusing on training procedures (e.g., training integrity) and the types of supports educators received as part of and after the initial training (e.g., specific praise and performance feedback, follow up coaching) to effectively and sustainably implement BSP to support. We address three questions: What procedures were used in training the intervention agent to deliver BSP? To what degree were training integrity and social validity of training procedures assessed? What feedback did intervention agents receive about their implementation of BSP (e.g., emailed graph of BSP rate; self-monitored rate of BSP)?
Method
Inclusion criteria
Studies in this synthesis were included in the three previous systematic reviews of BSP (Gage et al., 2017; Royer et al., 2019) and met the following criteria. First, we included studies investigating effects of BSP as an independent variable (IV, e.g., effects of BSP on student outcomes) or dependent variable (DV, e.g., effects of coaching teaching to increase BSP). Studies had to mention praise was defined as an acknowledgment of student behavior naming the behavior being acknowledged with examples to confirm. If studies did not define specific praise, they were excluded. We only included studies where BSP was a primary DV or IV of interest because BSP is often a component of an intervention package but not the focus of the investigation. If other strategies were a part of the study, studies had to report separate data for BSP to meet inclusion criteria. Second, we included studies when they (a) took place in a Pre-K–12 traditional school setting, (b) used an experimental or quasi-experimental design, inclusive of group or single case designs (SCDs), and (c) were published in a peer-reviewed journal. We focused on traditional school settings given wide variety of student needs in alternative education settings and different service-delivery models (e.g., individualized behavior intervention plans; Jolivette, 2013; Leone and Weinberg, 2012). Descriptive studies, illustrations, and meta-analysis were not included.
Coding procedures
Authors of previous BSP systematic reviews were trained to code articles as part of the procedures in prior systematic reviews, where training procedures are featured (i.e., Ennis et al., 2020b; Ennis et al., 2017; Royer et al., 2019). In sum, training included coding studies which were included in the prior reviews, later meeting to discuss discrepancies and come to agreement until 85% inter-rater agreement (IRA) or higher was established for three consecutive articles.
To code additional variables in this research synthesis, the first author reviewed coding procedures and schema with the second author. Both authors coded one article together, discussing results to come to agreement. Then authors independently coded one article and met to discuss and resolve disagreements, obtaining interrater 97.02% agreement across 48 variables in the study. The first author independently coded all articles and the second author coded 25% of articles for reliability purposes (IRA = 97.12%, 89.58%–100%).
Inter-rater agreement for each additional variable coded as part of this synthesis was as follows, calculated across each variable for 25% of studies for (a) intervention agent training, including training integrity and social validity of training procedures; and (b) performance feedback. We used IRA data as a training procedure to inform our reliability of coding and as proof of calibration to increase reliability with each other. See Table 1 for IRA results in detail.
Descriptive coding
The first author contacted authors of previous BSP systematic reviews and obtained coded variables for each article’s (a) purpose, (b) research questions, (c) type of study (e.g., SCD; multiple baseline, and reversal), (d) level of student populations participating in the intervention (e.g., early childhood, elementary, middle, and high school), (e) strategy for delivering BSP (e.g., educator training, coaching, and self-monitoring), (f) method of delivery (e.g., verbal, written, and peer), (g) design methodology (e.g., type of SCD or group), (h) BSP recipients (e.g., teacher and student), (i) grade level, (j) disability category (e.g., at risk, general education, and specific learning disability), (k) generalization, (l) maintenance, (m) social validity, (n) IV and delivery components, (o) intervention agent, (p) DV, and (q) study outcomes. In previous BSP systematic reviews, a second author checked accuracy of coding for 35.85% of articles (IRA = 98.62%).
To answer research questions for this research synthesis, we created an additional database to code more details of (a) intervention agent training, including training integrity and social validity of training procedures; and (b) performance feedback provided to intervention agents during training. We refer to training integrity (Erath et al., 2020) as the extent to which the procedures used to train the intervention agent were implemented as planned (also referred to as implementation fidelity by Ledford and Gast, 2018). We differentiate training integrity from treatment integrity, the accuracy with which the intervention was delivered as planned (Buckman et al., 2021; also referred to as treatment integrity by Ledford and Gast, 2018). We coded each variable using an absolute coding scheme where 1 = reported and 0 = not reported. The second author checked reliability of coding for 25% of studies using cell-by-cell agreement, dividing agreements by the total number of cells, and multiplying by 100 to obtain a percentage.
Intervention agent training
We coded variables to capture frequency, dosage, rationale, steps, materials, discussion, practice, praise, and feedback provided during training for the intervention agent, passive and active check for understanding measures of teachers’ knowledge of BSP, and if studies measured social validity of training procedures (e.g., survey or rating scales, interviews). Additionally, we captured if training integrity was assessed and reported, who created and completed such measures, how often it was assessed, and if training integrity data was provided back to the trainers as feedback on their performance.
Performance feedback
We coded variables to determine whether performance feedback was provided to intervention agents during training, type of performance feedback (e.g., visual, verbal, written, and self-monitored), frequency, modality, and timing (e.g., immediate and delayed).
Analyses
Results of database coding were analyzed descriptively to report percentages for each variable. IRA for this step was 100.00%.
Results
We coded 57 studies from 52 articles for descriptive variables to synthesize research on BSP intervention agent training and performance feedback. Table 2 lists student participants’ descriptive characteristics. Table 3 summarizes specific training components and materials. Table 4 reflects performance feedback educators received as part of intervention agent training.
Training components of BSP
Training of intervention agents for implementing BSP was reported in 39 studies (75.00%; e.g., Gage et al., 2017). Authors in 31 studies (60.78%) reported definitions of BSP in training (e.g., Hemmeter et al., 2011), and 23 studies (45.10%) included rationale and benefits for using reinforcement (e.g., Duchaine et al., 2011). The training mode most reported was in person in 23 studies (46%; e.g., Allday et al., 2012). Authors in nine studies (17.65%) included modeling (e.g., Hawkins and Heflin, 2011). Nine studies included discussion during training (17.31%), while 14 studies included opportunities to practice or role play (27.45%). Authors in 13 studies (25%) included corrective feedback for educators during training, while 5 (9.43%) reported using praise feedback, either general or behavior specific (Houghton et al., 1990).
No studies reportedly included a pre-training measure of BSP knowledge (0%), while six studies (11.54%) included a passive check for understanding (e.g., Kalis et al., 2007; Reinke et al., 2007) and four studies (7.69%) reported an active check for understanding (e.g., verbal checks; identifying praise statements using a script for self-evaluation and operational definitions of praise until 80% agreement; demonstrating until proficient; direct observation) (e.g., Fullerton et al., 2009; Duncan et al., 2013; Horton, 1975; Wright et al., 2012).
Training integrity and social validity
First, we found across studies, teachers were largely the ones who delivered the BSP intervention to students (77.19%), teacher interns in some studies (10.53%), paraprofessionals (3.51%), and peers in a few (10.53%). However, researchers mostly implemented the training to support the intervention agents in delivering BSP in about 40 (76.92%) of studies, while few had teachers (5.77%) and administrators (3.85%) deliver professional learning opportunities.
In nine studies (17.31%; e.g., Dufrene et al., 2012), authors described measuring training integrity, differentiated from qualitative (e.g., journal and notes) or subjective statements (e.g., “two co-trainers were present to ensure fidelity”), and in seven studies (13.46%) authors reported results (Simonsen et al., 2014). In all those studies (100%) with training integrity measures, researchers were recording those data. Additionally, training integrity measures were all (100%) created by researchers. Feedback was not provided in any studies (0%) after training integrity was measured. In all studies, authors reported training integrity was measured once (100%).
Social validity of overall study procedures was assessed in 33 (61.40%) studies, and it was mostly assessed after the intervention (e.g., Fullerton et al., 2009) in 30 studies reporting social validity (90.91%). Some studies assessed social validity before and after the intervention (6.0%; e.g., Simonsen et al., 2010). Social validity specific to intervention agent training was assessed in six studies (11.54%; e.g., Reinke et al., 2008; Capizzi et al., 2010).
Feedback support
We did not find any studies in which feedback was provided to trainers after training integrity data were collected (0%). Performance feedback of the intervention was assessed in 42 studies (80.77%; e.g., Pisacreta et al., 2011; Stormont et al., 2007). The types of performance feedback provided included visual or graphed (19.23%; e.g., Coissart et al., 1973), verbal in-person (34.61%; Madsen et al., 1968), text/written (36.54%; e.g., Reinke et al., 2008), and self-monitored (9.62%; e.g., Alexander et al., 2012). The frequency of performance feedback varied across studies from daily (48.01%), to weekly (11.54%), after each observation (15.38%), to once (1.92%; DaFonte and Capizzi, 2015). Furthermore, performance feedback was delivered immediately after observation (42.31%; e.g., Armstrong et al., 1988), delayed unspecified time (21.15%), delayed same day (9.62%), delayed next day (1.92%), delayed more than 1 day (1.92%), self-monitored (1.92%; e.g., Horton, 1975), and live synchronous (7.69%; e.g., Dufrene et al., 2014). Table 4 summarizes these findings.
Discussion
Behavior-specific praise has been studied across grade spans, in both traditional and non-traditional settings (Ennis et al., 2020b). Additionally, BSP is an easy to implement low-intensity strategy to reinforce positive behavior. The ongoing implementation of BSP may effectively promote socially important goals for students, like increasing engagement and decreasing behaviors which negatively affect learning (Royer et al., 2019). BSP is a highly adaptable strategy within tiered systems of support and can be used to support all students (at Tier 1), modified to provide additional support to some students or groups of students (at Tier 2), or used as a component of intensive interventions (at Tier 3) to support few students with more significant needs. Furthermore, recent evidence shows BSP may significantly predict meaningful student outcomes (Gage et al., 2017). Given the impact of this relatively easy and cost-effective low-intensity strategy to reinforce important behaviors, we emphasize the need for educators to regularly implement BSP to create supportive learning environments and foster positive relationships for both students and educators.
In this research synthesis, we examined the training procedures used to implement BSP, the assessment of training integrity, and the types of feedback provided to educators. Findings indicate BSP training procedures varied considerably across studies, with some implementing structured coaching models and others relying on self-paced or workshop-based approaches (Allday et al., 2012; Simonsen et al., 2014). The assessment of training integrity was reported inconsistently, highlighting the need for stronger fidelity monitoring (c.f., Dufrene et al., 2012; Duncan et al., 2013). Additionally, feedback to educators also played a role in BSP implementation, with studies incorporating both direct coaching and performance-based assessments to reinforce skills (Reinke et al., 2008; Simonsen et al., 2013). We further discuss critical insights into the effectiveness of BSP training and highlight areas for improvement in professional learning and implementation fidelity and tracking for future researchers and educators alike. The role of coaching and follow-up support has been particularly emphasized as essential for maintaining implementation fidelity (Barton et al., 2016; Gage et al., 2017).
Training components of BSP
Findings revealed significant variability in how BSP training was delivered across studies. Some programs incorporated structured coaching with real-time feedback (Duchaine et al., 2011; Simonsen et al., 2014), while others relied on self-guided materials or one-time workshops (Capizzi et al., 2010; Hemmeter et al., 2011). Studies that included ongoing coaching and follow-up support tended to report higher implementation fidelity, reinforcing existing research on effective professional learning (Gage et al., 2017; Reinke et al., 2007). These findings suggest that BSP training may benefit from adopting a more standardized approach that includes ongoing coaching and practice-based feedback to enhance teacher competence and long-term implementation (Ploessl and Rock, 2014; Rathel et al., 2014). Future researchers could examine the optimal balance between initial training and sustained support to determine the most effective model for BSP training. Some evidence shows training approaches that incorporate tiered professional learning systems and systematic performance feedback yielded stronger implementation outcomes (Simonsen et al., 2017; Thompson et al., 2012). These findings highlight the importance of integrating multiple training components to ensure both initial competency and sustained implementation. It would also be informative to further investigate how various training modalities influence educator adherence and skill acquisition over time.
To inform effective and relevant professional learning, one important source of data may be treatment integrity, or the extent to which a plan or intervention was delivered as intended. This means the extent to which BSP was implemented as intended—as the intervention or treatment. These data can be most useful to inform specific areas for additional refinement of implementation efforts, such as areas of discrepancy between teacher and observer ratings of teacher-implemented low-intensity strategies in the classroom (Buckman et al., 2021). Beyond treatment integrity, it may also be very useful and informative for researchers and educators to track training integrity, or the extent to which training components and steps were implemented as intended. Training integrity is specifically about how training support, or professional learning, was delivered to educators who would later implement BSP. It is important to know the integrity of training opportunities educators received to learn strategies, such as BSP, before they are expected to implement them in the classroom with integrity. Moreover, effective professional development, such as tiered training approaches, have been shown to increase BSP use and sustainability (Thompson et al., 2012). Future researchers should further investigate how various training modalities influence educator adherence and skill acquisition over time.
Training integrity and social validity
The assessment of training integrity and social validity was reported inconsistently across reviewed studies. While some studies incorporated fidelity measures to track BSP implementation, others did not specify how training effectiveness was monitored (Dufrene et al., 2012; Duncan et al., 2013). Training integrity is a key predictor of implementation success (Simonsen et al., 2014), and only a few studies (e.g., Dufrene et al., 2012) systematically evaluated whether teachers applied BSP as intended after training—which is treatment integrity. Training integrity may also be useful for administrators and coaches to inform their own part in building capacity and ensuring implementation sustainability. We found some authors measured training integrity, yet feedback was not reportedly provided to trainers after collecting those data. It is fundamental to not only collect integrity data, but to also use it to support others in refining practices after training occurs. Furthermore, previous studies have shown that school-based interventions for educators with ongoing coaching and fidelity monitoring yield higher rates of sustained implementation (Lochman et al., 2012). Including fidelity tracking provides educators with accountability structures that may continue to reinforce consistent application.
Social validity, or what educators think about the procedures, goals, and outcomes for training, can also be a useful source of information for data-informed professional learning. Given relatively few studies included measures on the impressions of educators about training, we emphasize including these types of measures in future studies of professional learning efforts to support educators. Additionally, social validity measures appeared underutilized, despite their importance in assessing how educators perceive BSP training and relevance to their instructional practices (Reinke et al., 2008; Capizzi et al., 2010). When educators have a voice in the way training support is delivered, it is more likely the procedures, goals, and outcomes will be aligned with their actual needs and preferences. Even though social validity has been mostly measured using surveys and interviews, it is fundamental to also focus on use as a behavioral marker for social validity. This means educators’ use of BSP serves as an indicator of its acceptability, and higher social validity may often lead to greater implementation fidelity. As highlighted by Royer et al. (2019), effective BSP implementation not only enhances teachers’ own engagement but also helps prevent challenging behaviors from occurring, reinforcing its value as a classroom proactive classroom intervention. Furthermore, educators could focus on using students’ affect or other positive outcomes such as collateral effects on prosocial behaviors as broader and more verifiable measures of social validity. By embedding comprehensive training integrity, implementation fidelity, and social validity assessments into future studies, educators can ensure that BSP training is both effective and aligned with specific contextual needs.
Feedback support
We found few studies provided checks for understanding for educators who participated in training to implement BSP. It is noteworthy role play (including feedback) and checks for understanding whether active or passive are important components of professional learning, yet they were not used in most studies. A measure or permanent product of what educators know can be a helpful measure to determine areas for support and additional feedback. Measures of perceived and actual knowledge can be used by coaches and other educators to clarify misunderstandings and ensure educators achieve a certain criterion of knowledge needed to implement a practice. Educator training evaluations serve as a critical tool for refining professional development. By assessing both perceived and actual knowledge acquisition, we can tailor future training to focus on areas where teachers need additional support, ensuring more effective and sustainable BSP implementation (Gage et al., 2017; Hemmeter et al., 2011). Research has demonstrated that structured coaching and performance feedback significantly enhance teacher implementation fidelity and long-term adherence to EBPs (Hawkins and Heflin, 2011; O’Handley et al., 2018). Additionally, studies indicate that training strategies incorporating video feedback and self-monitoring can improve teacher effectiveness and reduce variability in implementation (Pinter et al., 2015).
Professional learning models that embed follow-up support and reinforcement strategies have been linked to greater retention and application of behavioral interventions in classroom settings (Simonsen et al., 2017; Pisacreta et al., 2011). To optimize training outcomes, future research should examine the impact of various training modalities and identify the most effective balance between initial instruction and ongoing coaching for educators. Educators should receive coaching and follow-up support based on decisions made with schoolwide data sources like treatment and training integrity, social validity, and knowledge checks to sustain implementation of practices.
The role of feedback in professional learning emerged as a critical factor in BSP training. Studies that incorporated structured coaching and performance-based feedback reported higher levels of teacher confidence and adherence to BSP strategies with fidelity, and potentially enhance sustainability (Ploessl and Rock, 2014; Rathel et al., 2014; Reinke et al., 2007). In contrast, studies that relied on one-time feedback sessions or self-reflection showed less consistent implementation fidelity (Pinter et al., 2015). These findings align with research indicating that ongoing, formative feedback enhances adult learning and skill retention (Pisacreta et al., 2011). Future training models should integrate consistent, high-quality feedback mechanisms, such as video modeling, peer observation, and real-time coaching, to reinforce BSP application and improve long-term outcomes (O’Handley et al., 2018; Ploessl and Rock, 2014). Additionally, performance feedback has been shown to enhance sustainability of training outcomes, further reinforcing the need for systematic data collection in BSP implementation (Dufrene et al., 2014). By embedding structured data-based feedback into professional learning frameworks, educators can receive the necessary support to sustain BSP practices effectively.
Limitations and future directions
This research synthesis has some limitations. First, we did not conduct an additional search to capture the most recent studies beyond those included in the last three reviews. This means there are possibly more studies of training educators to implement BSP we may have not included. We did not contact publishers of journals to ask if there are more recent studies, as the purpose of this synthesis was to further investigate specific training and feedback components of training of studies included in previous systematic reviews and a map of the literature. Additionally, inconsistencies in fidelity reporting reduce the ability to draw definitive conclusions about which training models lead to the highest quality BSP implementation. Future research should explore standardized fidelity measures and examine the scalability of BSP training in different school contexts.
We encourage more researchers to embed training integrity and social validity measures in studies of professional learning opportunities offered to educators. Evaluating the extent to which training was implemented as intended, and more importantly, giving people feedback using data can inform more precise and complete efforts in supporting educators when learning new strategies. Using social validity data can also better inform the learning process with input from those receiving professional learning to make the process more valid and potentially useful.
Another important feature of efforts to support educators in learning is perceived and actual knowledge (Oakes et al., 2018). We also encourage professional learning leaders and researchers to include some measure of knowledge to gauge understanding, and to use those data as an opportunity to refine the process and materials or content offered. Just as educators must progress monitor and assess student knowledge, educators themselves would benefit from such assessments to support their own professional learning journeys with knowledge checks.
One expected finding was that most studies delivered BSP verbally in person and some studies used praise notes (Teerlink et al., 2017). It is now possible to deliver praise in different ways that were not common before, such as through virtual platforms and asynchronously. Furthermore, we invite educators to find creative ways to use both nonverbal and verbal cues to acknowledge students to be more inclusive of students who may not be able to understand or process language the same way most students do—or perhaps even to praise students whose first language is not English. Even though a gesture would not qualify as BSP by itself, Kelly et al. (2014) demonstrated that combining non-verbal gestures (smiling, high fives and reducing physical distance) with verbal praise enhanced reinforcing effects and boosted engagement, as the non-verbal cues became conditioned reinforcers through consistent pairing with the descriptive feedback provided (Dozier et al., 2012). Effective use of praise depends on contextual and individual student factors, particularly students’ language and communication skills, which are critical considerations when supporting all students, and in particular students with EBD (Chow and Hollo, 2021). BSP heavily relies on language processing, therefore understanding all factors that may influence a student’s response is essential for tailoring praise strategies to meet diverse student needs. Additionally, it is important to also consider how to adjust the learning environment to make BSP an effective reinforcer, regardless of students’ learning histories or abilities. When BSP is not effective, doing a more thorough environmental assessment can help identify additional supports a students may need beyond BSP. Overall, the idea of noticing and reinforcing the positive is a way to shift away from punitive and exclusionary practices and move toward building a more proactive and positive culture to promote behaviors in line with a schools’ values and purpose. Also, educators can support each other using BSP when they notice each other’s efforts implementing the plan through email, texts, and notes instead of relying on verbal and in-person feedback, which may not always be feasible.
While it was not possible within the score of this review to directly explore the connection between BSP training and student outcomes, we encourage educators and researchers to carefully examine the effects of BSP in different contexts. It would be helpful to determine the least amount of training and support necessary for educators to successfully implement BSP with fidelity in a way that yields positive effects on students. And educators can work together to implement high rates of BSP to support diverse needs of students – we propose districts offer professional learning for educators to expand their views and fundamental understandings of the principles of reinforcement. In the same way educators equip themselves with proactive and preventive strategies to promote engagement and learning with students, they can also encourage each other with acknowledgments for efforts to promote a positive culture built on core values.
Summary
We examined professional learning strategies to support educators effectively implement BSP a foundational and proactive approach for promoting prosocial student behavior. BSP is a low-intensity, research-based strategy that reinforces or increases the future likelihood of expected behavior and helps foster positive classroom and school climate. We found educators were trained through a range of methods, including modeling, role-play, and feedback, with follow-up coaching often enhancing long-term implementation. Performance feedback and professional support were used to sustain BSP use, helping educators integrate practice within daily routines to further encourage student engagement and ultimately reduce the need for reactive, exclusionary discipline practices. We underscore the critical role of social validity in supporting successful BSP implementation in schools. Professional learning must extend beyond teaching “how” to deliver praise—it must also address “why” such strategies matter to consistently and meaningfully apply BSP. Understanding the rationale behind BSP can potentially increase buy-in, thus helping educators see its value in reinforcing student success and ultimately building positive relationships. Furthermore, our findings suggest data-informed, educator-centered professional learning—paired with intentional focus on the principles of reinforcement—can empower teachers to use reinforcement as a proactive tool.
The foundational principles of reinforcement may be leveraged to help shift toward a more proactive approach, particularly as educators navigate increasing responsibility to support students academically, behaviorally, and socially. Data-informed professional learning—centered on elements like treatment integrity, training fidelity, and social validity—may enhance educators’ knowledge and confidence in applying low-intensity practices like BSP. Future research should continue refining professional learning frameworks within tiered models of prevention, continuing efforts to embed ongoing coaching, structured feedback, and fidelity monitoring. It would be particularly interesting to notice patterns across different educational cultures and geographical areas to have a clearer roadmap for optimizing BSP training. Additionally, exploring how different professional learning models influence long-term use of BSP across various school contexts may further inform best practices. Having a firm understanding of how learning occurs and rethinking assumptions about reinforcement and strategies like BSP can create opportunities to bridge the gap between expectations and support. By considering the broader mechanisms of learning, we, as researchers and educators, can better align our efforts in meeting the needs of learning communities.
Author contributions
PP: Conceptualization, Methodology, Supervision, Writing – original draft, Writing – review & editing. DR: Conceptualization, Methodology, Supervision, Writing – original draft, Writing – review & editing. MB: Conceptualization, Methodology, Writing – original draft, Writing – review & editing. KL: Conceptualization, Methodology, Supervision, Writing – original draft, Writing – review & editing.
Funding
The author(s) declare that financial support was received for the research and/or publication of this article. This synthesis was funded in part by the Institute of Education Sciences, U.S. Department of Education (R324N190002: PI Lane) and Office of Special Education Programs, KU RITE Project (H32D160080; PI Lane).
Conflict of interest
The authors declare that the research was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest.
Publisher’s note
All claims expressed in this article are solely those of the authors and do not necessarily represent those of their affiliated organizations, or those of the publisher, the editors and the reviewers. Any product that may be evaluated in this article, or claim that may be made by its manufacturer, is not guaranteed or endorsed by the publisher.
Author disclaimer
Opinions expressed herein do not necessarily reflect the position of the U.S. Department of Education, and as such, endorsements should not be inferred.
Footnotes
*Included in the synthesis of studies for this review.
References
Alexander, M., Williams, N. A., and Nelson, K. L. (2012). When you can’t get there: Using video self-monitoring as a tool for changing the behaviors of pre-service teachers. Rural Special Educ. Quar. 31, 18–24. doi: 10.1177/875687051203100404
Allday, R. A., Hinkson-Lee, K., Hudson, T., Nielsen-Gattie, S., Kleinke, A., and Russel, C. S. (2012). Training general educators to increase behavior-specific praise: Effects on students with EBD. Behav. Disord. 37, 87–98. doi: 10.1177/019874291203700203
Allen, G. E. (2021). Social validity and Tier 1 practices in high schools. Doctoral dissertation, University of Kansas, ERIC: Washington, D.C.
*Andrews, L., and Kozma, A. (1990). Increasing teacher praise and altering its distribution to children of differing on-task levels. Can. J. Behav. Sci. 22, 110–120. doi: 10.1037/h0078897
Armstrong, S. B., McNeil, M. E., and Van Houten, R. (1988). A principal’s inservice training package for increasing teacher praise. Teach. Educ. Special Educ. 11, 79–94. doi: 10.1177/088840648801100301
Barton, E. E., Fuller, E. A., and Schnitz, A. (2016). The use of email to coach preservice early childhood teachers. Top. Early Childhood Special Educ. 36, 78–90. doi: 10.1177/0271121415612728
*Barton, E. E., Pribble, L., and Chen, C. I. (2013). The use of e-mail to deliver performance-based feedback to early childhood practitioners. J. Early Intervent. 35, 270–297. doi: 10.1177/1053815114544543
*Briere, D. E., Simonsen, B., Sugai, G., and Myers, D. (2015). Increasing new teachers’ specific praise using a within-school consultation intervention. J. Positive Behav. Intervent. 17, 50–60. doi: 10.1177/1098300713497098
*Brock, M. E., and Beaman-Diglia, L. E. (2018). Efficacy of coaching preschool teachers to manage challenging behavior. Educ. Treatment Children 41, 31–48. doi: 10.1353/etc.2018.0001
Brophy, J. (1981). Teacher praise: A functional analysis. Rev. Educ. Res. 51, 5–32. doi: 10.3102/00346543051001005
Buckman, M. M., Lane, K. L., Common, E. A., Royer, D. J., Oakes, W. P., Allen, G. E., et al. (2021). Treatment integrity of primary (Tier 1) prevention efforts in tiered systems: Mapping the literature. Educ. Treatment Children 44, 125–144. doi: 10.1007/s43494-021-00049-
Caldarella, P., Christensen, L., Young, K. R., and Densley, C. (2011). Decreasing tardiness in elementary school students using teacher-written praise notes. Intervent. School Clin. 47, 104–112. doi: 10.1177/1053451211414186
Capizzi, A. M., Wehby, J. H., and Sandmel, K. N. (2010). Enhancing mentoring of teacher candidates through consultative feedback and self-evaluation of instructional delivery. Teach. Educ. Special Educ. 33, 191–212. doi: 10.1177/0888406409360012
Chow, J. C., and Hollo, A. E. (2021). Language skills of students with emotional and behavioral disorders. Intervent. Sch. Clin. 58, 46–50. doi: 10.1177/10534512211047584
Coissart, A., Hall, R. V., and Hopkins, B. L. (1973). The effects of experimenter’s instructions, feedback, and praise on the teacher praise and student attending behavior. J. Appl. Behav. Anal. 6, 89–100. doi: 10.1901/jaba.1973.6-89
Cooper, J. O., Heron, T. E., and Heward, W. L. (2007). Applied behavior analysis, 2nd Edn. London: Pearson.
Council for Exceptional Children [CEC] (2014). Council for exceptional children standards for evidence-based practices in special education. Arlington, VA: Council for Exceptional Children.
DaFonte, A. M., and Capizzi, A. (2015). Module-based approach: Training paraeducators on evidence-based practices. Phys. Disabil. Educ. Related Serv. 34, 31–54. doi: 10.14434/pders.v34i1.13823
Dozier, C. L., Iwata, B. A., Thomason-Sassi, J., Worsdell, A. S., and Wilson, D. M. (2012). A comparison of two pairing procedures to establish praise as a reinforcer. J. Appl. Behav. Anal. 45, 721–735. doi: 10.1901/jaba.2012.45-721
Duchaine, E. L., Jolivette, K., and Fredrick, L. D. (2011). The effect of teacher coaching with performance feedback on behavior-specific praise in inclusion classrooms. Educ. Treatment Children 34, 209–227. doi: 10.1353/etc.2011.0009
Dufrene, B. A., Lestremau, L., and Zoder-Martell, K. (2014). Direct behavioral consultations: Effects on teachers’ praise and student disruptive behavior. Psychol. Schools 51, 567–580. doi: 10.1002/pits.21768
Dufrene, B. A., Parker, K., Menousek, K., Zhou, Q., Harpole, L. L., and Olmi, D. J. (2012). Direct behavioral consultation in Head Start to increase teacher use of praise and effective instruction delivery. J. Educ. Psychol. Consult. 22, 159–186. doi: 10.1080/10474412.2011.620817
Duncan, N. G., Dufrene, B. A., Sterling, H. E., and Tingstrom, D. H. (2013). Promoting teachers’ generalization of intervention use through goal setting and performance feedback. J. Behav. Educ. 22, 325–347. doi: 10.1007/s10864-013-9173-5
Ennis, R. P., Royer, D. J., Lane, K. L., and Dunlap, K. D. (2020a). The impact of coaching on teacher-delivered behavior-specific praise in pre-K–12 settings: A systematic review. Behav. Disord. 45, 148–166. doi: 10.1177/0198742919839221
Ennis, R. P., Royer, D. J., Lane, K. L., and Dunlap, K. D. (2020b). Behavior-specific praise in pre-K–12 settings: Mapping the 50-year knowledge base. Behav. Disord. 45, 131–147. doi: 10.1177/0198742919843075
Ennis, R. P., Royer, D. J., Lane, K. L., and Griffith, C. E. (2017). A systematic review of precorrection in PK-12 settings. Educ. Treat. Child. 40, 465–495. doi: 10.1353/etc.2017.0021
Erath, T. G., DiGennaro Reed, F. D., Sundermeyer, H. W., Brand, D., Novak, M. D., Harbison, M. J., et al. (2020). Enhancing the training integrity of human service staff using pyramidal behavioral skills training. J. Appl. Behav. Anal. 53, 449–464. doi: 10.1002/jaba.608
Forness, S. R., Kim, J., and Walker, H. M. (2012). Prevalence of students with EBD: Impact on general education. Beyond Behav. 21, 3–11.
Fullerton, E. K., Conroy, M. A., and Correa, V. I. (2009). Early childhood teachers’ use of specific praise statements with young children at risk for behavioral disorders. Behav. Disord. 34, 118–135. doi: 10.1177/019874290903400302
Gage, N. A., MacSuga-Gage, A. S., and Crews, E. (2017). Increasing teachers’ use of behavior-specific praise using a multitiered system for professional development. J. Posit. Behav. Intervent. 19, 239–251. doi: 10.1177/1098300717693568
Hawkins, S. M., and Heflin, L. J. (2011). Increasing secondary teachers’ behavior-specific praise using a video self-modeling and visual performance feedback intervention. J. Posit. Behav. Intervent. 13, 97–108. doi: 10.1177/1098300709358110
*Haydon, T., and Musti-Rao, S. (2011). Effective use of behavior- specific praise: A middle school case study. Beyond Behav. 20, 31–39.
Hemmeter, M. L., Snyder, P., Kinder, K., and Artman, K. (2011). Impact of performance feedback delivered via electronic mail on preschool teachers’ use of descriptive praise. Early Childhood Res. Quar. 26, 96–109. doi: 10.1016/j.ecresq.2010.05.004
*Hollingshead, A., Kroeger, S. D., Altus, J., and Trytten, J. B. (2016). A case study of positive behavior supports-based interventions in a seventh-grade urban classroom. Prevent. School Failure: Alternat. Educ. Children Youth 60, 1–8. doi: 10.1080/1045988x.2015.1124832
Horton, G. O. (1975). Generalization of teacher behavior as a function of subject matter specific discrimination training. J. Appl. Behav. Anal. 8, 311–319. doi: 10.1901/jaba.1975.8-311
Houghton, S., Wheldall, K., Jukes, R., and Sharpe, A. (1990). The effects of limited private reprimands and increased private praise on classroom behaviour in four British secondary school classes. Br. J. Educ. Psychol. 60, 255–265. doi: 10.1111/j.2044-8279.1990.tb00943.x
Jolivette, K. (2013). Challenges of conducting empirically-rigorous intervention and evaluative research in juvenile correctional facilities: Suggestions for the field. J. Correct. Educ. 64, 37–50.
Kalis, T. M., Vannest, K. J., and Parker, R. (2007). Praise counts: Using self-monitoring to increase effective teaching practices. Prevent. School Failure 51, 20–27. doi: 10.3200/psfl.51.3.20-27
*Keller, C. L., Brady, M. P., and Taylor, R. L. (2005). Using self evaluation to improve student teacher interns’ use of specific praise. Educ. Train. Dev. Disabil. 40, 368–376.
Kelly, M. A., Roscoe, E. M., Hanley, G. P., and Schlichenmeyer, K. (2014). Evaluation of assessment methods for identifying social reinforcers. J. Appl. Behav. Anal. 47, 113–135. doi: 10.1002/jaba.107
Lane, K. L., Cabell, S. Q., and Drew, S. V. (2021). A productive scholar’s guide to respectful, responsible inquiry during the COVID-19 pandemic: Moving forward. J. Learn. Disabil. 54, 388–399. doi: 10.1177/00222194211023186
Lane, K. L., Menzies, H. M., Ennis, R. P., and Oakes, W. P. (2015). Supporting behavior for school success: A step-by-step guide to key strategies. New York, NY: Guilford Press.
Ledford, J. R., and Gast, D. L. (2018). Single case research methodology: Applications in special education and behavioral sciences, 3rd Edn. England: Routledge, doi: 10.4324/9781315150666
Leone, P., and Weinberg, L. (2012). Addressing the unmet educational needs of children and youth in the juvenile justice and child welfare systems. Washington, DC: Center for Juvenile Justice Reform, Georgetown University Public Policy Institute.
Lochman, J. E., Powell, N. P., Boxmeyer, C. L., Andrade, B., Stromeyer, S. L., and Jimenez-Camargo, L. A. (2012). Adaptations to the Coping Power program’s structure, delivery settings, and clinician training. Psychotherapy 49, 135–142. doi: 10.1037/a0027165
Madsen, C. H., Becker, W. C., and Thomas, D. R. (1968). Rules, praise, and ignoring: Elements of elementary classroom control. J. Appl. Behav. Anal. 1, 139–150. doi: 10.1901/jaba.1968.1-139
*Martella, R. C., Marchand-Martella, N. E., Young, K. R., and MacFarlane, C. A. (1995). Determining the collateral effects of peer tutor training on a student with severe disabilities. Behav. Modif. 19, 170–191. doi: 10.1177/01454455950192002
*Moffat, T. K. (2011). Increasing the teacher rate of behaviour specific praise and its effect on a child with aggressive behaviour problems. Kairaranga 12, 51–58. doi: 10.54322/kairaranga.v12i1.152
*Morgan, R. L., Menlove, R., Salzberg, C. L., and Hudson, P. (1994). Effects of peer coaching on the acquisition of direct instruction skills by low-performing pre-service teachers. J. Special Educ. 28, 59–76. doi: 10.1177/002246699402800105
*Myers, D. M., Simonsen, B., and Sugai, G. (2011). Increasing teachers’ use of praise with a response-to-intervention approach. Educ. Treatment Children 34, 35–59. doi: 10.1353/etc.2011.0004
*Nelson, J. A. P., Caldarella, P., Young, K. R., and Webb, N. (2008). Using peer praise notes to Increase the social involvement of withdrawn adolescents. Teach. Except. Children 41, 6–13. doi: 10.1177/004005990804100201
O’Handley, R. D., Dufrene, B. A., and Whipple, H. (2018). Tactile prompting and weekly performance feedback for increasing teachers’ behavior-specific praise. J. Behav. Educ. 27, 324–342. doi: 10.1007/s10864-017-9283-6
Oakes, W. P., Schellman, L. E., Lane, K. L., Common, E. A., Powers, L., Diebold, T., et al. (2018). Improving educators’ knowledge, confidence, and usefulness of functional assessment-based interventions: Outcomes of professional learning. Educ. Treat. Child. 41, 533–565. doi: 10.1353/etc.2018.0028
Pérez, P., Gil, H., Artola, A., Royer, D. J., and Lane, K. L. (2023). Behavior-specific praise: empowering teachers and families to support students in varied learning contexts. Prevent. School Failure: Alternat. Educ. Children Youth 67, 83–90. doi: 10.1080/1045988x.2023.2181303
Pinter, E. B., East, A., and Thrush, N. (2015). Effects of a video-feedback intervention on teachers’ use of praise. Educ. Treatment Children 38, 451–472. doi: 10.1353/etc.2015.0028
Pisacreta, J., Tincani, M., Connell, J. E., and Axelrod, S. (2011). Increasing teachers’ use of a 1:1 praise-to-behavior correction ratio to decrease student disruption in general education class- rooms. Behav. Intervent. 26, 243–260. doi: 10.1002/bin.341
Ploessl, D. M., and Rock, M. L. (2014). eCoaching: The effects on co-teachers’ planning and instruction. Teach. Educ. Special Educ. 37, 191–215. doi: 10.1177/0888406414525049
Rathel, J. M., Drasgow, E., Brown, W. H., and Marshall, K. J. (2014). Increasing induction-level teachers’ positive- to-negative communication ratio and use of behavior- specific praise through e-mailed performance feedback and its effect on students’ task engagement. J. Posit. Behav. Intervent. 16, 219–233. doi: 10.1177/1098300713492856
*Rathel, J. M., Drasgow, E., and Christle, C. C. (2008). Effects of supervisor performance feedback on increasing preservice teachers’ positive communication behaviors with students with emotional and behavioral disorders. J. Emot. Behav. Disord. 16, 67–77. doi: 10.1177/1063426607312537
Reinke, W. M., Lewis-Palmer, T., and Martin, E. (2007). The effect of visual performance feedback on teacher use of behavior-specific praise. Behav. Modif. 31, 247–263. doi: 10.1177/0145445506288967
Reinke, W. M., Lewis-Palmer, T., and Merrell, K. (2008). The classroom check-up: A classwide teacher consultation model for increasing praise and decreasing disruptive behavior. School Psychol. Rev. 37, 315–332. doi: 10.1080/02796015.2008.12087879
*Rogers-Warren, A., and Baer, D. M. (1976). Correspondence between saying and doing: Teaching children to share and praise. J. Appl. Behav. Anal. 9, 335–354. doi: 10.1901/jaba.1976.9-335
Royer, D. J., Lane, K. L., Dunlap, K. D., and Ennis, R. P. (2019). A systematic review of teacher-delivered behavior-specific praise on K-12 student performance. Remed. Special Educ. 40, 112–128. doi: 10.1177/0741932517751054
Simonsen, B., Freeman, J., Dooley, K., Maddock, E., Kern, L., and Myers, D. (2017). Effects of targeted professional development on teachers’ specific praise rates. J. Posit. Behav. Intervent. 19, 37–47. doi: 10.1177/1098300716637192
Simonsen, B., MacSuga, A. S., Fallon, L. M., and Sugai, G. (2013). The effects of self-monitoring on teachers’ use of specific praise. J. Posit. Behav. Intervent. 15, 5–15. doi: 10.1177/1098300712440453
Simonsen, B., MacSuga-Gage, A. S., Briere, D. E., Freeman, J., Myers, D., Scott, T. M., et al. (2014). Multitiered support framework for teachers’ classroom-management practices: Overview and case study of building the triangle for teachers. J. Posit. Behav. Intervent. 16, 179–190. doi: 10.1177/1098300713484062
Simonsen, B., Myers, D., and DeLuca, C. (2010). Teaching teachers to use prompts, opportunities to respond, and specific praise. Teach. Educ. Special Educ. 33, 300–318. doi: 10.1177/0888406409359905
*Smith, S. C., Lewis, T. J., and Stormont, M. (2011). The effectiveness of two universal behavioral supports for children with externalizing behavior in Head Start classrooms. J. Posit. Behav. Intervent. 13, 133–143. doi: 10.1177/1098300710379053
Stormont, M. A., Smith, S. C., and Lewis, T. J. (2007). Teacher implementation of precorrection and praise statements in head start classrooms as a component of a program-wide system of positive behavior support. J. Behav. Educ. 16, 280–290. doi: 10.1007/s10864-007-9040-3
*Sutherland, K. S., Wehby, J. H., and Copeland, S. R. (2000). Effect of varying rates of behavior-specific praise on the on-task behavior of students with EBD. J. Emot. Behav. Disord. 8, 2–8. doi: 10.1177/106342660000800101
Teerlink, E., Caldarella, P., Anderson, D. H., Richardson, M. J., and Guzman, E. G. (2017). Addressing problem behavior at recess using peer praise notes. J. Posit. Behav. Intervent. 19, 115–126. doi: 10.1177/1098300716675733
Thompson, M. T., Marchant, M., Anderson, D., Prater, M. A., and Gibb, G. (2012). Effects of tiered training on general educators’ use of specific praise. Educ. Treatment Children 35, 521–546. doi: 10.1353/etc.2012.0032
U.S. Department of Education (2021). Supporting child and student social, emotional, behavioral, and mental health. Washington, D.C: U.S. Department of Education.
Keywords: behavior-specific praise (BSP), professional learning (PL), evidence-based practices (EBPs), positive behavioral intervention and supports (PBIS), applied behavior analysis (ABA)
Citation: Pérez P, Royer DJ, Buckman MM and Lane KL (2025) Supporting educators to implement behavior-specific praise: a research synthesis. Front. Educ. 10:1443322. doi: 10.3389/feduc.2025.1443322
Received: 03 June 2024; Accepted: 20 May 2025;
Published: 10 September 2025.
Edited by:
James Martin Boyle, University of Strathclyde, United KingdomReviewed by:
Sarah Wilkinson, University of Southern Maine, United StatesRhonda Miller, Coastal Carolina University, United States
Copyright © 2025 Pérez, Royer, Buckman and Lane. This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) and the copyright owner(s) are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.
*Correspondence: Paloma Pérez, aGtzNXNrQHZpcmdpbmlhLmVkdQ==
†ORCID: Paloma Pérez, orcid.org/0000-0001-9380-7054; David James Royer, orcid.org/0000-0003-2882-1049; Mark Matthew Buckman, orcid.org/0000-0001-9332-0940; Kathleen Lynne Lane, orcid.org/0000-0001-6364-838X