Skip to main content

ORIGINAL RESEARCH article

Front. Sports Act. Living, 20 January 2023
Sec. Sports Coaching: Performance and Development
Volume 4 - 2022 | https://doi.org/10.3389/fspor.2022.1066378

The 3Ps: A tool for coach observation

  • 1School of Health and Human Performance, Faculty of Science and Health, Dublin City University, Dublin, Ireland
  • 2Grey Matters Performance Ltd., Stratford upon Avon, United Kingdom
  • 3Moray House School of Education and Sport, The University of Edinburgh, Edinburgh, United Kingdom
  • 4Insight SFI Centre for Data Analytics, Dublin City University, Glasnevin, Dublin, Ireland

There is growing recognition of the value of “in situ” coach development practice across a variety sporting contexts. Unfortunately, however, there remains a limited number of tools available with which to observe coaching practice. In this study, we pilot and test a quasi-systematic tool for observation in the form of the 3Ps. Drawing on a range of representational perspectives, the theoretically neutral labels of “procedure”, “planning”, and “process” were developed for the purpose of holistic observation. In order to test the tool, a group of experienced coach development practitioners (n = 10) integrated the tool into their practice over a 12-month programme of professional development. Those participants subsequently took part in semi-structured interviews, in which they expressed a strong sense of acceptability, perceiving effectiveness and positive opportunity cost. We propose that the 3Ps tool presents a holistic and practically useful means of observing coaches’ professional judgment and decision making. We also suggest future directions for the researcher who seeks to generate evidence in a naturalistic coaching context.

1. Introduction

Recent years have seen significant growth of the role of the coach developer (CD), with conceptual and practical advances highlighting the value of coaches receiving support to develop their practice (1). The term CD has often been used interchangeably with coach educator, mentor and, in some cases, tutor. More recently, the role has been defined as: “expert support practitioners who plan for, implement, and sustain strategies and interventions in support of skilled performance in sport coaching” (2, p. 4). This definition distinguishes the scope of the CD from other roles [e.g., coach mentor – (3)]; and highlights the more active, embedded, and complex pedagogic role of the CD. Notably, given the inclusion of “expert” in the definition, it is important to consider how best this process can be achieved.

One approach to the advancement of coaching practice which is advocated in the literature for is professional judgement and decision making [PJDM – (4, 5)]. PJDM is built on the foundation of an “it depends” approach to practice: recognising the inherent complexity of the biopsychosocial circumstances of coaching, and the need to take into account a wide range of contextual and individual demands when coaching (6). Although founded on a decision-making perspective, PJDM recognises the messy social world of coaching practice, in which multiple agendas and priorities influence a coach's practice (7). Crucially, the approach adopts a “for” coaching perspective: one that seeks to equip coaches with the resources to enhance their practice (8), rather than examining the process from a theoretical standpoint. This distinguishes PJDM from other approaches in the coaching sphere, which have aimed to describe what coaching is based on its ties to a particular epistemological or ontological orientation (9). So, rather than aiming to model the coaching process, PJDM instead offers a philosophy for practice, and suggests a practically oriented basis from which coaches can make decisions (10). From this foundation, researchers have identified coach decision-making as an area of research interest (1113), and PJDM has been adopted as a model for practice in multiple domains – including forming the basis for coaching standards in high performance and coach development (2, 14).

Although sometimes misunderstood as a licence for practitioners to ignore research findings, PJDM is in fact the essence of research-informed practice – a concept more broadly accepted in learning and development (15). This recognises that tightly controlled experimental designs are unlikely to be possible, or generalisable, given the biopsychosocial complexity of coaching (e.g., 16). Operationalising PJDM relies on several core constructs (6), these include: engaging in nested planning with the bigger picture in mind (17); identifying intentions for impact (18); utilising declarative knowledge to support decision making (19); and adaptive intuitive practice, making adjustments that account for the broader whole (10). Specifically, when applied to the PJDM of the CD, there are a number of cognitive demands, including understanding of the context, the coach, adult learning, the curriculum, process and practice, and of self (20). Therefore, PJDM does not consider coaching, or coach development, to simply be a cognitive process. It draws on multiple perspectives in order to enhance the coach's ability to act in the messy social world (cf. 21).

1.1. Observing coaching practice

In order to optimise the process of coach learning, there has been increasing emphasis on “in situ” support, with CDs observing coaching practice alongside offering developmental input for the coach. Behavioural approaches have long been used for the interrogation of coaching practice, including early work observing the practices of John Wooden (22) and use of the Coaching Behaviour Assessment System (23). Subsequently, behavioural tools have been used both as a means of gathering empirical data to understand the work of the coach (24), and in practice for feedback to a coach. A more contemporary approach has been the Coach Analysis Intervention System [CAIS – (25)], proving popular both in research and in pockets of CD practice. The advantages of behavioural approaches have been emphasised because of its capacity to provide data on the work of a coach and how they interact with athletes (26). Similarly, behavioural analysis can act as a means of enhancing a coach's awareness of their actions (27), especially when a coach lacks the basic self-awareness necessary to engage in realistic discussions about their practice (28, 29).

Despite these advantages, however, several limitations also hold for behavioural analysis of coaching (30, 31). Although never intended to do so by behavioural analysis, behaviours cannot be used to predict effective coaching, nor do they offer a holistic view of coaching practice (32). This lack of holism presents a challenge for those looking to understand the relation between a coach's learning design, their coaching approach, and the experience of athletes (e.g., 33). Similarly, behavioural analysis has predominantly been used to observe coaches during training sessions (26) and, as such, it cannot cater to a fuller range of activities, especially when coaching practice extends beyond the track/field/court to a range of “off-field” settings (34). Naïve application has also seen behavioural analysis used to enforce “the right way” of coaching. Although strongly discouraged in the literature (35), it is something that remains a feature of the practical coaching discourse and an active discussion point among policy makers who seek to audit coaching practice. The limitations of this naïve approach have long been recognised, with the suggestion that behavioural analysis should instead be used to check for congruence against a coach's intentions (36).

More recent scholarship advocates for the use of dialogic pedagogy when interrogating a coach's practice (37). This dialogic approach may allow for researchers to utilise some of the tools that have been developed with the purpose of helping coaches to plan for and reflect on their practice, for example, the Coaching Planning, Practice and Reflective Framework (38, 39), and the “Big 5” structured approach to critical reflection (40). However, if these are employed as a means of generating feedback, there may be limited utility for the coach who is more fully aware of the pedagogical strategies that they are deploying (41). In short, whilst offering utility, behavioral observation behavioural observation remains subject to the critique that:

When the tasks that people are doing are complex, it is not enough to simply observe people's actions and behaviours—what they do. It is important to find out how they think, and what they know, how they organise and structure information and what they seek to understand better (42, p. 3).

For both practice and research, this leaves a choice of observational tools that all use behavioural analysis as a base, with all its acknowledged strengths and limitations. Until now, no alternative observational tool has been developed that supports the CD, coach, or researcher in critically observing practice through a PJDM lens. In essence, there is no approach that enables a view on the most fundamental feature of PJDM: the “why?” of practice (43). Therefore, this study had two overarching aims. First, to generate a tangible observation framework of coaching practice, grounded in a PJDM approach. Second, to explore the practical acceptability of the tool in an applied setting (acceptability being: “the extent to which people delivering or receiving [an] intervention consider it to be appropriate, based on anticipated or experienced cognitive and emotional responses to the intervention” (44, p. 4)). Specifically, we wanted to understand concurrent and retrospective acceptability for observing coaching practice. The focus for our research was therefore on whether practitioners understood the nature of the intervention, the opportunity cost compared to other approaches, and the perceived effectiveness and impact on the practitioner (44).

2. Method: piloting the tool

Drawing on the PJDM perspective, we aimed to develop a practical tool that offered utility to practitioners across a range of perspectives, without tying the CD to a specific decision-making paradigm. As such, the 3Ps (procedure, planning, and process) are theoretically neutral labels with which to categorise elements of a coach's practice. They were developed by drawing on a range of representational theories (cf. 45), including information processing (46), macrocognition (47), and literature from the active inference perspective (48).

Earlier conceptualisations of PJDM drew on both the dual systems approach to decision making and constructs from the macrocognitive paradigm (5). The dual systems perspective suggests that system 1 decisions are fast, frugal, and intuitive; with system 2 decisions being slower and more deliberative (49). Macrocognition offers a variety of cognitive functions used by practitioners including recognition primed decision making, sensemaking, projection, knowledge construction and flexecution (47). Between dual systems and macrocognitive perspectives, there is coherence and agreement on the core elements, with differences being more “emotional” than intellectual (50, p. 518). The key difference is that, by capturing a range of different cognitive processes, the macrocognitive perspective promotes a view that expert decision-making is not a reduction of bias or mistakes, but instead rests upon on discoveries, insights, and the use of mental models in action (51). Across both bodies of work, there is also recognition that faster, more intuitive processes are not distinct from more deliberate processes, but are interlinked, and that actions typically result from the relative weighting of both systems (52).

Since the development of the original PJDM work, there has been significant growth in the active inference paradigm, an approach with different epistemological roots (53). Active inference explicitly seeks not to challenge previous psychological frameworks, but instead to underpin key constructs from the psychological literature (54). Active inference suggests that all human behaviour follows the imperative of minimising the surprise of sensory observations with the active control of action–perception loops.1 Behaviour is theorised to result from deliberative and habitual processes, the contribution of each depending on the level of experience in context and relative investment of cognitive resources (54). Deliberative processes are those oriented towards the reduction of uncertainty for epistemic purposes, they are more versatile, but slower, and more costly (55). Habitual processes are those based on stimulus–response associations, considered to be fast, but inflexible, and promoted by increased experience in a specific context (56). Habit formation can also occur through the observation of goal-directed behaviour and the engagement of deliberative processes (57, 58).

At their root, these perspectives clearly offer different perspectives; but a degree of overlap would seem face valid (e.g., 50, 54). Any attempt to amalgamate different positions may lack coherence at their ontological roots; yet, from a pragmatic perspective, where one is interested in the “cash value” of knowledge (59), this overlap may be of value to practitioners (60). Thus, in real-world settings, we suggest that there is value in a multiple model approach (61). On this basis, we wanted to develop an approach that directly addressed the issue of moving beyond what a coach does, to an understanding of their decision-making processes, incorporating both deliberative and more intuitive decisions. As such, the 3Ps are not novel psychological constructs, but instead theoretically neutral labels representing the different processes that a coach might engage in. These, in turn, provide lenses through which a coaching environment can be observed, both in and out of session, allowing for inferences to be made of a coach's mental world. This acknowledges the inherent limitations of observational analysis and the subsequent interpretive role of the observer. It encourages observers to cast themselves in the mental world of the coach and athlete during observation, with the aim of better understanding a coach's practice. The Ps are defined, and examples of the underpinning theories for each element given, in Table 1.

TABLE 1
www.frontiersin.org

Table 1. An overview of perspectives informing the 3Ps.

The tool explicitly recognises the more competency-based and administrational elements of practice that could fairly be seen as “best practice”; that is, the basic competencies that could be identified as being demonstrably “good” or “bad” – termed here as “procedure”. “Planning” groups the slower, more deliberative elements of practice and expertise, while “Process” refers to the more intuitive elements that underpin adaptive progress towards intentions (75). Built into the approach is an acknowledgement that there is no arbitrary line between the constructs categorised as either process or planning. Indeed, any action taken by a coach is likely to have elements of faster, more intuitive elements, and slower, more deliberative underpinnings. This is explicitly recognised by all of the constructs that inform the 3Ps structure. We can, however, make useful generalisations about styles of cognition, which has also been the case in the literature (52). Categorisation is a matter for the professional judgement of the observer and should be subject to later triangulation. In addition, there is no arbitrary line to be drawn between a coach's approach on or off the dojo/court/pitch/track. For example, a feedback conversation with parents following a training session will rely on the same processes as those used during the session. In practice, this enables CDs to observe all elements of coaching, within and beyond the training or competition context.

2.1. Intervention and testing the 3Ps

Following the development of the framework, we wanted to pilot the use of the 3Ps tool, testing its utility and acceptability in coach development practice. Adopting a similar approach to one previously used to pilot coach development tools, the study was conducted in the context of an advanced CD training programme (40). A participant group of 10 experienced practitioners each opted into the development programme via their national sporting organisation (9 male and 1 female, Mage = 40.5, SD = 6.74), with a mean 21.4 years’ coaching experience (SD = 5.53), and a mean 11.8 years’ experience working to support coach learning (SD = 4.81). All participants held previous qualifications as a coach educator or mentor, eight held the highest coaching award available in their main sport, and seven held master's degrees in an area related to sports coaching. Participants in the sample were supporting coaches on the full spectrum of experience, from relative novice to highly experienced professional; and across the domains of children's participation, talent development, and performance (35). Given the limited population of coach development professionals, no further demographic information is offered in order to protect the anonymity of the participants.

Over the course of a 12-month programme of development, participants engaged in training to support their practice as CDs. This included support for broader elements of CD practice such as: understanding PJDM as an approach, various pedagogical approaches, and feedback/debrief processes. Specifically, it also offered an introduction to the 3Ps, their theoretical underpinning, and suggested means of practice from the author team. Part of this support advised CDs that observations could be conducted in multiple ways. For the purpose of the 3Ps, CDs were advised to apply a structure to their field notes, based on a split of three columns, one representing each category. In addition, CDs were advised to add a time stamp to each observation, enabling the capture of observations made prior to what would traditionally be considered “in session” or competition (e.g., conversations with athletes and other coaches prior to training). In order to support the multiple lens approach, CDs were advised to deliberately switch between different frames of reference during observation, in line with each “P”. It was recommended that CDs review these field notes following an observation, with the aid of any video or audio captured from the coaching environment. Concurrent with this “training”, CDs had the opportunity to embed the 3Ps in their practice. Support was offered by the first and second authors (JT and ÁM) acting as “meta coach developers”, with two “in situ” visits to observe the practice. These visits were supported by follow-up training interventions on other elements of CD practice, along with retrieval of initial ideas. Although by opting into the programme the participants could be considered as motivated to develop their professional practice, they were under no obligation to integrate the 3Ps in their work, nor deviate from their normal practice.

2.2. Data collection

Following protocol approval by the Dublin City University Ethics Committee, the first and second authors (JT and ÁM) conducted semi-structured interviews with the CD participants to understand their existing practice, concurrent and retrospective acceptability of the tool, and creative practice. A semi-structured interview guide was constructed in order to gauge each participant's understanding of the intervention, opportunity cost compared to other approaches, perceived effectiveness, and impact on self-efficacy of the practitioner. Specifically, we sought to understand (i) the CD's previous observational practice; (ii) if and how they had used the 3Ps; (iii) changes to their coaching observations; (iv) if they have made adaptations to the tool; and (v) perceived changes in their wider professional practice. Interview questions included: “Can you outline your historic approach to the observation of coaches?”, “If you have, how have you used the 3Ps in your practice?”, “Has your practice changed in anyway because of the 3Ps?”, “Have you, or do you foresee any advantageous adaptations of the tool?”, “What are the relative strengths and weaknesses of the tool?” Interviews were conducted following the completion of the wider development programme, in order to reduce any perceived need for impression management. It was made clear to participants that they were in no way obliged to take part in the research as part of the development programme. However, the richness of interview data was enhanced by the history of interaction between researchers and participants; and the accompanying understanding of professional practice and their social context. All interviews were conducted using video-conferencing technology (Zoom Video Communications, San Jose, CA, USA, Version 5.7) at a time to suit the participants. The interviews were audio-recorded and subsequently transcribed verbatim.

2.3. Data analysis

Due to the relative novelty of the subject, a deductive and then inductive content analysis was conducted to examine the acceptability of the 3Ps. This process reflected the assertion of Braun and Clark (76) that coding and data analysis involves a combination of processes. The deductive phase enabled the identification of information to address the research question; the inductive phase ensured the analysis was open-ended and that respondent-meaning was emphasised. Data analysis using QSR NVivo software took place in five steps: (1) Following transcription, the transcripts were read and re-read for familiarisation. This enabled the first author (JT) to become more familiar with the content, with notable material being highlighted and annotations made. (2) In this next step, quotations were deductively tagged in relation to (a) whether the 3Ps was a tangible observation framework of coaching practice, grounded in a PJDM approach, and (b) the extent to which a group of experienced CD practitioners accepted the tool. (3) Following this, a thorough inductive content analysis was performed, which involved moving recursively between open coding, focused coding and organising categories into higher order themes. (4) Constant comparison across different participant interviews and critical reflection was used to guide this analytical process (77, 78).

2.4. Trustworthiness

As is recommended by Nowell and colleagues (79), there is a need to demonstrate that qualitative research is conducted methodically and rigorously. In doing so, trustworthiness can be achieved through establishing credibility, transferability, dependability, confirmability, an audit trail, and deploying a reflexive approach (79). This led to the second author (ÁM), another experienced qualitative researcher, acting as a theoretical sounding board to encourage reflection and consideration of the broader dataset (80). The third author (DC) also acted as an overall critical friend, challenging broader theme-generation (81). In addition, the first author (JT) maintained a reflexive journal for the purpose of reflecting on the power dynamics during data collection, and the role of the researcher in shaping knowledge-generation. The diary also served as an audit trail, documenting methodological steps, the interpretation of data, and managing the analysis process (79). We ask the reader to judge the credibility and transferability of findings, based on the significant use of participant data and the thick descriptions presented in the results.

3. Results

Results are represented by six themes, supported by direct quotes from CDs throughout. CDs described their previous practice, the impact on their “in situ” practice, the impact on broader professional practice, the perceived advantages and disadvantages, and any innovative use of the tool.

3.1. Previous practice

CDs described the use of three predominant approaches to their practice: the use of competency checklists, structured behavioural observation, and unstructured general observations (Table 2). CDs reported a general sense of dissatisfaction with these approaches, with a feeling that no single observational tool offered them what they needed to enhance coaching practice. For example, some CDs reported the naïve use of behavioural tools: “we had behaviours that we look for, you know, questioning and letting players explore. I would count up how many times they used good behaviours and bad ones” (CD10).

TABLE 2
www.frontiersin.org

Table 2. Historic use of observational strategies by CDs.

Given the experience of some of the CD group, a number reflected on the changing nature of their practice over time. As a result, some had used various approaches over the time that they had been a practising coach educator or developer. For example, CD3:

I’ve been around long enough to go through different processes. Initially, it was a tick box based around what the coach did and how they did it, the coach had to evidence being competent within each stage to get a tick. That process has changed over the last three four years to using a behaviour template.

3.2. Impact on “in situ” practice

For CDs, the 3Ps were utilised in three predominant ways: first, as a tool for structuring their observation of coaching practice; second, as a means to inform their feedback, or debrief processes; and third, as a frame through which to enhance their overall sensemaking of the coach's unique circumstances. These themes are presented in Table 3.

TABLE 3
www.frontiersin.org

Table 3. Impact of 3Ps on “in situ” practice.

3.2.1. Structuring observation

In the case of acting as a tool for structured observation, CDs placed a high value on the use of the 3Ps:

It is really effective for structured observations and will give me most of the things that I needed from the session and the coach's environment … that might be why it has been so well received; it's made a big difference to our practice (CD4).

The 3Ps allowed CDs to deliberately change frames of reference in observation, for example, CD7 suggested that: “rather than trying to look at everything, it really helps me organise what I am looking at, it helps my mind be more organised, almost like drawers to open and think a different way”. A significant part of this focus allowed for the observer to consider how a coach's intentions and mental processes were influencing practice. The deliberate switching of lenses seemed to characterise CD's adoption of the tool, seeking to understand the same events, but from different perspectives. It also seemed to support CD's metacognition and monitoring of what they were attending to:

It allows you to think “why did I see that?” Why did the coach ask that question instead of doing it this way? And ultimately, I think this really helps me to impact the coach later (CD2).

Similarly, CD9 described how the structure influenced metacognitive monitoring of their own approach:

The biggest thing is that I’m zoomed out for longer. Before I’ve been inclined to pay attention to one thing. Now, committing less to an area when it pops up in the session, but also able to look at the whole. I guess, a zoomed in and zoomed out approach isn’t it. I look at the full detail and breadth of what's happening. It helps me get into a specific part of the session, or much broader, into the whole environment.

It was this attention to the wider coaching environment that was perceived to be of significant benefit, with the CD not only observing within a session, but observing more broadly and considering the wider context of the coaching environment: “the structure really helps me, not just in the session, but also the rest of the environment” (CD10). This process in turn, seemingly promoted a holistic means of observing coaching practice.

3.2.2. Sensemaking

The breadth of the 3Ps structure allowed for a wide range of data to be collected from observations. Along with in-observation sensemaking, prompted by the nature of the categorisation process, post-observation sensemaking was supported. The tool's structure allowed for a review, both of notes taken and of video footage of coaching observations:

I revisit my notes and the video afterwards. They [3Ps] have allowed me to think again about what I saw, to really ask the question of why things are happening and I’m taking time to do that … I had this really difficult session with a coach, I didn’t know where to start because there were so many issues. The 3Ps enabled me to get the coach thinking about specific areas without me really having to go any deeper until I had the time to reflect and debrief for myself (CD2).

This reflection and sensemaking also took place in slower time, considering the needs of the coach and where the session fitted in the broader context. This was often described by CDs as a process of deciding which data to use with the coach given the volume of potential avenues for further investigation: “I have to narrow things down to make sure they get some sort of coherence from me” (CD8).

3.2.3. Informing post-observation learning strategies

The third use of the 3Ps tool was in crafting CDs' pedagogic approach for the coach in the immediate follow up after an observation, with the framework directly informing their feedback or debrief processes. This allowed CDs to steer conversations with coaches, keeping a bandwidth on the number of focal points for reflection, and to debrief with good judgement: “previously, I was being a bit too judgmental, and giving my own opinion, whereas now, I’ve got options in the debrief” (CD2). In some cases, CDs chose to explain the tool to the coaches that they worked with:

I’ve used it to help structure feedback. Rather than bouncing all over the place, I have explained to them and asked them to consider their thought process, almost auditing what I saw. This bit of debrief gets them reflecting and we can then look at the concepts that I have observed (CD6).

Explaining the framework to their coaches enabled greater understanding and aided the coach's reflective processes.

After working with a coach for a while, I asked them to go through the 3Ps themselves when they were reviewing a session. They used the framework, watching it [the session] back and making notes for discussion (CD10).

3.3. Impact on professional practice

In addition to shaping their practice with the coach, using the 3Ps also seemed to influence CDs' broader practice, encouraging them to approach the development of coaches in subtly different ways. CDs described other impacts on their work, indeed shifts in their professional philosophy as a whole, and how the tool impacted their perceptions of their role. These themes are presented in Table 4.

TABLE 4
www.frontiersin.org

Table 4. Impacts on professional practice.

3.3.1. PJDM of the coach

The first and most prominent area of change for CDs was the shift from observing only what a coach did, to observing with an interest in why the coach was taking the actions that they were:

It's [the 3Ps] way more contextually relevant, and more personally relevant to the coach. I’m really trying to understand why they’re doing what they’re doing, using my understanding of the coach and their circumstances (CD7).

The consequence of the focus on the PJDM of the coach allowed for the CD to support the development of a coach's knowledge in their context and not in the abstract:

It has significantly impacted the value I put on hearing from the coach, treating them as a person and a professional. It is now about working with their embedded knowledge, helping them to reflect on their practice and their beliefs. It's changed my view away from just thinking about best practice (CD8).

This shift towards a more contextualised understanding of the coach and their work seemed to give the CDs a greater insight and a perception that they were better able to support individuals' learning and development.

3.3.2. Focus on longer term impact for the coach

The second theme concerning impact on the CDs' work was the view that using the 3Ps enhanced the ability of CDs to see beyond the confines of a specific session and identify areas of impact for the coach. This was partly a result of putting themselves in the mental world of the coach and using data captured to consider the broader impact on the coach's learning and development. For example, CD8:

It has helped me move from being very procedural in my practice, to a focus on the “why?”. The epistemic stuff. I was at a training session with a coach last night. It was very structured, very drill based. There's nothing inherently wrong with that, but it was clear the girls were just going through a practice without any purpose…. We had a really good conversation afterwards about his reasoning, it really helped me get into and understand his beliefs (CD8).

This necessitated a change from offering a personal perspective on what the CD would do if they were the coach, to an interest in longer-term impact, helping the coach to reflect on their work based on longer-term needs: “previously I observed and just gave my opinion. Using the 3Ps, it focuses me a wee bit more on the bigger picture” (CD2).

3.3.3. Informing PJDM of the coach developer

The final change in practice concerns the shift in the PJDM of the CD. In all cases, CDs described a sense of being better-equipped to support coaches and being more informed in the decisions that they took to support the coach. For example, in the case of CD4:

It informs me formulating logical steps. I am thinking two, three, four steps ahead. Those three, four steps ahead are in the back of my head, I'm thinking: “how is the coach going to react?”. Which bit do I generate feedback on first? (CD4).

CDs expanded on this view and described the feeling that it enabled them to meet the needs of more challenging coach cases:

With the tricky ones [coaches] I always used to get stuck thinking “where do you start?” This can take you as deep as you need to go. It really helps with my process … whilst the 3Ps might never be spoken about between me and the coach (CD4).

CDs described using greater degrees of flexibility and creativity in their practice, using the levels of data generated to meet the bespoke needs of the coach. This was especially the case for those CDs who had previously either relied on competency-based approaches, or naïve interpretations of behavioural observation: “it invites a more flexible approach from me. I can use different tools afterwards. It isn’t just: ‘you did this, you did that’, or ‘you didn’t tick this box’. It gives you options” (CD3). Part of this option-generation came from CDs using the 3Ps to influence their pedagogic process following the observation. This typically took the form of structuring feedback or debriefing processes immediately following observations:

I feel it helps with my approach as a CD. I really value practical, intimate coaching conversations with the coach. It is potentially my bias, but I feel this [3Ps] helps me do it. I can have a coherent flow to my work (CD6).

This, in turn, seemed to influence a consideration of options: “I’m much more confident providing or generating feedback or debriefing effectively. This [3Ps] gives me options and coherent routes to follow, I have really genuine options” (CD1). In essence, it seemed to offer possibilities for informing an appropriate pedagogic strategy, whether guiding a coach through a debrief, offering feedback, or generating feedback.

3.4. Perceived opportunity cost of the tool

As a feature of the pilot, CDs were asked to consider what they considered the opportunity cost of the 3Ps to be, compared with other coach observation tools they had experience of (as outlined in Table 5). As identified earlier, the approaches that CDs were aware of included competency checklists and various forms of behavioural observation.

TABLE 5
www.frontiersin.org

Table 5. Perceived opportunity cost of the 3Ps.

In comparison with a competency-based approach, there was a strong perception that the 3Ps tool offered significantly more for CD practice. The key advantages being described as the adaptability of the tool and its ability to meet the needs of the coach and their context. One negative point that was identified was the potential for governing bodies to prefer a universal standard against which to assess every coach, regardless of age or stage. Another negative was the perception that the 3Ps tool is not something that could be used without a level of expertise.

A more nuanced view of behavioural approaches was offered. Behavioural observation was discussed by CDs as a different tool, for a different purpose. In some cases, CDs described naïve applications of behavioural analysis as a contrast. Though, in comparison with less naïve behavioural approaches, CDs welcomed the flexibility and holism that the 3Ps offered. The tool also appeared to be favoured for its ability to generate a broader range of data, both across the environment, beyond what the coach did, and understanding why a coach chose an approach. CDs also described being able to move beyond observing a coach's behaviour alone, to considering their learning design and the technical/tactical components of their coaching: “It helps me go beyond coach behaviours. It gets into other areas of coaching, the technical, tactical and their practice design and what's going on with the athletes” (CD10). There was however a perceived disadvantage to the use of the 3Ps tool when working with coaches who lacked self-awareness in their approach and who were resistant to the input of the CD. In this regard, it was perceived that the more “objective” data generated by systematic behavioural analysis might be useful for offering feedback to a coach.

3.5. Innovative use of the tool

The final area of interest in piloting the tool was to understand if the 3Ps had been used in creative or innovative ways. Most CDs suggested no significant adjustments to the tool. Where CDs had innovated or foresaw changes in their use of the tool these were additive, rather than adjusting the structure of the tool. As an example, CD1, who had experience of behavioural observation, discussed a blend with the 3Ps for coaches who lacked self-awareness or saw their practice paradigmatically differently to the CD:

If they are miles off in their self-awareness, I might start using the 3Ps with some behaviour tracking. I might run both at the same time, as long as I’ve got video, but I think I will always start with the 3Ps and use behaviours in the procedural bracket if I have video to refer back to (CD1).

This additive combination was also noted by CD9 who suggested: “if there is a clear need, I have used other frameworks like the Coach Planning and Reflection framework. But probably only when the focus has been narrowed down a little bit”. Others suggested a potential expansion of the tool, allowing for a more fine-grained analysis of observations. This was proposed as a means of either contributing to observation at the time, or, for greater depth, utilising video footage for slower analysis: “you could have subsections in each of the factors. So, you could have macro, meso, micro aspects of planning. For procedure, you could have subsections to analyse coaching behaviour” (CD6).

Other innovations included the narrow use of the tool to focus entirely on a particular area of practice based on previously identified needs, and how this might influence what a CD may observe: “I might just take one of the elements, for example planning to generate feedback. It changes the emphasis on what you might want to observe, like a planning meeting” (CD9). Others suggested that after working with a coach for a while, they might ask them to observe the session using the approach, both as a means for generating a different type of feedback and also as a check for coherence between the CD and the coach: “after working with a coach for a while, I can see myself asking the coach to go through the 3Ps themselves when they are reviewing the session. Watching it back and making notes for discussion” (CD10).

4. Discussion

The purpose of this study was to pilot a research-informed tool for the purpose of coach observation, testing its utility and acceptability in CD practice. Findings support the practical utility and acceptability of the tool in a small cohort of experienced CD practitioners. In all cases, the tool was perceived as highly acceptable by participant CDs, who discussed changes to their professional practice as a result of the trial. We have also identified significant strengths of the tool, with it seemingly offering a more holistic and flexible view of coaching practice. In addition, with a pragmatic orientation in mind, we aimed to understand the opportunity cost of the tool in applied practice, relative to other observation approaches.

4.1. Professional judgement and decision making (PJDM)

Aligned with the purpose of the tool, there was a strong view that it enhanced CDs’ ability to enter the mental world of the coach and make inferences about the cognition behind their action. This seemed to enable CDs to move beyond what they were seeing and support active sensemaking, forming tentative hypotheses for coach decision making (66). As a result, we would suggest that the tool allows CDs to observe and infer coach decision making on multiple levels. It appears that the 3Ps may enable a fuller understanding of a coach's needs. While it may be less useful for coaches with low self-awareness of their practice, or those who are less open (82), the perception of utility held for coaches across a spectrum of practice from beginner to expert. Where the emphasis for a beginner coach might sit more at the procedural end, developing basic competencies, the focus for more experienced coaches might be more on the intuitive or deliberative elements of coaching expertise (83). Therefore, in addition to observation, CDs felt that the tool offered a frame with which to make sense of the needs of the coach, along with informing the pedagogic strategies that they themselves might deploy following the observation (65). As such, the 3Ps tool could inform a flexible approach to CD practice, where professional judgments could be made regarding the data used to offer, or generate, feedback. In essence, appropriate use of the tool seemed to inform how the CD actively shaped the learning experiences of coaches (84). CDs also believed that the tool had a broader impact on their professional practice than “in situ” observation of coaches alone. There was a sense that use of the tool enhanced professional effectiveness by helping the CD's to focus on reasoning strategies and underlying belief (85). This focus was perceived to help CDs engage coaches in more transformative reflective practice (cf. 86).

4.2. Holism

In addition to this perception of improved effectiveness, CDs also described a greater sense of holism, taking the view that the 3Ps encouraged a greater breadth of observation. This allowed for the capture of a coach's practice beyond discrete actions. The 3Ps tool may encourage a broader view, considering the meaning behind a coach's approach. There was also a sense that it enabled observation beyond the confines of a training session or competition, into wider coaching settings, for example classroom sessions (87) or “offline” work with other coaches and staff (17). It also allowed CDs to consider the nuanced interpersonal dimensions of coaching practice, for example, why a coach might be approaching a given interaction with a particular affective tone (88). That is, the tool offers the potential for the CD to infer the social intuition of the coach, the “rapid and automatic evaluation of another person's cognitive and/or affective state” (69, p. 308). In essence, allowing CDs to quasi-systematically observe a coach's learning design (89), their pedagogic approach (90), interpersonal dynamics (91), and infer the experience of athletes (33). Parallel consideration of these interlinking constructs presents the ability to reflect on these observations through multiple lenses, including the pedagogic, various “ologies”, and, where appropriate, the technical and tactical (19). As an example, the recent move to conceptualise feedback as a process, rather than something the coach does (cf. 92, 93).

This holism would seem to be a particular advantage for moving forward from the conceptualisation of coaching being something that the coach does, to a bi-directional, or indeed multi-directional, process. The 3Ps tool can offer a more holistic view of coaching, taking account of domains that previously might not have been structured elements of the CD's observation, and so offering an additional dimension to CD practice and research. Of all these elements, it is the technical/tactical dimension that has not traditionally formed part of CD practice. To be clear, we make no suggestion that the tool fundamentally changes the CD role frame. Instead, that the tool offers significant flexibility depending on the expertise of the CD and the nature of the CD–coach contract.

4.3. A tool for practice

The genesis of the 3Ps tool was a matter of practicality, aiming to offer a research-informed approach to the observation of practice primarily for the use of CDs. In considering the opportunity cost of using the tool CDs, while emphasising the broader utility of the tool, noted the challenge of using the 3Ps with a coach who lacked any self-awareness in their practice (29). This may be an issue if optimal impact with a coach is seeing a disparity between what they think they did and what they did. For the CD, it may be the case that using tools like the CAIS (25) and subsequent quantitative feedback may prove useful in moderating this tendency (29). Importantly, in no cases was the use of a competency-based approach identified as having advantages over the 3Ps, with the exception that it may provide a false comfort of apparent standardisation for national governing bodies.

For the CD, there is a need to see where and when various tools are most appropriate, and how they might inform observational practice. In both cases, it is important to note that CDs should use the 3Ps in a qualitatively different manner, for different purposes than other observational approaches. As noted by one of the participants, the 3Ps should be used as a window into the PJDM of the coach, informing the future direction of CD practice. As an observational tool, it can provide a range of data with which to tackle the cognitive demands faced by the CD (20). Given the context of many coaching settings, the 3Ps may also enhance the observation of whole coaching teams, rather than just the individual coach. The tool offers the flexibility of multiple lenses on the coaching process, allowing the observer to deliberately adopt different foci. Finally, there is the question of how much support CDs require in order to use the tool – as highlighted, it appears to require a significant depth and breadth of knowledge on the behalf of the observer (19). In practice, the 3Ps cannot be used as a formula (cf. 94), effective use will rely on the breadth of knowledge and expertise identified as a minimum standard for effective CD practice (2).

4.4. A tool for research

In addition to practical use, we suggest that the 3Ps tool also presents significant potential for evidence-generation in coaching. As a naturalistic research tool, it is not designed to provide the type of evidence emanating from tightly controlled experimental designs (68). As such, for the purpose of research, the approach should not be seen as a replacement for behavioural observation, which offers a more controlled and systematic, but less holistic, approach. The 3Ps tool actively recognises the interpretive role played by the observer in aiming to understand the mental world of the coach and the broader context. In this regard, if the aim is to understand the “why” of practice there is a need to understand what coaches are thinking, their knowledge, and how they make sense of events (31). Building on the reflections of Gallimore and Tharp (95) who suggest coupling systematic observation with qualitative methods for a richer account of coaching practice, future CD and research practice may wish to couple the 3Ps with other methods as a means of triangulation and “thickening” data. For the CD, this may be as simple as engaging in reflective questioning with a coach (37, 40, 86). However, for the researcher, aiming to address the long-identified lack of evidence from real-world settings (35) could involve triangulation using cognitive task analysis tools (96). Specifically, tools such as the critical decision method (97) could be coupled with observation to deepen insight. Depending on the style of research, or CD work, it may be appropriate for various combinations and adaptations of CTA tools to be used – for example, a critical decision audit to understand the coach's perspective on observed practice and the knowledge sources that underpin their work (98). This points to the opportunities presented by the various tools that have been developed for knowledge elicitation and how they might be combined with observations of coaching practice (99). In short, there appear several potential future applications of the 3Ps tool for both research and practice, each offering a fuller capture of a coach's naturalistic cognition and their practice.

4.5. Limitations

It should of course be recognised that there are a number of limitations presented by this study. First, although there are several advantages conferred by established relationships in qualitative research (100), it is important to consider the potential for impression-management influencing data collection (101). Deliberate steps were taken to mitigate this with interviews timed to take place after the conclusion of the CD development course and the programme review. To further combat this, we sought the views of participant CDs on the relative utility of the approach, against other common approaches, in accordance with the notion of acceptability (44). It is also important to acknowledge that the participants using the tool engaged in professional development that enabled them to build the declarative knowledge necessary to use the tool in practice. As such, it is unlikely that practitioners or researchers who wish to simply pick up and use the tool without this declarative understanding will have the same experience. Therefore, although the universal support for the 3Ps offers clear evidence for the acceptability of the tool, it may be the case that further research and validation among other populations are necessary. Finally, the potential for confirmation bias on the behalf of the research team is clear – for this reason, thick participant data is used throughout the results section to enhance credibility (79).

5. Conclusion

This paper has presented a novel approach to the observation of coaching practice with an emphasis on the PJDM of the coach. Findings suggest that the 3Ps tool may provide an opportunity to expand our understanding of PJDM in practice (5, 6) and support a base of evidence in practice that accounts for the context of particular approaches. While not suggesting that coaching is solely a matter of individual cognition, the framework was designed as a pragmatic tool to support the observation of coaching practice and make inferences about the PJDM of the coach. The tool was generated through an evidence-informed approach, drawing together multiple strands of representational research and professional practice. From a cohort of experienced CD practitioners, who had previously engaged in a 12-month programme of professional development, the tool received universal support for its acceptability, with high levels of understanding, positive reflections on opportunity cost relative to other approaches, and effectiveness of approach. We therefore suggest that the 3Ps may become a useful feature of the CD's and researcher's work, offering a tool with which to observe practice and support the development of expertise (102).

Data availability statement

The datasets presented in this article are not readily available in order to protect the anonymity of participants. Requests to access the datasets should be directed to jamie.taylor@dcu.ie.

Ethics statement

The studies involving human participants were reviewed and approved by DCU Research Ethics Committee (REC) reference: DCUREC/2022/040. The patients/participants provided their written informed consent to participate in this study.

Author contributions

JT, ÁM, and DC contributed to conception and design of the study. JT and ÁM conducted the interviews. JT, ÁM, and DC contributed to data analysis as outlined in the method section. JT wrote the first draft of the manuscript. All authors contributed to the article and approved the submitted version.

Funding

Funding received for open access publication from Dublin City University.

Conflicts of interest

JT, ÁM, and DC were employed by Grey Matters Performance Ltd.

Publisher's note

All claims expressed in this article are solely those of the authors and do not necessarily represent those of their affiliated organizations, or those of the publisher, the editors and the reviewers. Any product that may be evaluated in this article, or claim that may be made by its manufacturer, is not guaranteed or endorsed by the publisher.

Footnote

1For a more detailed exploration, we refer the reader to Linson et al. (53).

References

1. Allison W, Abraham A, Cale A. Advances in coach education and development: from research to practice. Oxon: Routledge (2016).

2. CIMSPA. Coach Developer Standard V1.0 (2021). Available from: https://www.cimspa.co.uk/standards-home/professional-standards-library?cid=18&d=485.

3. Leeder TM, Sawiuk R. Reviewing the sports coach mentoring literature: a look back to take a step forward. Sports Coach Rev. (2021) 10(2):129–52. doi: 10.1080/21640629.2020.1804170

CrossRef Full Text | Google Scholar

4. Martindale A, Collins D. Professional judgment and decision making: the role of intention for impact. Sport Psychol. (2005) 19(3):303–17. doi: 10.1123/tsp.19.3.303

CrossRef Full Text | Google Scholar

5. Abraham A, Collins D. Taking the next step: ways forward for coaching science. Quest. (2011) 63(4):366–84. doi: 10.1080/00336297.2011.10483687

CrossRef Full Text | Google Scholar

6. Collins D, Taylor J, Ashford M, Collins L. It depends coaching – the most fundamental, simple and complex principle or a mere copout? Sports Coach Rev. (2022):1–21. doi: 10.1080/21640629.2022.2154189

CrossRef Full Text | Google Scholar

7. Potrac P, Jones R. Power, conflict, and cooperation: toward a micropolitics of coaching. Quest. (2009) 61(2):223–36. doi: 10.1080/00336297.2009.10483612

CrossRef Full Text | Google Scholar

8. Collins D, Kamin S. The performance coach. In: Murphy S, editor. Handbook of sport and performance psychology. Oxford: Oxford University Press (2012). p. 692–706.

9. North J. Sport coaching research and practice: ontology, interdisciplinarity and critical realism. London: Routledge (2017).

10. Collins D, Collins L, Carson HJ. “If it feels right, do it”: intuitive decision making in a sample of high-level sport coaches. Front Psychol. (2016) 7(504). doi: 10.3389/fpsyg.2016.00504

CrossRef Full Text | Google Scholar

11. Collins L, Collins D. Integration of professional judgement and decision-making in high-level adventure sports coaching practice. J Sports Sci. (2015) 33(6):622–33. doi: 10.1080/02640414.2014.953980

PubMed Abstract | CrossRef Full Text | Google Scholar

12. Downes P, Collins D. Examining the roles and consequent decision-making processes of high-level strength and conditioning coaches. Societies. (2021) 11(3):76. doi: 10.3390/soc11030076

CrossRef Full Text | Google Scholar

13. Lyle J, Muir B. Coaches’ decision making. In: Hackfort D, Schinke RJ, editors. The Routledge international encyclopedia of sport and exercise psychology: volume 2: applied and practical measures. 1st Edn. London: Routledge (2020). p. 135–53. doi: 10.4324/9781315187228

14. CIMSPA. Coaching in high performance sport V1.7 (2019). Available from: https://www.cimspa.co.uk/standards-home/professional-standards-library?cid=18&d=463.

15. Neelen M, Kirschner P. Evidence-informed learning design: use evidence to create training which improves performance. London: KoganPage (2020).

16. Nash C, Taylor J. “Just let them play”: complex dynamics in youth sport, why it isn’t so simple. Front Psychol. (2021) 12. doi: 10.3389/fpsyg.2021.700750

CrossRef Full Text | Google Scholar

17. Taylor J, Collins D. The talent development curriculum. In: Nash C, editor. Practical sport coaching. 2nd Edn. Oxon: Routledge (2022). p. 77–91

18. Martindale A, Collins D. Enhancing the evaluation of effectiveness with professional judgment and decision making. Sport Psychol. (2007) 21(4):458–74. doi: 10.1123/tsp.21.4.458

CrossRef Full Text | Google Scholar

19. Nash C, Collins D. Tacit knowledge in expert coaching: science or art? Quest. (2006) 58(4):465–77. doi: 10.1080/00336297.2006.10491894

CrossRef Full Text | Google Scholar

20. Abraham A. Task analysis of coach developers: applications to the FA youth coach educator role. In: Allison W, Abraham A, Cale A, editors. Advances in coach education and development: from research to practice. Oxon: Routledge (2016). p. 53–65.

21. Cushion CJ, Armour KM, Jones RL. Locating the coaching process in practice: models “for” and “of” coaching. Phys Educ Sport Pedagogy. (2006) 11(1):83–99. doi: 10.1080/17408980500466995

CrossRef Full Text | Google Scholar

22. Tharp RG, Gallimore R. What a coach can teach a teacher. Psychol Today. (1976) 9:75–8.

Google Scholar

23. Smith RE, Smoll FL, Hunt E. A system for the behavioral assessment of athletic coaches. Res Q Am Assoc Health Phys Educ. (1977) 48(2):401–7. doi: 10.1080/10671315.1977.10615438

CrossRef Full Text | Google Scholar

24. Gilbert WD, Trudel P. Analysis of coaching science research published from 1970 to 2001. Res Q Exerc Sport. (2004) 75(4):388–99. doi: 10.1080/02701367.2004.10609172

PubMed Abstract | CrossRef Full Text | Google Scholar

25. Cushion C, Harvey S, Muir B, Nelson L. Developing the coach analysis and intervention system (CAIS): establishing validity and reliability of a computerised systematic observation instrument. J Sports Sci. (2012) 30(2):201–16. doi: 10.1080/02640414.2011.635310

PubMed Abstract | CrossRef Full Text | Google Scholar

26. Cope E, Partington M, Harvey S. A review of the use of a systematic observation method in coaching research between 1997 and 2016. J Sports Sci. (2017) 35(20):2042–50. doi: 10.1080/02640414.2016.1252463

PubMed Abstract | CrossRef Full Text | Google Scholar

27. Partington M, Cushion CJ, Cope E, Harvey S. The impact of video feedback on professional youth football coaches’ reflection and practice behaviour: a longitudinal investigation of behaviour change. Reflective Practice. (2015) 16(5):700–16. doi: 10.1080/14623943.2015.1071707

CrossRef Full Text | Google Scholar

28. Harvey S, Cushion CJ, Cope E, Muir B. A season long investigation into coaching behaviours as a function of practice state: the case of three collegiate coaches. Sports Coach Rev. (2013) 2(1):13–32. doi: 10.1080/21640629.2013.837238

CrossRef Full Text | Google Scholar

29. Partington M, Cushion C. An investigation of the practice activities and coaching behaviors of professional top-level youth soccer coaches. Scand J Med Sci Sports. (2013) 23(3):374–82. doi: 10.1111/j.1600-0838.2011.01383.x

PubMed Abstract | CrossRef Full Text | Google Scholar

30. Abraham A, Collins D. Examining and extending research in coach development. Quest. (1998) 50(1):59–79. doi: 10.1080/00336297.1998.10484264

CrossRef Full Text | Google Scholar

31. Crandall BW, Hoffman RR. Cognitive task analysis. In: JD Lee, A Kirlik, editors. The Oxford handbook of cognitive engineering. Oxford library of psychology. New York, NY: Oxford University Press (2013). p. 229–39.

32. Cushion CJ. Coach behavior. In: Lyle J, Cushion CJ, editors. Sports coaching professionalization and practice. London: Elsevier (2010). p. 43–62

33. Taylor J, Collins D. The highs and the lows – exploring the nature of optimally impactful development experiences on the talent pathway. Sport Psychol. (2020) 34(4):319–28. doi: 10.1123/tsp.2020-0034

CrossRef Full Text | Google Scholar

34. Richards P, Collins D, Mascarenhas DRD. Developing team decision-making: a holistic framework integrating both on-field and off-field pedagogical coaching processes. Sports Coach Rev. (2017) 6(1):57–75. doi: 10.1080/21640629.2016.1200819

CrossRef Full Text | Google Scholar

35. Lyle J, Cushion C. Sport coaching concepts: a framework for coaching practice. London, UK: Routledge (2017).

36. Morgan G, Muir B, Abraham A. Systematic observation. In: Nelson L, Groom R, Potrac P, editors. Research methods in sport coaching. New York: Routledge (2014). p. 122–31.

37. Cope E, Cushion CJ, Harvey S, Partington M. Re-visiting systematic observation: a pedagogical tool to support coach learning and development. Front Sports Act Living. (2022) 4:962690. doi: 10.3389/fspor.2022.962690

PubMed Abstract | CrossRef Full Text | Google Scholar

38. Muir B, Morgan G, Abraham A, Morley D. Developmentally appropriate approaches to coaching children. In: Stafford I, editor. Coaching children in sport. Oxon: Routledge (2011). p. 17–37.

39. Till K, Muir B, Abraham A, Piggott D, Tee J. A framework for decision-making within strength and conditioning coaching. J Strength Cond Res. (2019) 41(1):14–26. doi: 10.1519/SSC.0000000000000408

CrossRef Full Text | Google Scholar

40. Collins D, Collins L. Developing coaches’ professional judgement and decision making: using the “big 5”. J Sports Sci. (2020) 39(1):115–9. doi: 10.1080/02640414.2020.1809053

PubMed Abstract | CrossRef Full Text | Google Scholar

41. Ashford M, Cope E, Abraham A, Poolton J. Coaching player decision making in rugby union: exploring coaches espoused theories and theories in use as an indicator of effective coaching practice. Phys Educ Sport Pedagogy. (2022):1–22. doi: 10.1080/17408989.2022.2153822

CrossRef Full Text | Google Scholar

42. Crandall B, Klein G, Hoffman RR. Working minds: a practitioner’s guide to cognitive task analysis. Cambridge, MA: MIT Press (2006). xii, 332-xii, p.

43. Martindale A, Collins D. But why does what works work? A response to Fifer, Henschen, Gould, and Ravizza, 2008. Sport Psychol. (2010) 24(1):113–6. doi: 10.1123/tsp.24.1.113

CrossRef Full Text | Google Scholar

44. Sekhon M, Cartwright M, Francis JJ. Acceptability of healthcare interventions: an overview of reviews and development of a theoretical framework. BMC Health Serv Res. (2017) 17(1):1–13. doi: 10.1186/s12913-017-2031-8

PubMed Abstract | CrossRef Full Text | Google Scholar

45. Clark A. Predicting peace: the end of the representation wars. In: Metzinger TK, Windt JM, editors. Open MIND. Frankfurt Am main: MIND Group (2015). p. 1–7

46. Tversky A, Kahneman D. Judgment under uncertainty: heuristics and biases. Science. (1974) 185(4157):1124–31. doi: 10.1126/science.185.4157.1124

PubMed Abstract | CrossRef Full Text | Google Scholar

47. Hutton R. Macrocognitive models of expertise. In: Ward P, Schraagen JM, Gore J, Roth EM, editors. The Oxford handbook of expertise. Oxford: Oxford University Press (2019). p. 190–218.

48. Friston K, FitzGerald T, Rigoli F, Schwartenbeck P, Pezzulo G. Active inference: a process theory. Neural Comput. (2017) 29(1):1–49. doi: 10.1162/NECO_a_00912

PubMed Abstract | CrossRef Full Text | Google Scholar

49. Sloman SA. The empirical case for two systems of reasoning. Psychol Bull. (1996) 119(1):3–22. doi: 10.1037/0033-2909.119.1.3

CrossRef Full Text | Google Scholar

50. Kahneman D, Klein G. Conditions for intuitive expertise: a failure to disagree. Am Psychol. (2009) 64(6):515–26. doi: 10.1037/a0016755

PubMed Abstract | CrossRef Full Text | Google Scholar

51. Klein G, Wright C. Macrocognition: from theory to toolbox. Front Psychol. (2016) 7:1–5. doi: 10.3389/fpsyg.2016.00054

PubMed Abstract | CrossRef Full Text | Google Scholar

52. Kahneman D. Thinking fast and slow. London: Penguin Random House (2011).

53. Linson A, Clark A, Ramamoorthy S, Friston K. The active inference approach to ecological perception: general information dynamics for natural and artificial embodied cognition. Front Robot AI. (2018) 5:1–22. doi: 10.3389/frobt.2018.00021

PubMed Abstract | CrossRef Full Text | Google Scholar

54. Parr T, Pezzulo G, Friston KJ. Active inference: the free energy principle in mind, brain, and behavior. The MIT Press (2022). doi: 10.7551/mitpress/12441.001.0001

55. Pezzulo G, Rigoli F, Chersi F. The mixed instrumental controller: using value of information to combine habitual choice and mental simulation. Front Psychol. (2013) 4:1–15. doi: 10.3389/fpsyg.2013.00092

PubMed Abstract | CrossRef Full Text | Google Scholar

56. Pezzulo G, Cartoni E, Rigoli F, Pio-Lopez L, Friston K. Active inference, epistemic value, and vicarious trial and error. Learn Mem. (2016) 23(7):322–38. doi: 10.1101/lm.041780.116

PubMed Abstract | CrossRef Full Text | Google Scholar

57. Friston K, FitzGerald T, Rigoli F, Schwartenbeck P, O’Doherty J, Pezzulo G. Active inference and learning. Neurosci Biobehav Rev. (2016) 68:862–79. doi: 10.1016/j.neubiorev.2016.06.022

PubMed Abstract | CrossRef Full Text | Google Scholar

58. Maisto D, Friston K, Pezzulo G. Caching mechanisms for habit formation in active inference. Neurocomputing. (2019) 359:298–314. doi: 10.1016/j.neucom.2019.05.083

PubMed Abstract | CrossRef Full Text | Google Scholar

59. Bryant A. Grounded theory and pragmatism: the curious case of Anselm Strauss. Forum Qual Soc Res. (2009) 10(3). doi: 10.17169/fqs-10.3.1358

CrossRef Full Text | Google Scholar

60. Ashford M, Abraham A, Poolton J. A communal language for decision making in team invasion sports. Int Sport Coach J. (2021) 8(1):122–9. doi: 10.1123/iscj.2019-0062

CrossRef Full Text | Google Scholar

61. Mosier KL, Fischer U, Hoffman R, Klein G. Expert professional judgments and “naturalistic decision making”. In: Ericsson KA, Hoffman RR, Kozbelt A, Williams AM, editors. The Cambridge handbook of expertise and expert performance. 2nd Edn. New York, NY: Cambridge University Press (2018). p. 453–75.

62. Collins D, Burke V, Martindale A, Cruickshank A. The illusion of competency versus the desirability of expertise: seeking a common standard for support professions in sport. Sports Med. (2015) 45(1):1–7. doi: 10.1007/s40279-014-0251-1

PubMed Abstract | CrossRef Full Text | Google Scholar

63. Hatano G, Inagaki K. Two courses of expertise. In: Stevenson H, Azama K, Hakuta K, editors. Child development and education in Japan. New York: Freeman (1986). p. 262–72.

64. Kahneman D. A perspective on judgment and choice: mapping bounded rationality. Am Psychol. (2003) 58(9):697–720. doi: 10.1037/0003-066X.58.9.697

PubMed Abstract | CrossRef Full Text | Google Scholar

65. Klein G, Moon B, Hoffman RR. Making sense of sensemaking 2: a macrocognitive model. IEEE Intell Syst. (2006) 21(5):88–92. doi: 10.1109/MIS.2006.100

CrossRef Full Text | Google Scholar

66. Taylor J, Nash C. Sensemaking for the coach developer. In: Nash C, editor. Coach development. Routledge (In Press).

67. Klein G, Crandall B. The role of mental simulation in problem solving and decision making. In: Hancock PA, Flach JM, Caird J, Vicente KJ, editors. Local applications of the ecological approach to human-machine systems. 1st Edn. Abingdon: CRC Press (1995). p. 324–58. doi: 10.1201/9780203748749

68. Klein G, Ross KG, Moon BM, Klein DE, Hoffman RR, Hollnagel E. Macrocognition. IEEE Intell Syst. (2003) 18(3):81–5. doi: 10.1109/MIS.2003.1200735

CrossRef Full Text | Google Scholar

69. Gore J, Sadler-Smith E. Unpacking intuition: a process and outcome framework. Rev Gen Psychol. (2011) 15(4):304–16. doi: 10.1037/a0025069

CrossRef Full Text | Google Scholar

70. Pezzulo G, Rigoli F, Friston K. Active inference, homeostatic regulation and adaptive behavioural control. Prog Neurobiol. (2015) 134:17–35. doi: 10.1016/j.pneurobio.2015.09.001

PubMed Abstract | CrossRef Full Text | Google Scholar

71. Gilovich T, Griffin D, Kahneman D. Heuristics and biases: the psychology of intuitive judgment. Cambridge: Cambridge University Press (2002).

72. Klein G. Flexecution, part 2: understanding and supporting flexible execution. IEEE Intell Syst. (2007) 22(6):108–12. doi: 10.1109/MIS.2007.107

CrossRef Full Text | Google Scholar

73. Klein GA. A recognition-primed decision (RPD) model of rapid decision making. In: G Klein, J Orasanu, R Calderwood, et al. editors. Decision making in action: models and methods. Westport, CT: Ablex Publishing (1993). p. 138–47.

74. Martens R. Science, knowledge, and sport psychology. Sport Psychol. (1987) 1(1):29–55. doi: 10.1123/tsp.1.1.29

CrossRef Full Text | Google Scholar

75. Ward P, Gore J, Hutton R, Conway GE, Hoffman RR. Adaptive skill as the conditio sine qua non of expertise. J Appl Res Mem Cogn. (2018) 7(1):35–50. doi: 10.1016/j.jarmac.2018.01.009

CrossRef Full Text | Google Scholar

76. Braun V, Clarke V. One size fits all? What counts as quality practice in (reflexive) thematic analysis? Qual Res Psychol. (2020) 18(3):328–52. doi: 10.1080/14780887.2020.1769238

CrossRef Full Text | Google Scholar

77. Corbin J, Strauss A. Basics of qualitative research: techniques and procedures for developing ground theory. 3rd Edn. London: Sage (2008).

78. Côté J, Salmela JH, Baria A, Russell SJ. Organizing and interpreting unstructured qualitative data. Sport Psychol. (1993) 7(2):127–37. doi: 10.1123/tsp.7.2.127

CrossRef Full Text | Google Scholar

79. Nowell LS, Norris JM, White DE, Moules NJ. Thematic analysis: striving to meet the trustworthiness criteria. Int J Qual Methods. (2017) 16(1):1609406917733847. doi: 10.1177/1609406917733847

CrossRef Full Text | Google Scholar

80. Faulkner G, Sparkes A. Exercise as therapy for schizophrenia: an ethnographic study. J Sport Exerc Psychol. (1999) 21(1):52–69. doi: 10.1123/jsep.21.1.52

CrossRef Full Text | Google Scholar

81. Smith B, Sparkes AC. Narrative inquiry in psychology: exploring the tensions within. Qual Res Psychol. (2006) 3(3):169–92. doi: 10.1191/1478088706qrp068oa

CrossRef Full Text | Google Scholar

82. Collins D, Abraham A, Collins R. On vampires and wolves—exposing and exploring reasons for the differential impact of coach education. Int J Sport Psychol. (2012) 43(3):255–71.

Google Scholar

83. Cruickshank A, Martindale A, Collins D. Raising our game: the necessity and progression of expertise-based training in applied sport psychology. J Appl Sport Psychol. (2020) 32(3):237–55. doi: 10.1080/10413200.2018.1492471

CrossRef Full Text | Google Scholar

84. Clark D. Learning experience design: how to create effective learning that works. London: Kogan Page (2021).

85. Crowther M, Collins D, Collins L, Grecic D, Carson HJ. Investigating academy coaches’ epistemological beliefs in red and white ball cricket. Sports Coach Rev. (2022):1–23. doi: 10.1080/21640629.2022.2101912

CrossRef Full Text | Google Scholar

86. Downham L, Cushion C. Reflection and reflective practice in high-performance sport coaching: a heuristic device. Phys Educ Sport Pedagogy. (2022):1–20. doi: 10.1080/17408989.2022.2136369

CrossRef Full Text | Google Scholar

87. Mason RJ, Farrow D, Hattie JAC. An exploratory investigation into the reception of verbal and video feedback provided to players in an Australian football league club. Int J Sports Sci Coach. (2020) 16(1):181–91. doi: 10.1177/1747954120951080

CrossRef Full Text | Google Scholar

88. Taylor J, Ashford M, Collins D. Tough love: impactful, caring coaching in psychologically unsafe environments. MDPI Sports. (2022) 10(6):83. doi: 10.3390/sports10060083

CrossRef Full Text | Google Scholar

89. Hodges NJ, Lohse KR. An extended challenge-based framework for practice design in sports coaching. J Sports Sci. (2022) 40(7):754–68. doi: 10.1080/02640414.2021.2015917

PubMed Abstract | CrossRef Full Text | Google Scholar

90. Pill S, SueSee B, Rankin J, Hewitt M. The spectrum of sport coaching styles. New York: Routledge (2021).

91. Jowett S. Coaching effectiveness: the coach–athlete relationship at its heart. Curr Opin Psychol. (2017) 16:154–8. doi: 10.1016/j.copsyc.2017.05.006

PubMed Abstract | CrossRef Full Text | Google Scholar

92. Henderson M, Ajjawi R, Boud D, Molloy E. Identifying feedback that has impact. In: Henderson M, Ajjawi R, Boud D, Molloy E, editors. The impact of feedback in higher education: improving assessment outcomes for learners. Cham: Springer International Publishing (2019). p. 15–34.

93. Taylor J, Collins D, Cruickshank A. Too many cooks, not enough gourmets: examining provision and use of feedback for the developing athlete. Sport Psychol. (2021) 36(2):89–100. doi: 10.1123/tsp.2021-0037

CrossRef Full Text | Google Scholar

94. Abraham A, Collins D, Morgan G, Muir B. Developing expert coaches requires expert coach development: replacing serendipity with orchestration. In: Lorenzo A, Ibanez SJ, Ortega E, editors. Aportaciones teoricas Y practicas para El Baloncesto Del Futuro. Sevilla: Wanceulen Editorial Deportiva (2009). p. 183–205

95. Gallimore R, Tharp R. What a coach can teach a teacher, 1975–2004: reflections and reanalysis of John Wooden’s teaching practices. Sport Psychol. (2004) 18(2):119–37. doi: 10.1123/tsp.18.2.119

CrossRef Full Text | Google Scholar

96. Collins L, Collins D. Managing the cognitive loads associated with judgment and decision-making in a group of adventure sports coaches: a mixed-method investigation. J Adventure Educ Outdoor Learn. (2021) 21(1):1–16. doi: 10.1080/14729679.2019.1686041

CrossRef Full Text | Google Scholar

97. Klein GA, Calderwood R, MacGregor D. Critical decision method for eliciting knowledge. IEEE Trans Syst Man Cybern. (1989) 19(3):462–72. doi: 10.1109/21.31053

CrossRef Full Text | Google Scholar

98. Borders J, Klein G. The critical decision audit: blending the critical decision method & the knowledge audit. Proceedings of the 13th bi-annual international conference on naturalistic decision making; University of Bath (2017). p. 33–9.

99. Militello LG, Anders S. Incident-based methods for studying expertise. In: Ward P, Maarten Schraagen J, Gore J, Roth EM, editors. The Oxford handbook of expertise. New York: Oxford University Press (2019). p. 429–50

100. Sparkes AC, Smith B. Judging the quality of qualitative inquiry: criteriology and relativism in action. Psychol Sport Exerc. (2009) 10(5):491–7. doi: 10.1016/j.psychsport.2009.02.006

CrossRef Full Text | Google Scholar

101. Leary MR, Kowalski RM. Impression management: a literature review and two-component model. Psychol Bull. (1990) 107:34–47. doi: 10.1037/0033-2909.107.1.34

CrossRef Full Text | Google Scholar

102. Nash C, Martindale R, Collins D, Martindale A. Parameterising expertise in coaching: past, present and future. J Sports Sci. (2012) 30(10):985–94. doi: 10.1080/02640414.2012.682079

PubMed Abstract | CrossRef Full Text | Google Scholar

Keywords: sport coaching, observation, coach development, PJDM, expertise, macrocognition, coach behaviour

Citation: Taylor J, MacNamara Á and Collins D (2023) The 3Ps: A tool for coach observation. Front. Sports Act. Living 4:1066378. doi: 10.3389/fspor.2022.1066378

Received: 10 October 2022; Accepted: 19 December 2022;
Published: 20 January 2023.

Edited by:

Faye Didymus, Leeds Beckett University, United Kingdom

Reviewed by:

Tom Mitchell, Leeds Beckett University, United Kingdom
Andrew Abraham, Leeds Beckett University, United Kingdom

© 2023 Taylor, MacNamara and Collins. This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) and the copyright owner(s) are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.

*Correspondence: Jamie Taylor jamie.taylor@dcu.ie

Specialty Section: This article was submitted to Sports Coaching: Performance and Development, a section of the journal Frontiers in Sports and Active Living

Download