Skip to main content

ORIGINAL RESEARCH article

Front. Psychol., 23 January 2019
Sec. Movement Science
This article is part of the Research Topic Performance Analysis in Sport View all 67 articles

A New Reliable Performance Analysis Template for Quantifying Action Variables in Elite Men’s Wheelchair Basketball

  • 1School of Sport and Exercise Science, University of Worcester, Worcester, United Kingdom
  • 2Faculty of Engineering, Environment and Computing, Coventry University, Coventry, United Kingdom
  • 3School of Allied Health & Community, University of Worcester, Worcester, United Kingdom
  • 4Faculty of Health and Sport Sciences, University of Agder, Kristiansand, Norway

This study aimed to develop a valid and reliable performance analysis template for quantifying team action variables in elite men’s wheelchair basketball. First action variables and operational definitions were identified by the authors and verified by an expert panel of wheelchair basketball coaching staff in order to establish expert validity. A total of 109 action variable were then placed into 17 agreed Categorical Predictor Variable categories. The action variables were then used to develop a computerized performance analysis template for post-event analysis. Each possession (n = 200) from an international men’s wheelchair basketball game was analyzed by the first author on two occasions for assessment of intra-observer reliability and by a coach and a performance analyst for inter-observer reliability. Percentage error and Weighted Kappa coefficients were calculated to compare the levels of error and agreement for each action variable. Intra-observer reliability demonstrated perfect or almost perfect agreement (<K0.980) and low percentage error values (<1.50%) for the 109 action variables within the 17 categories. Inter-observer reliability demonstrated perfect or almost perfect agreement (<K0.974) and low percentage error values (<3.00%) for the 109 action variables within the 17 categories. The template should be used in future for obtaining valid and reliable data in elite men’s wheelchair basketball.

Introduction

Performance analysis aims to assist the decision making and learning of athletes, coaches and support staff (Sampaio et al., 2013; Hughes and Bartlett, 2015). Objective performance data are collected regarding the key actions and behavioral aspects of an individual’s and/or team’s performance (Sampaio et al., 2013) through specifically designed performance analysis templates and systems. The data are then utilized to provide feedback. Central to the quality of the feedback is the analyst’s ability to design an appropriate data collection template that will permit the collection of valid and reliable performance data.

If a sport performance analysis template can record a sports performance using precise definitions of actions and events and consistently produce similar or identical results each time it is used, it can be deemed both valid and reliable. However, previous performance analysis research has highlighted problems in the processes often undertaken to identify valid action variables and to develop a reliable performance analysis template (Watson et al., 2017; Jayal et al., 2018). Particularly in relation to the validity of defining action variables, performance indicators, operational definitions, and the reliability test procedures themselves (James et al., 2005; Hughes et al., 2012; Thomson et al., 2013; Hughes, 2015).

Hughes (2015) argued the presentation of reliability and validation procedures has increased immensely since Hughes et al. (2002) previously highlighted the need for the reliability of performance analysis templates to be clearly established within all studies. Prior to the 2007 special edition of the International Journal of Performance Analysis in Sport that focused on reliability issues in performance analysis, of the 77 empirical studies published, only 56% of the journal’s articles reported reliability procedures and only 42% included details detailing the validation procedures.

Within the special edition’s editorial, O’Donoghue (2007, p. i) stated the discipline “takes reliability very seriously because many methods involve human operators where there are many sources of measurement error.” Subsequently, the number of articles within the journal presenting information regarding the reliability procedures increased to 68% but the number of studies outlining the validation processes reduced to 40% (312 out of 462 articles that included empirical data between 2007 and 2015). Despite these clear recommendations, the importance of establishing and presenting both the validity and reliability of performance analysis templates is too often still overlooked. More recently, Watson et al. (2017) have reiterated this point and attempted to address the issue regarding validity and reliability of key performance indicators that discriminate between successful and unsuccessful rugby union teams. However, the issue of the collection of valid and reliable performance analysis data in less studied sports, e.g., wheelchair basketball, are no exception to this trend.

Wheelchair basketball is played by people with varying physical disabilities with a primary objective of scoring more baskets than their opponents (Frogley, 2010). To achieve this objective, the offensive team endeavors to progress the ball toward the basket by coordinating actions in an attempt to position themselves close to the basket, whilst the defensive team attempts to coordinate actions to restrict the offensive players’ space to shot and regain possession. The two teams consist of players with a range of disabilities, including amputations, birth defects, cerebral palsy, paralysis due to an accident and, spina bifida, and who are unable to play the running form of basketball (Gil-Agudo et al., 2010). The growth in the sport, now being played by over 105 nations (International Wheelchair Basketball Federation, 2016), has led to the performance gap between participation and qualification into a World Championships or Paralympic Games becoming increasingly difficult. Nations have had to become more tactically and technically strategic in the way athletes and teams prepare for competitions through turning to performance analysis (de Bosscher et al., 2008). The discipline, therefore, seems to be an excellent approach for increasing the technical and tactical understanding of wheelchair basketball demands, assisting coaches, athletes, classifiers and analysts with the ability to apply the findings in order to improve training plans and game management.

Each of the seven post-event wheelchair basketball performance analysis articles published, however, that have attempted to explore the technical and tactical demands of the sport using a form of performance analysis template (Vanlandewijck et al., 1995, 2003, 2004; Molik et al., 2009; Skucas et al., 2009; Gómez et al., 2014, 2015), have inherently questionable validity and reliability. These studies have relied on box score data, with no consideration of its validity or reliability and the (modified) comprehensive basketball grading system (CBGS) to provide an “objective” means of evaluating individual player performance. The CBGS was originally developed for use in running basketball and from a very small sample of games at a specific level of competition (Mullens, 1978), making it invalid for use in the wheelchair game. The CBGS records the frequency counts of shots, rebounds, and fouls drawing a game, concluding that the classification system proportionally represents the functional potential of the players. However, these findings offer limited tactical and technical insights into the key determinants of success and thus provide limited contextually rich data that can be used by coaches, players and staff to inform future practice. Furthermore, the post-event analysis completed in these studies and largely in performance analysis research differs from applied practice whereby the immediateness of the obtained results is of priority. Post-event analysis, however, allows for greater in-depth analysis and warrants a higher precision of accuracy due to the possibility that errors can be rectified (Arriaza and Zuniga, 2016).

Researchers have attempted to include wheelchair basketball specific variables in the modified-CBGS (Byrnes and Hedrick, 1994), however, the sport-specific variables were removed due to definitional errors identified as a result of the operators’ experience (Vanlandewijck et al., 2003, 2004). The CBGS and modified-CBGS data were also found to be highly correlated with one another. Reliability of these studies was assessed by inter-observer reliability procedures using a Pearson’s R Correlation, which has been criticized due to presenting miss-leading results as it is an assessment of relationship, not agreement (Liu et al., 2016). Despite this, researchers have elected to use this “evidence” to determine the quality of players’ games and made comparisons between functional classification groups, identifying that higher classified players achieved higher CBGS scores.

Furthermore, researchers have also claimed the findings from individual box score data can be used to provide an insight into team performance. Neither version of the CBGS, however, capture contextual and situational relevant data regarding team performance. Araújo and Davids (2016) argued that it is important to consider the interactive behaviors of players over time and recording these on a continuous or sequential basis. Researchers have identified the performance relationship between game status (e.g., Sampaio et al., 2010), line-up rotations (e.g., Clay and Clay, 2014) and the offensive-defensive dyads involved in sports (García et al., 2013), and thus by capturing this data it may be possible to provide meaningful objective augmented feedback (Araújo, 2017; Jayal et al., 2018). Passos (2008) also argued that the collection of discrete variables, as is the case with the (modified) CBGS, does not provide a true insight into an entire performance. Additionally, the seven studies did not mention how the action variables were established. Therefore, if the process of establishing the action variables is not outlined and the secondary box score data has been shown to be potentially incorrect, the data collected should not be used by coaches, players and support staff to inform decisions regarding team aspects of performance (Ziv et al., 2010). The (modified) CBGS is not suitable for measuring team performance in elite men’s wheelchair basketball.

Considering the above concerns within the discipline and specifically in wheelchair basketball regarding reliability, there is a need for a new post-event valid and reliable sports performance analysis template to assess a team’s performance in wheelchair basketball. The template is required to correctly identify and record the actions that occur during a game in a consistent manner, thus providing coaches, players and support staff with meaningful performance data to inform future decision making following games. The variables that are analyzed in the study can contribute to the players’ learning, thus increasing the likelihood of wheelchair basketball teams achieving performance success. As such, an adequate methodological process for quantifying action variables in elite men’s wheelchair basketball was required. Therefore, the aims of this paper were to (i) develop a valid performance analysis template in elite men’s wheelchair basketball and (ii) assess its intra-observer and inter-observer reliability by the lead author, a wheelchair basketball coach and a performance analyst intern.

Materials and Methods

Following ethical approval from the University of Worcester’s Ethics and Research Governance Committee, the methodological approaches used by James et al. (2005) and Thomson et al. (2013) were followed as an initial framework. The framework was adapted and followed nine distinct stages; stages one to six relating to the validation process, stage seven developing the performance analysis template and stages eight and nine referred to establishing reliability (Figure 1).

FIGURE 1
www.frontiersin.org

Figure 1. Diagram showing the systematic research process for developing a new performance analysis template [adapted from James et al. (2005) and Thomson et al. (2013)].

Validation Process

First, a list of 120 action variables was developed from previous wheelchair basketball literature and the knowledge of the authors. The action variables were initially grouped into 16 categories depending on the sub-phases that would occur during a single possession in the game. The action variables within each category were an exhaustive list of all behaviors that could occur which help toward understanding the sequential nature of a possession that would contribute toward scoring a basket.

Second, on receipt of written informed consent, developed in line with The British Association of Sport and Exercise Sciences code of conduct, from four elite wheelchair basketball staff, the list was circulated and the participants were given 1 week to scrutinize the information. The four staff members consisted of three elite wheelchair basketball coaches (Coach one: 20 years’ experience; Coach two: 19 years’ experience; Coach three: 19 years’ experience) and a member of support staff from an elite wheelchair basketball team (3 years’ experience). During the week, the staff were asked to review the list and provide their opinions as to whether the variables and categories would allow the collection of objective data regarding the sequential nature of a possession. The staff made notes on the list and returned it.

Third, adaptations were made to the action variables and categories during a focus group with all four staff and the lead researcher. Following the discussion, the adapted list comprised of 109 action variables placed into 17 categories. The Offense – End and Defense – End categories were combined into the End of Possession category, removing 18 action variables, the action variables within the Offense – Shot category were split into three categories (Shot Taken, Shot Point, and Shot Outcome) adding two action variables and the Defensive System category added five action variables to provide additional context to the possession.

Fourth, operational definitions were developed for each of the 109 variables using various resources (Frogley, 2010; Federation International Basketball Association, 2014; International Wheelchair Basketball Federation, 2014). The list of action variables and operational definitions was then re-circulated to each of the wheelchair basketball staff members who were given another week to comment.

Fifth, the staff identified any suggested amendments to the definitions during a second focus group. The definitions for “Zone” and “Highline” Defensive System were discussed and amended to add further clarity.

Sixth, video clips with overlaying text were created illustrating each action variable. The clips were circulated to the wheelchair basketball staff using external hard drives. Each member was given 1 week to watch the clips and ensure the overlaying text represented the operational definitions for each action variable. One staff member requested a further clip to illustrate the different types of Defensive System when a team were playing a “Highline” defense. The clip was circulated to all staff members. After watching the additional clip, the staff members confirmed the second video clip represented the overlaying text more accurately. No additional clips or amendments to the operational definitions were required, resulting in the final list of 109 action variables placed into 17 categories (Table 1).

TABLE 1
www.frontiersin.org

Table 1. Operational definitions for the action variables in each category.

Template Development

Following the validation process (stages one to six above); a performance analysis template was created in SportsCode Elite Version 10 during stage seven by the lead researcher, the four wheelchair basketball staff and the performance analysis intern. The template underwent two pilot tests on a randomly selected elite wheelchair basketball game from a pre-tournament held in 2015. As a result of this pilot, the buttons were resized and positioned in their category group (Figure 2).

FIGURE 2
www.frontiersin.org

Figure 2. Team performance analysis template for coding wheelchair basketball performance.

Reliability Process

Intra-Observer Reliability Assessment

During stage eight, one game of elite male international wheelchair basketball was selected at random from the 2015 European Wheelchair Basketball Championships. The footage was imported into SportsCode Elite Version 10 and converted into a “SportsCode Project” analyzed post-game and viewed at normal playback speed (25 keyframes per second). If necessary, the playback speed was adjusted to ensure events were observed and recorded accurately. Multiple actions within a category could be recorded. For example, if the player was fouled in the act of scoring a successful basket then the End of Possession category would automatically record “Basket Scored” and “Foul For.” In addition, the home and away team numbers were checked against the official tournament website and the players’ classifications verified on the International Wheelchair Basketball Federation’s player database.

Levels of agreement with Weighted Kappa coefficients (Cohen, 1968) and percentage error values (Bland and Altman, 1999) were calculated for each category. The interpretation of Weighted Kappa coefficients within the field of performance analysis has been demonstrated by Lamas et al. (2015); with the following values being utilized: “<0 less than the chance agreement, 0.01–0.20 slight agreement, 0.21–0.40 fair agreement, 0.41–0.60 moderate agreement, 0.61–0.80 substantial agreement, and 0.81–0.99 almost perfect agreement” (Landis and Koch, 1977, p. 165). Whilst, the level of reliability for each category when using the percentage error value was deemed acceptable when less than five per cent error was identified (Hughes et al., 2002).

For intra-observer procedures, 100 Home Offense and 100 Away Offense possessions were analyzed on two occasions with a period of 4 weeks between the two observations. The two observations were exported as categorical variables from SportsCode using the “Sorter” function into Microsoft Excel. The 400 rows of data were transferred into a CSV file (Supplementary Data Sheet S1) and imported into R (R Core Team, 2015). Weight Kappa coefficients and percentage error values were calculated for each category to determine intra-observer agreement levels between the two observations. Where categories did not demonstrate perfect agreement or establish a zero per cent error, the source of the discrepancy was identified and the specific possession was re-observed to create an agreed observation.

Inter-Observer Reliability Assessment

Following the establishment of an agreed observation, stage nine involved a wheelchair basketball coach and a performance analysis intern completing an observation of the same game, enabling the completion of an inter-observer reliability test. The wheelchair basketball coach, who had 19 years of sport-specific experience, was involved in the classification of action variables and had a year of experience using a similar performance analysis software program (Dartfish TeamPro, Switzerland). The performance analysis intern had 9 months experience of performance analysis in wheelchair basketball) and 3 years of experience as a performance analyst in rugby union using SportsCode Elite.

The coach and performance analyst intern accessed the action variables, operational definitions and the accompanying video clips 2 weeks prior to conducting the observations to help familiarize themselves with the specific behaviors they were required to record. In addition, the coach and the intern were allowed to code a pre-tournament game between the two competing nations to assist with learning the performance analysis template and the software. Familiarization varied in time for the two operators, with the coach completing four sessions of 2 h over a 5 day period and the intern undertaking an additional 2-h session before both individuals felt they were able to complete the reliability test (O’Donoghue, 2014). The testing was conducted 1 day after they had completed their final familiarization session. The coach and the intern focused on observing the entire game, which equated to 200 possessions. Weighted Kappa coefficients and percentage error values were calculated for each category to determine inter-observer agreement levels with the agreed observation being first compared against the coach’s observation and second against the performance analyst intern’s observation. Finally, the coach’s, performance analyst intern’s and the agreed observation were triangulated and expressed as Weighted Kappa coefficients and percentage error values.

Results

Intra-Observer Reliability Test

Cohen’s Weighted Kappa demonstrated perfect agreement (K1.000) for 12 categories and almost perfect agreement (K0.987–0.994) for the remaining five categories between the first (Ob1) and second observation (Ob2) (Table 2). Percentage error reported zero error for the same 12 categories and below the five per cent acceptable error percentage for the remaining five categories.

TABLE 2
www.frontiersin.org

Table 2. Intra-observer agreement reported using Cohen’s Weighted Kappa (K) and percentage error between the first observation (Ob1) and the second observation (Ob2).

Inter-Observer Reliability

Agreed Observation Versus Coach’s Observation

The test demonstrated perfect agreement (K1.000) and zero percentage error for ten categories and almost perfect agreement (K0.974–0.993) and within the acceptable percentage error threshold (0.50–1.50%) for the remaining seven categories (Table 3). The Man-Out Offense category recorded the lowest Weighted Kappa coefficient (K0.974) but almost a zero percentage error value (0.50%). By comparing the frequency counts for each action variable between the two observations within this category, it was identified that no action was recorded for one possession by the coach resulting in the discrepancy.

TABLE 3
www.frontiersin.org

Table 3. Inter-observer agreement reported using Cohen’s Weighted Kappa (K) and percentage error between the agreed observation (Ob3), the coach’s observation (Ob4), and the performance analyst intern’s observation (Ob5).

Agreed Observation Versus Performance Analyst Intern’s Observation

The test demonstrated perfect agreement (K1.00) and zero percentage error with 12 categories and almost perfect agreement (K0.981–0.993) and within the five per cent error limit (0.50–1.50%) with five categories (Table 3). The Shot Clock Remaining category recorded the lowest Weighted Kappa coefficient (K0.981) and highest error percentage (1.50%) as a result of three disagreements.

Triangulation of Coach’s, Performance Analyst Intern’s, and Agreed Observation

Through reporting the Weighted Kappa coefficients and percentage error values of the 17 categories, 9 categories demonstrated perfect agreement and zero percentage error, and 8 categories produced almost perfect agreement (K0.974–0.996) and within the five per cent error threshold (0.50–3.00%). Three categories, Shot Location, Start of Possession and Shot Clock Remaining, reported the largest number of discrepancies amongst the variables within each action variable (Table 3). The triangulation results for the Shot Location category highlight the category is the most susceptible to producing errors, however, the Weighted Kappa coefficient and percentage error values are still within the acceptable thresholds for agreement levels.

Discussion

This paper set out to develop a unique valid and reliable performance analysis template for wheelchair basketball. To achieve this aim, the methodological procedures to develop a template completed by James et al. (2005) and Thomson et al. (2013) were adapted. This involved completing a nine-stage methodological process, which included a validation process, template development and reliability assessment. To address the limitations of the (modified) CBGS, it was necessary to employ the knowledge of sport-specific staff to assist in identifying contextually relevant action variables as well as drawing on the existing sport-specific literature. The four members of staff that were used in the paper provided a qualitative contribution through focus groups to further enhance the final list of 109 action variables and operational definitions.

The template was developed to be used post-event, with the ability to extract data as total frequency counts or as successive, discrete possessions. The development of the template built on Cooper et al. (2007) idea of dividing an observation into specific time cells. It also agreed with Thomson et al. (2013) work that this process was a sufficient method for assessing test-retest analysis. However, rather than dividing the observed performance into 2 min or 10-s time cells, each possession, which could last up to 24 s, was used. As outlined above, within each possession, irrelevant of the duration, each observer collected information pertaining to 17 categories.

Intra-observer and inter-observer reliability tests highlighted that the accuracy of all observations was excellent for the notation of all 109 action variables and 17 categories with inter-observer reliability slightly lower than intra-observer reliability. The coach’s observation achieved the lowest Weighted Kappa coefficient for the Shot Clock Remaining category whilst the performance analyst intern achieved the lowest Weighted Kappa coefficient for the Man-Out Offense category.

Previous research in boxing and rugby union have identified that it is not unexpected for the level of inter-observer reliability to be inferior to intra-observer reliability (Thomson et al., 2013: intra-observer agreement ranged from 80–100% whereas inter-observer agreement ranging from 33–100%; James et al., 2005: intra-observer agreement ranged from 1.97+3.14% whilst inter-observer agreement ranged from 11.09+8.61%), but all observations in this paper fell within the adequate levels of reliability. It is clear, however, that an adequate period of template piloting, familiarization and training was key to obtaining these excellent levels of reliability. The small disagreements identified between the observations could be due to the dynamic nature of the sport whereby observers are attempting to record action variables quickly and thus may incorrectly click on a closely related button rather than missing an action at all. Examples of this were identified when the coach coded the possession starting as an “Offensive Rebound” whereas the agreed observation coded the possession starting as a “Defensive Rebound.” It could also be argued that whilst operational definitions should be clear to distinguish between the two rebound types, they share a number of characteristics and thus may explain the disagreement.

The use of two reliability statistical approaches, Weighted Kappa coefficients (Cohen, 1968) and percentage error values (Bland and Altman, 1999), provided a useful cross-checking method for determining the reliability of the template. The concept of percentage error allowed directed comparisons of agreement to be made irrespective of the scaling between observers (Hopkins, 2000). Thus, it enabled the identification of errors and determined if these were random (McHugh, 2012). Whilst the Weighted Kappa tests acknowledged that in some instances no operator could be sure of the action to record (McHugh, 2012) and provided credit when two observers recorded adjacent values, for example, in the Shot Location category. The use of both percentage error and weighted kappa statistics to assess intra-observer and inter-observer is recommended in the development process of a performance analysis template.

It is important to note, however, that this template was developed for post-event analysis, and thus changes would be required if the goal was to use the template in real-time analysis. The action variables included within the template were carefully considered to ensure meaningful and contextually relevant information was captured. Additional action variables could be added to the template to assist in strengthening the profile of an elite team’s performance regarding different tactical approaches, however, this would likely increase the time taken to analyze the wheelchair basketball performance and interpret the data. Subsequently, if additional modifications were made to the action variables, operational definitions, categories, or template, further reliability assessment would be required. Nevertheless, the current template provides the grounding for future attempts to identify the key tactical determinants of team success in elite wheelchair basketball and the processes undertaken to produce the template provide a framework for the development of future templates in all team sports.

Conclusion

The paper provides an improved methodological process to establish a valid and reliable performance analysis template, that in this article we have used to produce accurate and reliable observations of key performance behaviors in a sequential nature within elite male wheelchair basketball. Additionally, the template has enabled the collection of most actions that occur in a wheelchair basketball possession whilst also recording the actions of the opposition, allowing for a context-specific insight to be gained. The current template should now be used by wheelchair basketball coaches, analysts and researchers to collect valid and reliable performance data at zonal qualification tournaments, world championships, and Paralympic Games to help identify the key tactical determinants of team success and subsequently to underpin both performance enhancing training and within-game practices.

Data Availability Statement

The datasets generated for this study can be found in the Worcester Research and Publications collection (https://eprints.worc.ac.uk/id/eprint/7334 and https://eprints.worc.ac.uk/id/eprint/7332).

Author Contributions

JF devised the structure of the manuscript, collected and analyzed the data, and drafted the manuscript. AO provided the guidance and support for statistical analyses and reporting, and commented on the final draft. DP devised the structure of the manuscript, oversaw the whole research process, commented on drafts and approved the final draft.

Funding

Funding for the Ph.D. studentship program of research was awarded to DP by the University of Worcester and British Wheelchair Basketball.

Conflict of Interest Statement

The authors declare that the research was conducted as part of a Ph.D. research project between the University of Worcester and British Wheelchair Basketball.

Acknowledgments

The authors acknowledge the valuable contribution of the study participants.

Supplementary Material

The Supplementary Material for this article can be found online at: https://www.frontiersin.org/articles/10.3389/fpsyg.2019.00016/full#supplementary-material

DATA SHEET S1 | Raw intra-observer and inter-observer reliability data.

References

Araújo, D. (2017). “Variables characterising performance and performance indicators in team sports,” in Performance Analysis in Team Sports, eds P. Passos, D. Araújo, and A. Volossovitch (Abingdon: Routledge), 38–52.

Google Scholar

Araújo, D., and Davids, K. (2016). Team synergies in sport: theory and measures. Front. Psychol. 7:1449. doi: 10.3389/fpsyg.2016.01449

PubMed Abstract | CrossRef Full Text | Google Scholar

Arriaza, E., and Zuniga, M. (2016). Soccer as a study case for analytic trends in collective sports training: a survey. Int. J. Perform. Anal. Sport 16, 171–190. doi: 10.1080/24748668.2016.11868879

CrossRef Full Text | Google Scholar

Bland, M., and Altman, D. (1999). Measuring agreement in method comparison studies. Stat. Methods Med. Res. 8, 135–160. doi: 10.1177/096228029900800204

PubMed Abstract | CrossRef Full Text | Google Scholar

Byrnes, D., and Hedrick, B. (1994). “Comprehensive basketball grading system,” in Wheelchair Basketball, eds D. Byrnes, B. Hedrick, and L. Shaver (Washington, DC: Paralyzed Veterans of America).

PubMed Abstract | Google Scholar

Clay, D. C., and Clay, K. E. (2014). Player rotation, on-court performance and game outcomes in NCAA men’s basketball. Int. J. Perform. Anal. Sport 14, 606–619. doi: 10.1080/24748668.2014.11868746

CrossRef Full Text | Google Scholar

Cohen, J. (1968). Weighted kappa: nominal scale agreement provision for scaled disagreement or partial credit. Psychol. Bull. 70, 213–220. doi: 10.1037/h0026256

CrossRef Full Text | Google Scholar

Cooper, S.-M., Hughes, M., O’Donoghue, P., and Nevill, A. (2007). A simple statistical method for assessing the reliability of data entered into sport performance analysis systems. Int. J. Perform. Anal. Sport 7, 87–109. doi: 10.1080/24748668.2007.11868390

CrossRef Full Text | Google Scholar

de Bosscher, V., Bingham, J., Shibli, S., van Bottenburg, M., and de Knop, P. (2008). The Global Sporting Arms Race: an International Comparative Study on Sports Policy Factors Leading to International Sporting Success. Oxford: Meyer & Meyer Sport (UK) Ltd.

Google Scholar

Federation International Basketball Association (2014). Official Basketball Rules. Mies: FIBA Central Board.

Google Scholar

Frogley, M. (2010). “Wheelchair basketball,” in Wheelchair Sport: A Complete Guide for Athletes, Coaches, and Teachers, ed. V. Goosey-Tolfrey (Champaign, IL: Human Kinetics), 119–132.

Google Scholar

García, J., Ibañez, S.-J., Cañadas, M., and Antúnez, A. (2013). Complex system theory in team sports. Example in 5 on 5 basketball contest. Revista de Psicologia Del Deporte 22, 209–213.

Google Scholar

Gil-Agudo, A., Del Ama-Espinosa, A., and Crespo-Ruiz, B. (2010). Wheelchair basketball quantification. Phys. Med. Rehabil. Clin. North Am. 21, 141–156. doi: 10.1016/j.pmr.2009.07.002

PubMed Abstract | CrossRef Full Text | Google Scholar

Gómez, M. -Á, Molik, B., Morgulec-Adamowicz, N., and Szyman, R. (2015). Performance analysis of elite women’s wheelchair basketball players according to team-strength, playing-time and players’ classification. Int. J. Perform. Anal. Sport 15, 268–283. doi: 10.1080/24748668.2015.11868792

CrossRef Full Text | Google Scholar

Gómez, M. -Á, Pérez, J., Molik, B., Szyman, R., and Sampaio, J. (2014). Performance analysis of elite men’s and women’s wheelchair basketball teams. J. Sports Sci. 32, 1066–1075. doi: 10.1080/02640414.2013.879334

PubMed Abstract | CrossRef Full Text | Google Scholar

Hopkins, W. (2000). Measures of reliability in sports medicine and science. Sport Med. 30, 1–15. doi: 10.2165/00007256-200030010-00001

PubMed Abstract | CrossRef Full Text | Google Scholar

Hughes, M. (2015). “Analysis of notation data: reliability,” in Essentials of Performance Analysis in Sport, 2nd Edn, eds M. Hughes and I. Franks (Abingdon: Routledge), 169–179. doi: 10.4324/9781315776743-11

CrossRef Full Text | Google Scholar

Hughes, M., and Bartlett, R. (2015). “What is performance analysis?,” in Essentials of Performance Analysis in Sport, 2nd Edn, eds M. Hughes and I. Franks (Abingdon: Routledge), 18–28. doi: 10.4324/9781315776743-3

CrossRef Full Text | Google Scholar

Hughes, M., Caudrelier, T., James, N., Redwood-Brown, A., Donnelly, I., Kirkbride, A., et al. (2012). Moneyball and soccer: an analysis of key performance indicators of elite male soccer players by position. J. Hum. Sport Exerc. 7, 402–412. doi: 10.4100/jhse.2012.72.06

CrossRef Full Text | Google Scholar

Hughes, M., Cooper, S.-M., and Nevill, A. (2002). Analysis procedures for non-parametric data from performance analysis. Int. J. Perform. Anal. Sport 2, 6–20. doi: 10.1080/24748668.2002.11868257

CrossRef Full Text | Google Scholar

International Wheelchair Basketball Federation (2014). Official Wheelchair Basketball Rules. Available at: https://iwbf.org/rules-of-wheelchair-basketball/

Google Scholar

International Wheelchair Basketball Federation (2016). Who We Are. Available at: https://iwbf.org/the-game/history-wheelchair-basketball/

Google Scholar

James, N., Mellalieu, S., and Jones, N. (2005). The development of position-specific performance indicators in professional rugby union. J. Sports Sci. 23, 63–72. doi: 10.1080/02640410410001730106

PubMed Abstract | CrossRef Full Text | Google Scholar

Jayal, A., McRobert, A., Oatley, G., and O’Donoghue, P. (2018). Sports Analytics Applications in Soccer. In Sports Analytics: Analysis, Visualisation and Decision Making in Sports Performance. Abingdon: Routledge, 220–244. doi: 10.4324/9781315222783-12

CrossRef Full Text | Google Scholar

Lamas, L., Santana, F., Heiner, M., Ugrinowitsch, C., and Fellingham, G. (2015). Modeling the offensive-defensive interaction and resulting outcomes in basketball. PLoS One 10:e0144435. doi: 10.1371/journal.pone.0144435

PubMed Abstract | CrossRef Full Text | Google Scholar

Landis, J. R., and Koch, G. G. (1977). The measurement of observer agreement for categorical data. Biometrics 33, 159–174. doi: 10.2307/2529310

CrossRef Full Text | Google Scholar

Liu, J., Tang, W., Chen, G., Lu, Y., Feng, C., and Tu, X. M. (2016). Correlation and agreement: overview and clarification of competing concepts and measures. Shanghai Arch. Psychiatry 28, 115–120. doi: 10.11919/j.issn.1002-0829.216045

PubMed Abstract | CrossRef Full Text | Google Scholar

McHugh, M. L. (2012). Interrater reliability: the kappa statistic. Biochem. Medica 22, 276–282. doi: 10.11613/BM.2012.031

CrossRef Full Text | Google Scholar

Molik, B., Kosmol, A., Morgulec-Adamowicz, N., Laskin, J., Jezior, T., and Patrzalek, M. (2009). Game efficiency of elite female wheelchair basketball players during world championships (Gold Cup) 2006. Eur. J. Adap. Phys. Act. 2, 26–38. doi: 10.5507/euj.2009.007

CrossRef Full Text | Google Scholar

Mullens, L. (1978). European Basketball Championships 1977: Reliability of the Observation Protocol. Attempt to Elaborate a Player Proficiency Protocol. MSc Thesis, Katholieke Universiteit Leuven, Leuven.

Google Scholar

O’Donoghue, P. (2007). Editorial: special issue on reliability. Int. J. Perform. Anal. Sport 7, 20–27.

Google Scholar

O’Donoghue, P. (2014). An Introduction to Performance Analysis of Sport. Abingdon: Routledge.

Google Scholar

Passos, P. (2008). Dynamical Decision Making in Rugby: Identifying Interpersonal Coordination Patterns. Ph.D. thesis, Universidade de Lisboa, Lisbon.

Google Scholar

R Core Team (2015). R: A Language and Environment for Statistical Computing. Vienna: R Foundation for Statistical Computing.

Google Scholar

Sampaio, J., Lago, C., Casais, L., and Leite, N. (2010). Effects of starting score-line, game location, and quality of opposition in basketball quarter score. Eur. J. Sport Sci. 10, 391–396. doi: 10.1080/17461391003699104

CrossRef Full Text | Google Scholar

Sampaio, J., McGarry, T., and O’Donoghue, P. (2013). “Introduction,” in Routledge Handbook of Sports Performance Analysis, eds T. McGarry, P. O’Donoghue, and J. Sampaio (Abingdon: Routledge), 1–2.

Google Scholar

Skucas, K., Stonkus, S., Molik, B., and Skucas, V. (2009). Evaluation of wheelchair basketball skill performance of wheelchair basketball players in different game positions. Sportas 4, 65–70.

Google Scholar

Thomson, E., Lamb, K., and Nicholas, C. (2013). The development of a reliable amateur boxing performance analysis template. J. Sports Sci. 31, 516–528. doi: 10.1080/02640414.2012.738922

PubMed Abstract | CrossRef Full Text | Google Scholar

Vanlandewijck, Y., Evaggelinou, C., Daly, D., Van Houtte, S., Verellen, J., Aspeslagh, V., et al. (2003). Proportionality in wheelchair basketball classification. Adapt. Phys. Activ. Q. 20, 369–380. doi: 10.1123/apaq.20.4.369

CrossRef Full Text | Google Scholar

Vanlandewijck, Y., Evaggelinou, C., Daly, D., Verellen, J., van Houtte, S., Aspeslagh, V., et al. (2004). The relationship between functional potential and field performance in elite female wheelchair basketball players. J. Sports Sci. 22, 668–675. doi: 10.1080/02640410310001655750

PubMed Abstract | CrossRef Full Text | Google Scholar

Vanlandewijck, Y., Spaepen, A. J., and Lysens, R. J. (1995). Relationship between the level of physical impairment and sports performance in elite wheelchair basketball athletes. Adapt. Phys. Activ. Q. 12, 139–150. doi: 10.1123/apaq.12.2.139

CrossRef Full Text | Google Scholar

Watson, N., Durbach, I., Hendricks, S., and Stewart, T. (2017). On the validity of team performance indicators in rugby union. Int. J. Perform. Anal. Sport 17, 609–621. doi: 10.1080/24748668.2017.1376998

CrossRef Full Text | Google Scholar

Ziv, G., Lidor, R., and Arnon, M. (2010). Predicting team rankings in basketball: the questionable use of on-court performance statistics. Int. J. Perform. Anal. Sport 10, 103–114. doi: 10.1080/24748668.2010.11868506

CrossRef Full Text | Google Scholar

Keywords: sport performance analysis, Paralympic, reliability, validity, elite sport

Citation: Francis J, Owen A and Peters DM (2019) A New Reliable Performance Analysis Template for Quantifying Action Variables in Elite Men’s Wheelchair Basketball. Front. Psychol. 10:16. doi: 10.3389/fpsyg.2019.00016

Received: 22 November 2018; Accepted: 07 January 2019;
Published: 23 January 2019.

Edited by:

Miguel-Angel Gomez-Ruano, Polytechnic University of Madrid, Spain

Reviewed by:

Wilbur Julio Kraak, Stellenbosch University, South Africa
Aitor Iturricastillo, Universidad del País Vasco, Spain
Jolanta Marszalek, Józef Piłsudski University of Physical Education in Warsaw, Poland

Copyright © 2019 Francis, Owen and Peters. This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) and the copyright owner(s) are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.

*Correspondence: John Francis, j.francis@worc.ac.uk

Disclaimer: All claims expressed in this article are solely those of the authors and do not necessarily represent those of their affiliated organizations, or those of the publisher, the editors and the reviewers. Any product that may be evaluated in this article or claim that may be made by its manufacturer is not guaranteed or endorsed by the publisher.