An evidence-based approach to assessing the effectiveness of training regimen on athlete performance: Youth soccer as a case study (2024)

Abstract

Athletic performance data are modeled in an effort to better understand the relationship between both hours spent training and a measurement of “commitment” to that training, and improvements in performance. Both increased training time and greater commitment were predicted to produce larger increases in performance improvement, and commitment was predicted to be the more important determinant of improvement. The performance of 108 soccer players (ages 9–18) was quantified over a 10-week training program. Hours spent training ranged from 16 to 90 during the course of the study, while commitment scores ranged from 0.55 to 2.00, based on a scale from 0.00 to 2.40. A model selection approach was used to discriminate among models specifying relationships between training hours and improvement, and commitment and improvement. Despite considerable variability in the data, results provided strong evidence for an increase in performance improvement with both training hours and commitment score. The best models for hours and commitment were directly compared by computing an evidence ratio of 5799, indicating much stronger evidence favoring the model based on commitment. Results of analyses such as these go beyond anecdotal experience in an effort to establish a formal evidentiary basis for athletic training programs.

Introduction

Coaches, trainers and athletes have searched for the optimal formula for skill acquisition for as long as sports have existed, simply because of the competitive advantages conferred on athletes who can develop abilities faster, or to a greater extent, than others [1, 2]. Historically, the selection of training methods has been guided primarily by anecdotal experience rather than by empirical evidence on effectiveness [3, 4]. There have been some efforts to develop evidence for the effects of training on sports and other endeavors [1, 5, 6]. However, based on varied experiences of the authors with elite club, high school, Division 1 collegiate, and professional athletics, it appears that athletes and their mentors seldom use strong empirical evidence to guide their training decisions. Instead, decisions about methods to promote skill development tend to be based on anecdotal experiences about what methods have, and have not, “worked”. In some cases, opinions of coaches and trainers can seem to converge on a conventional wisdom. However, the recent revolution in analytics provides strong evidence of the fallibility of such wisdom [3, 7].

Science is based on the key step of comparing hypothesis-based predictions against observations. When hypotheses are consistently good predictors, they inspire confidence and are used to make decisions. This consistency of predictions and observations is referred to as evidence. Greater reliance on evidence-based decisions has been recommended in such diverse fields as dentistry [8], medicine [9], physical education [4] and conservation [10]. In addition to decisions being “better” (more likely to attain objectives), evidence-based decisions are transparent, defensible, objective, and scientific, all desirable attributes of decisions that are subject to scrutiny.

In this paper, the initial steps of evidence-based learning are applied to a case study involving an online program of soccer exercises developed and administered by one of the authors (M.K.). The soccer exercises were unsupervised and administered remotely via video, such that different athletes enrolled in the program expended different amounts of time on the exercises (hours) and obtained different scores for the quality of training and task completion (commitment). Tasks included not only completing training exercises, but also watching motivational videos, updating skill scores, and honoring commitments to training goals. The utility of these two independent variables, hours and commitment, was assessed by the athletes’ performance during tests designed to measure their skill levels. Specifically, performance metrics were obtained by self-administered tests near the beginning of the remote 10-week training program and then at the end of this program, permitting the computation of an “improvement” metric for each athlete.

An information-theoretic model selection approach with generalized linear models was used to formally address three questions:

  1. Were more training hours associated with larger improvement metrics?

  2. Were larger commitment scores associated with larger improvement metrics?

  3. Was one of these variables a better predictor of improvement than the other?

Multi-model inference was used to estimate the relationship between skill improvement and these two variables, hours and commitment. These questions were motivated by hypotheses about the adequacy of practice alone [2, 11] as a predictor of skill development, versus the need to also include intrinsic psychological qualities of the athletes [12].

Hypotheses and predictions

The training program entailed a series of exercises designed to improve the athlete’s abilities in several soccer skills important to match play. Data on amount of time spent in program training, commitment and skill level were obtained periodically (e.g., weekly). The focus for this study was a 10-week period. This was the standard length of a training course in this program, selected based on the reported time required for humans to develop new habitual behavior [13]. The training program was flexible such that each athlete decided how much time to spend training, depending on such variables as personal dedication and amount of available time. The central questions are: do time spent in program training and individual commitment affect skill improvement. The basic predictions were that athletes who expended more time training and who exhibited more commitment would show a greater improvement in the targeted skill. Different plausible hypotheses about the nature and form of these relationships were considered.

The basic hypothesis for the training time analysis was that improvement would be greater for athletes who expended more time training [2, 11]. This hypothesis was incorporated into a model in which improvement increased linearly with training time. A null hypothesis model included no effect of training time on improvement. Development of any skill using any method will vary across individual athletes; hence the need for statistical inference. This analysis was based on a relatively homogeneous group of athletes; soccer players aged 9–18 years who competed at varying levels (recreational through elite). Variation was still expected among athletes in this group with respect to initial skill level. This variation was a primary reason for focusing on improvement in skill over the period of training, rather than on absolute skill level attained (see Analytic Methods). In addition, a general model was developed to incorporate the hypothesis that improvement would require more time for athletes beginning the program at a high level of skill. The premise was that athletes starting out at lower skill levels can increase proficiency rapidly. Another general model did not incorporate this hypothesis about starting skill level, but included a term that permitted an inflection in the relationship between improvement and time spent training. Specifically, a leveling off in the training-improvement relationship was predicted to occur at higher levels of training time.

The same basic model set was used for the independent variable, commitment score, as well. Commitment was viewed as a characteristic of each individual athlete and was measured by accumulated activities (see Training Methods and Metrics). One model incorporated the hypothesis that improvement was a linear function of commitment (athletes with greater commitment were expected to exhibit greater improvement). In contrast, a null model included no relationship between commitment and improvement. Another model incorporated the hypothesis that initial proficiency influenced the change in improvement with commitment. A final model allowed an inflection, such that the rate of improvement with commitment was lower at high levels of commitment.

Methods

Training methods and metrics

Training program

This training program utilized a web-based player development platform to which players gained access for 10 weeks when they signed up to participate. Athletes in the program were able to access their performance data, but could not edit the data. For this study, NA = 108 athletes who supplied usable data from the Spring 2020 training program. Data were not included from individuals who began the training but did not complete the full program for any reason (e.g., injury). Players were able to sign up for one of three skill development tracks for this 10-week period. These tracks each focused on a unique set of skills, the first, dribbling; the second, first touch and passing; and the third, striking.

Ethics

Approval from an ethics committee was not sought because:

  1. the data were collected under the auspices of "Captain Elite, Soccer Research and Training Organization", which has no institutional ethics committee;

  2. the focal training segment was one component of an ongoing training program and not designed as a specific research study;

  3. this study is retrospective and was not anticipated when the data collection took place;

  4. all participants and their parents (in the case of minors) signed consent forms that their data could be used for research purposes;

  5. all data were anonymized and analyzed anonymously. Training and performance data had no linkages to individual identification information.

Consent was informed and documented via signature. All athletes and parents/guardians signed a waiver/release at every enrollment that included the following statements:

"I understand that the course involves sharing photos, videos and data of Participant during Participant’s training in order to help assess Participant’s level, to share with others in the course and to use for anonymous research purposes. I consent for the Company to use these photos, videos and data for these purposes. If I choose not to participate in sharing data, photos and/or videos of Participant’s progress, I understand that it is my obligation not to enroll in any of the Company’s online training courses. I HAVE CAREFULLY READ THIS RELEASE AND AGREE ON BEHALF OF MY MINOR CHILD."

The collector (M.K.) of the data used in this study is developer and director of a program focused on the training of athletes. This training requires periodic assessment tests to evaluate the rate of development of the athletes (see Skill improvement). These tests are periodically evaluated and results posted so that athletes can see their progress and relative standing. Athletes (and their parents/guardians) who participate in this program sign the above waiver/release specifying that they are aware of the evaluation process and the potential uses of their data, and they are the ones who submit test results and associated video evidence.

Commitment

The program required players to submit a weekly log by midnight every Sunday as a self-report of the work they completed in the training program that week. This log asked players:

  1. to record the total number of hours they trained in the training program in the specified week;

  2. to specify whether they watched a professional soccer game in the focal week and, If yes, what teams competed;

  3. to specify whether they completed each of the tasks required of them for that week (players had to check a box next to each individual task completed for the week).

Each player’s commitment score was based on her/his completion of each weekly task, in addition to whether or not s/he met or exceeded the pre-set goal in weekly hours trained. Additionally, players were given the opportunity to earn extra “points” for their commitment scores by completing optional tasks.

Overall commitment scores were computed as the sum of two components, one based on required tasks and the other based on optional tasks. The “required” component was computed as the fraction of required tasks that was completed; i.e., by dividing the number of points accrued by the maximum number of points for a player completing 100% of required tasks. The required component score thus took values between 0 (no required tasks completed) and 1.00 (all required tasks completed). Optional tasks were available as well, with up to a 1.40 score for an athlete scoring maximum optional points. Thus, commitment scores could range from 0 to 2.40 for each player. The commitment metric was intended to go beyond time expended on training, as it incorporated information that reflected an athlete’s commitment to completing all tasks required of them, as well as some related tasks that were not required.

Hours trained

In the weekly log, players self-reported the number of hours they trained on drills in their specific development tracks. Hours trained were computed for each player by summing the number of training hours reported across the 10-week period.

Skill improvement

The training platform included videos of 10–12 soccer drills that players watched and attempted to replicate on their own. A minimum of three times per 10-week program (beginning, mid-point, and end of the 10 weeks), and a maximum of once per week during the 10 weeks, players were required to self-report their scores on a designated three of these video-based exercises to measure their improvement. Each time players submitted new scores on these 3 drills, they were required to submit videos of themselves completing each drill as evidence that they accomplished the reported scores. Players were encouraged to update their skill scores more frequently, and each time they did so (in excess of the 3 required tests), they were rewarded by gaining more “commitment points”.

Each development track had a different scoring system for measurement of skill improvement. Players measured their skill scores in the dribbling track based on how long it took them to perform each of the 3 designated drills. For the first touch and passing track, they recorded how many correct repetitions they could complete in 30 seconds, and for the other 2 drills they recorded how many consecutive, correctly-executed repetitions they could achieve before making a mistake. For the striking track, players measured all 3 of the drills as the distance from the goal at which they could correctly complete each of the striking techniques. Each striking drill was scored separately as an average of the maximum left-footed and right-footed distances the player achieved. A new distance for either foot on any one of the drills could be achieved only by completing 5 correct strikes in a row with that technique.

A player’s scores from each of the separate development tracks were combined into one overall score using a “Skill Stage” scoring system reflecting objective standards of skill achievement. Scores ranged between 0 and 4 for each development track (0–1.99 = fundamental youth skill, 2–2.99 = elite youth skill, 3–3.99 = collegiate skill, 4+ = professional skill). For this study, skill improvement was measured as a change in average Skill Stage score across each of the 3 designated drills from the beginning of the 10 weeks until the end of the 10 weeks.

Analytic methods

Two sets of analyses were conducted for the 2 different independent variables. The first analysis was based on time spent training over the 10-week training period. The participating athletes showed substantial variation in this metric, ranging from about 16 hours to 90 hours. The second set of analyses was based on the composite metric, commitment, that was computed using a point system, where players were awarded a specific number of points according to which of the required and optional tasks they completed throughout the program (see above).

Training time

Competing models of skill improvement as a function of time spent training were developed and then fit to the data. Model selection was used to assess the level of support for each model. For each of the i = 1…NA athletes, the primary data used in the analysis were:

yit = skill level of athlete i at beginning of week t as assessed by skill test,

xit = time spent training by athlete i between weeks t and t+1.

The focal response variable of the analysis was a statistic reflecting improvement between the initiation of the training program, week t0, and some endpoint assessment time (in this case 10 weeks later), tT:

Δi=yitTyit0.

The first independent variable of primary interest was accumulated training time between the beginning and end of the training period (denote the length of the training period as Δt = 10 weeks):

xiΔt=t=t0tTxit.

One other independent variable was used in one of the general models for training time, yit1. This variable provided an assessment of skill level at the beginning of the training and assessment period. We did not use yit0 as an assessment statistic, because this would induce a sampling covariance between this statistic and our response variable, Δi. Use of yit1 as an independent variable represented an effort to investigate the hypothesis that initial skill level might influence the increases in skill level expected to accompany increases in training time.

The basic approach of this investigation was to develop 4 models that represented 4 competing hypotheses about the relationship between training time and skill improvement (described above). In all models a normal distribution was assumed with mean μ and variance σ2 for the response variable, i.e.,

ΔiN(μ,σ2).

The models differed only in the structure imposed on the mean, i.e., only in the key determinants of skill improvement. The simplest such model is the null model 1 under which training time beyond 16 hours (the minimum training time among all participants) did not influence skill improvement:

μ1=β0.

Note that model 1 can be viewed as null with respect to the effect of accumulated hours of training on proficiency. All athletes trained for at least 16 hours during the 10-week program, so μ1 reflects a baseline level of proficiency that is likely influenced by the first 16 hours of training.

Model 2 was the basic linear model postulating an increase in proficiency with training (prediction: β^1>0, where the hat denotes an estimate):

μ2=β0+β1xiΔt.

Note that no data were available on the effects of the first few hours of training, and rapid increases in proficiency might be expected as drills are learned. Thus, models 2–4 should be appropriate for predicting changes in proficiency for ≥16 hours of training.

Model 3 included an intercept term, β0, a linear term, β1, describing the relationship between training time and proficiency, and a quadratic term, β2, accounting for a possible threshold point after which time spent training begins to have either an increased or diminished effect:

μ3=β0+β1xiΔt+β2xiΔt2.

The prediction was that β^1>0 (where hats ^ denote estimates), indicating an increase in skill improvement with increased training time; and β^2<0, indicative of greater difficulty in increasing proficiency as training time and proficiency increase.

Finally, another general model, 4, included a linear term describing the relationship between training time and performance, as well as an interaction term expressing the additional influence of starting proficiency level:

μ4=β0+β1xiΔt+β3yit1*xiΔt.

The above model simply states that the average skill improvement can be written as a linear function of some overall baseline effect, β0, an effect of additional time spent training, β1, and an interaction effect between initial skill level and time spent training, β4. The expectation was that β^1>0, indicating an increase in skill improvement with increased training time; and β^3<0, indicating a reduced effect of training for athletes with high initial skill levels. Note that unlike models 1–3, each of which postulates a single relationship for all individuals in the population, model 4 produces a different relationship for individuals with different values for yit1, yielding a family of relationships associated with the different starting proficiencies.

Each of these 4 models was fit to the data, and model selection statistics and parameter estimates were then used to judge which model “best” described the data. Models were fit using maximum likelihood, and Akaike’s Information Criterion (AIC) [14, 15] was used for model selection. Based on the assumed Gaussian model, the maximum likelihood estimator for each of the models 1–4 takes the form of the regression equation

β^=minβ(yiμv(β))2,v=1,2,3,4.(1)

The resulting parameter estimates provide information about the importance of each independent variable. Each effect estimate, β^l, was examined to determine whether its sampling distribution, the width of which is reflected by the associated standard error (see S1 Appendix), SE^(βi^), was centered near 0 (indicating little support for the effect) or far from 0 (supporting the effect).

Because all 4 models represented plausible hypotheses, and because no single model received overwhelming support relative to the other models, model-averaged estimates [15, 16] were computed for the β parameters that defined our different models (see S1 Appendix).

Commitment

It has been suggested that perhaps a better predictor of performance improvement is “grit” (popularized in [12]), a term intended to reflect not just the time spent training for an activity, but the quality of that training and the dedication of the individual athlete to it. This grit characteristic is referred to here as commitment and is quantified over the 10-week training period as described above. As with training time, an increase in improvement was predicted for those athletes with higher commitment scores. A stronger relationship was predicted for commitment than for training time, but because the scales of these 2 independent variables are not comparable, this expectation does not lead to predictions about relative magnitudes of the effect estimates, β^1. Both training time and commitment could not be included as independent variables in the same model(s), because training time is a component of the commitment measure and these variables were relatively highly correlated. In order to avoid potentially problematic issues with collinearity, model selection was used to compare the models developed for these 2 independent variables.

The basic modeling approach described above for time spent training was also used for the commitment variable, zit.

zit=trainingcommitmentaccumulatedbyathleteithroughΔtweeksoftraining.

The commitment variable corresponding to the end of the 10-week assessment period was denoted as ziΔt for consistency with the time variable. The same basic set of models as used for time spent training was used to assess the relationship between accumulated commitment and improvement in proficiency. Thus, ziΔt was substituted for xiΔt in all of the above expressions in order to conduct this second set of analyses.

Training time vs. commitment

In addition to the separate model sets for training time and commitment, all models were combined to create a model set including models for each variable as well as a single null model (no covariate effect). AIC scores and weights were then used to assess the relative abilities of these combined models to parsimoniously describe the processes generating the data. Evidence ratios were computed for specific pairs of models as wi/wj, where wi and wj denote the AIC weights (S1 Appendix) of the 2 models being compared. The evidence ratio reflects the degree to which evidence for model i is greater than that for model j.

Results

Scores were recorded for 108 individuals over a 10-week period in the spring of 2020. The data consist of the proficiency levels recorded at the beginning and end of the period of performance for each athlete. Data were also reported during the interim weeks; the number of reported scores help comprise the independent “commitment” variable. Also recorded are the accumulated hours trained over this period of performance. Initially, separate model sets are presented for each of the 2 independent variables, focusing on the nature of these relationships. The model selection statistics for the 2 independent variables are then combined in order to assess which variable provided the better descriptions of the data.

Training time

Model selection statistics for time spent training are presented in Table 1. The “best” model (ΔAIC = 0) was model 2, in which there was a linear increase in proficiency with training time. Model 3 was very competitive, showing evidence of an increase in proficiency with training time until about 80 hours, after which additional training did not appear to further increase proficiency (Fig 1). Model 4 received less support than models 2 and 3, but provided some evidence of greater increases in proficiency for athletes beginning training at lower proficiency levels. The lowest ranked model was model 1 (no improvement with training time). Estimated β parameters describing these relationships (Table 1) provided strong evidence of an increase in proficiency with increased hours of training time.

Table 1. Model parameter estimates and standard errors, along with the associated ΔAIC values indicating the relative support for each of the four models using “training time” as the independent variable.

Modelβ^0(σ^0)β^1(σ^1)β^2(σ^2)β^3(σ^3)ΔAICwi
20.6336(0.0965)0.0104(0.0018)--0.00000.4659
30.2551(0.2156)0.0252(0.0078)-0.0001(0.0001)-0.02400.4603
40.6264(0.0972)0.0117(0.0025)--0.0011(0.0014)3.69460.0735
11.1577(0.0422)---14.66120.0003

Open in a new tab

Fig 1.

Open in a new tab

The raw data for each athlete in the training program, as well as the relationships between proficiency and training under all 4 models, are plotted in Fig 1. The raw data show substantial variation, emphasizing the need for statistical inference in order to draw conclusions about the effectiveness of training. Also shown in Fig 1 is the model-averaged relationship, which can be viewed as providing the best assessment of overall training effects, given model uncertainty. The model-averaged relationship shows increases in proficiency with hours of training, consistent with a priori hypotheses and the purpose of the training program.

Commitment

Following the same modeling and estimation approach as above, the analyses were repeated with “commitment” score as the independent variable. The quadratic (model 3) and linear (model 2) models were both competitive, showing increases in skill improvement with larger commitment scores (Table 2, Fig 2). Given model uncertainty, the model-averaged plot in Fig 2 provides the best description of this relationship.

Table 2. Model parameter estimates and standard errors, along with the associated ΔAIC values when using commitment as the independent variable.

Modelβ^0(σ^0)β^1(σ^2)β^2(σ^2)β^3(σ^3)ΔAICwi
3-0.9281(0.4387)2.9192(0.7880)-0.8003(0.3319)-0.00000.5800
20.0839(0.1287)1.0409(0.1197)--0.94940.3608
40.0829(0.1291)0.9775(0.1407)-0.0599(0.0695)4.56730.0591
11.1577(0.0422)---31.91370.0000

Open in a new tab

Fig 2.

Open in a new tab

Combined analysis for training time and commitment

To address the question of which independent variable provided a better description of the data, the training time and commitment models were combined to form a single model set with new model weights. In this “joint” assessment, seven separate models are considered, and their AIC values are labeled as AICCT1, AICC2, AICC3, AICC4, AICT2, AICT3, AICT4, where the subscripts denote the independent variable (T = time, C = commitment), and the form of the model (1–4) for that independent variable. The first model (constant) is identical for both independent variables (time, commitment); hence only the single AICCT1 value need be considered.

The model weights corresponding to use of commitment as the independent variable are all considerably larger than those using training time as the independent variable (Table 3). This suggests that commitment is a better predictor of performance improvement than time spent training. The largest model weight corresponds to the model that predicts a quadratic relationship, showing an increase in performance with commitment score that levels off at higher levels of commitment. An evidence ratio was computed using the AIC weights of the best models for each covariate, models C3 and T2. The evidence ratio, wG3/wT2 = 5799, strongly supported the quadratic commitment model as a better model for performance improvement than the linear training time model.

Table 3. AIC, ΔAIC, and model weights obtained by considering the joint model set consisting of 6 models associated with using commitment (C2-C4) and training time (T2-T4) as the independent variables, as well as a constant (null) model, CT1.

The “constant” model is the same for the 2 independent variables.

ModelAICΔAICwi
C3172.38410.00000.5799
C2173.33350.94940.3608
C4176.95144.56730.0591
T2189.636817.25270.0001
T3189.660817.27670.0001
T4193.331420.94730.0000
CT1204.297831.91370.0000

Open in a new tab

Discussion

The general objective of this study was to evaluate whether participation in an athletic training program can increase proficiency in targeted athletic skills. The specific study objective focused on 2 metrics reflecting the degree of participation in the training: (1) hours spent in the program and (2) a measure of athlete commitment to the training. It was hypothesized that larger values of both participation metrics would lead to larger gains in proficiency, and that commitment might be the better of the 2 predictors of increased proficiency. Results provided strong evidence that both hours spent training and commitment score were positively associated with increased proficiency in the targeted skills. In addition, evidence strongly supported the hypothesis that commitment score was the better of the 2 predictors of increased proficiency.

The development of training programs designed to improve athletic performance is likely as old as competition itself. Athletes and trainers largely base training recommendations on their preconceptions and their personal experiences with training that has and has not appeared to “work”, either for themselves or athletes whom they have trained. Learning from experience is certainly sensible, but humans frequently fail to do this in an objective manner. For example, humans tend to focus on observations that support their prior beliefs, subconsciously ignoring and downplaying observations that fail to support those beliefs. This tendency has been labeled “confirmation bias” and is well-known to psychologists [17], athletic trainers [3], and songwriters: “But a man hears what he wants to hear and disregards the rest,” [18].

The training program investigated in this paper was developed by one of the authors (M. K.) who has many years of experience coaching and training young soccer players. The drills that comprised the tested training were all expected to lead to increases in proficiency of focal skills. However, the scatterplot of points resulting from this investigation showed substantial variation, illustrating the difficulties in drawing inferences based on data for human performance. For example, despite the majority of individuals who showed increases in proficiency with training, a few individuals showed negligible increases despite nearly 60 hours of training.

One possible explanation for such outliers is that the quality of the training time for some athletes was sufficiently low to prevent substantial improvement. It was hypothesized that perhaps use of “commitment” as the independent variable would be one way to assess both the amount and quality of the training and hence provide a more reliable predictor of success. Indeed, the aforementioned outliers scored much lower in terms of “commitment” relative to the population than they did for “hours trained”. The joint model set including all models provided strong evidence that the relationship between commitment and performance improvement was much stronger than that obtained via training time alone (Table 3).

Methodologically, the selected approach to dealing with this variation and uncertainty was to use multimodel inference [15]. This approach deals not only with variation in observations conditional on a particular model being a good approximation to reality (e.g., the spread of points that do not fall exactly on a regression line), but also with the fact that humans never “know” which of several candidate models provides the best description of the processes that generated the data.

Based on this approach of multimodel inference, the best estimates of the relationship between hours expended training and increased proficiency are provided by model averaging and indicate roughly a 0.15-point increase in proficiency with every 10 hours of training, with smaller increases at higher training times. Given that the average of the initial skill scores is 1 point, this model can be viewed as predicting a 15% increase in proficiency per additional 10 hours trained, but with those increases beginning to level off after about 50 hours of training.

The relationship between commitment and skill improvement was best described by quadratic (C3) and linear (C2) models. The plot of model-averaged values provides the best estimate of this relationship, showing larger gains in proficiency for athletes with higher commitment scores.

Thus, the separate analyses with the 2 independent variables provided evidence of the importance of both training time and commitment to proficiency gains, as predicted. Model selection statistics based on the joint model set (Table 3) show that evidence is stronger for commitment. Results thus support the contention that both training time and individual commitment are key determinants of proficiency gains, with the addition to training time of individual characteristics related to the quality of training (commitment) increasing the explanatory ability of the models.

The basic approach used here can be used to answer a host of additional questions relating training to outcomes. One could, for example, relate the fraction of time spent on a particular skill in practice to improvement in game situations. Alternatively, one could expose multiple teams to different training regimens (e.g., vary ratio of speed and technical training) and compare the resulting values for β1. A number of outstanding questions [3] relating training to proficiency, and proficiency to game impact, can be addressed using this approach.

This study was motivated by the fact that much of the conventional wisdom in the fields of athletic training and performance is based largely on personal experience. This approach has served athletics reasonably well, but suffers from the possibility that some ideas may attain acceptance simply because they are championed by vocal persons with strong personalities. More and more disciplines are focusing on evidence-based approaches to develop best practices. Statistical approaches such as those used here provide a means of deriving strong inferences using data from existing training programs. Even stronger inferences can be obtained by accumulating evidence of multiple analyses such as those presented here [19] and by use of true experimental approaches. Such efforts to obtain stronger inferences are recommended especially for questions characterized by substantial existing uncertainty.

In conclusion, this study focused on a training program developed to increase proficiency in specific soccer skills. Both number of hours devoted to this training and a measure of commitment to the training program were positively related to increases in proficiency, and evidence was particularly strong for commitment. Although the specific results of this study are limited to one particular training program, they are consistent with general hypotheses about the importance to athletic training of individual characteristics variously described as grit and commitment (12). To the extent that commitment can be encouraged and developed by coaches, this evidence of its importance has the potential to be very useful. Specifically, coaches and trainers can emphasize to athletes the importance of not just allocating time to training exercises, but also focusing on the quality of the training. However, even if commitment is entirely intrinsic to each athlete and cannot be developed, the gains in proficiency associated with training time alone attest to the utility of this training program. In addition to these specific applications of study results, the more general application of methods similar to those used in this study should lead to a stronger evidentiary basis for athletic training.

Supporting information

S1 Appendix. Akaike’s Information Criterion (AIC), Akaike weights and multimodel inference.

(DOCX)

Click here for additional data file. (15.5KB, docx)

S1 Data

(DAT)

Click here for additional data file. (3.6KB, dat)

S2 Data

(DAT)

Click here for additional data file. (3.6KB, dat)

Data Availability

All relevant data are within Supplemental Files S2 and S3

Funding Statement

The authors received no specific funding for this work.

References

  • 1.Macnamara BN, Hambrick DZ, Oswald FL. Deliberate practice and performance in music, games, sports, education, and professions: A meta-analysis. Psychol Sci. 2014; 25: 1608–1618. doi: 10.1177/0956797614535810 [DOI] [PubMed] [Google Scholar]
  • 2.Ericsson KA. Towards a science of the acquisition of expert performance in sports: Clarifying the differences between deliberate practice and other types of practice. J Sports Sci. 2020; 38: 159–176. doi: 10.1080/02640414.2019.1688618 [DOI] [PubMed] [Google Scholar]
  • 3.Biermann C.Football hackers: The science and art of a data revolution.London: Blink Publishing, London; 2019. [Google Scholar]
  • 4.Strong WB, Malina RM, Blimkie CJR, Daniels SR, Dishman RK, Gutin B, et al. Evidence based physical activity for school-age youth. J Pediatrics. 2005; 146:732–737. doi: 10.1016/j.jpeds.2005.01.055 [DOI] [PubMed] [Google Scholar]
  • 5.Impellizzeri FM, Rampinini E, Marcora SL. Physiological assessment of aerobic training in soccer.J Sports Sci, 2005; 23: 583–592. doi: 10.1080/02640410400021278 [DOI] [PubMed] [Google Scholar]
  • 6.Reilly T.An ergonomic model of the soccer training process. J Sports Sci. 2005; 23: 561–572. [DOI] [PubMed] [Google Scholar]
  • 7.Lewis M.Moneyball: The art of winning an unfair game.New York: W.W. Norton; 2003. [Google Scholar]
  • 8.Richards D, Lawrence A. Evidence-based dentistry. Br Dental J. 1995; 179:270–273. doi: 10.1038/sj.bdj.4808896 [DOI] [PubMed] [Google Scholar]
  • 9.Davidoff F, Haynes B, Sackett D, Smith R. Evidence based medicine. BMJ1995; 310:1085–1086. doi: 10.1136/bmj.310.6987.1085 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 10.Sutherland WJ, Pullin SAS, Dolman PM, Knight TM. The need for evidence-based conservation. Trends Ecol Evol. 2004; 19:305–308. doi: 10.1016/j.tree.2004.03.018 [DOI] [PubMed] [Google Scholar]
  • 11.Ericsson KA, Pool R. Peak: Secrets from the new science of expertise.Boston: Eamon Dolan/Houghton Mifflin Harcourt; 2016. [Google Scholar]
  • 12.Duckworth A.Grit: The power of passion and perseverance. New York: Scribner; 2016. [Google Scholar]
  • 13.Lally P, van Jaarsveld CHM, Potts HWW, Wardle J. How are habits formed: Modelling habit formation in the real world. Eur J Social Psychol. 2009; 40: 998–1009. [Google Scholar]
  • 14.Akaike H.Information theory as an extension of the maximum likelihood principle. In: Petrov BN, Csaksi F, editors. 2nd International Symposium on Information Theory. Budapest: Akademiai Kiado; 1973. pp. 267–281. [Google Scholar]
  • 15.Burnham KP, Anderson DR. Model selection and multimodel inference.2nd ed. Berlin: Springer-Verlag; 2002. [Google Scholar]
  • 16.Buckland ST, Burnham KP, Augustin NH. Model selection: An integral part of inference. Biometrics. 1997; 38:469–477. [Google Scholar]
  • 17.Nickerson RS. Confirmation bias: A ubiquitous phenomenon in many guises. Rev General Psychology. 1998; 2: 175–220. doi: 10.1037/1089-2680.2.2.175 [DOI] [Google Scholar]
  • 18.Simon P.The boxer.New York: Columbia Records; 1992. [Google Scholar]
  • 19.Nichols JD, Kendall WL, Boomer GS. Accumulating evidence in ecology: Once is not enough. Ecol Evol. 2019; 9:13991–14004. doi: 10.1002/ece3.5836 [DOI] [PMC free article] [PubMed] [Google Scholar]

Giancarlo Condello

29 Jun 2022

PONE-D-22-04804An evidence-based approach to assessing the effectiveness of training regimen on athlete performance: youth soccer as a case studyPLOS ONE

Dear Dr. Nichols,

Thank you for submitting your manuscript to PLOS ONE. After careful consideration, we feel that it has merit but does not fully meet PLOS ONE’s publication criteria as it currently stands. Therefore, we invite you to submit a revised version of the manuscript that addresses the points raised during the review process.Considering the reviewers comments and requirement, it is suggested to make a major revision of your manuscript.

Please submit your revised manuscript by Aug 13 2022 11:59PM. If you will need more time than this to complete your revisions, please reply to this message or contact the journal office atplosone@plos.org. When you're ready to submit your revision, log on to https://www.editorialmanager.com/pone/ and select the 'Submissions Needing Revision' folder to locate your manuscript file.

Please include the following items when submitting your revised manuscript:

  • A rebuttal letter that responds to each point raised by the academic editor and reviewer(s). You should upload this letter as a separate file labeled 'Response to Reviewers'.

  • A marked-up copy of your manuscript that highlights changes made to the original version. You should upload this as a separate file labeled 'Revised Manuscript with Track Changes'.

  • An unmarked version of your revised paper without tracked changes. You should upload this as a separate file labeled 'Manuscript'.

If you would like to make changes to your financial disclosure, please include your updated statement in your cover letter. Guidelines for resubmitting your figure files are available below the reviewer comments at the end of this letter.

If applicable, we recommend that you deposit your laboratory protocols in protocols.io to enhance the reproducibility of your results. Protocols.io assigns your protocol its own identifier (DOI) so that it can be cited independently in the future. For instructions see: https://journals.plos.org/plosone/s/submission-guidelines#loc-laboratory-protocols. Additionally, PLOS ONE offers an option for publishing peer-reviewed Lab Protocol articles, which describe protocols hosted on protocols.io. Read more information on sharing protocols at https://plos.org/protocols?utm_medium=editorial-email&utm_source=authorletters&utm_campaign=protocols.

We look forward to receiving your revised manuscript.

Kind regards,

Giancarlo Condello, Ph.D.

Academic Editor

PLOS ONE

Journal requirements:

When submitting your revision, we need you to address these additional requirements.

1. Please ensure that your manuscript meets PLOS ONE's style requirements, including those for file naming. The PLOS ONE style templates can be found at

https://journals.plos.org/plosone/s/file?id=wjVg/PLOSOne_formatting_sample_main_body.pdf and

https://journals.plos.org/plosone/s/file?id=ba62/PLOSOne_formatting_sample_title_authors_affiliations.pdf.

2. Thank you for stating the following financial disclosure:

“Unfunded study”

At this time, please address the following queries:

a) Please clarify the sources of funding (financial or material support) for your study. List the grants or organizations that supported your study, including funding received from your institution.

b) State what role the funders took in the study. If the funders had no role in your study, please state: “The funders had no role in study design, data collection and analysis, decision to publish, or preparation of the manuscript.”

c) If any authors received a salary from any of your funders, please state which authors and which funders.

d) If you did not receive any funding for this study, please state: “The authors received no specific funding for this work.”

Please include your amended statements within your cover letter; we will change the online submission form on your behalf.

3. Thank you for stating the following in your Competing Interests section:

“No authors have competing interests”

Please complete your Competing Interests on the online submission form to state any Competing Interests. If you have no competing interests, please state ""The authors have declared that no competing interests exist."", as detailed online in our guide for authors at http://journals.plos.org/plosone/s/submit-now

This information should be included in your cover letter; we will change the online submission form on your behalf.

[Note: HTML markup is below. Please do not edit.]

Reviewers' comments:

Reviewer's Responses to Questions

Comments to the Author

1. Is the manuscript technically sound, and do the data support the conclusions?

The manuscript must describe a technically sound piece of scientific research with data that supports the conclusions. Experiments must have been conducted rigorously, with appropriate controls, replication, and sample sizes. The conclusions must be drawn appropriately based on the data presented.

Reviewer #1:Yes

Reviewer #2:Yes

**********

2. Has the statistical analysis been performed appropriately and rigorously?

Reviewer #1:I Don't Know

Reviewer #2:I Don't Know

**********

3. Have the authors made all data underlying the findings in their manuscript fully available?

The PLOS Data policy requires authors to make all data underlying the findings described in their manuscript fully available without restriction, with rare exception (please refer to the Data Availability Statement in the manuscript PDF file). The data should be provided as part of the manuscript or its supporting information, or deposited to a public repository. For example, in addition to summary statistics, the data points behind means, medians and variance measures should be available. If there are restrictions on publicly sharing data—e.g. participant privacy or use of data from a third party—those must be specified.

Reviewer #1:Yes

Reviewer #2:Yes

**********

4. Is the manuscript presented in an intelligible fashion and written in standard English?

PLOS ONE does not copyedit accepted manuscripts, so the language in submitted articles must be clear, correct, and unambiguous. Any typographical or grammatical errors should be corrected at revision, so please note any specific errors here.

Reviewer #1:Yes

Reviewer #2:Yes

**********

5. Review Comments to the Author

Please use the space provided to explain your answers to the questions above. You may also include additional comments for the author, including concerns about dual publication, research ethics, or publication ethics. (Please upload your review as an attachment if it exceeds 20,000 characters)

Reviewer #1:Dear Authors,

I appreciate the opportunity to comment to the authors in their manuscript titled "An evidence-based approach to assessing the effectiveness of training regimen on athlete performance: youth soccer as a case study". The manuscript's issue is interesting and relevant in the current context of youth soccer training knowledge. I consider that the relevance of the study lies in the limited research available exploring the formal evidentiary-basis for athletic training programs of academy soccer players.

The purpose of the study is based on the hypothesis that both increased training time and greater commitment would produce larger increases in performance improvement, and that commitment would be the most important determinant of improvement. The authors conclude with strong evidence for an increase in performance improvement with both training hours and commitment score.

The study is well structured, well written and has an innovative approach, which is why I believe it is suitable for publication and so I have noted this to the Editorial Board

Congratulations on the good work done.

BW

João Paulo Brito

Reviewer #2:The present study is of interest to investigate the effectiveness of training regimen on athlete performance, using a multi-model approach to estimate the relationship between skill improvement, hours and commitment.

Despite the interesting work, I strongly suggest following the comments to improve the quality of the manuscript.

Abstract

1. Authors should add a conclusion section.

Introduction

2. Line 35-37. "Coaches, trainers and athletes have searched for the optimal formula for skill acquisition for as long as sports have existed, simply because athletes who can develop abilities faster, or to a greater extent, than others possess a competitive advantage."

Authors should add a valid reference.

3. Line 37-38. "Historically, the selection of training methods has been guided by anecdotal experience rather than by empirical evidence on effectiveness."

Authors should add a valid reference.

4. Line 40-42. "However, from our varied experiences with elite club, high school, Division 1 collegiate, and professional athletics, we see that athletes and their mentors do not use empirical evidence to guide their training decisions as frequently as might be hoped."

With all due respect, I ask the authors how we can follow this rationale.

5. Line 44-45. "In some cases, opinions of coaches and trainers can seem to converge on a conventional wisdom."

What or wich cases? Be more specific using more details.

6. Line 78. "...and we focus on a 10-week period."

Please explain why, using valid scientific references.

7. Line 80. In my point of view, authors should avoid in scientific papers, using words such "etc". Please, mention all variables needed.

8. Line 85-86. "The basic hypothesis for the training time analysis was that improvement would be greater for athletes who expended more time training."

Please, explain why.

9. Line 90. "...aged 9-18 years who compete at varying levels (recreational through elite)."

How authors really classified the competitive and/or skill levels of the players? Please sustain your answer based on scientific evidence.

10. Line 92-93. "We deal with this variation in part by focusing on improvement in skill over the period of training, rather than on absolute skill level attained. " Please be more specific.

11. Line 95-96. "The basic idea was that athletes starting out at lower skill levels can increase proficiency rapidly. "

How this was really ensured?

12. Line 101-102. "...and was measured by accumulated activities."

Please, explain why.

Methods

13. Line 113. " ...we had NA=108 athletes..."

Please, explain why adding how athletes were really selected.

14. Line 117. "Ethics. We did not seek approval from an ethics committee because:"

Authors should confirm and add the approval/consent by the PlosOne officer if the paper could be published (if accepted) without ethical consent.

15. Line 127. "Consent was informed and documented via signature."

Authors should add, as a supplement file, one example of those informed consents.

16. Line 136-137. "This training requires periodic assessment tests to evaluate the rate of development of the athletes."

Please mention what type of test the authors are referring to.

17. Line 152-153. "Overall commitment scores were computed by dividing the number of points accrued by the maximum number of points for a player completing 100% of required tasks."

Authors should give more details/explanations how really this "equation" was created and calculated.

18.Line 153-154. "Optional tasks were available as well, such that commitment scores could range from 0 to 2.40 for each player."

Please, explain why.

19. Line 161. "Skill improvement" section.

Authors should add what type of skills were performed by the athletes. Were all the same for each age (9 to 18 years)?

20. Line 171-179. "Players measured their skill scores in the dribbling track based on how long it took them to perform each of the three designated drills. For the first touch and passing track, they recorded how many correct repetitions they could complete in 30 seconds, and for the other two drills they recorded how many consecutive, correctly-executed repetitions they could achieve before making a mistake (i.e., the ball drops). For the striking track, players measured all three of the drills as the distance from the goal that they could correctly complete each of the striking techniques, judging each striking drill separately as an average of the maximum left-footed and right footed distances they achieved. A new distance for either foot on any one of the drills could be achieved only by completing five correct strikes in a row with that technique."

All the previous information (i.e, each sentence), needs a valid references to better support all your rationale. Please, explain each sentence using valid references to better support how your skills were really selected.

21. Line 181. "...we developed a “Skill Stage” scoring system reflecting objective standards of skill achievement..."

I honestly ask if the authors already validated the mentioned score? If yes, please add the respective reference.

I truly recommend the authors to better explain how your main outcomes were selected and calculated from the online platform used (e.g, commitment and skill improvement). As it stands, in my point if view, the actual meaninglessness of your results can also be questioned. Moreover, more details are needed regarding how your variables were really treated and analysed before created a data set.

**********

6. PLOS authors have the option to publish the peer review history of their article (what does this mean?). If published, this will include your full peer review and any attached files.

If you choose “no”, your identity will remain anonymous but your review may still be made public.

Do you want your identity to be public for this peer review? For information about this choice, including consent withdrawal, please see our Privacy Policy.

Reviewer #1:Yes:João Paulo Brito

Reviewer #2:Yes:Júlio Alejandro Henriques da Costa

**********

[NOTE: If reviewer comments were submitted as an attachment file, they will be attached to this email and accessible via the submission site. Please log into your account, locate the manuscript record, and check for the action link "View Attachments". If this link does not appear, there are no attachment files.]

While revising your submission, please upload your figure files to the Preflight Analysis and Conversion Engine (PACE) digital diagnostic tool,https://pacev2.apexcovantage.com/. PACE helps ensure that figures meet PLOS requirements. To use PACE, you must first register as a user. Registration is free. Then, login and navigate to the UPLOAD tab, where you will find detailed instructions on how to use the tool. If you encounter any issues or have any questions when using PACE, please email PLOS atfigures@plos.org. Please note that Supporting Information files do not need this step.

PLoS One. 2022 Nov 1;17(11):e0276762. doi: 10.1371/journal.pone.0276762.r002

2 Aug 2022

Aug. 1, 2022

Giancarlo Condello, Ph.D.

Academic Editor

PLOS ONE

Re: PONE-D-22-04804

An evidence-based approach to assessing the effectiveness of training regimen on athlete performance: youth soccer as a case study

Dear Dr. Condello:

Thank you for your invitation to submit a revised version of our manuscript. As requested in your message of June 29, 2022, in this letter we respond to queries in your email and also detail our responses to the reviewers of this manuscript. We begin by copying your queries and responding in italics.

PLOS ONE requirements:

When submitting your revision, we need you to address these additional requirements.

1. Please ensure that your manuscript meets PLOS ONE's style requirements, including those for file naming. The PLOS ONE style templates can be found at

https://journals.plos.org/plosone/s/file?id=wjVg/PLOSOne_formatting_sample_main_body.pdf and

https://journals.plos.org/plosone/s/file?id=ba62/PLOSOne_formatting_sample_title_authors_affiliations.pdf.

Response: We have reviewed the templates and believe that we have complied with style requirements.

2. Thank you for stating the following financial disclosure:

“Unfunded study”

At this time, please address the following queries:

a)Please clarify the sources of funding (financial or material support) for your study. List the grants or organizations that supported your study, including funding received from your institution.

Response: No funding was received for this work from any source.

b) State what role the funders took in the study. If the funders had no role in your study, please state: “The funders had no role in study design, data collection and analysis, decision to publish, or preparation of the manuscript.”

Response: There were no funders.

c) If any authors received a salary from any of your funders, please state which authors and which funders.

Response: There were no funders, and no author received any salary for this work.

d) If you did not receive any funding for this study, please state: “The authors received no specific funding for this work.”

Response: The authors received no specific funding for this work.

Please include your amended statements within your cover letter; we will change the online submission form on your behalf.

3. Thank you for stating the following in your Competing Interests section:

“No authors have competing interests”

Please complete your Competing Interests on the online submission form to state any Competing Interests. If you have no competing interests, please state ""The authors have declared that no competing interests exist."", as detailed online in our guide for authors at http://journals.plos.org/plosone/s/submit-now

Response: The authors have declared that no competing interests exist.

This information should be included in your cover letter; we will change the online submission form on your behalf.

Responses to Reviewers’ Questions:

We simply copied the review comments and then placed our responses in italics below them. We very much appreciated the comments of reviewer 1. We tried to follow the recommendations of reviewer 2 as well. We appreciate the time and effort expended by you and both reviewers on this manuscript.

Comments to the Author

1. Is the manuscript technically sound, and do the data support the conclusions?

The manuscript must describe a technically sound piece of scientific research with data that supports the conclusions. Experiments must have been conducted rigorously, with appropriate controls, replication, and sample sizes. The conclusions must be drawn appropriately based on the data presented.

Reviewer #1: Yes

Reviewer #2: Yes

2. Has the statistical analysis been performed appropriately and rigorously?

Reviewer #1: I Don't Know

Reviewer #2: I Don't Know

3. Have the authors made all data underlying the findings in their manuscript fully available?

The PLOS Data policy requires authors to make all data underlying the findings described in their manuscript fully available without restriction, with rare exception (please refer to the Data Availability Statement in the manuscript PDF file). The data should be provided as part of the manuscript or its supporting information, or deposited to a public repository. For example, in addition to summary statistics, the data points behind means, medians and variance measures should be available. If there are restrictions on publicly sharing data—e.g. participant privacy or use of data from a third party—those must be specified.

Reviewer #1: Yes

Reviewer #2: Yes

4. Is the manuscript presented in an intelligible fashion and written in standard English?

PLOS ONE does not copyedit accepted manuscripts, so the language in submitted articles must be clear, correct, and unambiguous. Any typographical or grammatical errors should be corrected at revision, so please note any specific errors here.

Reviewer #1: Yes

Reviewer #2: Yes

5. Review Comments to the Author

Please use the space provided to explain your answers to the questions above. You may also include additional comments for the author, including concerns about dual publication, research ethics, or publication ethics. (Please upload your review as an attachment if it exceeds 20,000 characters)

Reviewer #1: Dear Authors,

I appreciate the opportunity to comment to the authors in their manuscript titled "An evidence-based approach to assessing the effectiveness of training regimen on athlete performance: youth soccer as a case study". The manuscript's issue is interesting and relevant in the current context of youth soccer training knowledge. I consider that the relevance of the study lies in the limited research available exploring the formal evidentiary-basis for athletic training programs of academy soccer players.

The purpose of the study is based on the hypothesis that both increased training time and greater commitment would produce larger increases in performance improvement, and that commitment would be the most important determinant of improvement. The authors conclude with strong evidence for an increase in performance improvement with both training hours and commitment score.

The study is well structured, well written and has an innovative approach, which is why I believe it is suitable for publication and so I have noted this to the Editorial Board

Congratulations on the good work done.

BW

João Paulo Brito

Response: We very much appreciate the comments and support of our work by reviewer 1.

Reviewer #2: The present study is of interest to investigate the effectiveness of training regimen on athlete performance, using a multi-model approach to estimate the relationship between skill improvement, hours and commitment.

Despite the interesting work, I strongly suggest following the comments to improve the quality of the manuscript.

Abstract

1. Authors should add a conclusion section.

Response: The abstract concludes with the following 3 sentences:

“Despite considerable variability in the data, we find strong evidence for an increase in performance improvement with both training hours and commitment score. We compared the best models for hours and commitment by computing an evidence ratio of 5799, indicating much stronger evidence favoring the model based on commitment. Results of analyses such as these go beyond anecdotal experience in an effort to establish a formal evidentiary basis for athletic training programs.”

We view these as the central conclusions of the paper, however we are happy to provide specific additions or clarifications if requested.

Introduction

2. Line 35-37. "Coaches, trainers and athletes have searched for the optimal formula for skill acquisition for as long as sports have existed, simply because athletes who can develop abilities faster, or to a greater extent, than others possess a competitive advantage."

Authors should add a valid reference.

Response: We have added 2 relevant citations as the reviewer recommended.

3. Line 37-38. "Historically, the selection of training methods has been guided by anecdotal experience rather than by empirical evidence on effectiveness."

Authors should add a valid reference.

Response: As recommended by the reviewer, we added 2 references to complement our personal observations.

4. Line 40-42. "However, from our varied experiences with elite club, high school, Division 1 collegiate, and professional athletics, we see that athletes and their mentors do not use empirical evidence to guide their training decisions as frequently as might be hoped."

With all due respect, I ask the authors how we can follow this rationale.

Response: We were simply trying to convey that we have often observed failures to use empirical evidence to guide training at every level of sport. This observation is consistent with the references that we have added in response to the reviewer’s previous recommendation. We believe these added citations properly address the reviewers’ concerns.

5. Line 44-45. "In some cases, opinions of coaches and trainers can seem to converge on a conventional wisdom."

What or wich cases? Be more specific using more details.

Response: The sentence that follows lines 44-45 (“However, the recent revolution in analytics provides strong evidence of the fallibility of such wisdom”) contains references to 2 entire books that are devoted to examples of both long-held conventional wisdom and empirical evidence (provided by the recent interest in sports “analytics”) overturning such wisdom. One example from reference 6 derives from professional baseball, where a hitter’s worth was judged by a few primary statistics that included “batting average”, the fraction of “at bats” at which a hitter got a base hit. This reliance on batting average was virtually universal across all levels of baseball play (amateur to professional). Analytics showed that ‘worth’ with respect to a team’s ability to win is more strongly predicted by a hitter’s “on base percentage”, a statistic that had been computed, but was little used to assess value or worth of a player. There are many such examples from many sports, and we believe the 2 cited books are sufficient to reinforce this point. Nonetheless, if the editor prefers, we are happy to provide material providing additional examples.

6. Line 78. "...and we focus on a 10-week period."

Please explain why, using valid scientific references.

Response: 10 weeks was the standard time of each training course, and was thus a natural duration for data collection. We specified this in the revised manuscript as recommended by the reviewer. We also cited a paper that was the basis for our selection of 10 weeks. This paper recommended 66 days as the time required for humans to develop new habits.

7. Line 80. In my point of view, authors should avoid in scientific papers, using words such "etc". Please, mention all variables needed.

Response: We omitted “etc.” as recommended by the reviewer, but chose not to attempt to enumerate every single factor that might influence how an athlete might choose to spend her time. Instead, we provided 2 examples of relevant factors.

8. Line 85-86. "The basic hypothesis for the training time analysis was that improvement would be greater for athletes who expended more time training."

Please, explain why.

Response: The importance of training and practice to achievement in any endeavor is encoded in conventional wisdom (e.g., sayings such as “practice makes perfect”) and is a belief widely shared by athletes and coaches alike. Nonetheless, we have added 2 citations (citations 2 and 11) that explain and provide evidence for the importance of training. Both references are from K.A. Ericsson, an authority on the duration and quality of practice as it relates to skill improvement.

9. Line 90. "...aged 9-18 years who compete at varying levels (recreational through elite)."

How authors really classified the competitive and/or skill levels of the players? Please sustain your answer based on scientific evidence.

Response: A useful aspect of our analysis is that it does not require classification of athletes by initial skill levels. Because we focus on the precise difference in test scores before and after training, we explicitly account for the starting (pre-training) ability level of each athlete, without having to classify them (see line 93-95). We also investigated one model that permitted an effect of before-training test scores on the rate of increase in ability with training. In short, our hypotheses and subsequent analyses permitted us to use the test scores directly without the need to place athletes into classes (see our response to comment 10 below for additional details).

10. Line 92-93. "We deal with this variation in part by focusing on improvement in skill over the period of training, rather than on absolute skill level attained. " Please be more specific.

Response: We added a note to “see Analytics Methods”, as the exact expression to compute the difference is presented there (page 10, line 209 – see also inclusion of starting proficiency in model 4, line 246 page 12)

11. Line 95-96. "The basic idea was that athletes starting out at lower skill levels can increase proficiency rapidly."

How this was really ensured?

Response: The hypothesis that athletes beginning at lower skill levels might increase proficiency more rapidly than athletes starting at higher levels is not ensured at all. Rather this was a hypothesis to be tested, so we incorporated this hypothesis into one of our models (model 4, line 246) and not into the others. We thus tested this idea and found little support for it (see e.g., table 1 or table 3).

12. Line 101-102. "...and was measured by accumulated activities."

Please, explain why.

Response: We wanted our measure of commitment to reflect activities occurring over the duration of the training period, just as training time was measured over the entire 10-week program. We added a parenthetical note to “see Training Methods and Metrics”, as details of our approach are described in that section.

Methods

13. Line 113. " ...we had NA=108 athletes..."

Please, explain why adding how athletes were really selected.

Response: Previously (lines 91-92), we wrote: “For this analysis we selected a relatively homogeneous group of athletes, focusing on soccer players aged 9-18 years who compete at varying levels (recreational through elite).” We specified that we selected athletes with “usable data from the Spring 2020 training program” (page 6, line 116). This specification of “usable data” simply meant that we did not include data from individuals who did not complete the program for any reason (e.g., injury), and we have added a statement to this effect in the manuscript (lines 116-117).

14. Line 117. "Ethics. We did not seek approval from an ethics committee because:"

Authors should confirm and add the approval/consent by the PlosOne officer if the paper could be published (if accepted) without ethical consent.

Response: We have described our consent process in accordance with the submission instructions, and PlosOne has found it adequate.

15. Line 127. "Consent was informed and documented via signature."

Authors should add, as a supplement file, one example of those informed consents.

Response: The exact consent statement is provided directly in the manuscript itself (lines 133-138).

16. Line 136-137. "This training requires periodic assessment tests to evaluate the rate of development of the athletes."

Please mention what type of test the authors are referring to.

Response: The testing simply entails repeating various soccer drills and videotaping them to ensure that reported scores are accurate. This general process and the exact drills are described in the text under Skill improvement, and we have therefore added a parenthetical note citing this section in response to the reviewer comment.

17. Line 152-153. "Overall commitment scores were computed by dividing the number of points accrued by the maximum number of points for a player completing 100% of required tasks."

Authors should give more details/explanations how really this "equation" was created and calculated.

Response: We have added several sentences of explanation as recommended by the reviewer (appearing on lines 156-165).

18.Line 153-154. "Optional tasks were available as well, such that commitment scores could range from 0 to 2.40 for each player."

Please, explain why.

Response: The completion of optional tasks (i.e., going above and beyond what was required) provided a direct indication of the commitment shown by athletes to the training program and self-improvement. Our rationale is explained in the following added statement (lines 162-165 ): “The commitment metric was intended to go beyond time expended on training, as it incorporated information that reflected an athlete’s commitment to completing all tasks required of them, and even some tasks that were not required.”

19. Line 161. "Skill improvement" section.

Authors should add what type of skills were performed by the athletes. Were all the same for each age (9 to 18 years)?

Response: The skills and how they were measured are described in some detail in this section. We did not write of any age stratification because athletes of all ages worked on the same skills, albeit with different levels of proficiency. As a reminder, we dealt with this age (and other) variation in proficiency by focusing on improvement in proficiency, rather than absolute proficiency, in all analyses.

20. Line 171-179. "Players measured their skill scores in the dribbling track based on how long it took them to perform each of the three designated drills. For the first touch and passing track, they recorded how many correct repetitions they could complete in 30 seconds, and for the other two drills they recorded how many consecutive, correctly-executed repetitions they could achieve before making a mistake (i.e., the ball drops). For the striking track, players measured all three of the drills as the distance from the goal that they could correctly complete each of the striking techniques, judging each striking drill separately as an average of the maximum left-footed and right footed distances they achieved. A new distance for either foot on any one of the drills could be achieved only by completing five correct strikes in a row with that technique."

All the previous information (i.e, each sentence), needs a valid references to better support all your rationale. Please, explain each sentence using valid references to better support how your skills were really selected.

Response: The reviewer recommends that we provide a reference for each statement describing our assessment drills. These drills were developed by author MK in order to assess skills that he thought to be important and sought to develop. These tests can be viewed as analogous to those designed by any university professor in order to assess learning in her or his specific class. We know of no standardized tests that are universally accepted in the soccer community, but even if such tests existed, it makes more sense to us to develop tests that are tailored to assessing the exact skills that we hoped to teach. As an aside, all coauthors have substantial experience with playing and coaching soccer, and all of us believe the drills to provide excellent assessments of proficiency in the selected skills.

21. Line 181. "...we developed a “Skill Stage” scoring system reflecting objective standards of skill achievement..."

I honestly ask if the authors already validated the mentioned score? If yes, please add the respective reference.

Response: The reviewer asks us to “validate” our scores, but we are not certain exactly what this means. One possibility is that this means the reviewer would like it if someone else had used the exact same approach to address the questions that we address. But as noted above, our preference is to develop scoring systems that are tailored to our objectives. We believe this to be far preferable to borrowing a scoring system developed by someone else and hoping that it corresponded closely enough to our objectives. Once again, we return to the analogy of the university professor deciding how to weight and combine the different assessment instruments (tests, essays, class projects, etc.) of the semester in order to develop an assessment score that corresponds to her/his objectives. The other possible meaning of validation is that the reviewer would like us to provide our own evidence that the tests measure proficiency in the selected skills. Our results of higher test scores with more training and commitment actually provide such evidence.

I truly recommend the authors to better explain how your main outcomes were selected and calculated from the online platform used (e.g, commitment and skill improvement). As it stands, in my point if view, the actual meaninglessness of your results can also be questioned. Moreover, more details are needed regarding how your variables were really treated and analysed before created a data set.

Response: This general comment about “how your main outcomes were selected and calculated” addresses 2 separate issues. (1) We agree with the reviewer that how the outcomes (test statistics) were calculated is very important. We have followed his suggestion about including additional information on our computation of a commitment score (see again the added paragraph now appearing on lines 156-165). In addition, we note that the explicit descriptions of how “variables were treated and really analysed” appear in the section Analytic methods. We believe that this section is very detailed, as it includes the explicit models fit to the data. But we can present even more detail if we know exactly what else is needed. (2) The second issue raised by the reviewer involves selection of test drills, namely that because we developed our own assessment drills (reviewer’s “main outcomes”) our results are somehow suspect. As noted above, we developed these drills specifically with our training program objectives in mind, and selected tests that we thought best evaluated whether our training program was successful in increasing proficiency in selected skills useful to soccer players. We have emphasized that this approach to testing is pervasive throughout academia, as well as the sports world. As a posteriori support that our drills and scoring system were appropriate for assessing our training program, we note that achievement based on this system showed improvement with increased hours of training and commitment to the program, as predicted. If we had done a poor job of assessment scoring, we would expect little evidence of a relationship between such scores and training/commitment. Finally, as we observed previously, we believe that an experienced soccer coach or player would consider these drills to provide very reasonable assessments of the specified soccer skills.

We hope that the above responses seem reasonable to you. We made one additional change in the revised manuscript that was not in response to reviewer comments. The senior author was married recently, so we now use her new, married name. We look forward to hearing from you, and we will try to address any remaining concerns you may have. Thank you for considering our manuscript.

Sincerely,

James D. Nichols

Attachment

Submitted filename: Response to Reviewers.docx

Click here for additional data file. (40.2KB, docx)

Rafael Franco Soares Oliveira

19 Sep 2022

PONE-D-22-04804R1An evidence-based approach to assessing the effectiveness of training regimen on athlete performance: youth soccer as a case studyPLOS ONE

Dear Dr. Nichols,

Thank you for submitting your manuscript to PLOS ONE. After careful consideration, we feel that it has merit but does not fully meet PLOS ONE’s publication criteria as it currently stands. Therefore, we invite you to submit a revised version of the manuscript that addresses the points raised during the review process.

ACADEMIC EDITOR:Dear authors,

Considering the work made by the previous editors and the revisions made by both reviewers, I also agree that this work had conditions to be published.

However, there minor details that can be improved. First an English revision/proofreading should be performed because there are several expressions and sentences not well written. Moreover, the work was written in the first person, but it should be changed to the third. Discussion should start with the aims of the study and the main results. In addition, there should be a limitation, a clear practical application and conclusion sections for this paper.

I believe these details will improve organization and clarity of the work.

Best regards

Please submit your revised manuscript by Nov 03 2022 11:59PM. If you will need more time than this to complete your revisions, please reply to this message or contact the journal office atplosone@plos.org. When you're ready to submit your revision, log on to https://www.editorialmanager.com/pone/ and select the 'Submissions Needing Revision' folder to locate your manuscript file.

Please include the following items when submitting your revised manuscript:

  • A rebuttal letter that responds to each point raised by the academic editor and reviewer(s). You should upload this letter as a separate file labeled 'Response to Reviewers'.

  • A marked-up copy of your manuscript that highlights changes made to the original version. You should upload this as a separate file labeled 'Revised Manuscript with Track Changes'.

  • An unmarked version of your revised paper without tracked changes. You should upload this as a separate file labeled 'Manuscript'.

If you would like to make changes to your financial disclosure, please include your updated statement in your cover letter. Guidelines for resubmitting your figure files are available below the reviewer comments at the end of this letter.

If applicable, we recommend that you deposit your laboratory protocols in protocols.io to enhance the reproducibility of your results. Protocols.io assigns your protocol its own identifier (DOI) so that it can be cited independently in the future. For instructions see: https://journals.plos.org/plosone/s/submission-guidelines#loc-laboratory-protocols. Additionally, PLOS ONE offers an option for publishing peer-reviewed Lab Protocol articles, which describe protocols hosted on protocols.io. Read more information on sharing protocols at https://plos.org/protocols?utm_medium=editorial-email&utm_source=authorletters&utm_campaign=protocols.

We look forward to receiving your revised manuscript.

Kind regards,

Rafael Franco Soares Oliveira

Academic Editor

PLOS ONE

Journal Requirements:

Please review your reference list to ensure that it is complete and correct. If you have cited papers that have been retracted, please include the rationale for doing so in the manuscript text, or remove these references and replace them with relevant current references. Any changes to the reference list should be mentioned in the rebuttal letter that accompanies your revised manuscript. If you need to cite a retracted article, indicate the article’s retracted status in the References list and also include a citation and full reference for the retraction notice.

Additional Editor Comments:

Dear authors,

Considering the work made by the previous editors and the revisions made by both reviewers, I also agree that this work had conditions to be published.

However, there minor details that can be improved. First an English revision/proofreading should be performed because there are several expressions and sentences not well written. Moreover, the work was written in the first person, but it should be changed to the third. Discussion should start with the aims of the study and the main results. In addition, there should be a limitation, a clear practical application and conclusion sections for this paper.

I believe these details will improve organization and clarity of the work.

Best regards

[Note: HTML markup is below. Please do not edit.]

Reviewers' comments:

Reviewer's Responses to Questions

Comments to the Author

1. If the authors have adequately addressed your comments raised in a previous round of review and you feel that this manuscript is now acceptable for publication, you may indicate that here to bypass the “Comments to the Author” section, enter your conflict of interest statement in the “Confidential to Editor” section, and submit your "Accept" recommendation.

Reviewer #1:All comments have been addressed

Reviewer #2:All comments have been addressed

**********

2. Is the manuscript technically sound, and do the data support the conclusions?

The manuscript must describe a technically sound piece of scientific research with data that supports the conclusions. Experiments must have been conducted rigorously, with appropriate controls, replication, and sample sizes. The conclusions must be drawn appropriately based on the data presented.

Reviewer #1:Yes

Reviewer #2:Yes

**********

3. Has the statistical analysis been performed appropriately and rigorously?

Reviewer #1:I Don't Know

Reviewer #2:I Don't Know

**********

4. Have the authors made all data underlying the findings in their manuscript fully available?

The PLOS Data policy requires authors to make all data underlying the findings described in their manuscript fully available without restriction, with rare exception (please refer to the Data Availability Statement in the manuscript PDF file). The data should be provided as part of the manuscript or its supporting information, or deposited to a public repository. For example, in addition to summary statistics, the data points behind means, medians and variance measures should be available. If there are restrictions on publicly sharing data—e.g. participant privacy or use of data from a third party—those must be specified.

Reviewer #1:Yes

Reviewer #2:Yes

**********

5. Is the manuscript presented in an intelligible fashion and written in standard English?

PLOS ONE does not copyedit accepted manuscripts, so the language in submitted articles must be clear, correct, and unambiguous. Any typographical or grammatical errors should be corrected at revision, so please note any specific errors here.

Reviewer #1:Yes

Reviewer #2:Yes

**********

6. Review Comments to the Author

Please use the space provided to explain your answers to the questions above. You may also include additional comments for the author, including concerns about dual publication, research ethics, or publication ethics. (Please upload your review as an attachment if it exceeds 20,000 characters)

Reviewer #1:Dear Authors,

As I commented in the 1st review, I consider the manuscript to be suitable for publication and that was the note given to the editor.

Congratulations on the quality of the manuscript.

Reviewer #2:I am happy with the current version of the manuscript.

The authors did a good job on reviewing the manuscript and answering all the revisions maded.

**********

7. PLOS authors have the option to publish the peer review history of their article (what does this mean?). If published, this will include your full peer review and any attached files.

If you choose “no”, your identity will remain anonymous but your review may still be made public.

Do you want your identity to be public for this peer review? For information about this choice, including consent withdrawal, please see our Privacy Policy.

Reviewer #1:Yes:João Paulo Brito

Reviewer #2:Yes:Júlio Alejandro Henriques da Costa

**********

[NOTE: If reviewer comments were submitted as an attachment file, they will be attached to this email and accessible via the submission site. Please log into your account, locate the manuscript record, and check for the action link "View Attachments". If this link does not appear, there are no attachment files.]

While revising your submission, please upload your figure files to the Preflight Analysis and Conversion Engine (PACE) digital diagnostic tool,https://pacev2.apexcovantage.com/. PACE helps ensure that figures meet PLOS requirements. To use PACE, you must first register as a user. Registration is free. Then, login and navigate to the UPLOAD tab, where you will find detailed instructions on how to use the tool. If you encounter any issues or have any questions when using PACE, please email PLOS atfigures@plos.org. Please note that Supporting Information files do not need this step.

PLoS One. 2022 Nov 1;17(11):e0276762. doi: 10.1371/journal.pone.0276762.r004

7 Oct 2022

Dear Dr. Oliviera:

Thank you for your invitation to submit a revised version of our manuscript. As requested in your message of September 19, 2022, in this letter we respond to your comments and recommendations. Below, we will copy your main comments and respond in italics.

“First an English revision/proofreading should be performed because there are several expressions and sentences not well written. Moreover, the work was written in the first person, but it should be changed to the third.”

We have tried to improve and clarify the writing throughout, and we have rewritten the entire manuscript and appendix in the third person. As you can see in the Track Changes copy of our revision, virtually all paragraphs contain changes associated with these recommendations.

“Discussion should start with the aims of the study and the main results. In addition, there should be a limitation, a clear practical application and conclusion sections for this paper.”

We have rewritten portions of the Discussion in order to follow your suggestions. We added a new paragraph to begin the Discussion (lines 664-672) in which we stated study aims and results. The primary limitation of the study is that inferences are based on a single training program (see lines 761-763). Practical applications are both specific (coaches/trainers encouraging greater commitment; lines 763-768) and general (use of similar methodological approaches to promote evidence-based training; lines 768-770; more detail and recommendations in lines 642-656). A conclusions paragraph was added (lines 725-757).

We tried to follow all of your recommendations and hope that our responses seem reasonable to you. We look forward to hearing from you, and we will try to address any remaining concerns you may have. Thank you for considering our manuscript.

Attachment

Submitted filename: Response to Editor (9-23-2022).docx

Click here for additional data file. (26.6KB, docx)

Rafael Franco Soares Oliveira

13 Oct 2022

An evidence-based approach to assessing the effectiveness of training regimen on athlete performance: youth soccer as a case study

PONE-D-22-04804R2

Dear Dr. Nichols,

We’re pleased to inform you that your manuscript has been judged scientifically suitable for publication and will be formally accepted for publication once it meets all outstanding technical requirements.

Within one week, you’ll receive an e-mail detailing the required amendments. When these have been addressed, you’ll receive a formal acceptance letter and your manuscript will be scheduled for publication.

An invoice for payment will follow shortly after the formal acceptance. To ensure an efficient process, please log into Editorial Manager at http://www.editorialmanager.com/pone/, click the 'Update My Information' link at the top of the page, and double check that your user information is up-to-date. If you have any billing related questions, please contact our Author Billing department directly at authorbilling@plos.org.

If your institution or institutions have a press office, please notify them about your upcoming paper to help maximize its impact. If they’ll be preparing press materials, please inform our press team as soon as possible -- no later than 48 hours after receiving the formal acceptance. Your manuscript will remain under strict press embargo until 2 pm Eastern Time on the date of publication. For more information, please contact onepress@plos.org.

Kind regards,

Rafael Franco Soares Oliveira

Academic Editor

PLOS ONE

Additional Editor Comments (optional):

Dear authors,

Congratulations on the improvements made on your work. My recommendation is to accept your work for publication.

Best regards

Reviewers' comments:

Rafael Franco Soares Oliveira

17 Oct 2022

PONE-D-22-04804R2

An Evidence-Based Approach to Assessing the Effectiveness of Training Regimen on Athlete Performance: Youth Soccer as a Case Study

Dear Dr. Nichols:

I'm pleased to inform you that your manuscript has been deemed suitable for publication in PLOS ONE. Congratulations! Your manuscript is now with our production department.

If your institution or institutions have a press office, please let them know about your upcoming paper now to help maximize its impact. If they'll be preparing press materials, please inform our press team within the next 48 hours. Your manuscript will remain under strict press embargo until 2 pm Eastern Time on the date of publication. For more information please contact onepress@plos.org.

If we can help with anything else, please email us at plosone@plos.org.

Thank you for submitting your work to PLOS ONE and supporting open access.

Kind regards,

PLOS ONE Editorial Office Staff

on behalf of

Dr. Rafael Franco Soares Oliveira

Academic Editor

PLOS ONE

    This section collects any data citations, data availability statements, or supplementary materials included in this article.

    Supplementary Materials

    S1 Appendix. Akaike’s Information Criterion (AIC), Akaike weights and multimodel inference.

    (DOCX)

    Click here for additional data file. (15.5KB, docx)

    S1 Data

    (DAT)

    Click here for additional data file. (3.6KB, dat)

    S2 Data

    (DAT)

    Click here for additional data file. (3.6KB, dat)

    Attachment

    Submitted filename: Response to Reviewers.docx

    Click here for additional data file. (40.2KB, docx)

    Attachment

    Submitted filename: Response to Editor (9-23-2022).docx

    Click here for additional data file. (26.6KB, docx)

    Data Availability Statement

    All relevant data are within Supplemental Files S2 and S3

    An evidence-based approach to assessing the effectiveness of training regimen on athlete performance: Youth soccer as a case study (2024)
    Top Articles
    Latest Posts
    Recommended Articles
    Article information

    Author: Maia Crooks Jr

    Last Updated:

    Views: 5376

    Rating: 4.2 / 5 (43 voted)

    Reviews: 82% of readers found this page helpful

    Author information

    Name: Maia Crooks Jr

    Birthday: 1997-09-21

    Address: 93119 Joseph Street, Peggyfurt, NC 11582

    Phone: +2983088926881

    Job: Principal Design Liaison

    Hobby: Web surfing, Skiing, role-playing games, Sketching, Polo, Sewing, Genealogy

    Introduction: My name is Maia Crooks Jr, I am a homely, joyous, shiny, successful, hilarious, thoughtful, joyous person who loves writing and wants to share my knowledge and understanding with you.