This post was provided by Complete Track and Field
By Scott Christensen
Why do distance runners need to train all summer, effectively doubling the length of their fall season, in order to be competitive in the championship cross country meets at the end of the year? The answer lies in the rate by which physiological changes occur in humans. Today we will discuss stimulus, adaptation, fitness, and timeframes as they relate to the training of the cross country athlete.
Fitness gains to the aerobic energy system requires enough time for structural changes in tissue to occur. In the same light, strength improvements in the muscular system’s Type 1 fibers requires enough time for the cross-sectional diameter size of the myosin filaments to increase; again, a change in tissue structure. Improvements in both the aerobic energy system and Type 1 muscular fibers lead to increased aerobic fitness which is the most important aspect of not only the cross country race itself, but the ability to effectively maintain day to day training.
The other side of the cross country training coin is anaerobic fitness. Improvements to the anaerobic energy system are chiefly biochemical in nature and not structural, thus considerably less length of training time is needed to show positive changes. Improvements in Type 2 muscle fiber activity does involve some small structural changes, but adaptation is mainly neural in nature.
Neural-muscular activity improvement lies in better synchronization and recruitment of Type 2 muscular fibers themselves, rather than size changes; again, requiring less weeks of training to evoke changes.
Cross country training involves a mix of aerobic and anaerobic stimuli and is volume and intensity driven. Table 1 indicates many of the adaptations necessary to improve a cross country runners fitness along with seven columns of increasing heart rate (HR) training intensity (found in Table 2). While a variety of training markers can be used to measure intensity, Table 1 uses heart rate (HR), while also indicating a 1-5 score for each level and degree of expected stimulus.
There are many valid means for measuring training intensity with some more accurate than others. These can vary from lactate analyzers to heart rate monitors to the athletes simply mentally “measuring and scaling” how they feel with a modified Borg Scale of perceived exertion.
Table 2 uses heart rate and vVO2 max mathematics to discriminate training intensities, along with corresponding perceived exertion on a ten-point scale found in Table 3. Heart rate is one of the easier hematological characteristics to measure, and most cross country runners fall into similar quantitative training ranges.
* Training Resource: Peaking Workouts for Cross Country Runners
Table 2 is set up for a runner with a maximum heart rate of 215 bpm which would be common for a person 16-20 years old. Perceived exertion is more variable with athletes qualitatively assessing somewhat different intensities for the same effort. (The seven levels in Table 2 correspond to the same seven columns found in Table 1.)
Besides considering the training intensity stimuli needed for cross country fitness adaptations there are also timeframes to develop when scheming a training program. Using the same crucial physiological adaptations found in Table 1, a timeframefor each of their developments can be charted.
* Coaching Resource: Advanced Topics Symposium in Cross Country
The timeframes can be found in Table 4. Remember, the data are set up to reflect maximum development of each adaptation. A shorter timeframe merely means a somewhat less, but immeasurable degree of adaptation.
Allow ample time for training cross country runners. Many adaptations are significant to fitness, but all take quite a long length of time to develop. Gauge training intensities to desired adaptation outcome in the training plan.