I found this article lately which explains a lot: https://www.sportsperformancebulletin.com/endurance-training/training-structure-and-planning/discontinuous-training-master-stroke-endurance-athletes/
What causes HR to drift? (N-Zone topic)
-
CreatorTopic
-
March 29, 2021 at 8:02 am #52491russes011Participant
When running for 30-60 minutes at an easy aerobic pace (eg at or below AeT) why does one’s HR drift upward when the pace is kept constant? (or alternatively, why does pace drift downward when your HR is kept constant?) What is the physiology behind this?
More importantly, why does more aerobic training cause the degree of drift to decrease, or at least to occur at somewhat higher HRs?
(for simplicity, lets presume the person is not overtrained and not dehydrated, which are decent assumptions for the average athlete performing a 1hr HR drift test.)
Thanks for any thoughts. I think there are at least two main reasons to explain the drift effect, both perhaps trainable.
— Steve
-
CreatorTopic
-
I’m not sure a variation in stroke volume, per se, explains HR drift at or below AeT.
A certain pace, once warmed up, demands a certain amount of oxygen (VO2), which is individual specific. VO2 and VO2max are determined by cardiac output, muscle vascularity, and mitochondrial efficiency/density. Cardiac output (CO) is stroke volume (SV) times heart rate (HR)–so for a certain VO2 demand, as one’s intrinsic SV increases (with training) then a lower HR is required to produce the same CO. Once SV is maximized, however, which occurs at abut 50% of VO2max, further increases in CO is only possible by increasing HR.
The theory purported in the article is that after a certain amount of time at AeT (say 15min) one’s SV naturally starts to decline, which is compensated for by the drifting up of one’s HR to maintain a constant VO2. The article goes on to review some preliminary data about how best to train at maximal SV, because maybe training at maximal SV may allow your SV to not decline. The main confounder, in my opinion, is that an increase in HR reduces SV (when exercising at a CO at or above maximal SV). Because of this, its a chicken or the egg conundrum: did a decrease in SV (due to some unknown fatigue in unconditioned athletes down-regulating one’s SV) cause the HR to drift up, or did the HR drift up (due, for example, to an increase in the VO2 required to maintain a steady pace).
SV is determined by heart muscle contractility, as well as by preload (how much blood is in the heart before it pumps)–both of these factors increase with training. This, in part, explains why one’s resting heart rate decreases with exercise. Nevertheless, SV appears to max out quite early during exercise, even in conditioned athletes (whether this is actually true is the crux). Data varies, but most think it maxes out at about 50% of VO2max (for simplicity, VO2max is about 10 beats below maxHR). My point is that for most Z2, at or below AeT, exercise one is mostly functioning well above maximal SV. For example, maximal SV occurs at a heart rate of 90-100 for someone with a max HR of 190. In summary, I don’t think SV can be increased to the extent where training to potentially stay below maximal SV can explain the absence of a HR drift in conditioned athletes. Furthermore, as mentioned, increasing HR decreases SV. This is because a fast HR does not give the heart enough time to to fill (reducing preload and therefore CO). I am not sure at what point HR starts to hinder SV, but I assume it’s gradual after a certain threshold. This explains why, after a certain point, an increase in HR does not increase VO2 (or CO) to the same degree as it did at lower HRs; or even at all when one exercises above VO2 max.
SV optimization to reduce HR drift is an interesting concept, and who know, I may be wrong, and it may be where the money is. But other factors that cause one’s oxygen requirement to increase over time, or for the CO to increase over time, despite a constant pace, may be the true causes of HR drift, or in effect a SV down-drift (as an epiphenomenon).
Thank you for posting that article, I found it very interesting. For me it raises as many questions as it answers.
Been thinking more about the SV issue. Even when operating at maximal SV, if this is in fact the case for most Z2 work, there still may be a fatigue factor involved that may in part explain HR drift. I think the heart is unique in that relaxation between contractions is actually ATP (or energy) dependent–it actually has to ‘de-contract’ to open back up. Perhaps this process fatigues so that SV decreases ~5%, causing HR to increase to maintain the same CO, and with training one’s ability to maintain de-contraction increases, thereby reducing HR drift?
Other factors, aside from dehydration and overtraining issues, that may explain HR drift include temperature regulation, adrenaline, and carb vs fat metabolism.
I’m not fully convinced temperature regulation is not still in play–it has such a powerful effect on CO and therefore HR. Basically, to cool down you shunt blood to your skin, thereby increasing CO and HR. I feel like minor things like wind, layering, and a few degree difference in temperature can really effect this response, even if one controls for things like having a good, long warm-up, and being fully acclimated to one’s general environment for a few weeks (eg going to kona a few weeks before your ironman). Sometimes as simple as adding a windbreaker can cause you to overheat and HR to drift.
Perhaps adrenaline slowly builds over time as you exercise in Z2, and this causes HR to drift since contraction of the heart has already maxed out. And maybe with conditioning and training, this creep in adrenaline is less? (This adrenaline effect may be related to sleep and overtraining, etc.)
Finally, there is fuel source. Curiously, fat is about 3-5% less efficient (per unit oxygen) than carbs for aerobic respiration. Perhaps as you run in Z2 your body slowly burns more fat than carbs (percentage wise) over say your 1hr AeT HR drift test, causing your HR or pace to drift 3-5%? Perhaps conditioning and training teaches yourself to switch to fats sooner, ie during the warm-up period before the test itself, thereby keeping the change to less <5% during the test itself. Alternatively, perhaps one becomes more efficient at using carbs and delays conversion to fat metabolism for a longer period of time, thus keeping the change <5% during a 1hr test. It’s interesting that a MET could probably answer this question–maybe it already has.
Like most things, it’s probably a combination of many of the above factors. All very interesting.
I have attached a graph comparing stroke volume and HR. It may reflect untrained individuals. Nevertheless, I don’t believe your average trained individual has more than a 10-20% increase in stroke volume compared to untrained individuals. Elite athletes, or children who trained heavily through their youth may be exceptions to this.
Attachments:
You must be logged in to view attached files.InactiveWhen running for 30-60 minutes at an easy aerobic pace (eg at or below AeT) why does one’s HR drift upward when the pace is kept constant? (or alternatively, why does pace drift downward when your HR is kept constant?) What is the physiology behind this?
Stress. HR measures stress, not (just) intensity. The longer an exercise duration, the greater the mounting stress (muscular, metabolic, hydration, etc, etc) so HR will climb regardless.
More importantly, why does more aerobic training cause the degree of drift to decrease, or at least to occur at somewhat higher HRs?
With increasing fitness, the cardiovascular system can handle more exercise stress, so HR will still climb but at slower and slower rates. The drift test is just one measure of an arbitrary (60m) duration to test this.
- The forum ‘General Training Discussion’ is closed to new topics and replies.