Whether you’re cutting alfalfa in southern Vermont or northern California, one commonality across the miles is that first-cut alfalfa is different than all of the rest.
It’s this uniqueness that likely has led the forage fraternity, both farmer and researcher, to debate the first-cut harvest strategy with unmatched rigor, at least as far as numbered alfalfa harvests go.
Why is the initial spring harvest so different and, might I add, so important? Here is my collection of factors.
1. The growing environment is like no other during the growing season. Temperatures during most of the growth cycle are likely going to be cool. That’s good from a forage quality standpoint. As harvest time nears, all bets are off as it could be cool, hot, wet, dry, or some combination. These extremes as alfalfa approaches harvest wreak havoc on the ability to predict forage quality. At mid-bud stage, one year you might have rocket fuel, the next year it might be cordwood.
Never underestimate the impact of growing environment on forage quality and realize that the range of possible outcomes are nearly endless for spring growth.
2. Fiber digestibility can easily go from first to worst. First cutting neutral detergent fiber (NDF) digestibility can be, and often is, the highest of the season because of cooler temperatures. But all good things must come to an end, and if hot weather sets in or the harvest is delayed by extended wet weather, the rate of first crop fiber digestibility decline is unmatched. Hence, a timely first cut is essential if high forage quality is the primary objective. To achieve a target forage quality, the spring harvest window is often narrower compared to subsequent growth cycles.
3. Initial attempts to gauge first-cut forage quality based solely on calendar date or maturity stage failed miserably because of annual fluctuations in growing environment. For this reason, researchers have developed methods to estimate spring alfalfa quality “on the hoof.”
Several approaches are available: predictive equations for alfalfa quality (PEAQ), growing degree accumulation, and simply taking fresh cuttings from the field and submitting them to a forage lab for analysis. None of these methods are perfect, but they do help to prevent forage quality train wrecks.
4. First cutting often provides the greatest percentage of dry matter yield compared to subsequent harvests. Similar to forage quality that declines at a faster rate than subsequent cuttings, forage dry matter accumulates at a faster rate; some research estimates are 100 pounds of dry matter per acre per day during the late-vegetative to late-bud stages. Cut on time and you have a lot of high-quality forage. Miss the mark and you’ll pay the price with mountains of coarse, low-quality fodder. The yield-quality trade-off is never more relevant than with first cut.
5. First cutting sets the pace for the rest of the growing season. It often dictates how many future cuttings will be taken, the interval between cuttings, and how late into the fall the last cutting will be harvested. It’s the only cutting of the year when there is no number of days since the previous harvest. The decision of when to cut first crop is wide open, but the consequences of the decision impact the remainder of the season.
Let the harvest games begin.
This article appeared in the April/May 2016 issue of Hay & Forage Grower on page 4.
Not a subscriber? Click to get the print magazine.