Upon reading the title, your thoughts probably first moved to something along the lines of “You’re not kidding; it always rains right about the time I get the chopper (or baler) greased and I have most of my hay down.”
No question, precipitation around harvest time has the potential to make or break a crop, especially if such weather persists for several days or longer. Twenty-five years ago, this would have been my first thought as well, but not anymore.
As the forage enterprise focus has shifted largely from yield to quality, my appreciation for how much weather, or more specifically environment, impacts forage quality has taken quantum leaps. It’s not just about rain showers after the hay is cut or about to be, but also temperature, moisture, and sunlight conditions up to that point. This is why calendar date and even stage of maturity often don’t work well as our harvest time guidance counselors . . . especially for first cutting.
There is plenty of anecdotal evidence to support this notion. Many times through the years I have heard farmers say, “Boy, I cut at early-bud stage, but the forage analysis came back looking horrible. The cows just aren’t milking.” Or, the reverse, “I got delayed a week, but the forage test results still came back really good.”
So, what gives?
Here’s what gives — temperature, moisture, sunlight, and the interaction thereof. After taking weekly spring cuttings of alfalfa for forage quality analysis over the course of 25 years, I got to the point where I could just about guess the extent of change just from knowing the weather conditions during the previous week. Since moisture was usually not a limiting factor in the spring, most of the change was driven by temperature and days of sunshine.
In addition to anecdotal examples, there is also plenty of science-based evidence as well. For example, alfalfa research confirms that as temperatures rise, plant maturity accelerates, lignification ramps up, fiber digestibility drops, and leaf to stem ratio declines. In contrast, a moisture deficit condition tends to delay plant maturity (if it occurs early in the growth cycle), reduces plant height, enhances leaf to stem ratio, and lowers plant neutral detergent fiber (NDF). Similar relationships have also been documented for grasses.
Key environmental factors like temperature and soil moisture status cannot be disregarded when trying to explain or predict forage quality characteristics. Making a prediction of forage quality based solely on morphological plant stage or calendar date often is erroneous when confounding environmental conditions exist. These environmental factors are interactive. The positive forage quality impact of dry conditions would be negated by high temperatures during a hot drought when forage quality drops fast and maturity accelerates.
The environmental conditions that exist in spring are unlike that of any other cutting. It can be cold, hot, wet, or dry to every extreme; sometimes it’s all of the above, each having an impact on developing forage quality metrics. For this reason, there are several research-based methods available for taking some of the guesswork out of first-cut forage quality.
Most alfalfa growers are familiar with the widely used Predictive Equations for Alfalfa Quality (PEAQ). Cornell has even developed a system for using PEAQ with alfalfa-grass mixtures. There is also a method of prediction by tracking base 41°F growing degree-days. Some farmers and consultants simply clip samples of alfalfa “on the hoof” and send them into a lab for analysis.
No method is perfect, but here’s a case where doing something is better than nothing. These predictive systems generally eliminate forage quality train wrecks caused by underestimating the impact of temperature, moisture, and sunlight on harvested forage quality.Because of some unique environmental interactions, first-cut alfalfa or grass has the potential to be the worst or best quality forage that you make all season. Understand these interactions and use the available predictive tools to help ensure it’s the latter.