|By Mike Rankin|
At a very early age, most children master the question of “Why?” |
“Why did Barney (the dog) die?”
“Because he was old, and his parts wore out.”
“Because that’s how it is with most living things in this world.”
“Because God kind of set things up that way.”
And on it goes for as long as the parent’s stamina holds out.
We’d like to think that we grow out of our childlike curiosity, but we never really do. Life and farming provide us with a never-ending list of “Why?” questions.
I was recently at a conference where the speaker gave a presentation titled “Why have alfalfa yields flatlined?” This was just the latest in a long line of similar talks I’ve heard (and given) over the years based on the same premise. Another angle to this question is, “Why are farm alfalfa yields lower than the 8 to 10 tons per acre achieved in small research plots?”
These questions are troublesome for alfalfa growers and certainly frustrating for those who work in the alfalfa industry. Alfalfa breeders understandably don’t like the insinuation that after a lifetime of work, no progress has been made.
Alfalfa feeding on dairy farms is down largely because corn silage provides more tonnage per acre and lower alfalfa yields raise per ton harvesting costs. To be fair, however, we do need to recognize that one-half of the corn silage yield is grain, not forage.
Let’s tackle the low-hanging fruit first. Ten-ton alfalfa yields achieved in nonirrigated research plots are often on prime land, experience no heavy field traffic, and are greenchopped, so there is minimal harvest loss. Those three factors alone can bump measured yields 30% or more compared to what we see on a typical farm. Also recognize that many alfalfa variety performance trials don’t average anywhere close to 10 tons per acre.
So, what’s holding back alfalfa from achieving its full-yield potential that we know is possible from those 10-ton research yields?
One reason may simply be that your ancestors decided to drop the Conestoga wagon’s anchor at a location with less than optimum soil conditions for growing alfalfa. That’s just tough luck.
Another reason, and a big reason in my opinion, is water. Irrigated alfalfa growers in the Northwest with essentially the same growing season as the Midwest and East are able to add 2 or more tons per acre to their total-season yields with the ability to control water. Nonirrigated alfalfa can be hurt significantly by too much water, the timing of water, and not enough water.
I have always maintained that alfalfa has multiple growing seasons within a year. Those occur from one cutting to the next, and even a two-week stretch without rain can be pretty devastating if you’re cutting every 28 to 30 days. That’s production you never get back. On the other hand, too much rain can be equally damaging from a disease, wheel-track, and harvest timing perspective.
Of course, there are factors such as soil pH, fertility, variety selection, establishment practices, disease control, and harvest timing and techniques that impact final yields. But these factors are often known and controlled by the top producers who still may not hit elite yield levels, though they are generally well above average. Research-proven best management practices are certainly where all producers need to start in their effort to attain higher yields.
Finally, what about the question of “flatline” yield improvement? Yes, using any USDA metric alfalfa yields appear flat spanning over many years. It presents a picture that runs counter to common sense and observation. I’m really not entirely sure why.
I, for one, know that today’s alfalfa varieties are much better than those of 30 years ago. I also know that when I’ve plotted university variety performance data over time, I often see significant improvement.
Why are nonirrigated alfalfa yields often lower than their potential? Why are historical USDA yields flatlined? It’s complicated. That’s why.
This editorial appeared in the March 2020 issue of Hay & Forage Grower on page 4.
Not a subscriber? Click to get the print magazine.