Farming Magazine - January, 2013


Forages: Fertilizing Forages in 2013

Changing with the times
By Everett D. Thomas

Some crop nutrients, including nitrogen, leach through the soil profile and must be replenished each year, whether by animal manure, plowed-down vegetation ("green manure"), nitrogen fertilizer or a combination of these. Other nutrients, such as phosphorus, mostly stay where they're put, moving very little through the soil profile even under heavy rainfall. If you keep the soil in place, you'll keep phosphorus there too. The exception is when soil phosphorus levels get extremely high, but judicious use of manure and fertilizer can prevent this.

These simple principles haven't changed since humans started cultivating crops. The facts may not have changed, but farming practices and events occurring many hundreds of miles from your farm raise the possibility that your fertilizer program may need to change - if not in 2013, then sometime in the not-too-distant future.

Phosphorus: A slow buildup on livestock farms

Recent efforts at several land-grant colleges, particularly Cornell University, have shown that on many dairy farms, soil phosphorus levels have increased to the extent that crops can be grown with little or no phosphorus fertilizer. Many livestock farms are applying enough manure to cropland to supply most or all the phosphorus needed for high yields. Regardless of this, many farms have continued to use phosphorus in the starter fertilizer, primarily during corn production, but often during forage seeding as well. This has resulted in increasing soil test phosphorus levels, in many cases to "high" or "very high" levels.

Repeated research trials over several years with corn, alfalfa and other crops have shown that once soil test phosphorus is at an elevated level, using phosphorus fertilizer results in little or no yield increase. When fertilizer use in a field reaches the point of diminishing returns, any additional application is a waste of money and of an increasingly limited resource.

The money angle is an old one, since it's always been poor economics to use fertilizer where none is needed, but a more serious long-term issue is that most global deposits of phosphates have already been discovered, and when these are exhausted, what then? Nobody really knows, though there are widely varying estimates of when this will happen.

Photo courtesy of Schick/

Some recovery of phosphorus will be possible by recycling manure and food residues, but many of these are materials not easily incorporated into today's high-efficiency farming systems.

There's also the issue of location: Much of the grain production in the U.S. isn't where the greatest concentration of livestock farms are, and long-distance transportation of livestock manures is an inefficient and expensive undertaking.

Last March I was in Hereford, Texas, promoted as the "Beef Capital of the World" (Texans are great at self-promotion), and while there was quite a bit of corn in the area, most of it is harvested for silage, not grain. Remember, location isn't just important in retail and real estate.

"Use it or lose it" might be good advice for personal fitness, but using phosphorus where it isn't needed is simply bad business, in both the short and long term.

The growing need for sulfur fertilizers

The issue of acid rain was once a hot topic in the northeastern U.S. You don't hear or read much about acid rain these days, due to changes in technology and environmental awareness.

Sulfur is considered a secondary nutrient. The primary nutrients are nitrogen, phosphorus and potassium, while the secondary nutrients are sulfur, calcium and magnesium. The term "secondary nutrients" is somewhat of a misnomer, since all six are actually macronutrients, needed in fairly large amounts by plants, while micronutrients are needed in small quantities, often a pound or less per acre. The difference between primary and secondary nutrients is that the secondary nutrients are more often found in adequate supply in soils, so sulfur, calcium and magnesium don't need to be applied as often as nitrogen and potassium.

Sulfur deficiencies in forage crops were once rare, in part because air pollution resulted in the deposition of considerable amounts of sulfur in the rain and snow that fell on cropland. However, efforts at reducing air pollution have been successful, and as a result, farmers in much of the central and eastern U.S. receive only a fraction of the "free sulfur" via precipitation they did 10 or 15 years ago.

Sulfur is one of the nutrients that will leach in soils, so regular (perhaps annual) supplementation may be necessary. Land-grant college agronomists have been aware of these changes and have done field trials to determine if there's a yield response to sulfur fertilization. Alfalfa is one of the crops most responsive to sulfur, and therefore has often been the crop evaluated. I was involved in a Cornell University trial about 30 years ago where we applied two sources of fertilizer sulfur to alfalfa, with no yield response from the use of this nutrient. However, recent research is finding economical yield responses to the use of sulfur, most recently in Wisconsin.

The visual symptoms of sulfur deficiency aren't hard to detect, appearing as a lighter color in the alfalfa foliage, and in some cases as a yellowing called chlorosis. Other problems can result in off-colored alfalfa, so a suspected sulfur deficiency should always be confirmed by tissue analysis. The results of a forage analysis can be used as an early warning signal, but since a tissue analysis uses only the top 6 inches of the alfalfa plant instead of all of the forage, it's therefore more sensitive in picking up a deficiency.

University of Wisconsin agronomists measured the yield of normal-appearing alfalfa versus alfalfa that showed a visual sulfur deficiency. Sulfur fertilization had no yield impact where the alfalfa was normal in appearance. However, in 2010 and 2011 field trials, application of sulfur to the chlorotic alfalfa increased alfalfa yields by 68 percent and 53 percent, respectively. Alfalfa testing below .25 percent tissue sulfur showed a significant yield response from sulfur, while alfalfa at or above .25 percent sulfur showed small or no yield increases.

Fortunately, sulfur isn't an expensive nutrient and can be supplied by several products, including calcium sulfate, ammonium sulfate and sulfate of potash magnesia (often called sul-po-mag). In the Wisconsin research, it only took 24 pounds per acre of sulfur to achieve alfalfa yield responses of 1.3 and 1.7 tons per acre. In dollar terms, with sulfur costing about 60 cents per pound, this is more than a 10-to-1 return on the cost of the fertilizer, and probably closer to a 20-to-1 return. If you have a sulfur deficiency in your alfalfa, use of sulfur fertilizer is a no-brainer. But as you've heard many times before: test, don't guess.

Ev Thomas has worked as an agronomist in New York for 45 years, first with Cornell University Cooperative Extension, then with the William H. Miner Agricultural Research Institute in Chazy, N.Y., including managing its 680-acre crop operation. He continues to work part-time for Miner Institute and is now an agronomist at Oak Point Agronomics. He has written our Forages column for 15 years and has been an expert contributor on a number of other topics.