The Evolution of Feed Efficiency Evaluation

By: John Genho

The start of my geneticist career came at a critical time for commercial cattle producers as changes in the industry created a greater need to enhance profitability and efficiency. My customers at the time were large commercial ranches who were data-centric and had profit as their motive. These commercial producers were looking for genetic evaluations, web databases, and as genomics grew in popularity, ways to assess genomic value. As feed efficiency data became available, they quickly realized the value of this data in an integrated system where feed costs make up a significant amount of the cost side of their businesses.

I am a quantitative geneticist, and as such the majority of my work is done with a statistical method called best linear unbiased prediction (BLUP). BLUP is what creates EPDs and associated accuracies that cattle producers use today. Its power has been a catalyst to the increase in quality grade in the American cowherd over the past decade, as well as milk yield in Holsteins over the past 50 years, and various other similar changes in different species. There is a key word to the acronym BLUP that is often overlooked – linear. A basic assumption of all data analyzed with BLUP is that the data is linear. As a geneticist, I’m trained to think linearly and often have a hard time thinking otherwise.

As the cattle industry became interested in feed efficient animals I analyzed my first GrowSafe Systems® datasets. Initially, I started with the assumption that growth and intake were linear across time. This of course is not the case, and it’s a good thing for each of us who have flattened out on our growth curves. Growth and intake are not linear and in fact have quite a bit of wobble through the growing part of an animal’s life, which is what we’re most interested in. Usually we can wave our hands and say that contemporary groups and collecting data at similar times remove these problems. However, I realized we needed to evaluate the data differently when I began comparing intake and growth collected separately over short and long periods of time. In the early stages of discovery, we evaluated intake and growth over a short period of time in addition to growth over a long period of time. I resolved the inaccuracies by pairing intake and gain together over the collection period to ensure the non-linear curves were compared to each other appropriately. I think the best way to evaluate intake and growth is with residual feed intake (RFI), but residual average daily gain (RADG) or feed conversion ratios are other alternatives. The important thing, in my experience, is to pair growth and intake over the same period.

At times, our industry becomes too obsessive with defining RFI. It is simply average daily intake with adjustments made for some factors. Nearly all traits are adjusted for covariates before analysis. The trait producers are likely most familiar with is weaning weight, which is adjusted for the age of the calf at weaning and age of the dam. Ultrasound, birth weight, scrotal circumference, and nearly all other traits are adjusted for relevant factors, or covariates as statisticians would call them. RFI is simply intake adjusted for some relevant covariates.

One problem that arises is defining exactly which covariate should be included in the RFI calculation. When I first began analyzing feed efficiency data, I had RFI adjusted in various ways with different covariates. This made it incredibly difficult to fairly compare animals. My job became much easier when GrowSafe developed a standard way of calculating RFI. This removed a significant amount of the noise that I saw in early data sets.

Figure 1: Correlation between EPDs and average progeny performance for sires with more than five progeny.

The current GrowSafe EPDs have over 35,000 individual feed efficiency records and over 235,000 pedigree animals included in the analysis. Sires with more than five progeny in the dataset were analyzed for the correlation between their EPDs and the average of their progenies’ performance for each trait (figure 1). In conclusion, it’s clear from these correlations that RFI is a very predictive trait and may be more correlated to actual progeny performance than either average daily gain (ADG) or dry matter intake (DMI).

Translate »