精东影业

Outlook 鈥  11/1/2023

How Are Advancements Shaping the Future of AI in Agriculture?

Something went wrong. Please try again later...

Bots that select the best individual plants for breeding. Digital cameras that recognize and remotely monitor individual livestock. Biomass assessment that predicts crop yields for entire regions. Artificial intelligence (AI) technologies like these are coming soon to a farm near you. Possibly your own.

AI in agriculture isn鈥檛 new. As early as 1983, researchers were using it to optimize cotton yields with that simulated the effects of irrigation, fertilization, weed control, and other environmental factors on growth. But recent advances in foundational AI (FAI)鈥攕ystems that can learn and reason like humans鈥攕uggest that, like medicine, finance, education, industry, security, and the creative arts, agriculture is about to be massively disrupted.

Ag robot in field
Ag robot in field

AI in agriculture isn鈥檛 new. As early as 1983, researchers were using it to optimize cotton yields with that simulated the effects of irrigation, fertilization, weed control, and other environmental factors on growth. But recent advances in foundational AI (FAI)鈥攕ystems that can learn and reason like humans鈥攕uggest that, like medicine, finance, education, industry, security, and the creative arts, agriculture is about to be massively disrupted.

Foundationally evolved

At the leading edge of this revolution is AIFARMS (Artificial Intelligence for Future Agricultural Resilience, Management, and Sustainability). This program is a five-year research initiative created in 2019. It draws on the brainpower and resources of high-caliber organizations to accelerate the evolution of FAI by deploying it in dense, occluded environments like fields and barns. These organizations are: 

  • University of Illinois at Urbana-Champaign
  • University of Chicago
  • Donald Danforth Plant Science Center
  • Michigan State University
  • Tuskegee University
  • USDA Agricultural Research Service
  • Argonne National Laboratory

鈥淏y solving for the challenges presented by crop and livestock management,鈥 says Jessica Wedow, the executive director of AIFARMS, 鈥渨e鈥檇 strengthen AI for application in other highly variable environments鈥攍ike self-driving cars.鈥

AIFARMS is focused on four areas鈥攆ully autonomous farming, remote livestock management, environmental resilience, and predictive yield assessment鈥攖hat preview the ways in which FAI will solve some of the thorniest problems today鈥檚 farmers face. These include a shrinking workforce, rising input costs, and ever more extreme weather.

Intelligent, not just autonomous

At I-FARM, an 80-acre test-bed site in Urbana, Illinois, robots the size of carry-on luggage tend the fields. Amid dense stands of corn, they sow cereal rye cover crops. Under a canopy of soybeans, they measure plant traits and collect seed samples. Between rows of delicate new growth, they weed.

Aside from its scale, this fleet of robots would appear to be just more of the same automation that large-scale producers already rely on. But in terms of data collection, data analysis, and task performance, it represents a whole new breed.

With every data point their array of digital sensors and cameras takes in, I-FARM bots are learning about and adapting to their field environment in order to make decisions and take a variety of actions on their own鈥攏o programming necessary. Like ChatGPT and other large language models, their intelligence appears eerily human because the machine-learning algorithms animating them have trained on massive datasets. These include images rather than text, with labeled as well as unlabeled content. Occasionally they get stuck and require human intervention. But eventually their agency will be near-absolute.

Delivering greater yields with fewer inputs

That degree of agency empowers I-FARM bots to perform two essential tasks: phenotyping and cover cropping.

Trained to recognize top-performing plant specimens, the bots use computer vision鈥攁 system where images captured by low-cost cameras are interpreted by AI鈥攖o locate these outliers in crop rows and digitally document their traits. Actuators controlled by AI then collect their seeds. This method of phenotyping, which is accurate, efficient, and cost-effective at enormous scale, accelerates the development of high-yield crops by providing plant breeders with promising genetic material.

Advanced computer vision also helps I-FARM bots navigate the dense, occluded environment of a standing crop. This enables farmers to sow cover crops earlier and more cost-effectively than other methods, increasing their potential for adoption. 鈥淚f smart bots made cover cropping easy and cheap, we could produce more with less鈥攚hich is what climate change demands we do,鈥 says Wedow.

Deploying a tireless, highly skilled workforce

Cow licking
Cow licking

At I-FARM, an 80-acre test-bed site in Urbana, Illinois, robots the size of carry-on luggage tend the fields. Amid dense stands of corn, they sow cereal rye cover crops. Under a canopy of soybeans, they measure plant traits and collect seed samples. Between rows of delicate new growth, they weed.

Aside from its scale, this fleet of robots would appear to be just more of the same automation that large-scale producers already rely on. But in terms of data collection, data analysis, and task performance, it represents a whole new breed.

With every data point their array of digital sensors and cameras takes in, I-FARM bots are learning about and adapting to their field environment in order to make decisions and take a variety of actions on their own鈥攏o programming necessary. Like ChatGPT and other large language models, their intelligence appears eerily human because the machine-learning algorithms animating them have trained on massive datasets. These include images rather than text, with labeled as well as unlabeled content. Occasionally they get stuck and require human intervention. But eventually their agency will be near-absolute.

Delivering greater yields with fewer inputs

That degree of agency empowers I-FARM bots to perform two essential tasks: phenotyping and cover cropping.

Trained to recognize top-performing plant specimens, the bots use computer vision鈥攁 system where images captured by low-cost cameras are interpreted by AI鈥攖o locate these outliers in crop rows and digitally document their traits. Actuators controlled by AI then collect their seeds. This method of phenotyping, which is accurate, efficient, and cost-effective at enormous scale, accelerates the development of high-yield crops by providing plant breeders with promising genetic material.

Advanced computer vision also helps I-FARM bots navigate the dense, occluded environment of a standing crop. This enables farmers to sow cover crops earlier and more cost-effectively than other methods, increasing their potential for adoption. 鈥淚f smart bots made cover cropping easy and cheap, we could produce more with less鈥攚hich is what climate change demands we do,鈥 says Wedow.

Deploying a tireless, highly skilled workforce

Indeed, the affordability of AIFARMS solutions is as newsworthy as their intelligence because it helps solve a chronic and worsening problem: too few farmhands.

The supply of farm labor has been dwindling for decades, pushing up the cost of food production. Smart bots such as those found at I-FARM could lower grocery bills for consumers by augmenting if not replacing the humans needed to tend fields, pick crops, and manage livestock.

AIFARMS prototypes promise an affordable alternative to humans in part because they use low-cost hardware. 

Soon, however, advances in edge computing鈥攄ata processing and storage conducted by hardware at the edge of a network鈥攃ould also lower the cost of interpreting the vast amount of data these cameras collect. Right now, Wedow explains, image data collected by field bot cameras goes to the cloud to be processed by supercomputers before being shared with the farmer. 鈥淲e need that capability on the rovers,鈥 she says. 鈥淲e have enough advances in AI to make edge computing a cheaper and more energy-efficient alternative to supercomputers.鈥

Such cost reductions are likely to accelerate the adoption of a digital workforce, especially when combined with AI that optimizes task performance. Already, smart bots are helping human grape harvesters by ferrying the produce they pick to collection points. Soon, as demonstrated by prototypes at AIFARMS partner institutes in California and Washington, they will be navigating rows of strawberry plants, discerning ripe berries, and plucking them without damaging the fruit or the plants. (, a robot unveiled by Root AI in 2019 can do this with grape tomatoes, but only in greenhouse environments.) Once commercialized, these smart bots could dramatically lower the price consumers pay for soft fruit like berries, tomatoes, and grapes.

The digital workforce promises also to lower the cost of animal protein by reducing human-animal interactions鈥攊mproving both human and animal welfare in the process. Currently, farmers try to prevent costly disease outbreaks by monitoring the health of individual animals. But the larger the herd, the harder and more dangerous this job becomes. A system of cameras, microphones, and facial-recognition software from AIFARMS works in crowded barns even on muddy-faced pigs. As a result, small teams can remotely monitor thousands of animals without putting themselves at risk of injury.

Expediting environmental resilience

Edge server
Edge server

Such cost reductions are likely to accelerate the adoption of a digital workforce, especially when combined with AI that optimizes task performance. Already, smart bots are helping human grape harvesters by ferrying the produce they pick to collection points. Soon, as demonstrated by prototypes at AIFARMS partner institutes in California and Washington, they will be navigating rows of strawberry plants, discerning ripe berries, and plucking them without damaging the fruit or the plants. (, a robot unveiled by Root AI in 2019 can do this with grape tomatoes, but only in greenhouse environments.) Once commercialized, these smart bots could dramatically lower the price consumers pay for soft fruit like berries, tomatoes, and grapes.

The digital workforce promises also to lower the cost of animal protein by reducing human-animal interactions鈥攊mproving both human and animal welfare in the process. Currently, farmers try to prevent costly disease outbreaks by monitoring the health of individual animals. But the larger the herd, the harder and more dangerous this job becomes. A system of cameras, microphones, and facial-recognition software from AIFARMS works in crowded barns even on muddy-faced pigs. As a result, small teams can remotely monitor thousands of animals without putting themselves at risk of injury.

Expediting environmental resilience

Researchers at AIFARMS have found that, by training a neural net on datasets collected by both soil and satellite sensors, they can track cover-cropping yields across entire growing regions. By interpreting images that use the entire electromagnetic spectrum, the model also enables remote monitoring of organic carbon and nitrogen levels in soil (SOC).

Accurate assessment of above- and belowground nutrients reveals, in turn, how effectively growers are containing nitrogen runoff, conserving soil carbon, and managing water usage.

鈥淚f we鈥檙e able to remotely track things like SOC, we can determine what鈥檚 contributing to better outcomes and replicate them at scale,鈥 Wedow says.

Prototype today, production tomorrow

Ultimately, the success of the AIFARMS project will be determined not by the solutions it pioneers but by farmers鈥 willingness to embrace them. AIFARMS devotes fully a fifth of its funding to education and outreach. At the same time, industry partners like 精东影业, Microsoft, John Deere, and EarthSense are working closely with AIFARMS, contributing funding and providing feedback in anticipation of commercializing its solutions.

Wedow believes that EarthSense鈥檚 cover-cropping bots are likely to be the first to make the leap to mass production.

鈥淲ill they revolutionize agriculture in twelve months?鈥 she muses. 鈥淗ard to say. The technology is getting cheaper, so there鈥檚 more interest. The stakes are getting higher, so there鈥檚 more pressure. With testing at the industry level, we鈥檒l soon find out.鈥

cover crop from above
cover crop from above

Researchers at AIFARMS have found that, by training a neural net on datasets collected by both soil and satellite sensors, they can track cover-cropping yields across entire growing regions. By interpreting images that use the entire electromagnetic spectrum, the model also enables remote monitoring of organic carbon and nitrogen levels in soil (SOC).

Accurate assessment of above- and belowground nutrients reveals, in turn, how effectively growers are containing nitrogen runoff, conserving soil carbon, and managing water usage.

鈥淚f we鈥檙e able to remotely track things like SOC, we can determine what鈥檚 contributing to better outcomes and replicate them at scale,鈥 Wedow says.

Prototype today, production tomorrow

Ultimately, the success of the AIFARMS project will be determined not by the solutions it pioneers but by farmers鈥 willingness to embrace them. AIFARMS devotes fully a fifth of its funding to education and outreach. At the same time, industry partners like 精东影业, Microsoft, John Deere, and EarthSense are working closely with AIFARMS, contributing funding and providing feedback in anticipation of commercializing its solutions.

Wedow believes that EarthSense鈥檚 cover-cropping bots are likely to be the first to make the leap to mass production.

鈥淲ill they revolutionize agriculture in twelve months?鈥 she muses. 鈥淗ard to say. The technology is getting cheaper, so there鈥檚 more interest. The stakes are getting higher, so there鈥檚 more pressure. With testing at the industry level, we鈥檒l soon find out.鈥