Oak Ridge National Laboratory

10/15/2024 | News release | Distributed by Public on 10/15/2024 08:29

Q&A with Larry York: APPL lab uses robotics, AI to advance plant science

October 15, 2024
Larry York is a senior staff scientist who works closely with the Advanced Plant Phenotyping Laboratory at ORNL. His work focuses on understanding how plant shoots and roots work together to optimize soil resource use. Credit: Carlos Jones/ORNL,U.S. Dept. of Energy

The Advanced Plant Phenotyping Laboratory, or APPL, uses robotics, multi-modal imaging and artificial intelligence at the Department of Energy's Oak Ridge National Laboratory to accelerate our understanding of plant genetics, performance and plant-microbe interactions. The lab helps scientists link genes to traits of interest for breakthroughs in bioenergy, agriculture and climate resilience. Larry York, an ORNL senior staff scientist who works closely with APPL, discusses the lab's capabilities, what kind of data it generates to accelerate plant science and what a new digital underground imaging system can reveal as researchers work on better biomass plant feedstocks for bioenergy and natural carbon storage pathways.

Q: How does APPL work?

A: The Advanced Plant Phenotyping Laboratory is a robotic greenhouse system using automated conveyors to move plants through imaging stations that measure plant properties, and watering stations that can add precise amounts of water and nutrients. We have 520 trays that can hold a single pot or as many as 20 pots per tray, meaning imaging as many as 10,400 plants during an experiment. The plants move through as many as five imaging stations where sensors and cameras collect digital images of the plants, and all this happens 24/7. Those images are processed using computer vision and AI. APPL stations conduct dynamic chlorophyll fluorescence imaging that measures photosynthesis efficiency, thermal imaging, near-infrared imaging, 3D laser scanning, and hyperspectral imaging, making up what we believe to be among the largest arrays of imaging modalities for plants in an automated greenhouse in the world. These images are analyzed using custom code to convert images to numeric traits, such as height, area, color and more advanced spectral analyses. The result is critical data on photosynthetic activity and efficiency, biomass accumulation, leaf growth and function, water content and distribution, stomatal dynamics controlling water use, stress response and biochemical composition. These numeric data are then analyzed using common biostatistical frameworks and new methods to accommodate the dense multiple-variable, time-series data.

APPL provides high-resolution data to accelerate fundamental science investigations - connecting genotypes to phenotypes with applications for bioenergy, agriculture and climate resilience. Credit: Carlos Jones/ORNL, U.S. Dept. of Energy

Q: What does APPL enable that we can't do now?

A: The current state-of-the-art in plant science for measuring plants can be as simple as growing plants for 30 days, cutting off the shoot, drying it and weighing it, giving one measurement at one time point. But APPL can provide hundreds of different measurements of living plants over that same amount of runtime, yielding multiple variables simultaneously. Images are processed to produce numeric traits that can be interpreted, and some of the traits have direct manual analogs like height. In other cases, measurements like leaf temperature are thought to relate to water use so can be used as a proxy but require extensive validation with ground truth measurements. At APPL, we take great care to ensure measurements are properly validated.

In addition to obvious but important traits such as plant height, area and volume, we can calculate less common measurements like plant shape and mass distribution. Using color imaging, we can estimate plant greenness as related to health and photosynthesis, and using hyperspectral imaging of 1,000 wavelengths, or false colors, we can give even more detailed spectral indices related to plant growth, water content, nitrogen content and chlorophyll content. In the future, we expect to use our imagery to estimate more and more chemical aspects, such as lignin and cellulose. Using imagery of fluoresced light, we can estimate aspects of the plant's ability to turn sunlight into food.

We are also working to streamline image and data analysis to provide near-real-time data and insights to our collaborators. This includes deploying a new analytics server that will provide the computational power to quickly analyze thousands of images in parallel as needed using 256 processor cores, over 1 TB RAM, and two powerful NVIDIA graphics cards. A single hyperspectral image can be 500 MB, so having the ability to process quickly is important.

At the same time, we are working to provide FAIR data to customers using new cloud computing capabilities outside APPL. In one project, we are working to automatically use APPL data in simulation models of leaf photosynthesis and in automatic genetic analysis and genomic prediction. That project is part of ORNL's INTERSECT initiative, advancing autonomous experiments that leverage advanced computing, scientific instruments and facilities.

Q: What have been some interesting applications of APPL so far?

A: APPL was used to confirm the discovery of a gene that significantly increases biomass in poplar trees. APPL measured leaf area index using 3D laser scans of poplars containing the gene compared to the control. In another project, scientists were able to discern changes in leaf size and shape in plants that were colonized with a bacteria believed to confer heat stress tolerance. In poplar, genetic analysis has uncovered hundreds of genetic regions associated with traits measured in APPL over time, leading to a new way to think about the influence of plant genetics over the course of a plant's development. Additional work has explored how soil fungus colonize plant roots and influenced their growth over time.

APPL has acquired more than 350,000 images across the multiple modalities over the past couple years. These images are being used by ORNL's John Lagergren, associate staff scientist, for the development of AI tools such as vision transformers to automatically detect and quantify individual leaves. This will allow better understanding of how plant function depends on leaf location and size, for example. Just like AI is transforming many aspects of society, the influence on plant science is expected to be immense.

Plants move through an imaging chamber in APPL. Credit: Carlos Jones/ORNL, U.S. Dept. of Energy

Q: What's ahead for APPL?

A: Several upgrades are coming to APPL through laboratory investments, but the one we're most excited about is the digital underground - a one-of-a-kind capability gathering data on the plant root system and soil that will be installed in the first half of 2025. The underground imaging method utilizes a plant growth container called a rhizobox. The box is a thin, transparent acrylic container held at an angle, so that roots grow downwards along a clear window hidden by a sliding cover to block light. When the rhizobox moves into APPL's imaging chambers, a robotic arm will slide the cover off, exposing the roots and soil for imaging with a high-resolution color camera as well as a near infrared camera using two frequencies of infrared light.

APPL can be used in a variety of other ways to support the DOE's mission. Currently, we are investigating using APPL to further screen plants than have been genetically transformed. Often these transformed plants have genes turned off or turned up to change how the plant behaves. While these plants are often created with a single trait in mind, APPL can screen them for how the modification could influence multiple traits as well as how the plants perform in various stress conditions such as water or nutrient stress. The imaging and analysis methods could be transferred to a new robotic system to screen newly modified plants even earlier in the process to select those that are best suited for downstream applications.

Q: What can we learn using the digital underground?

A: The color images we capture will be used with ORNL software called RhizoVision Explorer to extract a suite of root traits such as length and diameter that will help us understand how roots forage in soil for nutrients and water. RhizoVision Explorer has been downloaded more than 11,000 times and is in use around the world. New versions of the software will continue to benefit the root biology community as well as the science mission at ORNL. The new infrared imaging capability is expected to allow estimating soil water content at a millimeter scale, allowing us to ask questions about how the abundance of roots influences local soil water depletion.

The digital underground will give scientists data around whether plants are investing more in roots or shoots. That's a basic question in plant science, and with the underground system we'll be able to look at that ratio across different genotypes and treatments to see how plants respond to drought, for example, by increasing their allocation to roots. We also want to learn more about whether plants are building thicker roots or finer roots and how they branch. The finer roots are typically better able to explore the soil. That's often what we want to see: plants that can, on average, explore or "forage" in their soil environment more evenly to make the most of it.

Q: How can potential collaborators learn more about APPL?

A: They can visit the APPL website on ORNL.gov to learn more and even take a virtual tour of the facility, and send an email to the contact listed. We can help you narrow down your experiment based on the type of data you're interested in, and help guide you through the collaborative agreement process.

UT-Battelle manages ORNL for DOE's Office of Science. The single largest supporter of basic research in the physical sciences in the United States, the Office of Science is working to address some of the most pressing challenges of our time. For more information, please visit energy.gov/science.

Media Contact

Contact

Stephanie G Seay
865.576.9894