Abstract

Efficient and sustainable food production and management are growing concerns in the context of an ever-increasing global population. It is in this context that the integration of remote sensing with advanced imaging technologies presents transformative opportunities for data-driven decision-making in agriculture. This study therefore explores the use of unmanned aerial systems (UAS) equipped with multispectral, hyperspectral, and LiDAR sensors for the non-destructive monitoring of table beet (Beta vulgaris) crop traits, with a specific focus on root yield estimation and foliar disease assessment. Table beet, a subterranean crop of increasing commercial and nutritional importance, poses unique challenges for above-canopy sensing due to its below-ground storage organ. Two seasons of UAS campaigns were conducted (2021 and 2022) at Cornell AgriTech in Geneva, NY, capturing multispectral (475, 560, 668, 717, and 840 nm), hyperspectral (400-1000 nm), and LiDAR data across multiple growth stages. Initial analysis employed hyperspectral imagery to identify key narrow-band wavelengths predictive of root yield, yielding leave-one-out cross-validation R2 values between 0.85–0.90 and RMSE values of 10.81–12.93%. The 760–920 nm spectral range was most indicative of yield performance. Subsequent modeling efforts focused on developing growth stage- and season-independent yield prediction models. A Gaussian Process Regression model, using only multispectral data, achieved an R2test = 0.81 and MAPEtest = 15.7%, while the fusion of hyperspectral and LiDAR data produced an R2test = 0.79 and MAPEtest = 17.4%. This investigation additionally revealed the added value of structural information. For disease monitoring, we developed various machine learning models using features derived from five-band multispectral imagery, including vegetation indices and gray-level co-occurrence matrix (GLCM)-based texture metrics. The top-performing model achieved R2test = 0.90 and RMSEtest = 7.18%, and hyperspectral-based models reached R2test = 0.87 and RMSEtest = 10.1%. Here we demonstrated disease severity monitoring at relatively course resolution. This work demonstrates the efficacy of UAS-based multimodal sensing for high-throughput phenotyping, while also addressing key questions in sensor selection, feature engineering, and model generalization. The methodologies developed here offer scalable, data-driven solutions for yield forecasting and disease assessment, and may be adapted for broader use across other crops and sensing platforms.

Library of Congress Subject Headings

Beets--Yields--Forecasting; Drone aircraft in remote sensing; Machine learning

Publication Date

8-2025

Document Type

Dissertation

Student Type

Graduate

Degree Name

Imaging Science (Ph.D.)

Department, Program, or Center

Chester F. Carlson Center for Imaging Science

College

College of Science

Advisor

Callie Babbitt

Advisor/Committee Member

Anthony Vodacek

Advisor/Committee Member

Carl Salvaggio

Comments

This dissertation has been embargoed. The full-text will be available on or around 2/22/2026.

Campus

RIT – Main Campus

Plan Codes

IMGS-PHD

Available for download on Wednesday, February 18, 2026

Share

COinS