Automated Health Estimation of Capsicum annuum L. Crops by Means of Deep Learning and RGB Aerial Images
Article
-
- Overview
-
- Research
-
- Identity
-
- Additional Document Info
-
- View All
-
Overview
abstract
-
Recently, the use of small UAVs for monitoring agricultural land areas has been increasingly used by agricultural producers in order to improve crop yields. However, correctly interpreting the collected imagery data is still a challenging task. In this study, an automated pipeline for monitoring C. Annuum crops based on a deep learning model is implemented. The system is capable of performing inferences on the health status of individual plants, and to determine their locations and shapes in a georeferenced orthomosaic. Accuracy achieved on the classification task was 94.5. AP values among classes were in the range of (Formula presented.) for plant location boxes, and in (Formula presented.) for foliar area predictions. The methodology requires only RGB images, and so, it can be replicated for the monitoring of other types of crops by only employing consumer-grade UAVs. A comparison with random forest and large-scale mean shift segmentation methods which use predetermined features is presented. NDVI results obtained with multispectral equipment are also included. © 2022 by the authors.
publication date
funding provided via
published in
Research
keywords
-
deep learning; Mask RCNN; precision agriculture; UAVs Antennas; Decision trees; Deep learning; Plants (botany); Precision agriculture; Aerial images; Agricultural land; Capsicum annuum L; Crop yield; Deep learning; Imagery data; Land areas; Mask RCNN; Precision Agriculture; Small UAV; Crops
Identity
Digital Object Identifier (DOI)
PubMed ID
Additional Document Info
start page
end page
volume
issue