Cotton stand counting from unmanned aerial system imagery using mobilenet and centernet deep learning models

Zhe Lin, Wenxuan Guo

Research output: Contribution to journalArticlepeer-review

1 Scopus citations

Abstract

An accurate stand count is a prerequisite to determining the emergence rate, assessing seedling vigor, and facilitating site-specific management for optimal crop production. Traditional manual counting methods in stand assessment are labor intensive and time consuming for large-scale breeding programs or production field operations. This study aimed to apply two deep learning models, the MobileNet and CenterNet, to detect and count cotton plants at the seedling stage with unmanned aerial system (UAS) images. These models were trained with two datasets containing 400 and 900 images with variations in plant size and soil background brightness. The performance of these models was assessed with two testing datasets of different dimensions, testing dataset 1 with 300 by 400 pixels and testing dataset 2 with 250 by 1200 pixels. The model validation results showed that the mean average precision (mAP) and average recall (AR) were 79% and 73% for the CenterNet model, and 86% and 72% for the MobileNet model with 900 training images. The accuracy of cotton plant detection and counting was higher with testing dataset 1 for both CenterNet and MobileNet models. The results showed that the CenterNet model had a better overall performance for cotton plant detection and counting with 900 training images. The results also indicated that more training images are required when applying object detection models on images with different dimensions from training datasets. The mean absolute percentage error (MAPE), coefficient of determination (R2 ), and the root mean squared error (RMSE) values of the cotton plant counting were 0.07%, 0.98 and 0.37, respectively, with testing dataset 1 for the CenterNet model with 900 training images. Both MobileNet and CenterNet models have the potential to accurately and timely detect and count cotton plants based on high-resolution UAS images at the seedling stage. This study provides valuable information for selecting the right deep learning tools and the appropriate number of training images for object detection projects in agricultural applications.

Original languageEnglish
Article number2822
JournalRemote Sensing
Volume13
Issue number14
DOIs
StatePublished - Jul 2 2021

Keywords

  • CenterNet
  • Cotton stand count
  • Deep learning
  • MobileNet
  • Python
  • Remote sensing
  • Tensorflow
  • Unmanned aerial systems

Fingerprint

Dive into the research topics of 'Cotton stand counting from unmanned aerial system imagery using mobilenet and centernet deep learning models'. Together they form a unique fingerprint.

Cite this