Show simple item record

dc.contributor.authorNoshiri, Nooshin
dc.contributor.authorBeck, Michael A.
dc.contributor.authorBidinosti, Christopher P.
dc.contributor.authorHenry, Christopher J.
dc.date.accessioned2023-09-25T20:16:03Z
dc.date.available2023-09-25T20:16:03Z
dc.date.issued2023-09-14
dc.identifier.citationNoshiri, Nooshin, Michael A. Beck, Christopher P. Bidinosti, and Christopher J. Henry. "A comprehensive review of 3D convolutional neural network-based classification techniques of diseased and defective crops using non-UAV-based hyperspectral images." Smart Agricultural Technology 5 (October 2023): article no. 100316. DOI: 10.1016/j.atech.2023.100316.en_US
dc.identifier.issn2772-3755
dc.identifier.urihttps://hdl.handle.net/10680/2114
dc.description.abstractHyperspectral imaging (HSI) is a non-destructive and contactless technology that provides valuable information about the structure and composition of an object. It has the ability to capture detailed information about the chemical and physical properties of agricultural crops. Due to its wide spectral range, compared with multispectral-or RGB-based imaging methods, HSI can be a more effective tool for monitoring crop health and productivity. With the advent of this imaging tool in agrotechnology, researchers can more accurately address issues related to the detection of diseased and defective crops in the agriculture industry. This allows to implement the most suitable and accurate farming solutions, such as irrigation and fertilization, before crops enter a damaged and difficult-to-recover phase of growth in the field. While HSI provides valuable insights into the object under investigation, the limited number of HSI datasets for crop evaluation presently poses a bottleneck. Dealing with the curse of dimensionality presents another challenge due to the abundance of spectral and spatial information in each hyperspectral cube. State-of-the-art methods based on 1D and 2D convolutional neural networks (CNNs) struggle to efficiently extract spectral and spatial information. On the other hand, 3D-CNN-based models have shown significant promise in achieving better classification and detection results by leveraging spectral and spatial features simultaneously. Despite the apparent benefits of 3D-CNN-based models, their usage for classification purposes in this area of research has remained limited. This paper seeks to address this gap by reviewing 3D-CNN-based architectures and the typical deep learning pipeline, including preprocessing and visualization of results, for the classification of hyperspectral images of diseased and defective crops. Furthermore, we discuss open research areas and challenges when utilizing 3D-CNNs with HSI data.en_US
dc.description.sponsorship"This research did not receive any specific grant from funding agencies in the public, commercial, or not-for-profit sectors."en_US
dc.description.urihttps://www.sciencedirect.com/science/article/pii/S2772375523001454en_US
dc.language.isoenen_US
dc.publisherElsevier B.V.en_US
dc.rightsinfo:eu-repo/semantics/openAccessen_US
dc.subjectHyperspectral imagingen_US
dc.subjectAgricultureen_US
dc.subjectConvolutional neural networken_US
dc.subjectCrop disease and defect detectionen_US
dc.subjectCrop evaluationen_US
dc.subjectDeep learningen_US
dc.titleA comprehensive review of 3D convolutional neural network-based classification techniques of diseased and defective crops using non-UAV-based hyperspectral imagesen_US
dc.typeArticleen_US
dc.rights.licenseCC BYen_US
dc.identifier.doi10.1016/j.atech.2023.100316en_US


Files in this item

Thumbnail

This item appears in the following Collection(s)

Show simple item record