TY - JOUR
T1 - Automated peanut defect detection using hyperspectral imaging and deep learning
T2 - A real-time approach for smart agriculture
AU - Chen, Shih Yu
AU - Wu, Yu Cheng
AU - Kuo, Yung Ming
AU - Zhang, Rui Hong
AU - Cheng, Tsai Yi
AU - Chen, Yu Chien
AU - Chu, Po Yu
AU - Kang, Li Wei
AU - Lin, Chinsu
N1 - Publisher Copyright:
© 2025
PY - 2025/8
Y1 - 2025/8
N2 - Manual visual inspection remains the prevailing approach for peanut quality classification; however, it is labor-intensive, prone to fatigue-induced errors, and often results in inconsistent outcomes. Peanut defects are typically categorized into four classes: healthy, underdeveloped, insect-damaged, and ruptured. This paper proposes an automated classification framework that integrates push-broom and snapshot hyperspectral imaging techniques with deep learning models for accurate and efficient peanut defect detection. A push-broom hyperspectral imaging system was employed to acquire a dataset of 1557 peanut samples, divided into a training set (477 samples: 237 healthy, 240 defective) and a test set (1080 samples). Spectral band selection was applied to reduce data dimensionality, followed by the development and evaluation of 1D, 2D, and 3D Convolutional Neural Network (CNN) models. Among them, the 3D-CNN architecture achieved the highest classification accuracy of 98 %. In addition, the snapshot imaging system enabled the construction of a lightweight CNN model for real-time defect detection. Principal Component Analysis (PCA) was utilized to identify five informative spectral bands, enabling efficient classification with an overall accuracy of 98.5 % and a Kappa coefficient of 97.3 %. The novelty of this study lies in the dual integration of push-broom and snapshot hyperspectral imaging with hybrid CNN architectures, enabling both high-accuracy offline analysis and lightweight real-time detection. The combination of spectral dimensionality reduction and attention-based modeling presents a scalable and computationally efficient solution for quality assessment. These findings represent a significant advancement in automated peanut grading, offering a robust, cost-effective, and scalable approach for deployment in smart agriculture and automated food quality control systems.
AB - Manual visual inspection remains the prevailing approach for peanut quality classification; however, it is labor-intensive, prone to fatigue-induced errors, and often results in inconsistent outcomes. Peanut defects are typically categorized into four classes: healthy, underdeveloped, insect-damaged, and ruptured. This paper proposes an automated classification framework that integrates push-broom and snapshot hyperspectral imaging techniques with deep learning models for accurate and efficient peanut defect detection. A push-broom hyperspectral imaging system was employed to acquire a dataset of 1557 peanut samples, divided into a training set (477 samples: 237 healthy, 240 defective) and a test set (1080 samples). Spectral band selection was applied to reduce data dimensionality, followed by the development and evaluation of 1D, 2D, and 3D Convolutional Neural Network (CNN) models. Among them, the 3D-CNN architecture achieved the highest classification accuracy of 98 %. In addition, the snapshot imaging system enabled the construction of a lightweight CNN model for real-time defect detection. Principal Component Analysis (PCA) was utilized to identify five informative spectral bands, enabling efficient classification with an overall accuracy of 98.5 % and a Kappa coefficient of 97.3 %. The novelty of this study lies in the dual integration of push-broom and snapshot hyperspectral imaging with hybrid CNN architectures, enabling both high-accuracy offline analysis and lightweight real-time detection. The combination of spectral dimensionality reduction and attention-based modeling presents a scalable and computationally efficient solution for quality assessment. These findings represent a significant advancement in automated peanut grading, offering a robust, cost-effective, and scalable approach for deployment in smart agriculture and automated food quality control systems.
KW - Band selection
KW - CNN
KW - Deep learning
KW - Hyperspectral
KW - Peanuts
UR - https://www.scopus.com/pages/publications/105002290845
UR - https://www.scopus.com/inward/citedby.url?scp=105002290845&partnerID=8YFLogxK
U2 - 10.1016/j.atech.2025.100943
DO - 10.1016/j.atech.2025.100943
M3 - Article
AN - SCOPUS:105002290845
SN - 2772-3755
VL - 11
JO - Smart Agricultural Technology
JF - Smart Agricultural Technology
M1 - 100943
ER -