A Lightweight Instance Segmentation Model for Simultaneous Detection of Citrus Fruit Ripeness and Red Scale (Aonidiella aurantii) Pest Damage


Ünal İ., Eceoğlu O.

APPLIED SCIENCES, cilt.15, sa.17, ss.9742, 2025 (SCI-Expanded)

  • Yayın Türü: Makale / Tam Makale
  • Cilt numarası: 15 Sayı: 17
  • Basım Tarihi: 2025
  • Doi Numarası: 10.3390/app15179742
  • Dergi Adı: APPLIED SCIENCES
  • Derginin Tarandığı İndeksler: Science Citation Index Expanded (SCI-EXPANDED), Scopus, Aerospace Database, Agricultural & Environmental Science Database, Applied Science & Technology Source, Communication Abstracts, INSPEC, Metadex, Directory of Open Access Journals, Civil Engineering Abstracts
  • Sayfa Sayıları: ss.9742
  • Akdeniz Üniversitesi Adresli: Evet

Özet

Early detection of pest damage and accurate assessment of fruit ripeness are essential for improving the quality, productivity, and sustainability of citrus production. Moreover, precisely assessing ripeness is crucial for establishing the optimal harvest time, preserving fruit quality, and enhancing yield. The simultaneous and precise early detection of pest damage and assessment of fruit ripeness greatly enhance the efficacy of contemporary agricultural decision support systems. This study presents a lightweight deep learning model based on an optimized YOLO12n-Seg architecture for the simultaneous detection of ripeness stages (unripe and fully ripe) and pest damage caused by Red Scale (Aonidiella aurantii). The model is based on an improved version of YOLO12n-Seg, where the backbone and head layers were retained, but the neck was modified with a GhostConv block to reduce parameter size and improve computational efficiency. Additionally, a Global Attention Mechanism (GAM) was incorporated to strengthen the model’s focus on target-relevant features and reduce background noise. The improvement procedure improved both the ability to gather accurate spatial information in several dimensions and the effectiveness of focusing on specific target object areas utilizing the attention mechanism. Experimental results demonstrated high accuracy on test data, with mAP@0.5 = 0.980, mAP@0.95 = 0.960, precision = 0.961, and recall = 0.943, all achieved with only 2.7 million parameters and a training time of 2 h and 42 min. The model offers a reliable and efficient solution for real-time, integrated pest detection and fruit classification in precision agriculture.