Summary
This paper presents an integrated artificial intelligence system for early pest detection in plant production, combining computer vision (CNNs and YOLOv5) with multi-modal environmental sensors (temperature, humidity, soil moisture). The system achieved 94.2% detection accuracy on a dataset of 10,000 pest images and outperformed conventional human-observer-based surveillance by 15%, whilst reducing false alarms and enabling earlier intervention. The work demonstrates the potential of sensor fusion and deep learning to enhance the efficiency and precision of pest monitoring in agricultural systems.
Regional applicability
The study geography cannot be determined from the abstract provided. The technology and methodology are broadly applicable to United Kingdom horticulture and broader crop production systems, though validation would be required under UK climatic and crop conditions to confirm real-time deployment feasibility and cost-effectiveness relative to existing pest management practices.
Key measures
Detection accuracy (94.2%); performance improvement over traditional methods (15% outperformance); false alarm reduction; dataset size (10,000 pest images with real-time environmental sensor data)
Outcomes reported
The study reports pest detection accuracy of 94.2% achieved using a system combining convolutional neural networks, YOLOv5 image processing, and environmental sensor data (temperature, humidity, soil moisture). The proposed method outperformed traditional pest recognition approaches by 15% and reduced false alarms whilst enabling earlier pest detection.
Topic tags
Dig deeper with Pulse AI.
Pulse AI has read the whole catalogue. Ask about this record, its theme, or how the findings apply to UK farming and policy — every answer cites the underlying studies.