AUTHOR=He Shuqian , Jin Biao , Sun Xuechao , Jiang Wenjuan , Gu Jiaxing , Gu Fenglin TITLE=Few-shot object detection for pest insects via features aggregation and contrastive learning JOURNAL=Frontiers in Plant Science VOLUME=Volume 16 - 2025 YEAR=2025 URL=https://www.frontiersin.org/journals/plant-science/articles/10.3389/fpls.2025.1522510 DOI=10.3389/fpls.2025.1522510 ISSN=1664-462X ABSTRACT=Accurate detection of pest insects is critical for agricultural pest management and crop yield protection, yet traditional detection methods struggle due to the vast diversity of pest species, significant individual differences, and limited labeled data. These challenges are compounded by the typically small size of pest targets and complex environmental conditions. To address these limitations, this study proposes a novel few-shot object detection (FSOD) method leveraging feature aggregation and supervised contrastive learning (SCL) within the Faster R-CNN framework. Our methodology involves multi-scale feature extraction using a Feature Pyramid Network (FPN), enabling the capture of rich semantic information across various scales. A Feature Aggregation Module (FAM) with an attention mechanism is designed to effectively fuse contextual features from support and query images, enhancing representation capabilities for multi-scale and few-sample pest targets. Additionally, supervised contrastive learning is employed to strengthen intra-class similarity and inter-class dissimilarity, thereby improving discriminative power. To manage class imbalance and enhance the focus on challenging samples, focal loss and class weights are integrated into the model’s comprehensive loss function. Experimental validation on the PestDet20 dataset, consisting of diverse tropical pest insects, demonstrates that the proposed method significantly outperforms existing approaches, including YOLO, TFA, VFA, and FSCE. Specifically, our model achieves superior mean Average Precision (mAP) results across different few-shot scenarios (3-shot, 5-shot, and 10-shot), demonstrating robustness and stability. Ablation studies confirm that each component of our method substantially contributes to performance improvement. This research provides a practical and efficient solution for pest detection under challenging conditions, reducing dependency on large annotated datasets and improving detection accuracy for minority pest classes. While computational complexity remains higher than real-time frameworks like YOLO, the significant gains in detection accuracy justify the trade-off for critical pest management applications.