ORIGINAL RESEARCH article

Front. Plant Sci.

Sec. Technical Advances in Plant Science

Volume 16 - 2025 | doi: 10.3389/fpls.2025.1607582

This article is part of the Research TopicMachine Vision and Machine Learning for Plant Phenotyping and Precision Agriculture, Volume IIView all 30 articles

ROSE-MAMBA-YOLO: An Enhanced Framework for Efficient and Accurate Greenhouse Rose Monitoring

Provisionally accepted
Sicheng  YouSicheng You1Boheng  LiBoheng Li2Yijia  ChenYijia Chen3Zhiyan  RenZhiyan Ren3Yongying  LiuYongying Liu3Qingyang  WuQingyang Wu4Jianghan  TaoJianghan Tao5Zhijie  ZhangZhijie Zhang5Chenyu  ZhangChenyu Zhang6Feng  XueFeng Xue7Yulun  ChenYulun Chen8Guochen  ZhangGuochen Zhang3Jundong  ChenJundong Chen9Jiaqi  WangJiaqi Wang3*Fan  ZhaoFan Zhao3*
  • 1City University of Macau, Macao, Macao, SAR China
  • 2Department of Applied Informatics, Hosei University, Tokyo, Japan
  • 3Graduate School of Frontier Sciences, The University of Tokyo, Kashiwa, Chiba, Japan
  • 4University of California, Los Angeles, Los Angeles, California, United States
  • 5Graduate School of Global Environmental Studies, Sophia University, Tokyo, Japan
  • 6Graduate School of Information, Production and Systems, Waseda University, Tokyo, Japan
  • 7Department of Math and Applied Mathematics, China University of Petroleum-Beijing, Karamay, China
  • 8Department of Environmental Science, Southwest Forestry University, Yunnan, China
  • 9Data Science and AI Innovation Research Promotion Center, Shiga University, Hikone, Japan

The final, formatted version of the article will be published soon.

Accurately detecting roses in UAV-captured greenhouse imagery presents significant challenges due to occlusions, scale variability, and complex environmental conditions. This study introduces ROSE-MAMBA-YOLO, a hybrid detection framework that combines the efficiency of YOLOv11 with Mamba-inspired state-space modeling to enhance feature extraction, multi-scale fusion, and contextual representation. The model achieves a mAP@50 of 87.5%, precision of 90.4%, and recall of 83.1%, surpassing state-of-the-art object detection models. With its lightweight design and realtime capability, ROSE-MAMBA-YOLO provides a scalable and efficient solution for UAV-based rose monitoring. Extensive evaluations validate its robustness against degraded input data and adaptability across diverse datasets, demonstrating its applicability in complex agricultural scenarios. This framework offers a practical approach for precision floriculture and sets the stage for integrating advanced detection technologies into real-time crop monitoring systems, advancing intelligent, data-driven agriculture.

Keywords: YOLOv11, Mamba, precision agriculture, Rose Detection, UAV-based monitoring

Received: 07 Apr 2025; Accepted: 03 Jun 2025.

Copyright: © 2025 You, Li, Chen, Ren, Liu, Wu, Tao, Zhang, Zhang, Xue, Chen, Zhang, Chen, Wang and Zhao. This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) or licensor are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.

* Correspondence:
Jiaqi Wang, Graduate School of Frontier Sciences, The University of Tokyo, Kashiwa, Chiba, Japan
Fan Zhao, Graduate School of Frontier Sciences, The University of Tokyo, Kashiwa, Chiba, Japan

Disclaimer: All claims expressed in this article are solely those of the authors and do not necessarily represent those of their affiliated organizations, or those of the publisher, the editors and the reviewers. Any product that may be evaluated in this article or claim that may be made by its manufacturer is not guaranteed or endorsed by the publisher.