Your new experience awaits. Try the new design now and help us make it even better

ORIGINAL RESEARCH article

Front. Plant Sci.

Sec. Technical Advances in Plant Science

Volume 16 - 2025 | doi: 10.3389/fpls.2025.1676977

This article is part of the Research TopicPlant Phenotyping for AgricultureView all 16 articles

Machine Vision-Based Detection of Browning Maturity in Shiitake cultivation sticks

Provisionally accepted
Zeting  LiuZeting Liu1Jiuxiao  ZhaoJiuxiao Zhao2,3Wengang  ZhengWengang Zheng3,4Qiuxiao  SongQiuxiao Song4Xin  ZhangXin Zhang3,5Wei  LiuWei Liu2Feifei  ShanFeifei Shan2,3Ruixue  XuRuixue Xu2Zuolin  LiZuolin Li2,6Jing  DongJing Dong2,6Pengfei  ZhaoPengfei Zhao2,6Yajun  WangYajun Wang1*Mingfei  WangMingfei Wang2,3*
  • 1Dalian Polytechnic University School of Mechanical Engineering and Automation, Dalian, China
  • 2NongXin Science & Technology (Beijing), Beijing, China
  • 3Intelligent Equipment Technology Research Center, Beijing Academy of Agriculture and Forestry Sciences, Beijing, China
  • 4Shandong Qihe Bio Technology, Shandong, China
  • 5Beijing Academy of Agriculture and Forestry Sciences, Beijing, China
  • 6Beijing Academy of Agriculture and Forestry Sciences Information Technology Research Center, Beijing, China

The final, formatted version of the article will be published soon.

Accurate monitoring of pigmentation changes during the browning stage of shiitake cultivation sticks is essential for assessing substrate maturity, forecasting mushroom emergence, and improving cultivation quality. However, current commercial detection methods lack objective, real-time and quantifiable evaluation indicators for the browning degree assessment. To address this, this study proposes a detection method based on two-stage image segmentation. Firstly, the proposed VG-Stick-YOLOv11, built upon YOLOv11n-seg with VanillaNetBlock and GhostConv, achieved the best mIoU of 95.80% while markedly reducing parameters and computation, offering an efficient solution for real-time contour extraction and browning stage classification of shiitake sticks. Based on the extracted features during this stage, machine learning techniques were employed to enable rapid and semiautomatic annotation of browning regions, thereby constructing a segmentation dataset. Finally, the ResNet-Stick-UNet (RS-UNet) model was designed for browning regions segmentation and area ratio calculation. The encoder adopts ResNet50 with multi-branch inputs and stacked small kernels to enhance feature extraction, while the decoder incorporates a hybrid structure of grouped and depthwise separable convolutions for efficient channel fusion and detail preservation. In addition, a spatial attention mechanism was embedded in skip connections to emphasize large-scale browning regions. Experimental results showed that RS-UNet achieved a segmentation accuracy of 94.35%, improved the IoU to 88.56%, outperforming comparison models such as Deeplabv3+ and Swin-UNet, while reducing parameters by 36.31% compared to ResNet50-U-Net. This collaborative model provides a quantitative basis for the maturity detection of shiitake cultivation sticks during the browning stage, promoting the intelligent and standardized development of shiitake substrate cultivation.

Keywords: Shiitake cultivation sticks, browning detection, image segmentation, YOLOv11, Resnet

Received: 31 Jul 2025; Accepted: 20 Oct 2025.

Copyright: © 2025 Liu, Zhao, Zheng, Song, Zhang, Liu, Shan, Xu, Li, Dong, Zhao, Wang and Wang. This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) or licensor are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.

* Correspondence:
Yajun Wang, wangyj@dlpu.edu.cn
Mingfei Wang, wangmf@nercita.org.cn

Disclaimer: All claims expressed in this article are solely those of the authors and do not necessarily represent those of their affiliated organizations, or those of the publisher, the editors and the reviewers. Any product that may be evaluated in this article or claim that may be made by its manufacturer is not guaranteed or endorsed by the publisher.