ORIGINAL RESEARCH article
Front. Plant Sci.
Sec. Technical Advances in Plant Science
Lightweight Plant Phenotypic Feature Extraction via Transferable Attention Head Pruning in Vision Transformers
Provisionally accepted- Guangxi Science and Technology Normal University, Laibin, China
Select one of your emails
You have multiple emails registered with Frontiers:
Notify me on publication
Please enter your email address:
If you already have an account, please login
You don't have a Frontiers account ? You can register here
We propose a lightweight Multi-Head Self-Attention (MHSA) mechanism for plant phenotypic feature extraction, which integrates cross-species transfer learning with dynamic head pruning to improve efficiency without compromising accuracy. The primary challenge stems from minimizing redundant computations without compromising the model's capacity to generalize over varied plant species, an issue intensified by the substantial dimensionality of attention mechanisms in Vision Transformers. Our solution, the Transferable Attention Head Alignment (TAHA) framework, operates in three stages: pre-training on a source species, cross-species alignment via a Domain Alignment Loss (DAL), and head pruning based on a transferability score. The framework selects and keeps solely the attention heads with the highest transferability, thus diminishing model intricacy without compromising the ability to distinguish phenotypic traits. Furthermore, the pruned MHSA module is smoothly combined with standard Transformer backbones, which makes efficient deployment on edge devices possible. Experiments were conducted on real edge hardware (Raspberry Pi 4, NVIDIA Jetson Nano) and GPU platforms, showing our approach attains accuracy similar to full-head models yet cuts computational expenses by as much as 40% (14.1 ms inference latency on Raspberry Pi 4, 519 M parameters). The method holds special importance for scalable plant phenotyping, in situations where computational capacity is frequently constrained yet generalization across species is essential. Moreover, the repeated alignment and pruning procedure permits gradual adjustment to novel species without complete retraining, which increases feasibility for agricultural applications in practical settings. Supplementary experiments on phylogenetically distant species (Arabidopsis → pine) demonstrate the framework's generalization limits, with a 7.2% F1-score drop compared to close-species transfer (Arabidopsis → maize), highlighting the need for trait-specific head adaptation in distant transfers. The proposed method improves lightweight feature extraction by merging transfer learning and attention head optimization, achieving a balanced compromise between performance and efficiency.
Keywords: cross-species transfer learning, Dynamic Head Pruning, Lightweight Multi-Head Self-Attention (MHSA), Plant Phenotypic Feature Extraction, Transferable Attention Head Alignment (TAHA)
Received: 03 Dec 2025; Accepted: 21 Jan 2026.
Copyright: © 2026 Xie, Zeng, Wang and Li. This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) or licensor are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.
* Correspondence: Yongsheng Xie
Disclaimer: All claims expressed in this article are solely those of the authors and do not necessarily represent those of their affiliated organizations, or those of the publisher, the editors and the reviewers. Any product that may be evaluated in this article or claim that may be made by its manufacturer is not guaranteed or endorsed by the publisher.
