ORIGINAL RESEARCH article
Front. Environ. Sci.
Sec. Big Data, AI, and the Environment
Volume 13 - 2025 | doi: 10.3389/fenvs.2025.1566224
Deep Learning-Based Object Detection for Environmental Monitoring Using Big Data
Provisionally accepted- Gansu Industrial Vocational and Technical College, Tianshui, China
Select one of your emails
You have multiple emails registered with Frontiers:
Notify me on publication
Please enter your email address:
If you already have an account, please login
You don't have a Frontiers account ? You can register here
Recent advances in artificial intelligence have transformed the way we analyze complex environmental data. However, high-dimensionality, spatiotemporal variability, and heterogeneous data sources continue to pose major challenges. In this work, we introduce the Environmental Graph-Aware Neural Network (EGAN), a novel framework designed to model and analyze large-scale, multi-modal environmental datasets. EGAN constructs a spatiotemporal graph representation that integrates physical proximity, ecological similarity, and temporal dynamics, and applies graph convolutional encoders to learn expressive spatial features. These are fused with temporal representations using attention mechanisms, enabling the model to dynamically capture relevant patterns across modalities. The framework is further enhanced by domaininformed learning strategies that incorporate physics-based constraints, meta-learning for regional adaptation, and uncertainty-aware predictions. Extensive experiments on four benchmark datasets demonstrate that our approach achieves state-of-the-art performance in environmental object detection, segmentation, and scene understanding, making it a robust and interpretable tool for real-world environmental monitoring applications.
Keywords: Environmental Monitoring, Spatiotemporal modeling, Graph neural networks, meta-learning, uncertainty quantification
Received: 24 Jan 2025; Accepted: 14 May 2025.
Copyright: © 2025 Lin. This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) or licensor are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.
* Correspondence: Wenbo Lin, Gansu Industrial Vocational and Technical College, Tianshui, China
Disclaimer: All claims expressed in this article are solely those of the authors and do not necessarily represent those of their affiliated organizations, or those of the publisher, the editors and the reviewers. Any product that may be evaluated in this article or claim that may be made by its manufacturer is not guaranteed or endorsed by the publisher.