Skip to main content

About this Research Topic

Manuscript Submission Deadline 29 February 2024

In recent years, 3D perception conducted in Bird's Eye View (BEV) has become an emerging technology for intelligent driving and robotics. Various BEV-based algorithms today are deployed to support perception approaches for complex tasks such as 3D vehicle and pedestrian recognition, semantic segmentation, behavior prediction, motion planning, etc. The rationale behind BEV-based perception is that BEV representations of the world, especially traffic scenarios, contain rich semantic information, precise localization, and absolute scales, which can be directly deployed by many downstream real-world applications. Moreover, BEV provides a physics-interpretable way to fuse information from different views, modalities, time series, and agents. Besides, other widely used acquisition sensors, like LiDAR and Radar, capture data in 3D space, which can be easily transformed to BEV, and conduct sensor fusion with cameras.

In this Research Topic, we aim to thoroughly investigate the recent advances to fulfil the efficiency and robustness needs of BEV-based perception. Particularly, we seek novel solutions for perspective view transformation, data collection, annotation and compression, efficient fusion, semantic perception, visualization, and interaction, etc. For instance, instead of the general image and LiDAR-based 3D detection, we expect solutions that can achieve better efficiency and robustness of detection in BEV space. We welcome submissions from both academia and industry that address the fundamental challenges and opportunities of BEV-based perception.

The topics of interest of this Research Topic include, but are not limited to:

• Occupancy networks
• Paradigm of perspective view to bird's eye view (PV2BEV)
• Sensor installation, calibration, fusion and transformation for BEV
• BEV data collection, annotation and compression
• 3D detection in BEV space
• Map, lane and semantic segmentation in BEV space
• Hybrid semantics and networks in BEV space
• Multi-task learning under BEV
• Metrics and assessments of BEV-related algorithms
• Visualization and human-computer interaction under BEV

Keywords: Bird's Eye View, Intelligent Driving, Artificial Intelligence, 3D Perception, Machine Vision


Important Note: All contributions to this Research Topic must be within the scope of the section and journal to which they are submitted, as defined in their mission statements. Frontiers reserves the right to guide an out-of-scope manuscript to a more suitable section or journal at any stage of peer review.

In recent years, 3D perception conducted in Bird's Eye View (BEV) has become an emerging technology for intelligent driving and robotics. Various BEV-based algorithms today are deployed to support perception approaches for complex tasks such as 3D vehicle and pedestrian recognition, semantic segmentation, behavior prediction, motion planning, etc. The rationale behind BEV-based perception is that BEV representations of the world, especially traffic scenarios, contain rich semantic information, precise localization, and absolute scales, which can be directly deployed by many downstream real-world applications. Moreover, BEV provides a physics-interpretable way to fuse information from different views, modalities, time series, and agents. Besides, other widely used acquisition sensors, like LiDAR and Radar, capture data in 3D space, which can be easily transformed to BEV, and conduct sensor fusion with cameras.

In this Research Topic, we aim to thoroughly investigate the recent advances to fulfil the efficiency and robustness needs of BEV-based perception. Particularly, we seek novel solutions for perspective view transformation, data collection, annotation and compression, efficient fusion, semantic perception, visualization, and interaction, etc. For instance, instead of the general image and LiDAR-based 3D detection, we expect solutions that can achieve better efficiency and robustness of detection in BEV space. We welcome submissions from both academia and industry that address the fundamental challenges and opportunities of BEV-based perception.

The topics of interest of this Research Topic include, but are not limited to:

• Occupancy networks
• Paradigm of perspective view to bird's eye view (PV2BEV)
• Sensor installation, calibration, fusion and transformation for BEV
• BEV data collection, annotation and compression
• 3D detection in BEV space
• Map, lane and semantic segmentation in BEV space
• Hybrid semantics and networks in BEV space
• Multi-task learning under BEV
• Metrics and assessments of BEV-related algorithms
• Visualization and human-computer interaction under BEV

Keywords: Bird's Eye View, Intelligent Driving, Artificial Intelligence, 3D Perception, Machine Vision


Important Note: All contributions to this Research Topic must be within the scope of the section and journal to which they are submitted, as defined in their mission statements. Frontiers reserves the right to guide an out-of-scope manuscript to a more suitable section or journal at any stage of peer review.

Topic Editors

Loading..

Topic Coordinators

Loading..

Articles

Sort by:

Loading..

Authors

Loading..

total views

total views article views downloads topic views

}
 
Top countries
Top referring sites
Loading..

About Frontiers Research Topics

With their unique mixes of varied contributions from Original Research to Review Articles, Research Topics unify the most influential researchers, the latest key findings and historical advances in a hot research area! Find out more on how to host your own Frontiers Research Topic or contribute to one as an author.