The 4th industrial revolution started in 2016 and referred to a new phase in the industrial revolution. One of the most significant technological evolvements during the 4th industrial revolution is Augmented Reality (AR) technology. AR superimposes interactional virtual objects/images to real environments. Because of the interaction and see-through characteristics, AR is better applied to engineering than Virtual Reality (VR). The application of AR in civil infrastructure can avoid artificial mistakes, improve efficiency, and saves budget. This article reviews AR applications in civil infrastructure, focusing on research studies in the latest five years (2016–2020) and their milestone developments. More than half of the AR research and implementation studies have focused on the construction domain in the last five years. Researchers deploy AR technologies in on-site construction to assist in discrepancy checking, collaborative communication, and safety checking. AR also uses building information models (BIMs) to produce detailed 3D structural information for visualization. Additionally, AR has been studied for structural health monitoring (SHM), routine and damage detection, energy performance assessment, crack inspection, excavation, and underground utility maintenance. Finally, AR has also been applied for architecture design, city plan, and disaster prediction as an essential part of smart city service. This article discusses the challenges of AR implementation in civil infrastructure and recommends future applications.
With the current deterioration rate of existing infrastructure, the importance of intervention and preservation efforts such as on-site visual inspections, non-destructive evaluation, structural health monitoring (SHM), and building pathology are on the rise. A critical aspect of these intervention and preservation methods is the visualization and accessibility of large, heterogeneous data sets. To enable diverse stakeholders to make informed choices, data and metadata for the built environment needs to be directly integrated into a user's viewing environment. To address this challenge, a human-machine interface which organizes these types of data and provides actionable information is necessary. The main aim of this work is to develop a preliminary framework for documenting and visualizing data about the built environment both on and off site using a combination of image-based documentation and augmented reality (AR). While this work illustrates preliminary annotation mechanisms such as drawing, the concept of projecting data between the image-based environment and the AR environment is the main contribution of this work. This method was applied to test objects as well as case studies in SHM and building pathology.