Civil Engineering & Construction: Faculty Publications
Document Type
Article
Publication Date
1-13-2026
Publication Title
Sensors
DOI
10.3390/s26020530
Abstract
The deterioration of civil infrastructure poses a significant threat to public safety, yet conventional manual inspections remain subjective, labor-intensive, and constrained by accessibility. To address these challenges, this paper presents a real-time robotic inspection system that integrates deep learning perception and autonomous navigation. The proposed framework employs a two-stage neural network: a U-Net for initial segmentation followed by a Pix2Pix conditional generative adversarial network (GAN) that utilizes adversarial residual learning to refine boundary accuracy and suppress false positives. When deployed on an Unmanned Ground Vehicle (UGV) equipped with an RGB-D camera and LiDAR, this framework enables simultaneous automated crack detection and collision-free autonomous navigation. Evaluated on the CrackSeg9k dataset, the two-stage model achieved a mean Intersection over Union (mIoU) of 73.9 ± 0.6% and an F1-score of 76.4 ± 0.3%. Beyond benchmark testing, the robotic system was further validated through simulation, laboratory experiments, and real-world campus hallway tests, successfully detecting micro-cracks as narrow as 0.3 mm. Collectively, these results demonstrate the system’s potential for robust, autonomous, and field-deployable infrastructure inspection.
Recommended Citation
Ogun, Emmanuella, Yong Ann Voeurn, Doyun Lee.
2026.
"A Real-Time Mobile Robotic System for Crack Detection in Construction Using Two-Stage Deep Learning."
Sensors, 26 (2): MDPI.
doi: 10.3390/s26020530
https://digitalcommons.georgiasouthern.edu/civil-eng-facpubs/123
Creative Commons License

This work is licensed under a Creative Commons Attribution 4.0 License.
Comments
Georgia Southern University faculty member, Doyun Lee co-authored, "A Real-Time Mobile Robotic System for Crack Detection in Construction Using Two-Stage Deep Learning."