Laboratory Trial of a System for Vision Based Road Profile Analysis Detection using Stereo Vision

  • D. Lydon
  • , C. O'Higgins
  • , M. Lydon
  • , J. Early
  • , S. E. Taylor

Research output: Contribution to a Journal (Peer & Non Peer)Conference articlepeer-review

Abstract

Inspection of bridge structures is an important consideration for governmental bodies as early detection of structural damage can facilitate early repairs, theoretically reducing maintenance costs for underfunded infrastructure management departments. This paper investigates the use of a computer vision-based system for road profile analysis, which can then be used as part of a larger damage detection solution. A brief introduction to traditional methods for this task is presented, followed by a state-of-the-art review of computer vision-based approaches. A stereo vision-based method developed in Queen's University Belfast is then detailed in combination with a laboratory trial to validate the developed concept in addition to verification of validation methods. A discussion of the obtained results followed by recommendations for future work concludes the paper.

Original languageEnglish
Pages (from-to)773-778
Number of pages6
JournalInternational Conference on Structural Health Monitoring of Intelligent Infrastructure: Transferring Research into Practice, SHMII
Volume2021-June
Publication statusPublished - 2021
Externally publishedYes
Event10th International Conference on Structural Health Monitoring of Intelligent Infrastructure, SHMII 2021 - Porto, Portugal
Duration: 30 Jun 20212 Jul 2021

UN SDGs

This output contributes to the following UN Sustainable Development Goals (SDGs)

  1. SDG 9 - Industry, Innovation, and Infrastructure
    SDG 9 Industry, Innovation, and Infrastructure

Keywords

  • Computer Vision
  • Damage Detection
  • Road Profile

Fingerprint

Dive into the research topics of 'Laboratory Trial of a System for Vision Based Road Profile Analysis Detection using Stereo Vision'. Together they form a unique fingerprint.

Cite this