Affiliation:
1. Center for Risk Analysis, Reliability Engineering and Environmental Modeling (CEERMA), Universidade Federal de Pernambuco, Recife, Brazil
2. Department of Production Engineering, Universidade Federal de Pernambuco, Recife, Brazil
Abstract
Systems subjected to continuous operation are exposed to different failure mechanisms such as fatigue, corrosion, and temperature-related defects, which makes inspection and monitoring their health paramount to prevent a system suffering from severe damage. However, visual inspection strongly depends on a human being’s experience, and so its accuracy is influenced by the physical and cognitive state of the inspector. Particularly, civil infrastructures need to be periodically inspected. This is costly, time-consuming, labor-intensive, hazardous, and biased. Advances in Computer Vision (CV) techniques provide the means to develop automated, accurate, non-contact, and non-destructive inspection methods. Hence, this paper compares two different approaches to detecting cracks in images automatically. The first is based on a traditional CV technique, using texture analysis and machine learning methods (TA + ML-based), and the second is based on deep learning (DL), using Convolutional Neural Networks (CNN) models. We analyze both approaches, comparing several ML models and CNN architectures in a real crack database considering six distinct dataset sizes. The results showed that for small-sized datasets, for example, up to 100 images, the DL-based approach achieved a balanced accuracy (BA) of ∼74%, while the TA + ML-based approach obtained a BA > 95%. For larger datasets, the performances of both approaches present comparable results. For images classified as having crack(s), we also evaluate three metrics to measure the severity of a crack based on a segmented version of the original image, as an additional metric to trigger the appropriate maintenance response.
Subject
Safety, Risk, Reliability and Quality
Cited by
4 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献