Overview
Project No. | 843 |
---|---|
Contract No. | 693JK31950005CAAP |
Research Award Recipient | University of Missouri (The Curators - Rolla) 300 W 12th Street, 202 Centennial Hall Rolla, MO 65409-1330 |
AOR | Joshua Johnson Zhongquan Zhou |
Researcher Contact Info | Dr. Genda Chen, Professor and Abbett Distinguished Chair in Civil Engineering Director, INSPIRE University Transportation Center 112 Engineering Research Laboratory, 500 W. 16th Street, Rolla, MO 65409-0810 Tel: (573) 341-4462; Fax: (573) 341-4729; Email: gchen@mst.edu |
Project Status | Closed |
---|---|
Start Fiscal Year | 2019 (09/30/2019) |
End Fiscal Year | 2022 (09/29/2022) |
PHMSA $$ Budgeted | $250,000.00 |
Main Objective
The project has the following objectives:
- Develop and integrate a robust and stable, semi- or fully-automated unmanned aerial system with multiple sensors for multi-purpose pipeline safety data collection,
- Explore and develop novel signal and image processing techniques for data analytics, damage assessment, and condition classification, and
- Evaluate and validate field performance of the integrated unmanned aerial system for pipeline safety inspection.
Public Abstract
The U.S. has approximately 5 million miles of pipelines for distribution and gathering of liquid and gas. Between 1999 and 2018, over 11,991 pipeline incidents occurred, resulting in over $8 billion property loss and significant injuries. Currently, the most widely used inspection method in pipeline industry is ground patrol. Such a patrol is often difficult or even dangerous due to field conditions, potential risks, and natural hazards. The goal of this study is to enable a routine and maintenance inspection of pipelines for critical data collection, processing, and application towards condition and risk assessments for pipeline operators. This goal will be achieved by developing and validating an integrated unmanned aerial system of visible light, infrared, and hyperspectral cameras with signal processing and data analytics. Three objectives and corresponding tasks are presented in logical order from hardware integration of the system with a dual-sensor infrared camera and a co-aligned VNIR-SWIR hyperspectral camera, through novel processing of still images taken from the cameras, to identification of the regions of interest (still images) from video footage in pipeline applications. One unique contribution of this study lies in hardware integration of the infrared camera for both thermal and visible light imaging and the hyperspectral camera for broadband spectral imaging, both cameras made commercially available in recent years. The other unique contribution of this study is in software innovations of a new adaptive wavelet transform of thermal and visible light images, a new spatial feature extraction of hyperspectral images that enables a direct fusion of visible light, thermal, and hyperspectral images for increased probability of detection, and a physically-interpretable deep-learning neural network for frame extraction from long hours of video footage. The hardware and software will be integrated and demonstrated under operation conditions of a pipeline through a close collaboration with pipeline operators.
Relevant Files & Links
Final Report
Other Files
De-brief presentation
De-brief_presentation_-_693JK31950005CAAP_with_Genda_Chen.pdf