学位論文要旨



No 129564
著者(漢字) パイノンガル ビシュワンバラン ウニキリシナン
著者(英字) PAINUMGAL VISWAMBHARAN UNNIKRISHNAN
著者(カナ) パイノンガル ビシュワンバラン ウニキリシナン
標題(和) 利用中の水パイプラインを非接触内部検査する自律型水中ロボットの研究開発
標題(洋) Research and Development of Autonomous Underwater Vehicle for Non-contact Internal Inspection of In-service Water Pipelines
報告番号 129564
報告番号 甲29564
学位授与日 2013.03.25
学位種別 課程博士
学位種類 博士(環境学)
学位記番号 博創域第909号
研究科 新領域創成科学研究科
専攻 海洋技術環境学専攻
論文審査委員 主査: 東京大学 教授 浦,環
 東京大学 教授 鈴木,英之
 東京大学 特任教授 高川,真一
 東京大学 准教授 巻,俊宏
 東京海洋大学 准教授 近藤,逸人
内容要旨 要旨を表示する

Introduction

Well functioning water networks are essential to the sustainability of a community. Large transmission and distribution water mains are often the most sensitive components of these networks since their failure can be catastrophic. Hence periodic inspection and maintenance of the pipelines is required. Since pipelines are enclosed and difficult to access, robots are ideal candidates for their inspection. During the PhD research, a new pipeline inspection AUV capable of visual inspection of the interiors of water pipelines without using a tether cable or contact with the pipe walls was proposed. The PICTAN-2 pipe inspection robot was developed as a prototype vehicle to demonstrate the proposed non-contact in-service pipe inspection technique.

A conical laser based navigation system was developed which uses feature matching technique to estimate position of a vehicle inside a pipe with four degrees of freedom [2]. The first version of the vehicle, PICTAN1 was developed by Yamada which used the developed conical laser vision based navigation technique.

PICTAN-2 Hardware

The developed AUV has four tunnel thrusters for the lateral positioning inside the pipe. Tunnel thrusters are compact and do not protrude outside the body preventing any damage during launch and recovery through the pipeline small ports. Forward motion is achieved entirely by the flowing water in the pipe. The vehicle looks identical from the front and the rear and has identical fisheye cameras, LED lights and conical lasers on either ends. Lithium ion batteries provide power and two embedded computers are used for real time acquisition and processing of images. Compact and efficient image processing and control algorithms on the embedded system enable autonomous control of the thrusters and positioning of the vehicle inside the pipe.

Several enhancements were made to the existing vehicle to achieve robust tether-less control and accurate positioning. The problems in the thruster design were carefully studied and several problems were identified. Magnetic couplings were replaced with lighter non-magnetic ceramic ball bearings reducing the load on the motor. A new more efficient propeller was developed using the open-prop propeller design software and fabricated using 3D prototyping machine. The enhancement thrusters generated 0.24N thrust, 20% higher than the designed value. Reduced friction and inertia in the mechanical system improved the speed of rotation (rpm) higher than the designed value of 2400rpm.

An accurate and visible cone-laser is essential for the proposed positioning and control system which produces a single thin beam of laser ray at the desired fan angle and generates no scattered rays. In the new design, point laser was directly used to shine on the cone mirror to reduce mechanical alignment errors. The outer cone mirror was precision machined to achieve a surface rms roughness of <16nm.

The vehicle uses two low power Marvel Xscale PXA270 microcontroller boards as the main CPUs. A compact and high-speed computer vision, PICOV, (Pipe Inspection Computer Vision) software had been developed which acquires images, processes it and performs real time control of the vehicle. A simplified flow chart of the program is given in figure 4.

Photogrammetric Position Estimation Algorithm

Position information is essential for generation of photo mosaics, 3D pipe wire frames, Ovality measurement etc.A novel position estimation algorithm, Photogrammetric, was developed to calculate the position of the vehicle in the pipe with 4-degrees of freedom namely Sway, Heave, Pitch and Yaw. Position is calculated using the camera model and trigonometric formulae.

A cone laser of known fan angle is placed behind a fish eye camera at a fixed baseline distance. The primary axis of the laser and the projection axis of the camera are made collinear. The slant distance of the vehicle from the pipe wall can be calculated from the formulae. Once the slant distance is known, the pitch and yaw can be calculated by comparing the relative position shift between the front and rear side in their respective planes. The actual radial distance can then be computed by applying correction for the angular offset in the slant distance. For x plane yaw angle will be used and for y plane, the pitch angle. The position of the centre of the vehicle is finally obtained by using a straight line equation in parametric form.

A linear motion calibration bench with a rotary turn table was used to get ground truth image data. Heave displacements were simulated on the vehicle by using a lead screw. Ground truth images were used to calculate position using the position estimation algorithm and this output was compared to ground truth to visualize the position estimation accuracy. The accuracy of the positioning algorithm was later improved by treating the input images through image morphology algorithms and also by interpolating sections of the image where laser is not visible due to distance from camera, mirror frame shadow etc.

Real time tether less Positioning tests

Real time positioning experiments were conducted in the freshwater pool test facility at the URA Laboratory. A 3m long 0.76m diameter polyethylene pipe with 0.4m long window in the middle was used for the experiment. The pipe was fully submerged in water. The vehicle was made slightly buoyant and placed in to the pipe though the window. Once the control was switched on, the vehicle controlled its thrusters to reach the set point, which was the centre of the pipe. The cone laser images collected by the front and the rear camera were used to estimate the path the vehicle followed during the control action using the position estimation algorithm. The control system of the vehicle moved the vehicle successfully towards the set point taking about 60seconds for the vehicle to get alined to the centre of the pipe.

Real time Positioning test during motion

A simple experiment was conducted to test the control capability during motion. During this test, the vehicle was made to move slowly from one end to the other of a fully submerged 4m pipe by pulling it using a 0.5mm thin thread at a speed of 0.03m/s using speed controllable motor with pulley.

The vehicle controlled itself to the reach the centre of the pipe in less than 50 sec and dynamically positioned the vehicle towards the centre within +/-20mm until the end of the pipe.

Real time Positioning test with flow

The water channel experiment facility of Public Works Research Institute (Dobokukenkyujo) located at Tsukuba was chosen for the experiments with flow. The water channel has dimensions of 1m breadth, 1.3m height and

15m length. Flow rates of 0.05m/sec to 0.3m/sec were generated. The vehicle was placed at one end of the pipe. The flow provided the forward motion to the vehicle in the pipe. During the motion the vehicle's controls system controlled the vehicle to reach the centre of the pipe. Similar to the still water experiments, vehicle position was estimated to verify the path of the vehicle.

Imaging tests and test in bent pipeline

An experiment was conducted to demonstrate the multitasking capability of the vehicles camera. The images captured were used to generate a photomosaic. Experiments were also conducted in a 90 degree bent section of pipe, successfully. The vertical 90 degree pitching capability of the vehicle was verified to prove that the vehicle can operate in vertical bents or vertical straight pipes.

Conclusion

A novel technique to inspect an in-service water pipeline without coming in contact with the pipe wall was developped through this research. A compact and efficient real time image processing program named PICOV has been developed. A new Photogrammetric position estimation algorithm was developed for estimating vehicle position with 4 degrees of freedom in the pipe by offline data analysis.

Experiments were conducted in still water, flowing water and to demonstrate the tether-less non-contact positioning capability of the vehicle as well as to test the position estimation algorithm.

[1] Se-gon Roh, Hyouk Ryeol Choi; Differential-drive in-pipe robot for moving inside urban gas pipelines, Robotics, IEEE Transactions on [see also Robotics and Automation, IEEE Transactions on] Volume 21,Issue 1, Feb 2005 Page(s):1 - 17[2] Unnikrishnan.P.V, Blair Thornton, Tamaki Ura, and Yoshiaki Nose, A Conical Laser Light-Sectioning Method for Navigation of Autonomous Underwater Vehicles for Internal Inspection of Pipelines, OCEANS ´09 IEEE Bremen, Germany, 2009 May[3] Takuyoshi Yamada, Blair Thornton, Unnikrishnan P. V., Adrian Bodenmann and Tamaki Ura, Development of a small untethered underwater robot for internal inspection of flowing pipes, Workshop for Asian and Pacific Universities' Underwater Roboticians, Tokyo, Japan, 2010 March[4] Adrian Bodenmann , Blair Thornton, Tamaki Ura and Unnikrishnan V. Painumgal, Visual Mapping of Internal Pipe Walls using Sparse Features for Application on board Autonomous Underwater Vehicles, OCEANS ´09 IEEE Bremen, Germany, 2009 May[5] Specification of Mobisense MBS270 XScale(R) based embedded systemhttp://www.mobisensesystems.com/pages_en/xscale_boards.html

Figure 1: Schematic of PICTAN2 AUV

Figure 2: Cone laser design

Figure 3: Handmade 162 LED's Array

Figure 4:Steps involved in image acquisition and control of vehicle

Figure 5: Algorithm

Figure 6: Schematic of Motion experiment

Figure 7: Position estimate during forward motion

Figure 8: Experiment setup at Dobokukenkyujo water channel

審査要旨 要旨を表示する

本論文は、10章からなる。第1章では、パイプライン調査の現状、問題点が述べられ、本論の構成が提示されている。第2章では、自律型水中ロボットによる調査研究と開発されたロボットの展開についての背景が議論されている。管内に接触せず、突起物がなく、また、長距離の調査を可能な新たなロボットコンセプトが提示されている。管内を長手方向に進行するには、推進器をつけず、管内流れを利用して流れていき、横方向に移動するには4機の新たに開発するトンネルスラスタを導入することにより、ピッチ角と横位置を制御し、小型で使い勝手のよいロボット機構を提示している。さらに、管内における相対位置計測のために前後に2つの円錐型レーザー装置を設置する手法を提案している。第3章では、トンネルスラスタの問題点を明らかにし、プロペラ形状や駆動システムを含めて小型トンネルスラスタを研究開発している。第4章では、エネルギ消費の少ないレーザー観測システムを研究開発している。円錐状のレーザーシートをロボット内から発射し管内壁面からの反射光を計測するシステムを開発している。さらに、管内面を撮影するための小型照明およびカメラ装置を開発し、ロボット前後に取り付け、ロボットユーザーが必要とする管内画像を得ている。この映像は、パイプラインの継ぎ目を観測することができるので、ロボットの長手方向位置を計測することが可能となっている。第5章では、これら機器を制御する小型装置を研究開発し、実装試験を空気中でおこない、性能を確認している。第6章では、レーザー反射映像から位置を計測し、ロボットが管軸上に戻ることができる制御システムを作るための画像処理システムを開発している。第7章では、画像より管内ロボット相対位置を計測するシステムをくみ上げ、精度を検討している。最終的には、直径108mm、長さ675mmの小型水中ロボットを完成させている。第8章では、プールの中に沈められたパイプ内における相対位置制御試験をおこない、良好な結果を得ている。さらに、(独)土木研究所の協力を得て、パイプを流路内に設置し、水を流し、管内流れのあるパイプ中における制御試験をおこない、管軸に沿ってロボットを進ませることに良好な結果を得ている。すなわち、総合試験をおこない、当初計画しているロボット機能が実現したことを示している。第9章では、これまで計測の対象としてきた直管ではなく、ベント管の中での行動を実験的に検証し、ベント管においてもその軸上へとロボットを制御できるアルゴリズムを開発している。第10章では、これまでの結果をまとめ、研究開発したロボットが水の流れのある管内調査に有効に活用できると結論づけている。

本論文により、水道管等の水パイプライン内面調査に新しい観測機器が研究開発され、水中ロボット学に新たな知見を加えただけでなく、管内観測技術の新しい方向性を示した。

したがって、博士(環境学)の学位を授与できると認める。

UTokyo Repositoryリンク