A Positioning Method of Monocular Visual Guided Light Array in UUV Docking System
-
摘要: 在无人水下航行器(UUV)自动接驳技术中, 近距离接驳阶段操作的准确性和稳定性是有待攻克的技术难题。文中以叉柱式UUV接驳方式为背景, 提出了一种基于单目视觉导引灯阵定位方法。首先, 针对UUV接驳的机构特性及视觉对接要求, 设计了一套简易、可行性高的UUV视觉接驳装置; 其次, 对大津法进行改进, 解决了在亮度整体较暗、低对比度、细节较模糊条件下, 导引灯阵目标的特征提取问题; 再次, 利用二进制大对象(BLOB)分析描述特征结合改进多尺度核相关滤波器(KCF)跟踪算法, 解决了视觉跟踪过程中出现伪光源而导致跟踪失败的情况; 最后, 结合超短基线(USBL)与人工标定数据进行水池实验, 验证了在近距离定位过程中, 该方法相比USBL定位更加有效与稳定, 误差补偿后, 定位偏差<0.1 m。Abstract: A novel monocular visual guided light array positioning method of unmanned undersea vehicle(UUV) docking system is proposed for the fold-carrying-pole recovery. First, according to the mechanism characteristics of UUV docking and the visual docking requirements, a simple and feasible UUV visual docking device is designed. Second, the Otsu method is improved to extract target features of the guide light array under the conditions of darkness, low contrast, and blurry details. Third, using the binary large objects(BLOB) analysis to describe features and adopting the improved multi-scale kernel correlation filter(KCF) tracking algorithm, the tracking fail due to appearance of pseudo light source during the tracking process is avoided. Pool test combined with ultra-short baseline(USBL) and manual calibration data verifies that this positioning method is more effective and stable than the USBL method for short-distance positioning, and the error is within 0.1 m after error compensation.
-
Key words:
- unmanned undersea vehicle(UUV) /
- automatic docking /
- visual tracking /
- feature extraction
-
[1] 钟宏伟. 国外无人水下航行器装备与技术现状及展望[J]. 水下无人系统学报, 2017, 25(3): 215- 225.Zhong Hong-wei. Review and Prospect of Equipment and Techniques for Unmanned Undersea Vehicle in Foreign Countries[J]. Journal of Unmanned Undersea Systems, 2017, 25(3): 215-225. [2] Deltheil C, Didier L, Hospital E, et al. Simulating an Optical Guidance System for the Recovery of an Unmanned underwater vehicle[J]. IEEE Journal of Oceanic Engineering, 2002, 25(4): 568-574. [3] Hong Y H, Kim J Y, Lee P M, et al. Development of the Homing and Docking Algorithm for AUV[C]//The Thirteenth International Offshore and Polar Engineering Conference. Honolulu, Hawaii, USA: ISOPE, 2003. [4] Negre A, Pradalier C, Dunbabin M. Robust Vision Based Underwater Homing Using Self-similar Landmarks[J]. Journal of Field Robotics, 2010, 25(6-7): 360-377. [5] Wirtz M, Hildebrandt M, Gaudig C. Design and Test of a Robust Docking System for Hovering AUVs[C]//Oceans 2012. Yeosu, South Korea: IEEE, 2012. [6] Maire F D, Prasser D, Dunbabin M, et al. A Vision Based Target Detection System for Docking of an Autonomous Underwater Vehicle[C]//Australasian Conf. on Robotics and Automation 2009. Sydney, Australia: ACRA, 2009. [7] Ghosh S. Noises in Underwater Active and Passive Marker a Comparative Study[C]//IEEE International Conference Communication and Signal Processing. Melmaruvathur, India: IEEE, 2014. [8] Zhang W, Wang X, Chen T, et al. Fast Target Extraction Based on Bayesian BLOB Analysis and Simulated Annealing for Underwater Images[J]. IEEE Journal on Robotics and Automation, 2017, 32(2): 101-108. [9] Otsu N. A Threshold Selection Method from Gray-Level Histograms[J]. IEEE Transactions on Systems, Man, and Cybernetics, 2007, 9(1): 62-66. [10] Henriques J F, Caseiro R, Martins P, et al. High-Speed Tracking with Kernelized Correlation Filters[C]//IEEE Conference on Industrial Electronics and Applications. Wollongong, NSW, Australia: IEEE, 2016. [11] Li Y, Zhu J. A Scale Adaptive Kernel Correlation Filter Tracker with Feature Integration[C]//European Conference on Computer Vision Workshop VOT2014, Switzerland: Springer: ECCV, 2014. [12] Wu Y, Lim J, Yang M H. Object Tracking Benchmark[J]. IEEE Transactions on Pattern Analysis & Machine Intelligence, 2015, 37(9): 1834-1848. [13] 王晓娟. 基于视觉的AUV水下回收导引定位技术研究[D]. 哈尔滨: 哈尔滨工程大学, 2011. [14] 孙大军, 郑翠娥, 张居成, 等. 水声定位导航技术的发展与展望[J]. 中国科学院院刊. 2019, 34(3): 331-338.Sun Da-jun, Zheng Cui-e, Zhang Ju-cheng, et al. Development and Prospect for Underwater Acoustic Positioning and Navigation Technology[J]. Bulletin of Chinese Aca- demy of Sciences, 2019, 34 (3):331-338.
点击查看大图
计量
- 文章访问数: 542
- HTML全文浏览量: 14
- PDF下载量: 692
- 被引次数: 0