Intelligent Deep-Sea Vision System

Overview

The vision system consists of hardware and software components to support intelligent autonomous functions on-board of a Remotely Operated Vehicle (ROV) operating in the deep-sea. It is of course also suited to be used on an Autonomous Underwater Vehicle, but the context of its development is the EU-project Effective Dexterous ROV Operations in Presence of Communications Latencies (DexROV) [9], in which so to say “driver-assistance” and long-distance on-shore control of ROVs operating in the deep-sea, e.g., for Oil- and Gas-Production (OGP), were investigated [1][3][3].

The vision system supports multi cameras that can be combined, e.g., two cameras for a classical stereo-vision set-up or three cameras for multiple stereo views with different baselines or additional vergence. The cameras are in individual pressure bottles, which are daisy-chained to a computer bottle [15]. The camera bottles have flat-pane windows with high-end sapphire glass, which can be calibrated in air for underwater usage based on the PinAx camera model [1]. The system has substantial computation power [15] for on-board stereo, respectively multi-view processing as well as for further computer vision methods to support autonomous intelligent functions, e.g., object recognition, navigation, mapping, inspection, and intervention [4][5][6][10][11][12][13][14][16]

.

The system is based on a formalized approach to the component selection, i.e., given especially accuracy and baseline constraints as well as commercial of the shelf lens and imager options, an algorithmic analysis was used [15]. The validation and optimization of the pressure bottles for the cameras was done with a Finite Element Method (FEM) [15], which was complemented by validations of the stereo performance in air, robustness tests of the bottles in pressure tanks, and field trials of the complete system off the shore of Marseille on a commercial Remotely Operated Vehicle (ROV) [9].

Publications

The following publications described the hardware design of the vision system and its onboard software for underwater machine perception as well as use-cases:

[1] J. Gancet, D. Urbina, P. Letier, M. Ilzkovitz, A. Birk, M. Pfingsthorn, P. Weiss, F. Gauch, B. Chemisky, S. Calinon, G. Antonelli, G. Casalino, G. Indiveri, A. Turetta, C. Walen, and L. Guilpain, “DexROV: Enabling Effective Dexterous ROV Operations in Presence of Communication Latency,” in IEEE Oceans, Genoa, Italy, 2015. https://doi.org/10.1109/OCEANS-Genova.2015.7271691 [Preprint PDF]

[2] J. Gancet, P. Weiss, G. Antonelli, A. Birk, S. Calinon, A. Turetta, C. Walen, M. Ilzkovitz, D. Urbina, P. Letier, F. Gauch, G. Indiveri, G. Casalino, M. Pfingsthorn, A. Tanwani, and L. Guilpain, “DexROV: Dexterous Undersea Inspection and Maintenance in Presence of Communication Latencies,” in IFAC Workshop on Navigation, Guidance and Control of Underwater Vehicles (NGCUV), 2015. https://doi.org/10.1016/j.ifacol.2015.06.036 [Preprint PDF]

[3] J. Gancet, P. Weiss, G. Antonelli, M. F. Pfingsthorn, S. Calinon, A. Turetta, C. Walen, D. Urbina, S. Govindaraj, C. A. Müller, X. Martinez, T. Fromm, B. Chemisky, G. Indiveri, G. Casalino, P. A. D. Lillo, E. Simetti, D. D. Palma, A. Birk, A. Tanwani, I. Havoutis, A. Caffaz, L. Guilpain, and P. Letier, “Dexterous Undersea Interventions with Far Distance Onshore Supervision: the DexROV Project,” in 10th IFAC Conference on Control Applications in Marine Systems (CAMS), Trondheim, Norway, 2016. https://doi.org/10.1016/j.ifacol.2016.10.439 [Preprint PDF]

[4] T. Fromm, C. A. Mueller, M. Pfingsthorn, A. Birk, and P. D. Lillo, “Efficient Continuous System Integration and Validation for Deep-Sea Robotics Applications,” in IEEE Oceans, Aberdeen, UK, 2017. https://doi.org/10.1109/OCEANSE.2017.8084663 [Preprint PDF]

[5] T. Luczynski and A. Birk, “Underwater Image Haze Removal with an Underwater-ready Dark Channel Prior,” in IEEE Oceans, Anchorage, USA, 2017. [Preprint PDF]

[6] T. Luczynski, T. Fromm, S. Govindaraj, C. A. Mueller, and A. Birk, “3D Grid Map Transmission for Underwater Mapping and Visualization under Bandwidth Constraints,” in IEEE Oceans, Anchorage, USA, 2017. [Preprint PDF]

[7] T. Luczynski, M. Pfingsthorn, and A. Birk, “The Pinax-Model for Accurate and Efficient Refraction Correction of Underwater Cameras in Flat-Pane Housings,” Ocean Engineering, vol. 133, pp. 9-22, 2017. https://doi.org/10.1016/j.oceaneng.2017.01.029 [Open Access]

[8] T. Luczynski, M. Pfingsthorn, and A. Birk, “Image Rectification with the Pinax Camera Model in Underwater Stereo Systems with Verged Cameras,” in IEEE Oceans, Anchorage, USA, 2017. [Preprint PDF]

[9] A. Birk, T. Doernbach, C. A. Mueller, T. Luczynski, A. G. Chavez, D. Köhntopp, A. Kupcsik, S. Calinon, A. K. Tanwani, G. Antonelli, P. d. Lillo, E. Simetti, G. Casalino, G. Indiveri, L. Ostuni, A. Turetta, A. Caffaz, P. Weiss, T. Gobert, B. Chemisky, J. Gancet, T. Siedel, S. Govindaraj, X. Martinez, and P. Letier, “Dexterous Underwater Manipulation from Onshore Locations: Streamlining Efficiencies for Remotely Operated Underwater Vehicles,” IEEE Robotics and Automation Magazine (RAM), vol. 25, pp. 24-33, 2018. https://doi.org/10.1109/MRA.2018.2869523 [Preprint PDF]

[10] T. Doernbach, A. G. Chavez, C. A. Mueller, and A. Birk, “High-Fidelity Deep-Sea Perception Using Simulation in the Loop,” in IFAC Conference on Control Applications in Marine Systems (CAMS), 2018. https://doi.org/10.1016/j.ifacol.2018.09.465 [Preprint PDF]

[11] C. A. Mueller, T. Doernbach, A. G. Chavez, D. Köhntopp, and A. Birk, “Robust Continuous System Integration for Critical Deep-Sea Robot Operations Using Knowledge-Enabled Simulation in the Loop,” in IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), 2018. https://doi.org/10.1109/IROS.2018.8594392 [Preprint PDF]

[12] A. G. Chavez, C. A. Mueller, T. Doernbach, and A. Birk, “Underwater Navigation using Visual Markers in the Context of Intervention Missions,” International Journal of Advanced Robotic Systems (IJARS), 2019. https://doi.org/10.1177/1729881419838967 [Open Access]

[13] A. G. Chavez, Q. Xu, C. A. Mueller, S. Schwertfeger, and A. Birk, “Towards Accurate Deep-Sea Localization in Structured Environments based on Perception Quality Cues,” in Proceedings of the 18th International Conference on Autonomous Agents and MultiAgent Systems (AAMAS), Montreal QC, Canada, 2019, pp. 1988-1990. https://doi.org/10.5555/3306127.3331986 [Preprint PDF]

[14] A. G. Chavez, Q. Xu, C. A. Mueller, S. Schwertfeger, and A. Birk, “Adaptive Navigation Scheme for Optimal Deep-Sea Localization Using Multimodal Perception Cues,” in IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Macau, China, 2019. https://doi.org/10.1109/IROS40897.2019.8967888 [Preprint PDF]

[15] T. Luczynski, P. Luczynski, LukasPehle, M. Wirsum, and A. Birk, “Model based design of a stereo vision system for intelligent deep-sea operations,” Measurement, vol. 144, pp. 298-310, 2019. https://doi.org/10.1016/j.measurement.2019.05.004 [Preprint PDF]

[16] C. A. Mueller, A. Gomez Chavez, T. Doernbach, D. Köhntopp, and A. Birk, “Continuous system integration and validation for underwater perception in offshore inspection and intervention tasks,” in Fundamental Design and Automation Technologies in Offshore Robotics, Elsevier, 2020, pp. 9-75. https://doi.org/10.1016/B978-0-12-820271-5.00007-9 [Preprint PDF]