The “Lead Zeppelin” AUV

Background

The “Lead Zeppelin” is an Autonomous Underwater Vehicle (AUV) developed by the Robotics Group in cooperation with ATLAS Elektronik. It is designed for basic research and education purposes. The “Lead Zeppelin” had performed its first major test by participating at the Student Autonomous Underwater Challenge – Europe (SAUC-E) from the 3rd to the 7th of August, 2006. SAUC-E was held at Pinewood Studios near London, where a large-scale underwater stage, which is normally used for movie productions like e.g. James Bond films, was available for the competition. A short video [high quality, 63MB] / [DVD quality, 243MB] demonstrates some of the autonomous behaviors. The team was a real outsider among the seven participating teams, which all except us had experience with underwater robotics. Nevertheless, the team came in second in performance.

The “Lead Zeppelin” is to a large extent a converted land robot as a major part of its hard and software was developed based on software and control-electronics previously developed by the group for search and rescue (SAR) robots. The most basic underwater parts, namely the hull with the motors and propellers as well as some of the sensors, were provided by ATLAS Elektronik. These parts are from a so-called Seafox submarine, which is a remotely operated vehicle (ROV), i.e., a human steers this device. The “Lead Zeppelin” got completely new electronics developed in the robotics group, a powerful on-board computer, additional sensors including cameras, and of course a large amount of software to be able to execute missions on its own. It is for example programmed to move through an environment and to avoid obstacles, to recognize targets by computer vision, and to operate at specific desired depths.

 

Hardware Components of Lead Zeppelin

As mentioned before, some of the most basic hardware parts, namely the hull, the motors and some of the sensors are based on parts from a so-called Seafox ROV by ATLAS Elektronik. For autonomous operation, completely new electronics, additional sensors, a high performance on-board computer, and of course a lot of software was added. The “Lead Zeppelin” AUV uses two computation units, which is a common approach for so-called CubeSystem applications. The basic hardware control is done by a RoboCube. Higher level AI software is running on an embedded PC. It collects all sensor data from the RoboCube as well as from the sensors directly attached to it. On an external operator PC, which is connected via wireless network, a GUI is used that can be used for teleoperation and to debug autonomy. There is the option to use an antenna-buoy to maintain wireless communication. This is of course not needed if full autonomy is activated.

The submarine is composed of a main hull, a nose and four long tubes. Those parts house the power system, the propulsion system, a low level controller, a high level controller and a rich sensor payload.

The nose section contains:

  • pressure sensor for depth measurement
  • heading sensor including pitch and roll sensor
  • scanning sonar head
  • front camera

The middle section contains:

  • echo sounder
  • vertical thruster
  • embedded PC
  • RoboCube
  • DC/DC converters
  • wireless access point

Outside, below the middle section the following parts are attached
to the vehicle:

  • down-looking ground camera
  • marker disposal system

Furthermore, four battery tubes are attached to the middle section, each
containing:

  • propulsion motor
  • propeller with guards
  • power amplifier for the motor
  • battery

The submarine is trimmed slightly positively buoyant, i.e. it comes up to the surface in case of some error. Thus the middle thruster has to be used to force the vehicle under water. The maximum power of the four propulsion motors is 350 W of which up to 280 W are usable due to PWM duty cycle limitations. The 3-blade propellers are protected by guards for safety.

 

Sensor Equipment

A. Heading Sensor
A MTi IMU from Xsens is used. It is a low-cost miniature inertial measurement unit with an integrated 3D compass. It has an embedded processor capable of calculating the roll, pitch and yaw in real time, as well as outputting calibrated 3D linear acceleration, rate of turn (gyro) and (earth) magnetic field data. The MTi uses a right handed Cartesian co-ordinate system which is body-fixed to the device and defined as the sensor co-ordinate system. Since the gyro is placed near the center of the sub the orientations provided by the device can directly be used as vehicle orientations. The heading sensor is used to measure and control the roll, pitch and yaw of the AUV.

B. Scanning Sonar
The submarine has a scanning sonar which enables it to scan the surrounding for obstacles and their distance by emitting sound signals of 550 kHz and measuring the time it takes until the sounds echo comes back. The sonar covers up to 360 degrees since it is being turned by a motor in steps of one degree or bigger and it has a resolution of 1.5 deg. The range of the sonar is selectable to up to 80 meters. The sensor can also scan vertically with a coverage of ± 20deg. The horizontal scan rate is about 60deg per second at distances of up to 10 m.

C. Pressure sensor
The pressure sensor measures the depth of the submarine below the water surface. It has an operating pressure range from 0 to 31 bar thus being operational to depth of 300 meters. It can be directly interfaced to the A/D converters of the RoboCube.

D. Echo sounder
The echo sounder measures the distance from the submarine to the ground by emitting sound pulses with a frequency of 500 kHz and a width of 100 µs. The echo of this sound signal is being received and the travel time measured is then used to calculate the depth. The beam width is ± 3deg, the accuracy ± 5 cm and the depth range 0.5 to 9.8 meters.

E. USB cameras
Two Creative NX Ultra USB cameras are used in the vehicle. They support a resolution of up to 640 x 480 pixel. Those USB 1.1 devices have a wide-angle lens which enables a field of view of 78deg. The camera at the bottom of the submarine looks at the ground. It can for example be used to locate targets and to guide the submarine directly over them. The front camera can be used to search for the bottom as well as the mid-water targets. Both cameras are inserted in a waterproof protective lid of transparent plastic.

 

The AUV Software

The software for the “Lead Zeppelin” AUV is structured like in other CubeSystem robots in two parts. The basic low level control is done on the RoboCube. It uses the CubeOS and the RobLib to generate the proper PWM signals, to decode the encoder signals from the motors and to access the analog pressure and echo sound sensors. The higher level AI software is running on the robot PC, which uses SuSE 10 as operating system. It collects all sensor data from the cube, especially depth and speed, as well as from the sensors directly attached to the PC’s interfaces. It uses the high-resolution sonar head to do obstacle avoidance and localize other objects in the basin. The front USB camera can be used to find for example midwater targets. The down-looking camera can for example locate targets situated on the bottom. The software reuses quite some functions and libraries developed for other robotics projects, especially the rescue robots.

The AI software is embedded in a robot-server, which is a multi-threaded program written in C++. All system-wide constants like port numbers, resolutions, etc., are read at startup from a configuration file. A client GUI running on a PC or a Laptop is connecting to this server in order to manually drive the robot to a start position using a Gamepad, to start the autonomy and to observe the submarines status during the mission if wireless connection is still available. The NIST RCS framework is used to handle communication between the robot-server and the operator GUI. This framework allows data to be transferred between processes running on the same or different machines using Neutral Message Language (NML) memory buffers. All buffers are located on the robot computer, and can be accessed by the operator GUI process asynchronously. The GUI uses Trolltechs cross-platform Qt Class Libraries. Several threads are running in parallel to handle the massive amount of data. Each camera view has its own thread.

During a mission autonomous underwater vehicles have to perform different tasks. In order to allow fast and reliable mission planning we use a finite state automaton which is defined in a human readable way as a text file. For debugging and during a run this automaton is shown as a graph within the GUI, indicating the current state and the last transition by using red color. This finite state automaton consists of states, transitions, conditions and actions. At a new iteration all actions of a state are executed in the order they were specified. Actions can actively manipulate the submarine by, for example, changing the desired speeds of the thrusters, by opening the marker box or by starting a timer. Only after all actions have been called the new motor speeds are applied, allowing for nice implementation of processes in the spirit of behavior-based robotics. After that, the conditions of the transitions are checked. The first transition whose condition is true is then being used to change the state of the automaton. If no condition is true the state of the automaton is not changed. The automaton is executed as a speed of 10 Hz. This means that the actions as well as the conditions that are being implemented block the execution only for a very short time. Actions and conditions have a unique string identifier that corresponds to the actual implementation of those action or condition in the code. The only exception to this are complex conditions which are defined of conditions in the automaton file itself.

 

The Team

The 2006 SAUC-E team consisted of Andreas Birk, Sören Schwertfeger, Diana Albu, Petar Dobrev, Farah Gammoh, Andrei Giurgiu, Sergiu-Cristian Mihut, Bogdan Minzu, Razvan Pascanu, Alexandru Stan, Stefan Videv from the robotics group and Jörg Kalwa from ATLAS Elektronik

.

Publications

D. Albu, A. Birk, Petar Dobrev, F. Gammoh, A. Giurgiu, S.-C. Mihut, B. Minzu, R. Pascanu, S. Schwertfeger, A. Stan, and S. Videv, “Fast Prototyping of an Autonomous Underwater Vehicle (AUV) with the CubeSystem,” in 6th International Symposium on Intelligent Autonomous Vehicles (IAV 2007), 2007, pp. 78-83. https://doi.org/10.3182/20070903-3-FR-2921.00016 [Preprint PDF]

A. Birk, “Fast Robot Prototyping with the CubeSystem,” in International Conference on Robotics and Automation (ICRA), 2004. https://doi.org/10.1109/ROBOT.2004.1302539 [Preprint PDF]

A. Birk and H. Kenn, “A Rescue Robot Control Architecture for a Rescue Robot ensuring Safe Semi-Autonomous Operation,” in RoboCup 2002: Robot Soccer World Cup VI. vol. 2752, G. Kaminka, P. Lima, and R. Rojas, Eds., ed: Springer, 2003, pp. 254-262. https://doi.org/10.1007/978-3-540-45135-8_19 [Preprint PDF]

A. Birk and H. Kenn, “Efficient Scheduling of Behavior-Processes on Different Time-Scales,” in Proceedings of the International Conference on Robotics and Automation, ICRA’2001, ed: IEEE Press, 2001. https://doi.org/10.1109/ROBOT.2001.932569 [Preprint PDF]

A. Birk, H. Kenn, and T. Walle, “On-board Control in the RoboCup Small Robots League,” Advanced Robotics Journal, vol. 14, pp. 27 – 36, 2000. https://doi.org/10.1163/156855300741410 [Preprint PDF]

A. Birk, H. Kenn, and T. Walle, “RoboCube: an “universal” “special-purpose” Hardware for the RoboCup small robots league,” in 4th International Symposium on Distributed Autonomous Robotic Systems, 1998. https://doi.org/10.1007/978-3-642-72198-4_32