The “Lead Zeppelin” AUV

Background

The “Lead Zeppelin” is an Autonomous Underwater Vehicle (AUV) developed by the IUB Robotics Group and the Robotics Club in cooperation with ATLAS Elektronik. It is designed for basic research and education purposes. The “Lead Zeppelin” has already performed its first major test by participating at the Student Autonomous Underwater Challenge – Europe (SAUC-E) from the 3rd to the 7th of August, 2006. SAUC-E was held at Pinewood Studios near London, where a large-scale underwater stage, which is normally used for movie productions like e.g. James Bond films, was available for the competition. A short video [high quality, 63MB] / [DVD quality, 243MB] demonstrates some of the autonomous behaviors. The IUB team was a real outsider among the seven participating teams, which all except us had experience with underwater robotics. Nevertheless, the IUB team came in second in performance.

The IUB “Lead Zeppelin” is to a large extent a converted land robot as a major part of its hard and software was developed for the latest generation of the IUB rescue robots. The most basic underwater parts, namely the hull with the motors and propellers as well as some of the sensors, were provided by ATLAS Elektronik. These parts are from a so-called Seafox submarine, which is a remotely operated vehicle (ROV), i.e., a human steers this device. The “Lead Zeppelin” got completely new electronics developed at IUB, a powerful on-board computer, additional sensors including cameras, and of course a large amount of software to be able to execute missions on its own. It is for example programmed to move through an environment and to avoid obstacles, to recognize targets by computer vision, and to operate at specific desired depths.

 

Hardware Components of Lead Zeppelin

As mentioned before, some of the most basic hardware parts, namely the hull, the motors and some of the sensors are based on parts from a so-called Seafox ROV by ATLAS Elektronik. For autonomous operation, comepletely new electronics, additional sensors, a high performance on-board computer, and of course a lot of software was added. The "Lead Zeppelin" AUV uses two computation units, which is a common approach for so-called CubeSystem applications. The basic hardware control is done by a RoboCube. Higher level AI software is running on an embedded PC. It collects all sensor data from the RoboCube as well as from the sensors directly attached to it. On an external operator PC, which is connected via wireless network, a GUI is used that can be used for teleoperation and to debug autonomy. There is the option to use an antenna-buoy to maintain wireless communication. This is of course not needed if full autonomy is activated.

The submarine is composed of a main hull, a nose and four long tubes. Those parts house the power system, the propulsion system, a low level controller, a high level controller and a rich sensor payload.

The nose section contains:

  • pressure sensor for depth measurement
  • heading sensor including pitch and roll sensor
  • scanning sonar head
  • front camera

The middle section contains:

  • echo sounder
  • vertical thruster
  • embedded PC
  • RoboCube
  • DC/DC converters
  • wireless access point

Outside, below the middle section the following parts are attached
to the vehicle:

  • down-looking ground camera
  • marker disposal system

Furthermore, four battery tubes are attached to the middle section, each
containing:

  • propulsion motor
  • propeller with guards
  • power amplifier for the motor
  • battery

The submarine is trimmed slightly positively buoyant, i.e. it comes up to the surface in case of some error. Thus the middle thruster has to be used to force the vehicle under water. The maximum power of the four propulsion motors is 350 W of which up to 280 W are usable due to PWM duty cycle limitations. The 3-blade propellers are protected by guards for safety.

 

Sensor Equipment

A. Heading Sensor
A MTi IMU from Xsens is used. It is a low-cost miniature inertial measurement unit with an integrated 3D compass. It has an embedded processor capable of calculating the roll, pitch and yaw in real time, as well as outputting calibrated 3D linear acceleration, rate of turn (gyro) and (earth) magnetic field data. The MTi uses a right handed Cartesian co-ordinate system which is body-fixed to the device and defined as the sensor co-ordinate system. Since the gyro is placed near the center of the sub the orientations provided by the device can directly be used as vehicle orientations. The heading sensor is used to measure and control the roll, pitch and yaw of the AUV.

B. Scanning Sonar
The submarine has a scanning sonar which enables it to scan the surrounding for obstacles and their distance by emitting sound signals of 550 kHz and measuring the time it takes until the sounds echo comes back. The sonar covers up to 360 degrees since it is being turned by a motor in steps of one degree or bigger and it has a resolution of 1.5 deg. The range of the sonar is selectable to up to 80 meters. The sensor can also scan vertically with a coverage of ± 20deg. The horizontal scan rate is about 60deg per second at distances of up to 10 m.

C. Pressure sensor
The pressure sensor measures the depth of the submarine below the water surface. It has an operating pressure range from 0 to 31 bar thus being operational to depth of 300 meters. It can be directly interfaced to the A/D converters of the RoboCube.

D. Echo sounder
The echo sounder measures the distance from the submarine to the ground by emitting sound pulses with a frequency of 500 kHz and a width of 100 µs. The echo of this sound signal is being received and the travel time measured is then used to calculate the depth. The beam width is ± 3deg, the accuracy ± 5 cm and the depth range 0.5 to 9.8 meters.

E. USB cameras
Two Creative NX Ultra USB cameras are used in the vehicle. They support a resolution of up to 640 x 480 pixel. Those USB 1.1 devices have a wide-angle lens which enables a field of view of 78deg. The camera at the bottom of the submarine looks at the ground. It can for example be used to locate targets and to guide the submarine directly over them. The front camera can be used to search for the bottom as well as the mid-water targets. Both cameras are inserted in a waterproof protective lid of transparent plastic.

 

The AUV Software

The software for the "Lead Zeppelin" AUV is structured like in other CubeSystem robots in two parts. The basic low level control is done on the RoboCube. It uses the CubeOS and the RobLib to generate the proper PWM signals, to decode the encoder signals from the motors and to access the analog pressure and echo sound sensors. The higher level AI software is running on the robot PC, which uses SuSE 10 as operating system. It collects all sensor data from the cube, especially depth and speed, as well as from the sensors directly attached to the PC’s interfaces. It uses the high-resolution sonar head to do obstacle avoidance
and localize other objects in the basin. The front USB camera can be used to find for example midwater targets. The down-looking camera can for example locate targets situated on the bottom. The software reuses quite some functions and libraries developed for other IUB Robotics projects, especially the rescue robots.

The AI software is embedded in a robot-server, which is a multi-threaded program written in C++. All system-wide constants like port numbers, resolutions, etc., are read at startup from a configuration file. A client GUI running on a PC or a Laptop is connecting to this server in order to manually drive the robot to a start position using a Gamepad, to start the autonomy and to observe the submarines status during the mission if wireless connection is still available. The NIST RCS framework is used to handle communication between the robot-server and the operator GUI. This framework allows data to be transferred between processes
running on the same or different machines using Neutral Message Language (NML) memory buffers. All buffers are located on the robot computer, and can be accessed by the operator GUI process asynchronously. The GUI uses Trolltechs cross-platform Qt Class Libraries. Several threads are running in parallel to handle the massive amount of data. Each camera view has its own thread.

During a mission autonomous underwater vehicles have to perform different tasks. In order to allow fast and reliable mission planning we use a finite state automaton which is defined in a human readable way as a text file. For debugging and during a run this automaton is shown as a graph within the GUI, indicating the current state and the last transition by using red color. This finite state automaton consists of states, transitions, conditions and actions. At a new iteration all actions of a state are executed in the order they were specified. Actions can actively manipulate the submarine by, for example, changing the desired speeds of the thrusters, by opening the marker box or by starting a timer. Only after all actions have been called the new motor speeds are applied, allowing for nice implementation of processes in the spirit of behaviorbased robotics. After that, the conditions of the transitions are checked. The first transition whose condition is true is then being used to change the state of the automaton. If no condition is true the state of the automaton is not changed. The automaton is executed as a speed of 10 Hz. This means that the actions as well as the conditions that are being implemented block the execution only for a very short time. Actions and conditions have a unique string identifier that corresponds to the actual implementation of those action or condition in the code. The only exception to this are complex conditions which are defined of conditions in the automaton file itself.

 

The Team

The 2006 SAUC-E team consisted of Andreas Birk, Sören Schwertfeger, Diana Albu, Petar Dobrev, Farah Gammoh, Andrei Giurgiu, Sergiu-Cristian Mihut, Bogdan Minzu, Razvan Pascanu, Alexandru Stan, Stefan Videv from IUB and Jörg Kalwa from ATLAS Elektronik

.