Mobile robot Hercules – project details
← Back to the references overview.
Basic information about the robot
Funding, genesis
This mobile robot has been created on the Department of Robotics by several Ph.D. students and employees of the department.
The robot was mostly financed from the project MPO TANDEM FT-TA3/014/2006-2009 called Research and development of the special multipurpose rescue and intervention vehicle with the system of operational modification of parameters for application by human rescue and saving of tangible properties by disasters, fire, flooda, expeditions etc. Contractor: FITE, a.s., project participant: VŠB-TU Ostrava, prof. Ing. Jiří Skařupa, CSc.
Photographs
More pictures can be seen in the Department of Robotics web gallery, especially in the following albums:
- Mobile robot Hercules
- NATO Days in Ostrava, 2012
- NATO Days in Ostrava, 2011
- NATO Days in Ostrava, 2010
- Chemistry on the Silesian Ostrava Castle, 2012
Technical parameters
Chassis | four-wheeled, sprung, steerable rear axle, driven front axle with differential |
---|---|
Motors | driving: DC motor, steering: servo, arm: 3 Maxon EC motors, gripper: DC motor |
Sensors | distance-measuring LASER sensor on the arm, incremental position sensors in the arm motors, resistance thermometer, Dräger X-am 5000 gas detector |
Cameras | stereovision camera head on the arm, rear camera on the chassis |
Control system | remote wireless control of all functions from a PC or notebook |
Data transmission | driving: Radiocrafts RC1280HP (868 MHz); camera pictures, arm control and sensors data: Wi-Fi (UDP) |
Dimensions | chassis: 990 x 710 x 675 mm; arm links lengths: 450 mm, 860 mm (reach 1390 mm) |
Weight | 165 kg (140 kg chassis + 25 kg arm) |
Max. load | 115 kg chassis; 1,1 kg arm |
Description and history
Chassis of this mobile robot comes from an electric wheelchair InvaCare, mechanically modified on the department to allow placement of additional electronic and attachment of the manipulator arm. Thanks to the use of a highly optimised commercial chassis, the locomotory subsystem of the robot has excellent parameters (battery capacity, load, range of speeds, power, manoeuvrability). It turned out to be not really easy to control the functions of the chassis (power, speed switching, driving, steering) remotely from a computer – the original control system of the wheelchair is based on a proprietary and closed CAN-type bus with unknown protocol. The final solution was to completely replace the driving electronics, so now all the driving functions can be controlled remotely from a custom control application.
The manipulator arm was completely designed and realised by the department. One of the primary requirements because of the circumstances was as low manufacture complexity as possible (to cut both the manufacture time and costs). That’s why for example all three driving units are identical, regardless the lower load of some joints. Simplicity of the construction brought also other benefits, especially almost zero mechanical looseness and thus it is possible to perform very accurate and fine movements. The manipulator has 3 degrees of freedom; all joints are powered by disc electromotors MAXON ES90F 60 W with harmonic gearboxes. The low-level control is provided by Maxon EPOS units interconnected via a CAN bus.
The manipulator is currently equipped with a simple two-jaw gripper with adjustable gripping force (7 levels in each direction).
The robot contains a stereovision camera head located on the last link of the arm. This location allows the camera to watch the object of manipulation or to look around while driving or watching the environment. Between the cameras is a LASER distance sensor with 25 m range. There is also a third camera at the rear of the chassis, for navigation when driving backwards.
It is possible to control the robot remotely from an operator’s station (suitcase) containing all the necessary electronics, an accumulator, transmission devices and a notebook with touch screen. All robot functions are integrated into one user-friendly application and can be controlled using a wireless gamepad. Important information is displayed directly over the camera image, additional functions and data are shown on the right-hand panel. The operator can also use 3D glasses for stereovision.
There are two minicomputers on the robot (Via EPIS and Asus EEE netbook); one of them is running a server application controlling the arm movements and processing sensor signals and the other one is running an application acquiring and compressing camera pictures. Both these applications communicate with a client application (the operator’s station) via Wi-Fi. Driving signals are transmitted by radio because of its better reliability in buildings.
The robot also contains a thermometer to measure temperature around the robot and a gas detector Dräger X-am 5000, which can be equipped with up to 5 sensors for different gas types. Data from the detector and thermometer are of course also transmitted to the operator’s station and displayed on the screen.
Since 2010, the robot could be seen every year on the largest air, army and security show in cental Europe – Nato Days in Ostrava, where has been one of the most popular attractions on the stand of the Faculty of Mechanical Engineering (Department of Robotics).
My participation on the project
My work consised mainly of developing the higher-level control system that allows comfortable remote control of all functions of the robot by a human operator.
Structure of the control system
The system consists of a part running on a computer in the robot (server) and a part running on a computer in the operator’s station (client). Because of low computational power of the Via EPIA mini-PC in the mobile robot, it was necessary to add another computer (netbook Asus EEE), which is assigned only for camera images processing. So, in fact, the server part of the system is comprised of two completely independent applications.
Video server application
It is a console application programmed in Visual C++. During start-up, the application scans for connected cameras and then waits for incoming connection from the client application, using a custom communication protocol based on UDP. When the connection is active, the application is taking pictures from cameras, compressing them using Intel® IPP and sending them to the client. The communication is bidirectional – from the client application are coming instructions about which cameras should be transmitted and in what quality.
Main server application
This is a Windows application programmed in Visual C++ using a dialog window (the dialog box resource). Control elements on the dialog window have only information and service purposes, their use is not required under standard conditions and the robot operator doesn’t even see the window at all.
The application secures especially the arm movement logic – inverse kinematics, acceleration and deceleration ramps, anti-collision system (see the picture on the right) etc., including communication with EPOS units (Maxon motor control units). The second main task is reading and processing of sensors data.
Bidirectional communication with the client application is also here done via custom UDP datagrams containing required commands and feedback.
The anti-collision system prevents the arm from damaging itself or other parts of the robot. The algorithm uses extrapolation of actual angular velocities of all arm joints to predict positions of the joints after a specific time and in these positions it checks intersections between defined pairs of oriented bounding boxes that approximately envelope critical parts of the robot (see the picture on the right). Intersection checks are made using the separating axis theorem..
Client application
This is the only application directly interacting with the operator, thus it features an elaborated GUI (graphical user interface). The application was programmed in Visual C++ and contains a custom graphical engine based on Direct3D, which is also used to render the control elements like buttons.
The main purpose of this application is interaction with the operator – presenting important information to him and getting and processing input commands from him. The operator controls the application primarily via a gamepad (using Xinput or DirectInput, depending on the gamepad type). Less frequently used functions are controlled by a keyboard, mouse, touchpad or a touch screen.
The application runs in full screen mode. The main part of the screen is dedicated to image of the primary camera. The left side contains three retractable panels with control elements for additional functions of the arm (see the picture on the right), cameras and communication. Error icons are displayed in the bottom part; text messages in the upper part.
The strip on the right side of the screen contains smaller image of the secondary camera (usually the rear camera), which gets enlarged when clicked. Further there is an interactive 3D model of the robot that shows the actual position of the arm and wheels in real time. The model can be viewed from any angle and the viewport also can be enlarged. The middle part of the strip consists of basic control elements for manipulation and driving (various modes), the bottom shows sensors data.
Communication with the two above mentioned server applications is maintained automatically, independently with each of them (the arm can be fully controlled without connection with cameras and vice versa). Communication with the driving control system is done via RS232, by a radio module.