Introduction: Autonomous Response for Emergency Scenarios

In Uttarkhand, Nepal and most recently in Kerala, floods have caused a lot of devastation resulting in a tremendous damage to person and property. Due to the accumulated heap of landmass spread over a large area after the flood, it was difficult to carry out the search and rescue operations or navigate manually. This led to numerous loss of lives due to the unavailability of emergency support. Another recent incident that occurred in Thailand has caused 13 people getting trapped in a cave. With the increase in the rate of natural and man-made calamities, we need to recheck our ability to perform an emergency response operation without any delay.

With the help of latest technology, it is now possible to reduce the delay in response required to provide help and support to an emergency scenario such as a disaster, calamity or an incident. It drastically reduces the amount of workforce required for carrying out highly sophisticated tasks. Using a connected autonomous system consisting of aerial (Drone) and ground vehicles (Donkey car), a scene can be monitored in real time and instant support can be provided.

This project deals with the development of a system of intelligent and connected autonomous vehicles to support emergency scenarios and provide an immediate response to the situation.

Supplies

DIY Drone Kit (https://www.amazon.com/QWinOut-Quadcopter-Battery-Unassembly-Brushless/dp/B082KXZJXS/)

Pixhawk Autopilot (https://www.amazon.com/Pixhawk-FMUv5-Autopilot-NEO-M8N-Management/dp/B07DQDXQSH/)

Donkey Car Kit (https://www.robocarstore.com/products/donkey-car-starter-kit)

Raspberry Pi 3B+ or above (https://www.amazon.com/Raspberry-Pi-MS-004-00000024-Model-Board/dp/B01LPLPBS8/)

Logitech Webcam (https://www.amazon.com/Logitech-Desktop-Widescreen-Calling-Recording/dp/B004FHO5Y6/)

Step 1: Project Goals

  • Autonomous aerial and ground operation
  • Aerial and terrain mapping
  • Image segmentation and analysis for safe zones
  • Path planning and autonomous navigation
  • Obstacle detection and avoidanceEmergency assist and Payload transportation
  • Connected systems with multiple units
  • Automatic Solar chargingLive monitoring and control
  • Mobile application support

Step 2: Workflow

This project guide explains all the basic steps and procedures that are performed towards attaining the final goals of the project. I will try to keep this as simple as possible so that anyone referring to this guide would be able to infer and understand the concept. Also please keep in mind that I would like to keep this guide as compact as possible for the sake of readability since this project blog is a light version of the full project. This project is divided into two main sections:

1. Aerial drone
2. Ground vehicle

Step 3: Workflow: Drone

The drone is tasked to follow certain way-points, scan that area and if finds some nasty things, report it back to the home base. In our case, we provide the way-point information. It traverses each of these way-points automatically and checks if there are any humans present in the area with the help of the attached camera. It uses image recognition using the onboard camera and If it happens to find any person, their location is reported back to the base.

The drone is equipped with Raspberry Pi as a companion computer running on Ubuntu Mate, in addition to the Pixhawk 4 flight controller. We have used an S500 frame to build the drone. The bldc motors are of 920KV boasting a 9045 propeller on each. Details of building a multi-rotor are available online which I wish not to repeat here again. The connection between Raspberry Pi and Pixhawk and how to configure them is explained in detail here (https://ardupilot.org/dev/docs/raspberry-pi-via-mavlink.html). For creating automatic way-points a variety of software such as QGroundControl, Mission Planner, etc are available. Drone-kit is used for the simulations that were done to finalize the mission capabilities of the drone. There is a great set of tutorials by Tiziano Fiorenzani on setting up and using drone-kit in various drone applications. We used FlytOS and its APIs to define and execute way-point missions on the real hardware as it is based on ROS. Raspberry Pi is used to manage and execute these tasks, detect humans and report back the location to home base. A Logitech C270 HD Camera is used for image recognition. Rpi cameras can also be used. Ublox NEO-M8N GPS Module (link) with Compass is used for localization and navigation.

The software implementation for the drone is purely based on Python, ROS and FlytOS. FlytOS is a software framework and platform that can be used to develop custom APIs for controlling a variety of drones. The backbone of this framework depends on ROS, MAVROS and MAVLink modules. Using FlytOS APIs, we are able to call functions that can carry out specific tasks such as drone takeoff, land, position control, way-point execution, etc. Please refer here for the software code implementation (https://github.com/crisdeodates/Deodates-ARIES/blo...)

Step 4: Workflow: Ground Vehicle

The ground vehicle or Car is a terrain miniature vehicle capable of traversing either manually or autonomously. This vehicle can be any typical RC ground vehicle, preferably with a brushed DC motor, controlled by an ESC. The best example of this type is a Donkey Car. The advantage of donkey cars is that it comes with autonomous capability via Raspberry Pi 3 and this is achieved by training the car. Two prototypes where developed. The first prototype used a Rock Crawler RC Car kit and the second prototype used a Donkey Car Kit (HSP 94186 Brushed RC Car).

The PCA9685 PWM driver that comes with the donkey car kit is capable enough to drive all the servos and motors that come with it. But in case if you plan to attach more sensors or actuators, I would suggest you use a Raspberry Shield like this which we have used, available here (https://www.banggood.in/Full-Function-Robot-Expans...).

U-Blox Neo-6M GPS is attached which gives the local position of the car. The interfacing of GPS with Raspberry Pi 3 is available in this documentation (https://circuitdigest.com/microcontroller-projects...). We attached a couple of more sensors like HD camera for image processing and IR, SONAR for obstacle avoidance. Driving multiple SONAR is tedious than IR. Therefore IR is recommended in the short run.

A separate power supply unit must be provided for Raspberry Pi and the motor assembly. This is to avoid the motor noise getting into the power supply, which may cause instability. Still, both should have a common ground. The ground vehicle is also equipped with optional solar panel cells and an automatic solar battery charging circuit for seamless and extended periods of operation, enabling recharging on-the-go.

A python program is developed to give custom commands to the ground vehicle. This program is run on the onboard raspberry pi and controls the vehicle using the motor drivers. It also has the provision to control a pan-tilt servo system where the camera can be mounted.

Step 5: Workflow: Mobile Application

Manual control of the ground vehicle is implemented using a custom developed mobile application. It also provides live camera feed from the on-board HD camera. The app can be downloaded by anyone once it's published to playstore in a few days. This app allows manual and automatic operation of the ground vehicle. Manual control is performed via a virtual joystick implemented in the app and automatic operation is using obstacle avoidance by the manipulation of various onboard IR sensors via the python program.

Step 6: Testing

This system and the area of operation can be expanded further by adding multiple aerial and ground units. The units will be monitored via a single mission management console at the ground control station. For the testing, please refer to the testing visuals.

Step 7: Planned Future Modifications

Drone:

Nvidia Jetson TX2 / Avnet Ultra96 will be used as a companion computer. It will also be used for faster AI processing and reliable stereo processing capability.

Inclusion of thermal camera for analysis of unknown terrain and structures.

Weatherproofing of electronics.

Ground Vehicle:

Amphibious upgrade - Upgrading the donkey car to operate on both land and water so as to increase the effectiveness of the mission in case water search and rescue is imminent such as in the Thailand Cave incident.

Donkey car will be fitted with detachable screw barrel tires so as to enable its easy movement on the sand, water, and snow.

Weatherproofing of electronics using waterproof counterparts (ESC, Motors, etc.), plastic encasing, water repellent paste and hot glued contacts.

Others:

Setting up a distributed system of drones and cars which covers a wide area of operation. A large area will be divided into sectors (depending on the endurance of operation) and each sector will have a control center equipped with a drone and a car. If required, backup drones and cars can be pulled from nearby sectors.

A mobile application with SOS functionality which immediately sends the location of the user to the centralized monitoring system, alerting the team regarding an emergency at the user location. The drone will be immediately deployed for monitoring the location and the car will be kept in standby for the response.

Setting up emergency battery replacing stations for drones and cars so as to increase the endurance of operation (concept is already under our development).

Raspberry Pi Contest 2020

Participated in the
Raspberry Pi Contest 2020