Introduction: Embedded Vision-based Tracking System for Autonomous Robot Navigation

The project is being developed in the Student Scientific Club "AVADER" at the University of Science and Technology in Cracow. The project will be presented on the Digilent Design Contest 2017. As the project is under development the description will be regulary updated, and the new features will be shown. The final version should be available mid May.

Step 1: Embedded Vision System Concept

The aim of the project was the implementation of a hardware-software system for autonomous robot navigation. The system is based on two parts, a vision system and sensors, controlling the platform environment and behavior. The main task of the vision system is to track the selected object (it is not need to be defined a-priori). The Mean-Shift algorithm is used. The second part of the system is the fusion of the data from the sensors mounted on the mobile platform. Moreover, the wireless communication between the robot and PC was implemented. It allows a permanent access to all measurements by the PC application. Additionally, manual mode is enabled using the remote control panel. In this mode user has fully control over the robot.

The system is implemented on the heterogenous Zynq system-on-chip platform(Zybo development board). The vision system is implemented in the programmable logic, while the sensors and wireless communication drivers are implemented on the ARM processor.

Step 2: Equipment and Software

As it has already been written the system is made up of two parts. To run the vision tracking system you need:

- Zybo development board

- camera with HDMI output

- monitor with VGA input

- robot platform

- Digilent PMOD GYRO

- RPLIDAR A2

- 12V battery

- bluetooth module HC-05

Software:

- Vivado 2016.2

- petalinux 2016.2

- Xilinx SDK 2016.2 ( with gcc cross-compiler compatible with c++14 standard)

Step 3: The Hardware Implementation

The vision system was implemented on the Programable Logic (PL). It consists of HDMI signal decoding, RGB to HSV convertion, meanshift tracking, drawing the blue rectangle around the tracked object, motors PWM controller, and encoding to the VGA standard.

The bitstream of the working Meanshift algorithm is included below so you can test it by following the steps below.

1. Download the bitstream.

2. Run Vivado (recomended version 2016.2).

3. Connect Zybo board via USB cable.

4. Open Hardware Manager.

5. Open target->Auto Connect.

6. Program device -> select the correct bitstream -> Program.

Step 4: Wiring

1. Connect the power suply and USB cable to the computer.

2. Connect your video camera to the Zybo board with a HDMI cable.

3. Set the camera resolution as 1280x720 (720p).

4. Connect Zybo to the monitor using VGA cable.

5. Plug in the PCB to the PMOD in the following order:

- Left motors PMOD JE1

- Right motors PMOD JE2

- GND PMOD GND

Step 5: System Initialization

When the petalinux is booted and all the necessary wires are connected on the screen should be visible the image from the camera. Blue rectangle in the middle of the screen points the tracked area. To change between RGB and HSV color space use the switch 2. To initialize tracking put the chosen object inside the rectangle and turn on the switch 0. Now, the tracking is on, and blue rectangle follows the object. If you want to start the robot, after initialization you have to disconnect the VGA cable. Make sure you do not jerk the robot too much. When the cable is disconnected turn on the switch 1 which is responsible for starting the engines. Move the selected object, and the robot should follow it.

Desktop Application:

Run exe application and the user interface should appear. The bluetooth modules are pairing automatically, but if you connect the first time you will have to enter the passcode. The default HC-05 passcode is 1234. Then,you an access to the view for the robot parameters. Also the control panel is available. When you send the first command, the robot switches to the manual mode.