loading

The project is being developed in the Student Scientific Club "AVADER" at the University of Science and Technology in Cracow. The project will be presented on the Digilent Design Contest 2017. As the project is under development the description will be regulary updated, and the new features will be shown. The final version should be available mid May.

Step 1: Embedded Vision System Concept

The aim of the project is the implementation of a hardware-software system for autonomous robot navigation. The system is based on two parts, a vision system and a multisensor network. The main task of vision system is to track a selected object (it is not need to be defined a-priori). The Mean-Shift algorithm is used. The second part of the system is the fusion of the data from multiple sensors mounted on the mobile platform such as GPS receiver, gyroscope and magnetic compass. Additionally, the use of encoders allows to implement odometry algorithms and more accurate control the robot. Furthermore, proximity sensors (LIDAR) will be used in the nearfuture.

Moreover, the wireless full duplex communication between the robot and PC will be developed. It will allow a permanent access to all measurements by the PC application. A desktop application which will be created will have a user friendly interface to have continuous overview of robot parameters like position, orientation, speed etc. and to enable the robot to switch to the manual mode. In manual mode, the user will able to control the robot from the PC using a simple control panel.

The system is implemented on the heterogeneous Zynq system-on-chip platform (Zybo development board). The vision system is implemented in the programmable logic, while the sensors and wireless communication drivers are implemented on the ARM processor that will run Petalinux OS.

Step 2: Equipment and Software

As it has already been written the system is made up of two parts. To run the vision tracking system you need:

- Zybo

- camera with HDMI output

- monitor with VGA input

- robot platform

- PCB with engines driver

Software:

- Vivado 2016.2

The second part(in progress) is the data fusion from sensors mounted on the mobile platform and radio communication with PC. You need:

- Digilent PMOD GPS

- Digilent PMOD GYRO

- Digilent PMOD CMPS

- RPLIDAR A2

- nRF24L01 03 x2

- Raspberry Pi B2+

Software

- petalinux 2016.2

- Xilinx SDK 2016.2 ( with gcc cross-compiler compatible with c++14 standard)

Step 3: The Hardware Implementation

The vision system was implemented on the Programable Logic (PL). It consists of HDMI signal decoding, RGB to HSV convertion, meanshift tracking, drawing the blue rectangle around the tracked object, motors PWM controller, and encoding to the VGA standard.

The bitstream of the working Meanshift algorithm is included below so you can test it by following the steps below.

1. Download the bitstream.

2. Run Vivado (recomended version 2016.2).

3. Connect Zybo board via USB cable.

4. Open Hardware Manager.

5. Open target->Auto Connect.

6. Program device -> select the correct bitstream -> Program.

Step 4: Wiring

1. Connect the power suply and USB cable to the computer.

2. Connect your video camera to the Zybo board with a HDMI cable.

3. Set the camera resolution as 1280x720 (720p).

4. Connect Zybo to the monitor using VGA cable.

5. Plug in the PCB to the PMOD in the following order:

- Left motors PMOD JE1

- Right motors PMOD JE2

- GND PMOD GND

Step 5: System Initialization

When the bitstream is uploaded and all the necessary wires are connected on the screen should be visible the image from the camera. Blue rectangle in the middle of the screen points the tracked area. To initialize tracking put the chosen object inside the rectangle and turn on the switch 0. Now, the tracking is on, and blue rectangle follows the object. If you want to start the robot, after initialization you have to disconnect the VGA cable. Make sure you do not jerk the robot too much. When the cable is dosconnected turn on the switch 1 which is responsible for starting the engines. Move the selected object, and the robot should follow it.

<p>Thanks for sharing :)</p>

About This Instructable

185views

5favorites

More by kmbm:Embedded Vision-based Tracking System for Autonomous Robot Navigation 
Add instructable to: