The aim of project is to create a vision system capable of tracking moving objects with positioning of a pan-tilt camera. The project involves building a moving camera stand with use of off the shelf servomechanisms. The task was to keep tracked object in the center of frame by positioning servos.

As the computing platform the ZYBO evaluation board, containing a heterogeneous Zynq SoC device was used.

Automatic object tracking is used in matters connected with security, surveillance or in military applications.

A short movie has been prepared in order to present current status of the project.

Step 1: Tracking Algorithm

Tracking differs from other image processing algorithms in that it is necessary to use information from more than one video frame. This is particularly evident, when there are few objects similar to tracked one in frame. Then information about objects' locations in the past is required to distinguish them.

Significant problems with object tracking may be caused by changed orientation of target and its high speed in comparison to frames frequency. In case of moving camera, errors may be caused additionally by image blur.

Exemplary tracking algorithms:

  • Tracking by detection
  • Mean-shift
  • Particle filter
  • KLT

Mentioned algorithms were tested in MATLAB in order to evaluate their effectiveness for recorded test sequences. Chosen algorithms (tracking by detection, mean-shift) were implemented in programmable logic. Output (object position) was sent with use of AXI interface to processor system.

Step 2: System Construction

Constructed solution consists of:

  • Camera
  • Computing platform(Zybo development board)
  • Servomechanisms
  • Servomechanisms controller(Maestro)
  • Sensors
  • Power supply
  • Pointer

ZYBO is connected with camera, servo controller and position sensors. To change cameras position a command has to be sent to controller, which sends pulses of appropriate duration to servos. Controller is also capable of providing sufficient current for motors. In order to realize communication between controller and ZYBO, a logic level converter had to be used.

Sensors connected to board are:

  • Accelerometer
  • Magnetometer
  • Gyroscope

With use of above mentioned sensors it is possible to measure position of camera.

Voltage applied to motors is 7.2V. Such voltage was generated with use of a step-down, from 12V power supply. Laser pointer was used in order to visualize cameras target.

ZYBO is connected with PC through Bluetooth interface to provide possibility of changing parameters of work online, without having to reconfigure the device.

Step 3: Camera Positioning

Center of tracked object is calculated for each frame. From this signal control error is calculated and is applied as an input to regulator implemented in processor system. Regulator has to calculate new position for servomechanisms. Finally this position is sent to controller.

In order to test regulator and acquire correct parameters a mathematical model of system was calculated. Then it was implemented in Simulink.

At this stage of project a position feedback from sensor was not applied to control algorithm yet. It was decided, that incremental PID should be used as a regulator. Thus it was added to Simulink model and tested. Based on model responses parameters of regulator were chosen:

  • P=0.4
  • I=0.1
  • D=0.05

Chosen parameters provided stability and fast positioning of the system.

Step 4: Implementation

In attached .zip file 2 Vivado projects (version 2016.3) can be found. First project (magisterium) contains system with tracking based on detection (using objects color). Second project (magisterium_meanshift) contains system with mean-shift algorithm. In order to reproduce described project:

  1. Connect ZYBO to your computer.
  2. Connect ZYBOs MIO 15 pin and ground to controllers rx pin and ground.
  3. Connect servomechanisms to controller (pan to channel 11 and tilt to channel 12).
  4. Run Xilinx SDK (preferred version 2016.3).
  5. Run serial communication terminal and connect it to Zynq port.
  6. Program Programmable Logic (number 1 in included picture).
  7. Program Processor System (number 2 in included picture).
  8. If program was loaded successfully a message is displayed: "Program run successfully!".
  9. Send appropriate commands through UART interface to Zynq.

List of commands:

  • Changing servomechanisms' positions - 0x00 0xHH 0xLL 0xHH 0xLL
  • Changing max angular velocity - 0x01 0xHH 0xLL 0xHH 0xLL
  • Changing max torque - 0x02 0xHH 0xLL 0xHH 0xLL
  • Starting autonomous work (tracking) - 0x04 0x-- 0x-- 0x-- 0x--

First byte determines command type and 4 next are data for this command. 2 first data bytes are for pan servo, and two next bytes for tilt servo.

It is also possible to investigate block designs used in projects. To do this run attached projects in Vivado and open block diagram. In the diagram parameters of detected colors in tracking by detection algorithm can be changed. In order to change them double click block marked in included picture with block diagram. Then range for each color in RGB format can be modified. After changing range it is necessary to generate bitstream and export it to SDK projects.

Step 5: Evaluation

A short movie was recorded in order to present current status of project. System with tracking by detection algorithm is presented first, and system with Mean-shift algorithm is presented at the end.

About This Instructable



More by AVADER:An Object Tracking Pan-tilt Smart Camera Based on Zynq Heterogeneous Device HDMI Passthrough on Nexys Video An embedded vision system supporting the home care for convalescent or elderly people 
Add instructable to: