Introduction: Ball Tracking Robot

So in this, I am gonna tell how to make a ball tracking robot that is a robot will identify a ball and follows it. It is basically an automated surveillance technique that can be used in the modern world. So, just let us just hop in and start building...

NOTE: This the part assignment submitted to Deakin University, School of IT, SIT-210 Embedded Systems Development.

Supplies

https://www.hackster.io/junejarohan/ball-tracking-robot-7a9865

Step 1: Introduction

Today's surveillance provides a major drawback which is that it rests on the involvement of humans which as we all know can be easily distracted, so it was to our utmost importance to discover a system which can monitor regions autonomously and continuously. And also we want to identify obnoxious or unwanted things and dangers while simultaneously making decisions and respond accordingly. So object tracking with the use of intelligent systems and computers is essential and crucial to achieve automated surveillance.

Any outdoor surveillance system must be able to track objects moving in its field of view, classify these objects and detect some of their activities. I have developed a method to track and classify these objects in realistic scenarios. Object tracking in a single camera is performed using background subtraction, followed by region correspondence. This takes into account multiple cues including velocities, sizes and distances of bounding boxes.

Step 2: Materials and Soft-wares Used in This Project

Hardware Components Used:

  • Raspberry Pi (x1)
  • Raspberry Pi Camera Module (x1)
  • Ultrasonic Sensor (x3)
  • SparkFun Dual H-Bridge motor drivers L298 (x1)
  • DC Motor (x1)
  • Breadboard (x1)
  • Connecting Wires

Software Used:

  • OpenCV

Hand Tools:

  • Python

Step 3: What to Do?

Any outdoor surveillance system must be able to track objects moving in its field of view, classify these objects and detect some of their activities. I have developed a method to track and classify these objects in realistic scenarios. Object tracking in a single camera is performed using background subtraction, followed by region correspondence. This takes into account multiple cues including velocities, sizes and distances of bounding boxes.

Crucial thing while detecting images frame by frame was to avoid any frame drops as then the bot can go into a limbo state if the bot fails to notice the direction of movement of the ball because of frame drops. If the ball goes out of the range of the camera, it will go into what we call a limbo state, in that case, the bot makes a 360-degree turn to view the space around it till the ball comes back in the frame of the camera and then start moving in its direction.

For the image analysis, I am taking each frame and then masking it with the colour needed. Then I find all the contours and find the largest among them and bound it in a rectangle. And show the rectangle on the main image and find the coordinates of the centre of the rectangle.

Finally, the bot tries to bring the coordinates of the ball to the centre of its coordinate axis. This is how the robot works. This can be further be enhanced by using an IoT device like a photon particle which can let you be informed when a thing is detected and that the robot is following it or when the robot has lost the track of it and now is returning to the base.

For the purpose of image processing, you need to install OpenCV software on your raspberry pi which was quite tricky for me.

You can get any required information to install OpenCV through this link: click here

Step 4: Schematics

Above I have provided the schematics for my project and along with it is the Printed Circuit Board (PCB).

And here are some of the main connections that you need to do:

• First of all the Raspberry Pi Camera module is directly connected to the Raspberry Pi.

• The Ultrasonic sensors VCC are connected to the common terminal same is with the GND (ground) and the remaining two ports of the ultrasonic sensor is connected to the GPIO pins on the Raspberry Pi.

• The Motors are connected using the H-Bridge.

• The Power is supplied using the Battery.

I have also added the video that might help with the understanding of working of the ultrasonic sensor and how it works.

and also you can follow this link if you can not find the video above.

Step 5: How to Do?

I made this project depicting a basic robot which can track a ball. The robot uses a camera to do image processing by taking frames and track the ball. To track the ball various features like its colour, size, shape is used.

The Robot finds a hardcoded colour and then search for the ball of that colour and follows it. I have chosen Raspberry Pi as micro-controller in this project because it allows us to use its camera module and gives great flexibility in code as it uses python language which is very user friendly and also it lets us use OpenCV library for analysing the images.

An H-Bridge has been used to switch the direction of rotation of motors or to stop them.

For the image analysis, I am taking each frame and then masking it with the colour needed. Then I find all the contours and find the largest among them and bound it in a rectangle. And show the rectangle on the main image and find the coordinates of the centre of the rectangle.

Finally, the bot tries to bring the coordinates of the ball to the centre of its coordinate axis. This is how the robot works. This can be further be enhanced by using an IoT device like a photon particle which can let you be informed when a thing is detected and that the robot is following it or when the robot has lost the track of it and now is returning to the base. And to do this we will be using an online software platform that connects the devices and allows them to perform certain actions on specific triggers that is IFTTT triggers.

Step 6: Pseudo-Code

Here is the pseudo-code for the detection part using OpenCV where we detect a ball.

Step 7: Code

Above are the snippets of the code and below is the detailed description of the code.

# import the necessary packages

WE IMPORT ALL THE NEEDED PACKAGES

from picamera.array import PiRGBArray     #As there is a resolution problem in raspberry pi, will not be able to capture frames by VideoCapture
from picamera import PiCamera
import RPi.GPIO as GPIO
import time
import numpy as np 

NOW WE SET UP THE HARDWARE AND ASSIGN THE PINS CONNECTED ON RASPBERRY PI

GPIO.setmode(GPIO.BOARD)
GPIO_TRIGGER1 = 29      #Left ultrasonic sensor
GPIO_ECHO1 = 31
GPIO_TRIGGER2 = 36      #Front ultrasonic sensor
GPIO_ECHO2 = 37
GPIO_TRIGGER3 = 33      #Right ultrasonic sensor
GPIO_ECHO3 = 35
MOTOR1B=18  #Left Motor
MOTOR1E=22
MOTOR2B=21  #Right Motor
MOTOR2E=19
LED_PIN=13  #If it finds the ball, then it will light up the led
# Set pins as output and input
GPIO.setup(GPIO_TRIGGER1,GPIO.OUT)  # Trigger
GPIO.setup(GPIO_ECHO1,GPIO.IN)      # Echo
GPIO.setup(GPIO_TRIGGER2,GPIO.OUT)  # Trigger
GPIO.setup(GPIO_ECHO2,GPIO.IN)
GPIO.setup(GPIO_TRIGGER3,GPIO.OUT)  # Trigger
GPIO.setup(GPIO_ECHO3,GPIO.IN)
GPIO.setup(LED_PIN,GPIO.OUT)
# Set trigger to False (Low)
GPIO.output(GPIO_TRIGGER1, False)
GPIO.output(GPIO_TRIGGER2, False)
GPIO.output(GPIO_TRIGGER3, False)

THIS FUNCTION USES ALL THE ULTRASONIC SENSORS COLLECT THE DISTANCE FROM THE OBJECTS AROUND OUR BOT

# Allow module to settle
def sonar(GPIO_TRIGGER,GPIO_ECHO):
      start=0
      stop=0
      # Set pins as output and input
      GPIO.setup(GPIO_TRIGGER,GPIO.OUT)  # Trigger
      GPIO.setup(GPIO_ECHO,GPIO.IN)      # Echo
     
      # Set trigger to False (Low)
      GPIO.output(GPIO_TRIGGER, False)
     
      # Allow module to settle
      time.sleep(0.01)
           
      #while distance > 5:
      #Send 10us pulse to trigger
      GPIO.output(GPIO_TRIGGER, True)
      time.sleep(0.00001)
      GPIO.output(GPIO_TRIGGER, False)
      begin = time.time()
      while GPIO.input(GPIO_ECHO)==0 and time.time()

GETTING THE DC MOTORS TO WORK WITH THE RASPBERRY PI

GPIO.setup(MOTOR1B, GPIO.OUT)
GPIO.setup(MOTOR1E, GPIO.OUT)

GPIO.setup(MOTOR2B, GPIO.OUT) GPIO.setup(MOTOR2E, GPIO.OUT)

DEFINING FUNCTIONS TO OPERATE THE ROBOT AND MAKE IT MOVE IN DIFFERENT DIRECTIONS

def forward():
      GPIO.output(MOTOR1B, GPIO.HIGH)
      GPIO.output(MOTOR1E, GPIO.LOW)
      GPIO.output(MOTOR2B, GPIO.HIGH)
      GPIO.output(MOTOR2E, GPIO.LOW)
     
def reverse():
      GPIO.output(MOTOR1B, GPIO.LOW)
      GPIO.output(MOTOR1E, GPIO.HIGH)
      GPIO.output(MOTOR2B, GPIO.LOW)
      GPIO.output(MOTOR2E, GPIO.HIGH)
     
def rightturn():
      GPIO.output(MOTOR1B,GPIO.LOW)
      GPIO.output(MOTOR1E,GPIO.HIGH)
      GPIO.output(MOTOR2B,GPIO.HIGH)
      GPIO.output(MOTOR2E,GPIO.LOW)
     
def leftturn():
      GPIO.output(MOTOR1B,GPIO.HIGH)
      GPIO.output(MOTOR1E,GPIO.LOW)
      GPIO.output(MOTOR2B,GPIO.LOW)
      GPIO.output(MOTOR2E,GPIO.HIGH)
def stop():
      GPIO.output(MOTOR1E,GPIO.LOW)
      GPIO.output(MOTOR1B,GPIO.LOW)
      GPIO.output(MOTOR2E,GPIO.LOW)
      GPIO.output(MOTOR2B,GPIO.LOW)<br>

MAKING THE CAMERA MODULE WORK AND ADJUSTING THE SETTINGS

#CAMERA CAPTURE
#initialize the camera and grab a reference to the raw camera capture
camera = PiCamera()
camera.resolution = (160, 120)
camera.framerate = 16
rawCapture = PiRGBArray(camera, size=(160, 120))
 
# allow the camera to warmup
time.sleep(0.001)<br>

NOW IMPLEMENTING THE MAIN THING WHERE THE BOT FOLLOWS THE BALL AND AVOID ANY OBSTACLE IN THE WAY

while(1<10):
{  
      #distance coming from front ultrasonic sensor
    distanceC = sonar(GPIO_TRIGGER2,GPIO_ECHO2)
      #distance coming from right ultrasonic sensor 
    distanceR = sonar(GPIO_TRIGGER3,GPIO_ECHO3)
      #distance coming from left ultrasonic sensor
    distanceL = sonar(GPIO_TRIGGER1,GPIO_ECHO1)
             
     
        if(distanceC<10):
                        #if ball is too far but it detects something in front of it,then it avoid it and reaches the ball.
            if distanceR>=8:
                rightturn()
                time.sleep(0.00625)
                stop()
                time.sleep(0.0125)
                forward()
                time.sleep(0.00625)
                stop()
                time.sleep(0.0125)
                              #while found==0:
                leftturn()
                time.sleep(0.00625)
            elif distanceL>=8:
                leftturn()
                time.sleep(0.00625)
                stop()
                time.sleep(0.0125)
                forward()
                time.sleep(0.00625)
                stop()
                time.sleep(0.0125)
                rightturn()
                time.sleep(0.00625)
                stop()
                time.sleep(0.0125)
            else:
                stop()
                time.sleep(0.01)
        else:
                        #otherwise it move forward
            forward()
            time.sleep(0.00625)
            
            if(distanceC>10):
                              #it brings coordinates of ball to center of camera's imaginary axis.
                if(centre_x<=-20 or centre_x>=20):
                    if(centre_x<0):
                        flag=0
                        rightturn()
                        time.sleep(0.025)
                    elif(centre_x>0):
                        flag=1
                        leftturn()
                        time.sleep(0.025)
                        forward()
                        time.sleep(0.00003125)
                        stop()
                        time.sleep(0.00625)
                    else:
                        stop()
                        time.sleep(0.01)
                else:
                        #if it founds the ball and it is too close it lights up the led.
                    GPIO.output(LED_PIN,GPIO.HIGH)
                    time.sleep(0.1)
                    stop()
                    time.sleep(0.1)
      #cv2.imshow("draw",frame)    
    rawCapture.truncate(0)  # clear the stream in preparation for the next frame
}      

DO THE NECESSARY CLEANUPS

GPIO.cleanup() #free all the GPIO pins

Step 8: External Links

Link to the demonstration video: click here (Youtube)

Link to the Code on Git-hub: click here (Git-Hub)