Introduction: Autonomous Soccer Robot

The objective of this project is to create robots that are able to play soccer. To make that work, we made a system that detects the robots and the ball by checking theirs colors and finds theirs respective positions and angles. These informations are sent to the robots, which decide what to do, according to theirs codes. For this project we made 2 robots, one for each team, but it can be made with more than that, if you want to.

Supplies

  1. Computer
  2. Webcam
  3. 3 (or more) ESP32
  4. 2 (or more) robot structures
  5. 2 (or more) 9V-battery (or another supply)
  6. 4 (or more) DC motors
  7. 2 (or more) L298N H-Bridge modules

Step 1: Vision System

The vision system is made of a camera and an ESP connected to a Linux computer. The camera must be aiming to the field (which might be white to make it easier to detect other colors).

Step 2: Detecting Colors

Using Python and the OpenCV library, we can detect each robot position by checking theirs colors. To install OpenCV, you can check out this link.

After getting OpenCV, we can start thinking of a logic to detect the robots. We are using an yellow ball, a blue and green circle for the first robot and a blue and pink circle for the other one. To detect each color, we have to mask the image captured by the camera, by defining a range of colors in HSV. For that, we use two dictionaries, one for the lower and one for upper colors codes. When you mask the image, everything will turn into black, except the pixels that are between the range you selected. Then you can look for countours in the masked image.

Here's an example of how to detect the color yellow:

detect_yellow.py

import cv2
import numpy as np
import time
import math
# Read the camera
cap = cv2.VideoCapture(0)
# Define range of colors in HSV
lower = {'Yellow':(15,90,95)}
upper = {'Yellow':(25,220,205)}
# Define color in BGR
color = {'Yellow':(0,255,255)}
while(1):
# Take each frame
_, frame = cap.read()
# Convert BGR to HSV
hsv = cv2.cvtColor(frame, cv2.COLOR_BGR2HSV)
# Define mask
mask = cv2.inRange(hsv, lower['Yellow'], upper['Yellow'])
# Find contours in the mask
cnts = cv2.findContours(mask.copy(), cv2.RETR_EXTERNAL,cv2.CHAIN_APPROX_SIMPLE)[-2]
iflen(cnts) >0:
# Find largest contour in the ball mask
contour =max(cnts, key= cv2.contourArea)
# Print contour
cv2.drawContours(frame, [contour],0,color['Yellow'],2)
res = cv2.bitwise_and(frame,frame,mask= mask)
# Show screens
cv2.imshow('frame',frame)
cv2.imshow('mask',mask)
cv2.imshow('res',res)
k = cv2.waitKey(1) &0xFF
if k ==ord("q"):
break
cv2.destroyAllWindows()
view rawdetect_yellow.py hosted with ❤ by GitHub

Step 3: How to Find the Range of Each Color

This was by far the hardest part. Finding the best colors for each robot and for the ball took a long time, since we had to adapt the range a lot, mainly because of the back light. The ideal situation is to make this project on a room that doesn't have much light changing, so you don't have to change the colors range.

To make it easier to find the range for each color, we used this code: https://github.com/alkasm/colorfilters. With this, we could easily change the hue, saturation and value (HSV) to find the range that masks everything except the color we want, all we had to do was to save the image captured by the camera and select it. To save the image, we made a short code, called "photo.py".

photo.py

import numpy as np
import cv2
cap = cv2.VideoCapture(0)
# Capture frame-by-frame
ret, frame = cap.read()
# do what you want with frame
# and then save to file
cv2.imwrite('/home/marco/Soccer_Robot/colorfilters-master/image.jpg', frame)
# When everything done, release the capture
cap.release()
cv2.destroyAllWindows()
view rawphoto.py hosted with ❤ by GitHub

Step 4: Position and Angle Detection

Now that we know how to mask an image to find a color, we need to find its position. After finding the contour of your color, the position can be easily found with some OpenCV commands, which are shown on "detect_yellow_position.py".

Well, that's all for the ball... But we also need to know where the robots are facing at, so the logic for the robots is a little harder than that.

To check the robots angles, we use 2 colors for each, so we can make a line connecting the middle of the first color to the middle of the second one. With this line and some trigonometry, we can get the angle.

To get all information about the robot, first we need to mask the image captured with the camera, using the range of the two colors of the robot. Then, we can get the position of the robot. After that, we need to make two different masks, one for each color, and find theirs centers. Next step is to calculate the angle, and then you will get all information you need.

detect_yellow_position.py

import cv2
import numpy as np
import time
import math
# Read the camera
cap = cv2.VideoCapture(0)
# Define range of colors in HSV
lower = {'Yellow':(15,90,95)}
upper = {'Yellow':(25,220,205)}
# Define color in BGR
color = {'Yellow':(0,255,255)}
while(1):
# Take each frame
_, frame = cap.read()
# Convert BGR to HSV
hsv = cv2.cvtColor(frame, cv2.COLOR_BGR2HSV)
# Define mask
mask = cv2.inRange(hsv, lower['Yellow'], upper['Yellow'])
# Find contours in the mask and initialize the current (x, y) center of the ball
cnts = cv2.findContours(mask.copy(), cv2.RETR_EXTERNAL,cv2.CHAIN_APPROX_SIMPLE)[-2]
center =None
iflen(cnts) >0:
# Find largest contour in the ball mask
contour =max(cnts, key= cv2.contourArea)
# Get circle info
((x, y), radius) = cv2.minEnclosingCircle(contour)
M = cv2.moments(contour)
if M["m00"] !=0:
center = (int(M["m10"] / M["m00"]), int(M["m01"] / M["m00"]))
else:
center =None
if center !=None:
# Print position and contour
cv2.putText(frame, 'Yellow'+str(center), (int(x-radius),int(y-radius)), cv2.FONT_HERSHEY_SIMPLEX, 0.6,color['Yellow'],2)
cv2.drawContours(frame, [contour],0,color['Yellow'],2)
res = cv2.bitwise_and(frame,frame,mask= mask)
# Show screens
cv2.imshow('frame',frame)
cv2.imshow('mask',mask)
cv2.imshow('res',res)
k = cv2.waitKey(1) &0xFF
if k ==ord("q"):
break
cv2.destroyAllWindows()

detect_robots.py

import cv2
import numpy as np
import time
import math
# Read the camera
cap = cv2.VideoCapture(0)
# Define range of colors in HSV
lower = {'Blue':(40 , 170 , 115),'Green':(63 , 50 , 50),'Pink':(125 , 100 , 100)}
upper = {'Blue':(120 , 255 , 210),'Green':(98 , 250 , 140),'Pink':(179 , 255 , 255)}
# Define colors in BGR
colors = {'Blue':(255,0,0),'Green':(0,255,0),'Pink':(255,0,255)}
# Create masks
masks = {}
# Create lists of robots and define color of the ball
positions = {'Blue'}
teams = {'Green','Pink'}
while(1):
# Take each frame
_, frame = cap.read()
# Convert BGR to HSV
hsv = cv2.cvtColor(frame, cv2.COLOR_BGR2HSV)
# Define masks
for key in colors:
masks[key] = cv2.inRange(hsv, lower[key], upper[key])
mask_all = np.full((frame.shape[0], frame.shape[1]),0,dtype=np.uint8)
# For each team
for team in teams:
# For each robot of each team
for position in positions:
# Create a mask for the robot
mask = cv2.bitwise_or(masks[team], masks[position])
# Find contours in the mask and initialize the current (x, y) center of the robot
cnts = cv2.findContours(mask.copy(), cv2.RETR_EXTERNAL,cv2.CHAIN_APPROX_SIMPLE)[-2]
center =None
iflen(cnts) >0:
# Find largest contour in the robot mask
contour =max(cnts, key= cv2.contourArea)
# Get circle info
((x, y), radius) = cv2.minEnclosingCircle(contour)
M = cv2.moments(contour)
if M["m00"] !=0:
center = (int(M["m10"] / M["m00"]), int(M["m01"] / M["m00"]))
else:
center =None
if center !=None:
# Create a mask for the contour
mask_c=np.full((frame.shape[0], frame.shape[1]),0,dtype=np.uint8)
cv2.drawContours(mask_c, [contour], -1, (255,255,255), -1)
# Mask the team color and position color
mask_team=cv2.bitwise_and(mask_c,mask_c, mask= masks[team])
mask_position=cv2.bitwise_and(mask_c,mask_c, mask= masks[position])
# Find theirs centers
cnts_team = cv2.findContours(mask_team.copy(), cv2.RETR_EXTERNAL,cv2.CHAIN_APPROX_SIMPLE)[-2]
center_team =None
iflen(cnts_team) >0:
contour_team =max(cnts_team, key= cv2.contourArea)
((x_team, y_team), radius_team) = cv2.minEnclosingCircle(contour_team)
M_team = cv2.moments(contour_team)
if M_team["m00"] !=0:
center_team = (int(M_team["m10"] / M_team["m00"]), int(M_team["m01"] / M_team["m00"]))
else:
center_team =None
else:
center_team =None
cnts_position = cv2.findContours(mask_position.copy(), cv2.RETR_EXTERNAL,cv2.CHAIN_APPROX_SIMPLE)[-2]
center_position =None
iflen(cnts_position) >0:
contour_position =max(cnts_position, key= cv2.contourArea)
((x_position, y_position), radius_position) = cv2.minEnclosingCircle(contour_position)
M_position = cv2.moments(contour_position)
if M_position["m00"] !=0:
center_position = (int(M_position["m10"] / M_position["m00"]), int(M_position["m01"] / M_position["m00"]))
else:
center_position =None
else:
center_position =None
if (center_team !=None) and (center_position !=None):
# Mask the contour
cv2.drawContours(mask_all, [contour], -1, (255,255,255), -1)
# Calculate robot's orientation
angle = math.atan2(center_position[1]-center_team[1],center_position[0]-center_team[0])*180/math.pi
if angle <0:
angle =360+ angle
# Print position and angle and save them into info
cv2.line(frame, center_position, center_team, colors[team])
cv2.putText(frame, position, (int(x-radius),int(y-radius)), cv2.FONT_HERSHEY_SIMPLEX, 0.6,colors[team],2)
cv2.putText(frame, str(center)+', Angle:'+str(int(angle)), (int(x-radius),int(y-radius+20)), cv2.FONT_HERSHEY_SIMPLEX, 0.6,colors[team],2)
# Bitwise-AND mask and original image
res = cv2.bitwise_and(frame,frame, mask= mask)
res_c = cv2.bitwise_and(frame, frame, mask= mask_c)
res_all = cv2.bitwise_and(frame,frame,mask= mask_all)
# Show screens
cv2.imshow('frame',frame)
cv2.imshow('mask_all',mask_all)
cv2.imshow('res_all',res_all)
k = cv2.waitKey(1) &0xFF
if k ==ord("q"):
break
cv2.destroyAllWindows()
view rawdetect_robots.py hosted with ❤ by GitHub

Step 5: Vision Code

To send the information to the robots, we use ESPNow protocol. Each robot has an ESP that receives the DATA and controls it. There's also an ESP connected to the computer, that is responsible to send the DATA to the others. Since it's connected to the computer via USB, we can send the DATA from the computer to the ESP master via serial communication.

Connecting all these steps, we get the code below as result for the vision code:

vision.py

import cv2
import numpy as np
import serial
import time
import math
# Read the camera
cap = cv2.VideoCapture(0)
# Serial
ser = serial.Serial('/dev/ttyUSB0', 115200, timeout=0.5)
# Define range of colors in HSV
lower = {'Blue':(91 , 90 , 70),'Green':(63 , 50 , 50),'Yellow':(16,130,30),'Pink':(157 , 120 , 150)}
upper = {'Blue':(154 , 255 , 160),'Green':(98 , 250 , 140),'Yellow':(31,200,185),'Pink':(179 , 210 , 210)}
# Define colors in BGR
colors = {'Blue':(255,0,0),'Green':(0,255,0),'Yellow':(0,255,255),'Pink':(255,0,255)}
# Create masks
masks = {}
# Create lists of robots and define color of the ball
positions = {'Blue'}
teams = {'Green','Pink'}
ball ='Yellow'
while(1):
# Take each frame
_, frame = cap.read()
# Convert BGR to HSV
hsv = cv2.cvtColor(frame, cv2.COLOR_BGR2HSV)
# Define info to be sent
info =''
# Define masks
for key in colors:
masks[key] = cv2.inRange(hsv, lower[key], upper[key])
mask_all = np.full((frame.shape[0], frame.shape[1]),0,dtype=np.uint8)
# Find contours in the mask and initialize the current (x, y) center of the ball
cnts = cv2.findContours(masks[ball].copy(), cv2.RETR_EXTERNAL,cv2.CHAIN_APPROX_SIMPLE)[-2]
center =None
iflen(cnts) >0:
# Find largest contour in the ball mask
contour =max(cnts, key= cv2.contourArea)
# Get circle info
((x, y), radius) = cv2.minEnclosingCircle(contour)
M = cv2.moments(contour)
if M["m00"] !=0:
center = (int(M["m10"] / M["m00"]), int(M["m01"] / M["m00"]))
else:
center =None
if center !=None:
# Mask the contour
cv2.drawContours(mask_all, [contour], -1, (255,255,255), -1)
# Print position
cv2.putText(frame, ball +str(center), (int(x-radius),int(y-radius)), cv2.FONT_HERSHEY_SIMPLEX, 0.6,colors[ball],2)
# Save info
info = info + ball +str(center) +';'
# For each team
for team in teams:
# For each robot of each team
for position in positions:
# Create a mask for the robot
mask = cv2.bitwise_or(masks[team], masks[position])
# Find contours in the mask and initialize the current (x, y) center of the robot
cnts = cv2.findContours(mask.copy(), cv2.RETR_EXTERNAL,cv2.CHAIN_APPROX_SIMPLE)[-2]
center =None
iflen(cnts) >0:
# Find largest contour in the robot mask
contour =max(cnts, key= cv2.contourArea)
# Get circle info
((x, y), radius) = cv2.minEnclosingCircle(contour)
M = cv2.moments(contour)
if M["m00"] !=0:
center = (int(M["m10"] / M["m00"]), int(M["m01"] / M["m00"]))
else:
center =None
if center !=None:
# Create a mask for the contour
mask_c=np.full((frame.shape[0], frame.shape[1]),0,dtype=np.uint8)
cv2.drawContours(mask_c, [contour], -1, (255,255,255), -1)
# Mask the team color and position color
mask_team=cv2.bitwise_and(mask_c,mask_c, mask= masks[team])
mask_position=cv2.bitwise_and(mask_c,mask_c, mask= masks[position])
# Find theirs centers
cnts_team = cv2.findContours(mask_team.copy(), cv2.RETR_EXTERNAL,cv2.CHAIN_APPROX_SIMPLE)[-2]
center_team =None
iflen(cnts_team) >0:
contour_team =max(cnts_team, key= cv2.contourArea)
((x_team, y_team), radius_team) = cv2.minEnclosingCircle(contour_team)
M_team = cv2.moments(contour_team)
if M_team["m00"] !=0:
center_team = (int(M_team["m10"] / M_team["m00"]), int(M_team["m01"] / M_team["m00"]))
else:
center_team =None
else:
center_team =None
cnts_position = cv2.findContours(mask_position.copy(), cv2.RETR_EXTERNAL,cv2.CHAIN_APPROX_SIMPLE)[-2]
center_position =None
iflen(cnts_position) >0:
contour_position =max(cnts_position, key= cv2.contourArea)
((x_position, y_position), radius_position) = cv2.minEnclosingCircle(contour_position)
M_position = cv2.moments(contour_position)
if M_position["m00"] !=0:
center_position = (int(M_position["m10"] / M_position["m00"]), int(M_position["m01"] / M_position["m00"]))
else:
center_position =None
else:
center_position =None
if (center_team !=None) and (center_position !=None):
# Mask the contour
cv2.drawContours(mask_all, [contour], -1, (255,255,255), -1)
# Calculate robot's orientation
angle = math.atan2(center_position[1]-center_team[1],center_position[0]-center_team[0])*180/math.pi
if angle <0:
angle =360+ angle
# Print position and angle and save them into info
cv2.line(frame, center_position, center_team, colors[team])
cv2.putText(frame, position, (int(x-radius),int(y-radius)), cv2.FONT_HERSHEY_SIMPLEX, 0.6,colors[team],2)
cv2.putText(frame, str(center)+', Angle:'+str(int(angle)), (int(x-radius),int(y-radius+20)), cv2.FONT_HERSHEY_SIMPLEX, 0.6,colors[team],2)
info = info + team +'-'+ position +str(center) +str(int(angle)) +';'
# Bitwise-AND mask and original image
res = cv2.bitwise_and(frame,frame, mask= mask)
res_c = cv2.bitwise_and(frame, frame, mask= mask_c)
res_all = cv2.bitwise_and(frame,frame,mask= mask_all)
# Finish the info
if info =="":
info ="null"
info = info +'|'
# Convert str to bytes and send it
ser.write(str.encode(info))
# Show screens
cv2.imshow('frame',frame)
cv2.imshow('mask_all',mask_all)
cv2.imshow('res_all',res_all)
k = cv2.waitKey(1) &0xFF
if k ==ord("q"):
break
cv2.destroyAllWindows()
ser.close()
view rawvision.py hosted with ❤ by GitHub

Step 6: Robot Structure

We needed small robots, that wouldn't cover the ball and colored with a color that we didn't use on the vision code. Then we decided to print our own design, using a 3D printer and orange PLA. The design was made on Tinkercad, and you can check it out here.

Step 7: How the Robot Works

Each robot has an ESP, that is responsible for receiving the information from the ESP Master and for controlling the H Bridge module. This module controls the motors, and it's powered directly from the the battery. Since it has 3 power pins and one of them is regulated to 5V, we use it to power the ESP. We also use a switch to easily turn the robot on and off.

Step 8: Robot's Code

As mentioned before, each robot has an ESP to control its movements. The code for these ESPs are made using Arduino IDE. If you don't have it, download it here and install ESP32 boards in Arduino IDE following this link. The code is almost the same for every robot, except some defined variables. You might need to change the following lines:

#define team "Green"

#define pos "Blue"

#define adteam "Pink"

#define ball "Yellow"

#define quant_robots 2

#define quant_team 2

#define xgoal 10

#define ygoal 225

robot_Green-Blue.ino

#include<esp_now.h>
#include<WiFi.h>
//pins to control the robot
#defineIN118
#defineIN219
#defineIN321
#defineIN422
#defineENA25//enable M1
#defineENB26//enable M2
//define robot and ball
#defineteam"Green"
#definepos"Blue"
#defineadteam"Pink"
#defineball"Yellow"
#definequant_robots2
#definequant_team2
//define limits
#defineangl20
#definedistl50
#definetime8
//define goal coords
#definexgoal10
#defineygoal225
String coords_str; //variable that receives the info
int goal=0;
voidsetup() {
Serial.begin(115200);
WiFi.mode(WIFI_STA);
Serial.print("Mac Address in Station: ");
Serial.println(WiFi.macAddress());
InitESPNow(); //starts ESPNow
esp_now_register_recv_cb(OnDataRecv); //define function to receive data via ESPNow
//define pins as output
pinMode(IN1,OUTPUT);
pinMode(IN2,OUTPUT);
pinMode(IN3,OUTPUT);
pinMode(IN4,OUTPUT);
pinMode(ENA,OUTPUT);
pinMode(ENB,OUTPUT);
//control speed: speed(speed_m1, speed_m2)
speed(1024,1024); //use values between 600 and 1024
stop(); //stops the robot
}
voidloop(){
func();
}
//***main function***//
voidfunc(){
//variables for positions
int xrobot=-1,yrobot=-1,arobot=400;
int xball=-1,yball=-1;
int xad[3]={-1,-1,-1},yad[3]={-1,-1,-1};
String robot = team; robot += "-"; robot += pos;
Serial.println();
Serial.println(coords_str);
//finds robot
if(coords_str.indexOf(robot)!=-1){ //gets x, y and angle of the robot
String robot_temp = coords_str.substring(coords_str.indexOf(robot),coords_str.indexOf(";",coords_str.indexOf(robot)));
xrobot = robot_temp.substring(robot_temp.indexOf('(')+1,robot_temp.indexOf(',')).toInt();
yrobot = robot_temp.substring(robot_temp.indexOf(',')+1,robot_temp.indexOf(')')).toInt();
arobot = robot_temp.substring(robot_temp.indexOf(')')+1,robot_temp.indexOf(';')).toInt();
}else{
stop();
}
//finds ball
if(coords_str.indexOf(ball)!=-1){ //x e y da bola
String ball_temp = coords_str.substring(coords_str.indexOf(ball),coords_str.indexOf(";",coords_str.indexOf(ball)));
xball = ball_temp.substring(ball_temp.indexOf('(')+1,ball_temp.indexOf(',')).toInt();
yball = ball_temp.substring(ball_temp.indexOf(',')+1,ball_temp.indexOf(')')).toInt();
}else{
stop();
}
//finds adversaries
int i,j=-1;
for(i=0;i<(quant_robots/quant_team);i++){
j=coords_str.indexOf(adteam,j+1);
if(j!=-1){
String ad = coords_str.substring(j,coords_str.indexOf(";",j));
xad[i] = ad.substring(ad.indexOf('(')+1,ad.indexOf(',')).toInt();
yad[i] = ad.substring(ad.indexOf(',')+1,ad.indexOf(')')).toInt();
printc("Adversary",xad[i],yad[i]);
}
}
//starts strategy
if(xrobot!=-1 && yrobot!=-1 && arobot!=400 && xball!=-1 && yball!=-1){
printc("Robot",xrobot,yrobot);
printc("Ball",xball,yball);
strategy(xrobot,yrobot,arobot,xball,yball,xad,yad);
}else{
parar();
}
}
//***functions for the strategy***//
//main
voidstrategy(int xrobot,int yrobot,int arobot,int xball,int yball,int xad[],int yad[]){
int aball = angle(xrobot,yrobot,xball,yball); //calculates the angle bteween the robot and the ball
int distball = distance(xrobot,yrobot,xball,yball); //calculates the distance between the robot and the ball
int with_ball = approach(arobot,aball,distball); //approaches the robot to the ball
if(with_ball){
int agoal = angle(xrobot,yrobot,xgoal,ygoal); //calculates the angle bteween the robot and the goal
int distgoal = distancia(xrobot,yrobot,xgoal,ygoal); //calculates the distance between the robot and the goal
int goal = approach(arobot,agoal,distgoal); //approaches the robot to the goal
}
}
//checks for aligned objects
intaligned(int arobot,int a){
//if there's an object forward
if((arobot > (a-angl) && arobot < (a+angl)) || (arobot > (a-angl+360) && arobot < (a+angl+360))){
return1;
}else{
//if there's an object backward
if((arobot > (a-angl+180) && arobot < (a+angl+180)) || (arobot > (a-angl+540) && arobot < (a+angl+540))){
return -1;
}else{
return0;
}
}
}
//aligns the robot to an object
intsearch(int arobot,int a){
int object = aligned(arobot,a); //checks for aligned objects
if(!object){ //if the object isn't aligned
//turns the robot
int dif = a - arobot;
if(dif < 0)
dif+=360;
if(dif < 180){
right();
}else{
left();
}
}
return object; //returns the alignment
}
//approaches the robot to an object
intapproach(int arobot,int a,int dist){
int dir = search(arobot,a); //aligns the robot to an object
if(dir){ //if they're aligned
if(dist > distl){ //if they're distant
if(dir==1){ //if the object is forward
forward();
}else{
if(dir==-1){ //if the object is backward
backward();
}
}
}else{
return1; //they're close
}
}
return0; //they're distant
}
//calculates distance
intdistance(int x1,int y1,int x2,int y2){
returnsqrt(pow((y2-y1),2)+pow((x2-x1),2)); //calculates and returns the distance between two objects
}
//calculates angle
intangle(int x1,int y1,int x2,int y2){
int a = atan2((y2-y1),(x2-x1))*180/M_PI; //calculates the angle between two objects
//fixes negative angles
if(a < 0){
a+=360;
}
return a; //returns the angle
}
//***moving functions***//
//function for the speed
voidspeed(int speed1, int speed2){
ledcAttachPin(ENA, 0);//pin ENA to channel 0. (Engine 1)
ledcSetup(0, 15000, 10);//frequency of 1000Hz and resolution of 10bits for channel 0.
ledcAttachPin(ENB, 1);//pin ENB to channel 1. (Engine 2)
ledcSetup(1, 15000, 10);//frequency of 1000Hz and resolution of 10bits for channel 1.
//control speed
ledcWrite(0, speed1);
ledcWrite(1, speed2);
}
//function to stop the robot
voidstop(){
digitalWrite(IN1,HIGH);
digitalWrite(IN2,HIGH);
digitalWrite(IN3,HIGH);
digitalWrite(IN4,HIGH);
digitalWrite(ENA,LOW);
digitalWrite(ENB,LOW);
}
//function to go backward
voidbackward() {
digitalWrite(IN1,LOW);
digitalWrite(IN2,HIGH);
digitalWrite(IN3,LOW);
digitalWrite(IN4,HIGH);
digitalWrite(ENA,HIGH);
digitalWrite(ENB,HIGH);
delay(time);
stop();
}
//function to go forward
voidforward(){
digitalWrite(IN1,HIGH);
digitalWrite(IN2,LOW);
digitalWrite(IN3,HIGH);
digitalWrite(IN4,LOW);
digitalWrite(ENA,HIGH);
digitalWrite(ENB,HIGH);
delay(time);
stop();
}
//function to turns to the right
voidrigth(){
digitalWrite(IN1,LOW);
digitalWrite(IN2,HIGH);
digitalWrite(IN3,HIGH);
digitalWrite(IN4,LOW);
digitalWrite(ENA,HIGH);
digitalWrite(ENB,HIGH);
delay(time);
stop();
}
//function to turns to the left
voidleft(){
digitalWrite(IN1,HIGH);
digitalWrite(IN2,LOW);
digitalWrite(IN3,LOW);
digitalWrite(IN4,HIGH);
digitalWrite(ENA,HIGH);
digitalWrite(ENB,HIGH);
delay(time);
stop();
}
//***communication functions***//
//print coords
voidprintc(String type,int x, int y){
Serial.print(type);
Serial.print(": (");
Serial.print(x);
Serial.print(",");
Serial.print(y);
Serial.println(")");
}
//ESPNow inicialization
voidInitESPNow() {
//If it was succeed
if (esp_now_init() == ESP_OK) {
Serial.println("ESPNow Init Success");
}
//If there was an error
else {
Serial.println("ESPNow Init Failed");
ESP.restart();
}
}
//recieved data
voidOnDataRecv(constuint8_t *mac_addr, constuint8_t *data, int data_len) {
char macStr[18];
snprintf(macStr, sizeof(macStr), "%02x:%02x:%02x:%02x:%02x:%02x",
mac_addr[0], mac_addr[1], mac_addr[2], mac_addr[3], mac_addr[4], mac_addr[5]);
//Serial.print("Received from: ");
//Serial.println(macStr);
//Serial.println(data_len);
//Serial.println((char*)data);
//Serial.println("");
coords_str = "";
coords_str = (String)((char*)data);
}

Step 9: Results

The robots are finally ready to play and you can check the results watching the video.