note: this instructable requires a little knowledge about raspberry pi and arduino .for those of you have no experience using arduino and pi. i suggest you should follow the below links,or else enjoy the instructables

its easy follow the links below.

tutorial rasp pi (not by me)

https://www.instructables.com/class/Arduino-Class/ ( not by me)

so if you are done with the above links or you have used these before then you can easily follow my instructable

>>so let me tell you about IBOT

I BOT is a open source robotic and IOT platform.

The name I BOT stands for internet robot. The robot that can perform tasks on commands over cloud or LAN. The robot is companion which can talk, move around with you and change the environmental conditions for you. The robot also provides house security with a camera placed in its mouth and notify you over email. I BOT is meant be around people and make their lives happier and better. It has the ability to introduce itself and interact with the user for a set of predefined questions. It has 3D printed body which is designed for the robotic hobbyists to add on more hardware or features to it. It is running a raspberry pi (Broadcom and ARM based microcomputer) and its head is fitted with a camera. I BOT thus can serve as an excellent platform for the programmers too, providing opportunity to run their algorithms and see them working in real time. The robot which is a lot more than a robot.

Step 1: Componenets Required

these is the list of the components you will be requiring throughout this instructables.

there are different versions of the same thing available(for the few of the mention below) in the market so i recommend you to see the picture attached.

however i am inserting the links from where i bought mine. all others are the same and commonly available.

List of all Components and Cost

  • 1.Capacitors
  • 10nF
  • 50nfF
  • 10uF/63V
  • 100uF/50V
  • 1000uF/6.3V
  • 2.Resistor
  • 10K
  • 1M
  • 100 2W
  • 330
  • 470

  1. 555 timer ic Astable multivibrator
  2. 741 op amp ic Comparator
  3. Cd 4017 Counter
  4. Potentiometer 10k,500k,2.2k
  5. Arduino nano Microcontroller
  6. Raspberry pi 3 Microprocessor(https://www.protocentral.com/raspberry-pi-main-boa...
  7. LDR100k
  8. HC-SR04 Ultrasonic sensor
  9. 16*2 lcd display Display
  10. Bc547
  11. LED Rgb,white Colour
  12. DHT11 Temperature sensor
  13. Real time clockrtc amazon
  14. APR33A3 Voice moduleapr33a3 speech module
  15. Cameracamera
  16. 8 ohm speaker
  17. Servo motors 180deg,360deg

Step 2: 3d Printing

so first part of every robotics project is to get the robotic structure done.

i was first worried about the design part ,but it is with 123d design by autodesk. (it is free to download and very easy to use)

entire design part is done on 123d design.

when the design part was done it was time to print them. so i used my college printer to print.

so you can find the attached stl files and print an another ibot for yourselves.

if you dont have a printer.not to worry. there are many who can print it for you.(3d hubs etc)

for beginners in 3d design i would recommend 123d design if they want to redeisgn the i bot.

download link if you want to try it out

the material i used for my IBOT is ABS

the design of the robot gives enough space for the developers to add on more hardware to give the robot more features

all the stl files have been attached.

for any querries with the stl files please mail me at brainhackerz15@gmail.com

so basically the designing was broken down into four parts

  1. body
  2. head
  3. servo brackects
  4. hands

so i wanted to design a body which has enough space inside .so that all the electronics can go in and a space to place sensors and display unit on the front of the robot.so i came up with the above design.

then was the head.one which would suit the body. and ofcourse one which looks good.

i got inspiration from this one pepper the humanoid robot by softbank.

next was the servo brackets for the continous rotation servos at the bottom and for the servos for the hands.

by now i was short of time .

there was no time to make hands for the robot. but it would definetly not look good without them.

so i made those dummy hands which are not that functional but are ok for now.(i will change the hands soon).

some of you might not like few of the parts of IBOT. so feel free to modify.

i am attaching few screenshots of the stl files.have a look at them if you want to download.

so for the robot to move around i had to make a chasis.(pic attached)

after you have all the parts it is very esy to figure it out which part goes where

Step 3: Pan and Tilt Object Tracking

sorry for making this step very long. to make sure that nothing goes wrong while you do it , i had to do so.

so now when the structure is ready i started bringing in the elecronics.

so first thing i started with is to test the pan and tilt motion of the head.

to do this i ran the servo test codes which you will find in the 11 th step of this instructables.

the following video shows you the pan and tilt assembly.

Before you get started started with the object tracking

note: remember that raspberry pi gpios should be current limited hence i made small board with 330ohms resistors in series with all the gpios.so i recommend you too.

so now lets get started

first of all you must have a usb web cam and a raspberry pi

then bootup your raspberry pi.open up the terminal window and type the following to check if the camera works.

lsusb to view your camera info

cd /dev move to directory called dev to on your pi

ls -l to see whether the camera is listed usually as "video0"

cd ~ switch to home

sudo apt-get install fswebcam installing a app to take images

fswebcam -r 640x480 --no-banner image.jpg click a pic. the numbers represents the resolution

after you are done doing the above

go to the file manager look for image.jpg that is the picture you took.(so now you know that your camera is working)

now lets try with the video

sudo apt-get install luvcview installing application which can stream video from your camera

luvcview -s 320x240 to stream video

so if you get a video frame on your pi then everything is fine on your pi.

to start with object tracking and stuff you must have opencv with python installed on your pi.

to know about opencv http://opencv.org/

to install opencv and for all other configurations follow the below steps.

all these on the terminal one by one.

sudo apt-get update
sudo apt-get upgrade

sudo apt-get install libopencv-dev python-opencv

cd ~

sudo apt-get update sudo apt-get upgrade

sudo apt-get install python-numpy python-scipy python-matplotlib

sudo apt-get install build-essential cmake pkg-config

sudo apt-get install default-jdk ant

sudo apt-get install libgtkglext1-dev

sudo apt-get install v4l-utils

sudo apt-get install libjpeg8 \ libjpeg8-dev \ libjpeg8-dbg \ libjpeg-progs \ libavcodec-dev \ libavformat-dev \ libgstreamer0.10-0-dbg \ libgstreamer0.10-0 \ libgstreamer0.10-dev \ libxine2-dev \ libunicap2 \ libunicap2-dev \ swig \ libv4l-0 \ libv4l-dev \ python-numpy \ libpython2.7 \ python-dev \ python2.7-dev \ libgtk2.0-dev \ libjasper-dev \ libpng12-dev \ libswscale-dev

wget http://sourceforge.net/projects/opencvlibrary/fil...

unzip opencv-3.0.0.zip

cd opencv-3.0.0

mkdir build

cd build


sudo make // this one takes lot of time so sit back and relax or have a nap

sudo make install

sudo nano /etc/ld.so.conf.d/opencv.conf // opencv.conf is a blank file,add the following bold itallics and then save


and then back to the terminal

sudo ldconfig

sudo nano /etc/bash.bashrc //add the following bold itallics at the end of bash.bashrc file


sudo shutdown -r now

after the last command the system reboots.

and opencv will be installed after all these hardwork!.

now let me make it easy here on

just make the circuit connection as shown in the picture and copy and paste the following code onto a new python file

by typing sudo nano whateverthefilename.py

and save it

and run the the file by

sudo python whateverthefilename.py headed //if you want to see the video window


sudo python whateverthefilename.py headless //if you dont want the window

note the folowing code tracks the red ball with help of camera and the pan and tilt assemble of the robot.

its easy to modify the code for your requirement.

import RPi.GPIO as GPIO
import cv2
import numpy as np
import os
import sys
from operator import itemgetter
def main():
    headed_or_headless = ""
    if len(sys.argv) == 2 and str(sys.argv[1]) == "headed":
        headed_or_headless = "headed"
        print "entering headed mode"
    elif len(sys.argv) == 2 and str(sys.argv[1]) == "headless":
        headed_or_headless = "headless"
        print "entering headless mode"
        print "\nprogram usage:\n"
        print "for headed mode (GUI interface) @command prompt type: sudo python pan_and_tilt_tracker.py headed\n"
        print "for headless mode (no GUI interface, i.e. embedded mode) @ command prompt type: sudo python pan_and_tilt_tracker.py headless\n"
    # end if else
    GPIO.setmode(GPIO.BCM)              # use GPIO pin numbering, not physical pin numbering
    led_gpio_pin = 18
    pan_gpio_pin = 24
    tilt_gpio_pin = 25
    pwmFrequency = 100                 # frequency in Hz
    pwmInitialDutyCycle = 14           # initial duty cycle in %
    GPIO.setup(led_gpio_pin, GPIO.OUT)
    GPIO.setup(pan_gpio_pin, GPIO.OUT)
    GPIO.setup(tilt_gpio_pin, GPIO.OUT)
    pwmPanObject = GPIO.PWM(pan_gpio_pin, pwmFrequency)
    pwmTiltObject = GPIO.PWM(tilt_gpio_pin, pwmFrequency)
    capWebcam = cv2.VideoCapture(0)                     # declare a VideoCapture object and associate to webcam, 0 => use 1st webcam
    print "default resolution = " + str(capWebcam.get(cv2.CAP_PROP_FRAME_WIDTH)) + "x" + str(capWebcam.get(cv2.CAP_PROP_FRAME_HEIGHT))
    capWebcam.set(cv2.CAP_PROP_FRAME_WIDTH, 320.0)
    capWebcam.set(cv2.CAP_PROP_FRAME_HEIGHT, 240.0)
    print "updated resolution = " + str(capWebcam.get(cv2.CAP_PROP_FRAME_WIDTH)) + "x" + str(capWebcam.get(cv2.CAP_PROP_FRAME_HEIGHT))
    if capWebcam.isOpened() == False:                           # check if VideoCapture object was associated to webcam successfully
        print "error: capWebcam not accessed successfully\n\n"          # if not, print error message to std out
        os.system("pause")                                              # pause until user presses a key so user can see error message
        return                                                          # and exit function (which exits program)
    # end if
    intXFrameCenter = int(float(capWebcam.get(cv2.CAP_PROP_FRAME_WIDTH)) / 2.0)
    intYFrameCenter = int(float(capWebcam.get(cv2.CAP_PROP_FRAME_WIDTH)) / 2.0)
    panServoPosition = int(90)           # pan servo position in degrees
    tiltServoPosition = int(90)          # tilt servo position in degrees
    updateServoMotorPositions(pwmPanObject, panServoPosition, pwmTiltObject, tiltServoPosition)
    while cv2.waitKey(1) != 27 and capWebcam.isOpened():                # until the Esc key is pressed or webcam connection is lost
        blnFrameReadSuccessfully, imgOriginal = capWebcam.read()            # read next frame
        if not blnFrameReadSuccessfully or imgOriginal is None:             # if frame was not read successfully
            print "error: frame not read from webcam\n"                     # print error message to std out
            os.system("pause")                                              # pause until user presses a key so user can see error message
            break                                                           # exit while loop (which exits program)
        # end if
        imgHSV = cv2.cvtColor(imgOriginal, cv2.COLOR_BGR2HSV)
        imgThreshLow = cv2.inRange(imgHSV, np.array([0, 135, 135]), np.array([19, 255, 255]))
        imgThreshHigh = cv2.inRange(imgHSV, np.array([168, 135, 135]), np.array([179, 255, 255]))
        imgThresh = cv2.add(imgThreshLow, imgThreshHigh)
        imgThresh = cv2.GaussianBlur(imgThresh, (3, 3), 2)
        imgThresh = cv2.dilate(imgThresh, np.ones((5,5),np.uint8))
        imgThresh = cv2.erode(imgThresh, np.ones((5,5),np.uint8))
        intRows, intColumns = imgThresh.shape
        circles = cv2.HoughCircles(imgThresh, cv2.HOUGH_GRADIENT, 3, intRows / 4)      # fill variable circles with all circles in the processed image
        GPIO.output(led_gpio_pin, GPIO.LOW)
        if circles is not None:                     # this line is necessary to keep program from crashing on next line if no circles were found
            GPIO.output(led_gpio_pin, GPIO.HIGH)
            sortedCircles = sorted(circles[0], key = itemgetter(2), reverse = True)
            largestCircle = sortedCircles[0]
            x, y, radius = largestCircle                                                                       # break out x, y, and radius
            print "ball position x = " + str(x) + ", y = " + str(y) + ", radius = " + str(radius)       # print ball position and radius
            if x < intXFrameCenter and panServoPosition >= 2:
                panServoPosition = panServoPosition - 2
            elif x > intXFrameCenter and panServoPosition <= 178:
                panServoPosition = panServoPosition + 2
            # end if else
            if y < intYFrameCenter and tiltServoPosition >= 62:
                tiltServoPosition = tiltServoPosition - 2
            elif y > intYFrameCenter and tiltServoPosition <= 133:
                tiltServoPosition = tiltServoPosition + 2
            # end if else
            updateServoMotorPositions(pwmPanObject, panServoPosition, pwmTiltObject, tiltServoPosition)
            if headed_or_headless == "headed":
                cv2.circle(imgOriginal, (x, y), 3, (0, 255, 0), -1)           # draw small green circle at center of detected object
                cv2.circle(imgOriginal, (x, y), radius, (0, 0, 255), 3)                     # draw red circle around the detected object
            # end if
        # end if
        if headed_or_headless == "headed":
            cv2.imshow("imgOriginal", imgOriginal)                 # show windows
            cv2.imshow("imgThresh", imgThresh)
        # end if
    # end while
    cv2.destroyAllWindows()                     # remove windows from memory
# end main
def updateServoMotorPositions(pwmPanObject, panServoPosition, pwmTiltObject, tiltServoPosition):
    panDutyCycle = ((float(panServoPosition) * 0.01) + 0.5) * 10
    tiltDutyCycle = ((float(tiltServoPosition) * 0.01) + 0.5) * 10
# end function
if __name__ == "__main__":

Step 4: Temperature and Humidity With Time and Date Display

actually the temperature humidity(display part) and the light dependent RGB eyes are all done on a single micro controller. to make sure you understand it easily i will be explain them in the separate parts and separate codes and offcorse at any point of time you can combine both to run on a single microcontroller.

so first lets start with the display part.

i wanted my robot to constantly display something on it. so i planned to put up time date and temperature and humidity on it for the time being later will be changed on requirement.

so i put up an rtc(real time clock for the time and date) DS1307. and dht11 for temperature and humidity which is easy to use as it directly gives digital output for both together.

so with couple of libraries i came up with the following code.

so try it

make the circuit connections as shown in the picture

copy and paste the following code in the arduino ide

before you compile or upload the following code on to the arduino

you will have o download and install the following libraries on to your arduino ide.

download the following libraries from the following links






to install the zip file downloaded from the above links

  1. go to sketch
  2. under sketch> include library
  3. under include libarary >add .zip library
  4. and from there on browse to the downloads and select the zip file you downloaded.
  5. then you are done installing.

before you burn the followinf code onto the arduino ,don't forget to set time on to the rtc.

you can do this by loading the set time example sketch from the rtc library.which i have attached.

i have also attached the main code ie same as below

#include <TimeLib.h>
#include <DS1307RTC.h>
dht DHT;
#define DHT11_PIN 12
LiquidCrystal lcd(2, 3, 4, 5, 6, 7);
void setup() {
  lcd.begin(16, 2);
  while (!Serial) ; // wait for serial
  Serial.println("DS1307RTC Read Test");
  lcd.print("DS1307RTC Read Test");
void loop() {
  tmElements_t tm;
    int chk = DHT.read11(DHT11_PIN);
  lcd.print("Temp: ");
  lcd.print("Humidity: ");
  if (RTC.read(tm)) {
    Serial.print("Ok, Time = ");
    lcd.print("Time = ");
    Serial.print(", Date (D/M/Y) = ");
    lcd.print("Date = ");
  } else {
    if (RTC.chipPresent()) {
      Serial.println("The DS1307 is stopped.  Please run the SetTime");
      Serial.println("example to initialize the time and begin running.");
    } else {
      Serial.println("DS1307 read error!  Please check the circuitry.");
void print2digits(int number) {
  if (number >= 0 && number < 10) {

Step 5: The Led Chaser on the Side of the Robot

i wanted IBOT to have some circular lighting animations on the side similar to the pepper the huamanoid.

so i palnned to put some led chaser on the side but in circular pattern.

so make an circular pcb to fix the leds on to it in a circular pattern.so that when the led chases, it forms a ring like animation.as shown in the pics.

so now was the pcb design part its really easy.

it is very easy to design your own layout with diptrace. http://diptrace.com/download-diptrace/

try once on your or downlaod the attach jpeg image

to etch a pcb

  • first print it on a ohp sheet(or any glossy paper)
  • then put it on a cleaned shiny copper clad board
  • and iron it using an iron box
  • then dip it in the etching solution
  • wait for some 15mins
  • and then take out the board from the solution clean it with water and tooth brush to remove the ink.
  • and there you go!

after you etched a pcb solder the components and then its all done enjoy led chasing ears on the robot.

make another pcb for all the circuitary in the diagram to make it compact so that you can place it inside the head of the robot.

Step 6: Light Dependant Rgb Leds on the Eyes

so here is the second part of the microcontroller doing the displaying thing.

i wanted some color changing eyes for IBOT. so i put up some leds.

but the question was when to change colours.

so initially i planned to interface them with pi along with camera .and depending upon the persons facial expressions the eyes had to change colours.

then i came up with simple idea interface with same micro controller used for the displaying part as it is not doing much work where as i had lot to do on the pi

so i got LDR in and made the RGB leds light intensity dependent.

by just adding few extra lines of code on to the previous arduino code.

so read the analog value from the ldr and divided it into ranges and specified what to do when light of different light intensities falls on the robots head as LDR is fitted on to the robots head .

i have attached the code for this. so just combine the previous arduino code with this one.

Step 7: Robot Introducing Itself(google Text to Speech Api)

there are many different ways to play voices on the arduino or raspberry pi

so first let me tell you about playing voice from raspberry pi

  1. so first go to this website https://lingojam.com/RobotVoiceGenerator
  2. type in whatever you want the robot to speak or you want to play on pi.
  3. and download the mp3 file
  4. save it on pi home directory
  5. go to the terminal
  6. and type
  7. sudo apt-get -y install mpg321
  8. mpg321 youraudiofilename.mp3

there you have something playing through the speakers on pi

you can also play mp3 file from the python code

import os
from time import sleep
while True:    
        os.system('mpg123 -q youaudiofilename.mp3 &')

another way is google text to speech

put in whatever you want the robot to speak between that quotes

and just run the file

import os
text = "speak to me"
tts_text = text.replace(" ","+")
tts_url = "http://translate.google.com/
translate_tts?tl=en&q="+tts_textos.system("wget tts_url -O test.mp3")
os.system("aplay test.mp3")

Step 8: Robot Introducing Itself(using Apr33 Speech Module)

apr33a3 is an exellent speech module where you can recor 11 minutes of audio through an on board mic,into 8 different channels and play them using micro controller.

  • just record the voice or the speech you want the robot to speak
  • and play it using the microcontroller by giving it an active low signal through the code

to know more about how to use it:

Step 9: Power Source to the Robot

the robot was now almost done it had been powered with 3 different adapters of different voltage rating and current ratings.

so then i calculated the maximum amount of current it can draw(which i found to be 1.8amps)

so then i decided to power it with a single 12v powersupply

but half of the things o the robot were 12v compatible. so i brought in a buck converter.

buck converter and brougt down the voltage to 5v.

but i wanted 12v too.

so made a small circuit on perf board as shown in the picture.

which had 12v rail aswell as 5v rail.

Step 10: Robot Sending an Email If Someone Comes in Front of the Robot

note:dont forget to have a common ground

the robot uses an node mcu module to be connected to WiFi network around it

of course i could have used raspberry pi to do this. but already the pi is doing lot of hard work.

so got an node mcu.

make the circuit connections as shown in the picture and copy the following code and paste it on the arduino ide

select the proper settings for the nodemcu (such as com port ,baud rate etc) and upload.

this code detects the distance of the person in front of the robot if the distance is less than 30cm it notifies the owner of the robot with an email on his gmail

also include esp8266wifi.h library

#include <Esp8266Wifi.h>

#include "Gsender.h"
#pragma region Globals
int trigpin=05; //D1
int echopin=04; //D2
long duration,cm,inches;
const char* ssid = "MSB";                           // WIFI network name
const char* password = "bautomate";                       // WIFI network password
uint8_t connection_state = 0;                    // Connected to WIFI or not
uint16_t reconnect_interval = 10000;             // If not connected wait time to try again
#pragma endregion Globals
uint8_t WiFiConnect(const char* nSSID = nullptr, const char* nPassword = nullptr)
    static uint16_t attempt = 0;
    Serial.print("Connecting to ");
    if(nSSID) {
        WiFi.begin(nSSID, nPassword);  
    } else {
        WiFi.begin(ssid, password);
    uint8_t i = 0;
    while(WiFi.status()!= WL_CONNECTED && i++ < 50)
    if(i == 51) {
        Serial.print("Connection: TIMEOUT on attempt: ");
        if(attempt % 3 == 0)
            Serial.println("Check if access point available or SSID and Password\r\n");
        return false;
    Serial.println("Connection: ESTABLISHED");
    Serial.print("Got IP address: ");
    return true;
void Awaits()
    uint32_t ts = millis();
        if(millis() > (ts + reconnect_interval) && !connection_state){
            connection_state = WiFiConnect();
            ts = millis();
void setup()
void loop(){
 Serial.println( "cm");
   connection_state = WiFiConnect();
    if(!connection_state)  // if not connected to WIFI
        Awaits();          // constantly trying to connect
    Gsender *gsender = Gsender::Instance();    // Getting pointer to class instance
    String subject = "alert!";
    if(gsender->Subject(subject)->Send("swagath.b.18@gmail.com", " activity detected")) {
        Serial.println("Message send.");
    } else {
        Serial.print("Error sending message: ");

Step 11: Controlling Servos on the Robot

the servos need a pwm input to position themselves at a particular angle

the servo on the robot at the bottom are continous rotation servos.

if they are given with a pwm which represemts 90 degree in a normal 180 degree servo it means stop.

0 means full speed forward

180 means full speed backward

dutyCycle = (((Angle) * 0.01) + 0.5) * 10

thus in the above formula if u substitute the desired angle u get the duty cycle.if the pwm signal of this

duty cylce is applied to the servo,the servo positions at that particular angle.

the formula might slightly vary for diffrent servos,but not to worry it would not cause much problem if you use the same.

following is an example code,

modify the code to make the robot do things you like.

import RPi.GPIO as GPIO<br>
def main():
    gpio_pin = 21
    GPIO.setup(gpio_pin, GPIO.OUT)
    pwmObject = GPIO.PWM(gpio_pin, 100)     # frequency = 100 Hz
    pwmObject.start(14)             # initial duty cycle = 14
    while True:
        strAngle = raw_input("enter angle (0 to 180): ")
        intAngle = int(strAngle)
        dutyCycle = ((float(intAngle) * 0.01) + 0.5) * 10
    # end while
if __name__ == "__main__":


Excellent work. Keep it up guys <br>
<p>nice man ,keep it going</p>

About This Instructable




Bio: robotic hobbyist,pcb designer,coder and 3d designer
More by md arfan:I- BOT  biped robot 
Add instructable to: