General Description:

This project is entitled for the development of a fully functional mobile prototype ”the Rover Station”, responsible for environmental data capture as Temperature, Humidity and Luminosity. The idea is in the future agregate other functions to emulate what would be a Mars Rover emulator. This prototype is for education purposes only and was part of my Capstone Project at COURSERA - University of California, Irvine "An Introduction to Programming the Internet of Things (IOT)" course.

User & Design considerations:

  • The Rover will be remote controlled by an Android device with Bluetooth capability. The intention is that the station will be moving into a path where the data must be captured. The data will be continuously been captured and transmitted independently if the Rover is stationary or in movement.
  • The user should receive a visual feedback (live video stream) from the Rover
  • The captured data would be analyzed thru a public website (in this case: thingspeak.com)
  • The data will be available to users on a graphic and table format
  • Pre-defined tweet alarms will be generated locally by the station or WebSite
  • The Rover will have some autonomous capability to avoid obstacles in order to protect itself in case of bad controlling by user.
  • Common ”off of the shelf” components will be used and the total cost should be under USD150

Step 1: Design Options

Based on the project requirements, 2 options were considered for this project. A single processor responsible for all tasks, that in this case was a Raspberry Pi and a dual processor, having the requirements "split" between them (Arduino and RPi):

1.Processor 1: RPi

  • Responsible for data capture
  • Web communication
  • Streaming Video
  • Send social media messages

2.Processor 2: Arduino

  • Motors Control (movement and camera positioning)
  • Obstacles avoidance
  • Remote Control communication

In terms of costing, using 2 processors is in fact less expensive than the single processor option. This is because the Arduino is a very cheap item and less expensive that the Rpi Hat option necessary to run de Servers with RPi. Another difference is the BT module. For Arduino a very cheap HC-06 BT 3.0 slave module can be used, costing half of the price of the BT dongle to be added to RPi. So, the dual processor was the chosen option.

Step 2: BoM - Bill of Material

Step 3: Raspberry Pi Streaming Video Setup and Test

  1. Install PIP:
    • sudo apt-get install python-pip
  2. Install the picamera library:
    • pip install picamera
  3. Install the flask Python library:
    • sudo pip install flask
  4. Download Miguel’s Flask video streaming project:
  5. In the project folder edit the app.py file, comment out this line:
    • #from camera import Camera
  6. Un-comment this line:
    • from camera_pi import Camera
  7. Save the file app.py
  8. Run ifconfig to find out the local IP address of your Raspberry Pi “yourLocalIPaddress”.
  9. Start the Flask server by running this command:

    • python app.py
  10. A message will be printed at monitor:
  11. Open up a web browser and go this address:
    • “yourLocalIPaddress”:5000

Step 4: Install DHT11 Sensor at RPi

  1. First, get the Library from Github:
  2. Installing the Library:
    • sudo apt-get update
    • sudo apt-get install build-essential python-dev python-openssl
    • cd /Home/Pi/Adafruit_Python_DHT
    • sudo python setup.py install
  3. Test the sensor running the file AdafruitDHT.py at monitor. Enter as parameters: 11 (Sensor DHT11) and 4 (GPIO where the sensor is connected)
    • sudo python /Home/Pi/Adafruit_Python_DHT/examples/AdafruitDHT.py 11 4
  4. The result should be the temperature and humidity read by the sensor

Step 5: Sending Data to Web

    For the basic settings of the DH-11 sensor with the RPi, and sending data to internet, a great help was get from this tutorial:

    Plotting DHT11 sensor data at ThingSpeak.com using Raspberry Pi

    From what was learned, the important is

    1. Setting a Channel at ThingSpeak.com:
    2. Run the Python Code bellow for tests

    Step 6: Adding a "digital" Light Sensor

    The general idea for this step of the project was learned from:

    A LDR and a Capacitor was connected to GPIO24. If there is light, the GPIO24 will return HIGH and w/o light “LOW”.

    The Python Code used in this test:

    Step 7: Adding a Anagog Light Intensity Sensor

    The next step was to get "light intensity data". To add the LDR to RPi the best is to convert the analog signal from the sensor to a digital value using an external ADC (Analog to Digital Converter). The RPi does not have an internal ADC as the Arduino. If you do not have an ADC, a good approximation is to use a capacitor charging/discharging technic. The "Raspberry Pi Cookbook" gives the solution (note that Instead the Potentiometer, a LDR could be used):

    import RPi.GPIO as GPIO
    import time
    a_pin = 25
    b_pin = 23
    def discharge():
        GPIO.setup(a_pin, GPIO.IN)
        GPIO.setup(b_pin, GPIO.OUT)
        GPIO.output(b_pin, False)
    def charge_time():
        GPIO.setup(b_pin, GPIO.IN)
        GPIO.setup(a_pin, GPIO.OUT)
        count = 0
        GPIO.output(a_pin, True)
        while not GPIO.input(b_pin):
            count = count + 1
        return count
    def analog_read():
        return charge_time()
    while True:

    The best is to use the Arduino to capture this kind of info and send it to RPi. The result will be more accurate.

    Step 8: Sending All Data to the Web

    The previous Python Code was updated, including the new sensors:

    Step 9: Sending an Alarm Tweet

    One of the characteristics of IoT is to interact with users automatically. You can program the RPi as a WebServer to send Tweets directly or use the feature that the Website ThingSpeak has (but of course in the last case, only messages "triggered" by a condition based on the data captured and uploaded will be send).

    Python code:

    from twython import Twython
    C_KEY = "xxxxxxxxxxxx"
    C_SECRET = "yyyyyyyyy"
    A_TOKEN = "zzzzzzzzzzzz"
    A_SECRET = "wwwwwwwww"
    api = Twython(C_KEY, C_SECRET, A_TOKEN, A_SECRET)
    api.update_status(status="IoT Capstone Project - Tweet test")

    Note that your Tweeter account must permit that you send a tweet from the RPi. Also you must got the KEYs from Twitter in order to use the TWYTHON library available for RPi.

    Another simple solution as explained before, is to send a Twitter directly from the WebSite. In this case the ”React” feature of ThingSpeak.com can be used.

    Step 10: Connecting the RPi and the Arduino Using Serial Communication

    The Arduino used was the NANO that is powerful as the UNO but on a "small form factor.

    For tests purposes, A potentiometer was connected to Arduino analog port A0 and the value transmitted via Serial to the RPi. In the test, RPi will be reading a Keyboard (connect to it or thru VNC) and depending on the command, the Arduino LED will be toggle "ON/OFF".

    Bellow the Arduino and Python code used on the tests:

    Step 11: Sending Arduino Data to the Web

    Once the Arduino is connected with RPi, additional data captured by Arduino can also be sent to the Web, together with other data captured by the DH11 and LDR.

    The Python code used to send data to the Website was changed to also included the data captured by Arduino (potentiometer value).

    Step 12: Testing the Rover Motors

    At this point, the Rover will start to be assembled. I decided to disassemble all sensors and start from zero the "Arduino Phase". Once the Rover was properly working, the RPI and sensors were reassembled.

    For motors, 2 continuous servos (SM-S4303R) were used. Those servos will run with a speed depending of the pulse width received on its data input.

    • For this servo, the pulse width goes from 1.0ms to 2.0ms (other servos can work with different pulse width)
    • A pulse of 1.5ms will position the servo at Neutral position, or ”stopped”.
    • A pulse of 1.0ms will command the servo to full speed (around 70 RPM) in one direction and 2.0ms full speed in the opposite direction.
    • Pulse between 1.0 and 1.5ms or 1.5ms and 2.0ms, will generate proportional speed.

    The first thing that must be done, it is send a 1500ms pulse to verify if the motors are "stopped". If not, the servos must be adjusted to full stop (look for the yellow bolt, bellow the servo). Of course if your servo does not have this adjustment, try change the "1500ms" value until you get the full stop.

    Servo leftServo;
    Servo rightServo;
    Void setup()
    void loop()

    The code bellow, can be used for a complete Rover motor test (forward, Backward, Full stop, turn Left, turn right). If necessary you must adjust the delays for the required turn angle depending of your motors (also, sometimes left and right pulse values should be a little bit different to compensate any lack of balance of the motors.

    Step 13: Assembling the Rover Structure and Adding Remote Control

    First of all, note that the Robot or "Rover" is a prototype for educational purposes and was built with elastic band, wood and clips jointing the original parts. It is very simple, but works fine for what it is intended.

    For the Remote Control, an Android device was the chosen one, because it is very easy to develop an app using the MIT AppInverntor2. For this project, I developed the app shown in the photo.

    At the Arduino, a HC-06 bluetooth module was used. You need more information about how to use this module, please see my instructable:

    Connecting "stuff" via Bluetooth / Android / Arduino

    The complete Arduino code (the previous one + BT) is available at file below (do not forget that the 3 files, must be inside an unique folder:

    Step 14: The Android App

    The app is very simple. It has:

    • 5 buttons for direction control (FW, BW, Left, Right, Stop). When each of those buttons are pressed, a character is sent via Bluetooth to the HC-06 and the command is executed by the Arduino.
    • 1 Slider for Camera movement. A numeric value is sent from 0 to 100. This value will be "mapped" at the Arduino to move the camera depending the Servo angle range (in my case something from 20o to 160o)
    • PiCam IP address input. Use the button to store it.
    • Send/receive text to Arduino (use the "paper plane" button to send it)

    Bellow the . aia file if you are familiar with MIT AppInverntor2 and the .apk if you want only install and run the app in your phone.

    Step 15: Sensors for Obstacle Avoidance

    For obstacle avoidance, an ultrasonic sensor (HC-SR04) will be used. The sensor will be mounted over a 180o servo motor, in order to increase the area to be searched. Note that the servo will be also used as a base for the Pi-Camera. So, the user can have a a bigger view of the area to be researched. A slider at Android app will control the camera angle.

    The sensor works sending a sound pulse at trigger pin (2us LOW; 10us HIGH) and register how many microseconds the reflection of the pulse takes to return to echo pin (remember that sound travels at 340m/s). The function ”int distMeter()” is used for this calculation.

    In case an obstacle is found at 20cm (at front), the rover will stop, light ON the LED and run back a few centimeters. The video shows the tests with the Rover.

    The complete Arduino code (the previous one + obstacle avoidance and search servo control) is available at the files:

    Step 16: Final Integration RPi + Arduino

    At this stage al individual parts were tested and it functional. Now both, the Rpi and Arduino must be integrated. Note that I only included the LDR that measure the light (not the On-Off).

    1. First, run the Pi-Cam python program at flask directory, testing the camera:
      • sudo python app.py
    2. Once you can see the camera video use CTRL-C to liberate the monitor to enter the main Python code.(same as the one used at step 11).
    3. Check the sensor values (heat the sensor, cover the light sensor, etc). See the result at monitor and at webSite:
      • sudo python /home/pi/Desktop/Iot_capstone/iot_temp_hum_light_pot_ardu.py
    4. Run the Arduino sketch.
    5. Move the rover with the Android app and check the video and the sensor values.
    6. Check if the rover stop on an obstacle.
    7. Monitor the Website and see the environment data continuously been displayed.

    The video shows the complete prototype being controlled by the Android app, capturing sensor data and showing at the Internet.

    Bellow the final Arduino codes:

    Step 17: Mars Rover Landed in London!

    I am very happy that this Instructable called the attention of MagPI, the Rapsberry Pi official magazine, that published in its last May issue an interview about the project. The full magazine can be download in PDF form the link: MagPi edition 45

    Step 18: Conclusion

    That is it! If everything works properly, yours "Mars Rover emulator" is ready!


    In the future would be interesting to include some features at this project:

    • Guide the Rover over internet.
    • Addition of a robotic arm, so the Rover could do some mechanical work as obstacles remotion, samples collection, etc.
    • Solar panel forPower supply.

    So, we can really have a more realistic "Mars Rover emulation" Thanks and I hope that this project can help others to learn about Raspberry, IoT, Arduino, Robots, etc.

    The complete updated code can be find at GITHUB:


    For more tutorials and tips, please visit my Blog:


    <p>excelent as allways. </p>
    <p>Wow,amazing project!And i want to know what is sampling rate of camera?</p>

    About This Instructable




    Bio: Engineer, writer and forever student. Passionate to share knowledge of electronics with focus on IoT and robotics.
    More by mjrovai:WiFi Voice Controlled Robot With NodeMCU Voice Activated Control With Android and NodeMCU When IoT Meets AI: Home Automation With Alexa and NodeMCU 
    Add instructable to: