Introduction: Zoio-Laser Project

Picture of Zoio-Laser Project

This project aims to use the Hokuyo URG-04LX Laser as an input sensor (virtual eyes) that can detect obstacles in the environment. The laser sensor data is acquired from the USB-Serial port, being processed into the Edison Processor. The acquired data can be visualized through the web. We generate a graph plotting (using gnuplot) obtaining a SVG file, that is made available in the Edison webserver (http). The laser data is used to monitor the relative angle of the closest object, and 2 servos with "eye globes" (painted balls) look to the direction of the obstacle. The servo motors are controlled using python pwm routines. People can interact with these moving eyes (it is very fun!).

Step 1: Materials: Eyes + Servos + Hokuyo Laser + Intel EDISON

Picture of Materials: Eyes + Servos + Hokuyo Laser + Intel EDISON

This project uses the following materials (detailed in the next steps):

  • Eyes Balls: 2 x Hand painted styropor balls (Polystyrene Sphere)
  • Servo Motors: 2 x Tower Pro 9g Micro Servos
  • Glue and double sided adhesive tape
  • Teeth: 2 x candy teeth (Fini) for the mouth
  • Double sided adhesive tape
  • Intel Edison board running Yocto Linux
  • Mini protoboard and small jumpers cables to conect servos with the Arduino connectors into Edison Board

Software

  • Yocto Linux Firmware Release 2.1
  • Gnuplot 4.6.5
  • URG Library 1.1.9: http://sourceforge.net/projects/urgnetwork/files/urg_library/
  • Python (pre-installed on Edison)
  • C Compiler gcc (pre-installed on Edison)

Step 2: Sensor Hokuyo URG-04LX-UG01

Picture of Sensor Hokuyo URG-04LX-UG01

The adopted "eye sensor" was a Laser Sensor Hokuyo URG-04LX-UG01 is a LIDAR (Laser based Range Sensor)

Information about this sensor can be found in the Hokuyo Site:

http://www.hokuyo-aut.jp/02sensor/07scanner/downlo...

The datasheet is also available at:

http://www.hokuyo-aut.jp/02sensor/07scanner/downlo...

Software Library: (Linux)

http://sourceforge.net/p/urgnetwork/wiki/Home/

urg_library-1.1.9.zip => http://sourceforge.net/projects/urgnetwork/files/...

Summary of this sensor main features:

・Supply voltage 5V ( Supplied from USB )

・Measurement distance 4m (max.)

・Field of view 240°

・Interface USB

Step 3: Teeth (Robot Face Mouth)

Picture of Teeth (Robot Face Mouth)

We used teeth candies from Fini :)

This teeth helps to compose the "face" of our robot, which includes 2 moving eye balls and the mouth (teeth). In a future version we plan to add also a servo motor to move the "lips" (superior and inferior parts of the mouth).

The eye balls (within the servos) were attached using a double sided adhesive band to the Edison box, and also, the teeth were attached to this box below the eyes.

Step 4: Eye Balls

Picture of Eye Balls

The "eye balls" are composed by two hand painted styropor balls (Polystyrene spheres). The balls were painted using a marker pen, drawing a small black circle with an other blue circle outside of this first one.

Construction:IntFirstly we attached a Lego block (small 4 dots lego block) with a double sided adhesive band to the sphere;

  • After that, we attached the servo top mounting piece to this Lego block;
  • And, finally, we attached the servos to the superior piece (usually provided within the servos).

The adopted servo was a Tower Pro 9g Micro Servos:

http://www.dfrobot.com/index.php?route=product/pro...

(it can be replaced by any other servo motor with 180 degrees of rotation)

Servos has 3 wires: VCC (red), GND (Black or White), Control (Orange or Brown)
The VCC and GND wires were connected to a small breadboard and then to the Edison, using the correspondent arduino VCC(+5)/GND(Ground) pins. The Servo Control wires were connected together (1 single pwm signal is sent simultaneously to both servos, since the angle is the same for both). If you want you can use separated pwd signal controls for each eye ball.

Step 5: Intel Edison - Software

Picture of Intel Edison - Software

The system runs entirely on an Intel Edison board (embedded system):
http://www.intel.com/content/www/us/en/do-it-yours...

The software is composed of 2 different modules:

MODULE 1 - Eye tracking of objects: implemented functions
File: Laser-Perception.c and pwm3.py (python script)

  1. Data acquisition from the Hokuyo URG
  2. Detection of the obstacle angle
  3. Send the angle to the servo routine (in python)
  4. Python program turns the servos to the specified angle (pwm3.py)
    Python Script: "python pwm3.py <angle> &"
  5. Restart the loop (step1)

MODULE 2 - Web Server providing the Hokuyo URG perception map
File: Laser-Web.c and gpl.txt (gnuplot script)

  1. Data acquisition from the Hokuyo URG
  2. Conversion from polar coordinates (angle/distance) to cartesian (x,y)
  3. Write the coordinates to a file
  4. Execute a gnuplot script - Plot the perception map (gpl.txt)
  5. Save the gnuplot image into a file
  6. Copy this file to the Intel Edison web server
  7. Restart the loop (step1)
    NOTE: Web server configuration file updated...
    File: edison-config-server.js (path: /usr/lib/edison_config_tools )
    WebServer files are placed at: /usr/lib/edison_config_tools/public

Source codes available at:

http://osorio.wait4.org/Intel-Edison/

Step 6: IntelMaker IoT - ZoioLaser Team

Picture of IntelMaker IoT - ZoioLaser Team

ZOIO LASER

This project was developed during the #IntelMaker #IoT BRAZIL - São Paulo at Insper / June 2015

Team:

  • Fernando Osorio (USP - ICMC - LRM Laboratório de Robótica Móvel)
  • Gabriel Sobral (Unicamp)
  • Nikolas Makiya Vichi (Unicamp)
  • Paulo Ormenese

Video:

http://osorio.wait4.org/Intel-Edison/zoio-laser.mp...

Comments

DIY Hacks and How Tos (author)2015-06-27

Nice design. Thanks for sharing