I wanted to make a toy for my son, a toy that could interact easily, so I thought about making a robot that would do facetracking, that could interact with him through touch and express emotions.
I do not have much knowledge of 3d design, so I started with a design that I found in a thingiverse that could be adapted to my needs using Tinkerkad (https://www.tinkercad.com/things/1Qq7jjOXoHh) and (https://www.tinkercad.com/things/hJjcvy2X9Oy)
Little Timmy follow with the head the people who stand in front, you can caress his head and he will emit sounds of emotion, and if you caress many times his head, he will show hearts in his eyes.
You can program new behavior, for example, speech recognition like Alexa, follow with head diffrent objets...
Teachers! Did you use this instructable in your classroom?
Add a Teacher Note to share how you incorporated it into your lesson.
Step 1: First Gather All Parts and Tools
1 Raspberry pi 3
1 Raspberry pi camera
1 Arduino or Genuino Nano V3.0 ATmega328
1 Mini usb cable
2 servos sg90 (for pan and tilt)
2 mini oled 128x64 pixel (for the eyes)
1 buzzer (for sound)
1 touch sensor (to interact with the robot)
1 shield for arduino nano
Many Dupont F/F cable connectors
Step 2: 3D Print Settings
Little Timmy is very easy to print, I used blue color to head and body, and white color to hand and legs, for the eyes a used transparent filament,
The settings are:
Step 3: Assembly
The first thing is join the arms, hands, legs and feet I used small screws that I had at home, although you can use glue.
The second is put the servos to make a pan and tilk with the head. A servo is inside the body and the other is inside the neck.
I used glue to join the lcd eyes, touch sensor, camera, buzzer. My intention is in the future to modify the design to allocate the components without using glue.
Step 4: Electric Connection
To facilitate the conexion I used a Arduino Nano Shield.
The connection scheme is as follows:
Pin D7 --> Touch sensor
Pin D4 --> Axis X servo
PinD5 --> Axis Y servo
Pin D12 --> Buzzer
Both oled screens are connected to the same pins:
SDA -> A4
SCL -> A5
The Arduino and raspberry are joined by usb.
Step 5: The Code
To implement facetracking I used open cv library in a Raspberry, I modified an example that I found on github to send a command to Arduino and arduino controlled the servos, sensor and eyes.
To coding the toy you need:
Raspberry with raspbian and opencv library and python.
You can found the Arduino code and python code to raspberry on my github (https://github.com/bhm93/littleTimmy)
You must execute the program face-track-arduino.py in your raspberry to activate the facetracking.
Participated in the
1 Person Made This Project!
Amvarsha made it!