Introduction: Little Timmy Robot

I wanted to make a toy for my son, a toy that could interact easily, so I thought about making a robot that would do facetracking, that could interact with him through touch and express emotions.

I do not have much knowledge of 3d design, so I started with a design that I found in a thingiverse that could be adapted to my needs using Tinkerkad ( and (

Little Timmy follow with the head the people who stand in front, you can caress his head and he will emit sounds of emotion, and if you caress many times his head, he will show hearts in his eyes.

You can program new behavior, for example, speech recognition like Alexa, follow with head diffrent objets...

Step 1: First Gather All Parts and Tools

1 Raspberry pi 3

1 Raspberry pi camera

1 Arduino or Genuino Nano V3.0 ATmega328

1 Mini usb cable

2 servos sg90 (for pan and tilt)

2 mini oled 128x64 pixel (for the eyes)

1 buzzer (for sound)

1 touch sensor (to interact with the robot)

1 shield for arduino nano

Many Dupont F/F cable connectors

Printed Pieces

Step 2: 3D Print Settings

Little Timmy is very easy to print, I used blue color to head and body, and white color to hand and legs, for the eyes a used transparent filament,

The files modified for the toy is in and the original files is in

My Tinkerkad ( and (

The settings are:


Supports: No

Resolution: 0,2mm

Infill: 20%

Step 3: Assembly

The first thing is join the arms, hands, legs and feet I used small screws that I had at home, although you can use glue.

The second is put the servos to make a pan and tilk with the head. A servo is inside the body and the other is inside the neck.

I used glue to join the lcd eyes, touch sensor, camera, buzzer. My intention is in the future to modify the design to allocate the components without using glue.

Step 4: Electric Connection

To facilitate the conexion I used a Arduino Nano Shield.

The connection scheme is as follows:

Pin D7 --> Touch sensor

Pin D4 --> Axis X servo

PinD5 --> Axis Y servo

Pin D12 --> Buzzer

Both oled screens are connected to the same pins:

SDA -> A4
SCL -> A5

The Arduino and raspberry are joined by usb.

Step 5: The Code

To implement facetracking I used open cv library in a Raspberry, I modified an example that I found on github to send a command to Arduino and arduino controlled the servos, sensor and eyes.

To coding the toy you need:

Arduino IDE

Raspberry with raspbian and opencv library and python.

You can found the Arduino code and python code to raspberry on my github (

You must execute the program in your raspberry to activate the facetracking.

Toys Contest

Participated in the
Toys Contest