Introduction: Eggy, (scientific) Social Signal Pi Robot

About: Just a guy who likes making stuff and loves chocolate.

Hello maker! I put a lot of effort and time in making eggy and this indestructible. It would mean the world to me If you vote on me in the contest I'm participating. (click in the top right corner of my indestructible). Thanks! -Mark

Robots will become more integrated in our daily lives and it is desired that they will be self-reliant. That also means humans have to be able to understand a robot’s intention. This research tries to answer the question whether a robot needs to express emotions to let people know its’ intentions. To this end a non-human-like robot was developed. The experiment conducted with this robot leads to the conclusion that observers tend to easier attribute a goal to a robot that expresses emotions than to a robot that does not express emotions.

Meet Eggy! A social robot with an animatronic tail! As a bonus it was (scientifically*) tested if the incorporated social signals are clear to observances (p=23).

Eggie is a build a simple driving robot based on a Raspberry pi.

Since I'm a student I couldn't spent a lot of money so some smart poor-mans solutions where needed :).

It kind of behaves like a vacuum robot; it drives around until it bumps in to something. Than it moves back, turns one wheel on so it turns 90 degrees and drives forward till it bumps into something.

What makes this robot special are its social signals (non verbal communication). It has eye's witch are a webpage is displayed by a old Phone. It also has a tail! More about this in the following steps!

*The the best of my ability

Step 1: Background

Imagine you walk into a room and there is an infant who tries to walk. The infant falls, tries different methods, searches for things to hold on to. You take a look at the infant and see a happy focused expression. You could judge that the infant is happy and tries to function in the surrounding world. You let it be because it is developing itself. But what if in the same situation, the infant is crying. This signals to you that the infant is in distress, it may need some assistance or there is something seriously wrong (e.g. it is stuck and can not get free).

This is an example where emotional expression helps the interaction between humans to communicate each others intentions.

As many say, robots will become more integrated in our daily life. Some will be used more as a butler or guide (like Pepper) others will be less human-like (like Roomba). Both operate in a human environment and are prone to interact with humans. As stated before, this environment changes a lot and therefore it is of high importance that the robot keeps on learning and developing [1]. This learning process is trial and error.

Of course it is desired that robots will be self-reliant. But how likely is this? We think that emotional expression is a necessity that helps to understand the intentions of a robot. And sometimes a human, or maybe a robot, needs to make judgment calls (e.g. to intervene) about another robot. We would like to investigate this assumption by building a (non-)human-like robot that moves through a room. Every time the robot bumps into something, it stops, expresses an emotion and then chooses a different direction to continue. If it bumps on its’ right side, it moves left and vice versa. Moerland et al. [1] states that “emotions are intimately connected to adaptive behavior”. Four emotions in the reinforced learning model are selected (joy, distress, fear, hope) which are expressed by a value of how likely the robot is to get the reward in that moment. We would like to focus on the expression of these signals. We think they are important for the understanding of such a (reinforced) learning robot in order to communicate its current state and intentions.

In this project we would like to investigate if emotional expression supports understanding of the intentions in a Human-robot Interaction (HRI) setting. In other words: can people understand the intentions of the robot with emotions better than without?

Step 2: Design

I've designed Eggy in Tinkercad. As you probably know; Tinkercad is a free webbased 3d drawing program witch is easy to use. Attached you find imagery of the model; in side and outside. I've also attached the STL files so you can print the parts yourself!

Later on I added a holder for the phone, so I've printed that seperetly. I've printed everything on a prusa i3 mk2.5, sliced with sli3r in 200 micron.

In the body dome (the egg) of the robot you find place for a 5v powerbank (the ones you use for your phone), a raspberry pi, servo motor and geared brushless DC motors.

Everything is screwed down on the black frame. Ive infused in the frame diy 3d printed micro switches. They are inspired by the onse you find in your computer mouse. You only have to slide in a tact switch.

After testing I've notices that the contact area of the lever is too small. That is why I added a bumper later on. This gives the switch a bigger contact area so it is more likely the button be pressed when it bumps in the wall.

Step 3: Hardware

As said, the hardware is a raspberry pi. Together with some electronics (listed below) it gives eggy life!

One thing I would do differently is make to make a power switch outside the dome for easy power access.

The robot is built with the following materials:

  • Raspberry Pi Model B
  • 3D printed shell
  • 3 Wheels, of which 2 DC motorized and one 3d printed
  • Power bank (5V, 1500 mA)
  • Battery pack (4x 1,5V)
  • iPod Touch 2nd generation
  • 2 DC motors
  • Servo 5 tacktswitches
  • 1 motor driver; L293D motor driver (1) This IC makes it possible to drive the wheels independently forward and backwards.


  • Raspbian Stretch Lite
  • Python 2.7 Python
  • libraries: wiringpi, RPi.GPIO

The maze is made with the following materials:

  • 3 Tables
  • 3 Desktop pc’s
  • 1 Closet

I used "input pull up" for the buttons so there is no need for resistors.

Step 4: Software (pi)

This is the software, written is python. Screenshots are of the debug window.

The software is the brain of the robot. This is how the robot 'thinks':

The begin (standard) mode is the "normal" mode. The tail is not tence and the eyes are in normal.

In order to display positive emotions the robot must not bump into something for 20 seconds. Then it displays the "hopeful" emotion. When it doesn't bumpin for another 20 second into anything it displays "happy".

When it bumps into something when it is hopefull or happy - it goes back to normal.

Then, when it does bump into something 5 times, it becomes sad. When it then bumps into something for another 5 times it becomes distressed.

Step 5: Social Signals

For this research we intend to make a physical robot, called “Eggsy”. This robot will be able to move freely through a room and sense its’ environment. This will be achieved by a robot with 3 wheels, two of which are motor powered. The robot is able to detect when it bumps into an object by microswitches. When this happens it should go in a different direction.
We design and implement the four emotions (joy, distress, hope and fear).

There are two scenarios, one in which the robot expresses emotions and one in which it does not. The robot expresses 5 different emotions in the following scenario’s:

  • Initially the robot expresses a neutral emotion.
  • When the robot bumps into something while expressing hope or joy it expresses a neutral emotion.
  • When the robot bumps 3 times into something within 20 seconds it expresses fear.
  • When the robot bumps 5 times into something within 20 seconds it expresses distress.
  • When the robot hasn’t bumped into something for 10 seconds while expressing fear or distress it expresses a neutral emotion.
  • When the robot hasn’t bumped into something for 20 seconds it expresses hope.
  • When the robot hasn’t bumped into something for 30 seconds it expresses joy.

Once the robot is functional it will be evaluated by observers. They are asked to observe the robot exploring its environment. We then ask observers through a survey whether they recognise the emotions of the robot and what they think the intention of the robot is. Apart from that we may ask to what they think the robot is feeling at certain times during the observation.

Step 6: Experimental Setup / Approach

Then its a matter of turning it on!

The emotions of eggy was tested with people in a 'arena'. We tested with and without with social signals. We asked observant to observe eggy when he/she is driving around. The maze consisted of an obstacle area where the robot would bump into obstacles very often and thus would get to the ‘distress’ state very easily. On the other side of the maze was an area where the robot could move freely and therefore could reach the state of ‘joy’.Then they were asked to fill in a questionnaire. This are the results:

(see images)

  • Less noticeable intentions when no emotions are shown
  • Different intentions identified when no emotions are shown
  • Speed variation would be a great social signal

Because the results from this research satisfy both the hypotheses it can be concluded that people understand the intentions of a robot with emotions better than without.

Hello maker! I put a lot of effort and time in making eggy and this indestructible. It would mean the world to me If you vote on me in the contest I'm participating. (click in the top right corner of my indestructible). Thanks! -Mark

Raspberry Pi Contest 2017

First Prize in the
Raspberry Pi Contest 2017

Epilog Challenge 9

Participated in the
Epilog Challenge 9

Wheels Contest 2017

Participated in the
Wheels Contest 2017