Introduction: Trashly

With this instructable, you can implement Trashly, the automatic system that helps you to correctly perform the collection of domestic garbage.

Trashly is a system that is activated as soon as the user approaches the recycling bins, a photo of the object to be recycled is taken and a third-party cloud service identifies the content of the image. At this point the system, on the basis of the information received, determines the type of object to recycle and opens the correct bin.

Step 1: Requirements Hardware

  • Intel Edison
  • Arduino Expansion Board
  • Grove Base Shield (in Grove Starter Kit Plus)
  • 3 Motori Servo 180° (in Grove Starter Kit Plus)
  • 3 Grove Smart Relay (in Grove Starter Kit Plus)
  • 3 LED Strip 1 Grove LED + green LED (in Grove Starter Kit Plus)
  • 1 Sharp IR sensor (2Y0A21 F 03)
  • 1 webcam
  • 2 power supply voltage 12V 1,5A


For the realization of Trashly we wanted to use only recicled materials found at the moment or some component taken from electronic waste. What better way to present a system for recycling through the recycling itself!

Step 2: Requirements Software

  • NodeJS
  • Librerie Node:
    • request
  • FFMPEG
  • UVC driver

Step 3: Setup Edison

To get started with Edison if you have never set it up before, see this post. The most relevant sections are "Connecting Edison" and "Connect Edison to WiFi". Although the instructions are for the Arduino breakout board, setup is similar for the Mini breakout board: Snap Edison onto the left side of the board, then connect two micro USB cables to the board and to your computer.

The setup assumes that Edison and your computer are on the same Wi-Fi network. With Edison and your computer on the same Wi-Fi network, it is also possible to connect to Edison wirelessly via SSH. This is particularly helpful when running the demo. To do so, open a new terminal window and type the following:

$ssh root@myedison.local 
root@myedison.local's password:<br>root@myedison:~# 

Replace myedison with the name of your Edison. When prompted for your password, use the password you created when configuring Edison.

Step 4: Install and Configure Webcam

Use a UVC-compatible webcam. In my setup, I am using the Logitech E3500 quickcam.

External power (7-15 VDC) must be supplied to use Edison as a USB host. Refer to the appropriate item below (based on the board you have) to power and connect a USB device:

  • If you have the Arduino breakout board, see this document. Power must be supplied on J1 (the power jack). Plug the webcam into the USB port next to the power jack. Make sure the switch SW1 is switched towards the USB port.
  • If you have the Mini breakout board, see this document. Power must be supplied on J21 / J22, e.g. a 9V battery can be connected to J21 with a 2-pin connector. Connect a micro USB to USB OTG adapter to the webcam and plug into the micro USB port closest to J21 (lower right).

Configuring the package manager

Edison's operating system is based off Yocto Linux, which uses opkg as its package manager. AlexT's unofficial opkg repository is highly recommended for adding packages to Edison. It includes many useful packages, such as git and the UVC driver.

To configure the repository, add the following lines to /etc/opkg/base-feeds.conf:

<p>src/gz all http://repo.opkg.net/edison/repo/all<br>src/gz edison http://repo.opkg.net/edison/repo/edison
src/gz core2-32 http://repo.opkg.net/edison/repo/core2-32</p>

Update opkg:

opkg update

To check whether or not the UVC driver is installed, type the following:

find /lib/modules/* -name 'uvc'

If the UVC driver is installed, the output should look something like this:

/lib/modules/3.10.17-poky-edison+/kernel/drivers/media/usb/uvc

Now install FFMPEG : read this post

Step 5: Connections

Every basket is controlled indipendently an connections are chosen compliance with the possibility offer by Grove Base Shield (not every pin, as example, can be used to drive the Servo). Defining the three baskets as Waste (W) ,Organic (O) AND Plastic (P) the following link with the Grove Base Shield are made:

W-Servo → D3

W-Relay → D4

O-Servo → D5

O-Relay → D7

P-Servo → D6

P-Relay → D8

LEDs are connected to a common ground as shown in Figure (aggiungi figura ConnessioniAlim.jpg), directly to the negative pole of the first power supply. The positive (red) of the LED strips enters the connector of the corresponding relay. The positive terminal of the power supply directly enters the second input of the relay connector.

The IR sensor will be so connected to the Grove Base Shield:

Red → 5V

Black → GND

Yellow → A0

Webcam is connected to the USB port of Arduino Expansion Board that is powered by the second power supply.

Step 6: Assembly

We proceed to the assembly of the structure, starting with the acquisition system. Webcam and proximity sensor are fixed together and calibrated to obtain a clear image about 20cm away from the lens. The sensor is oriented towards the focal point and the corresponding value is used as a control of Object detection. We advice to orient the webcam towards the floor, so as to maintain a uniform background for each photo, and simplify the object recognition. The green LED was placed as a reference and object detection alert at the right distance.

The three baskets are made from cardboard boxes, gluing the upper wings toghether.

In the closed side was placed the corresponding Servo, so that the movement of 90 ° allow easy opening and closing (WARNING !! with the mini Servo in the Grove starter kit, you have to take care to positioning and the degrees of movement, or even a seemingly slight layer of cardboard cancan force them up to make them burn .. and it happens, trust me!). The pivot was created by combining two levers supplied with servo as shown in Figure . In our case, The fastening of servo has been carried by a recicled tape drilling through the cardboard . The LEDs have been applied on the front of the boxes that were then fixed side by side (and beautifully decorated, as you can see !!;) )

Step 7: Program Intel Edison in Node.js

0) take picture

using bash script to call ffmpeg take a picture and save it

<p>/home/root/bin/ffmpeg/ffmpeg -s 320x240 -f video4linux2 -i /dev/video0 -vframes 1 test.jpeg</p>

1) module for the image analysis

We have created a module nodejs that takes as input the image taken by the webcam and sends it to cloudsight to be analyzed; then interrogates the server until the analysis is completed.

2)Module for the semantic analysis

The next step is to categorize the previous results, getting preferably the category of the object (such as plant, software, etc.). If this is not available, are analyzed relationships and properties of the object (soft, elastic, wooden, etc.).

3) Creation of a knowledge base

At this point we supervised training of a machine for creation of a knowledge base of waste categories of our interest: plastic, damp and undifferentiated. We subjected the machine a number of known images and we taught him what categories belonged.

4) Evaluation of unknown images

It now remains to teach the machine how to deal with waste unknown. The image is analyzed by the modules described above, returning a set of categories to which it belongs, or, if not available, a set of properties'. A this data is assigned a score for each type of waste that we want to recycle; the highest score is interpreted as the trash in which must 'be thrown rejection. In case of indecision, will be chosen the undifferentiated.