Introduction: Sup! Community Storytelling Experience IoT Kiosk
This is a brief tutorial on how we created "Sup! Community Storytelling Experience" for the Young Maker Competition with a focus on community development and energy conservation.
This experience includes an Internet of Things (IoT) physical control coupled with a digital online piece. The focus was to address the need for community engagement with visual prompts for action conveyed through IoT to provide a seamless surprising video chat experience.
This hack took four days total from learning the nuances of doing IoT prototyping using the Intel XDK IoT Edition, to figuring out the user experience, and implementing everything seen here.
Each station requires the following hardware components:
- Intel Edison with Arduino Board
- Grove - Starter Kit Plus (includes the sensors used below)
- Grove Button
- Grove LED
- Grove LCD RGB Backlight
- A system able to run as a websocket server -- ie. DigitalOcean
Teachers! Did you use this instructable in your classroom?
Add a Teacher Note to share how you incorporated it into your lesson.
Step 1: The Concept
Sup! is about creating a simple way for people to communicate across great distances. We purposely focused on an IoT type of usage to start the experience so that there is no requirement for people to bring their own mobile phone.
A single physical button brings two anonymous people together. Communities are given the opportunity to chat ad hoc. You never know who is on the other end until you press the button. In many ways, this is bringing a physical kiosk setting to a Chatroulette-type experience.
Communities can strengthen through regular no-cost/low-cost contact.
We are mindful of energy-centric concerns especially in crafting a solution that can work around the world at different deployment contexts (low bandwidth, low power).
With Sup!, who knows who you will talk to?
Step 2: Infrastructure Overview
Each station has both a network connected button and a kiosk display.
The kiosk only provides a view into a single website. We purposely designed it this way so that you can use barebones computer solutions to create your kiosk design. The only hard requirement is having a webcam connection and being able to run a modern web browser like Chrome.
When a user holds down a button, the light on the other button lights up. This prompts the other user to press down as well.
On the server side, video communication is facilitated via WebRTC using the fantastic API provided by Icecomm.io. We have a dedicated dynamic webpage that changes state based off of button press.
When both buttons are detected as being down/active, the web connection is made between the two kiosks and video feed is initiated. If either button is released, the video feed quickly disappears.
Video feed will stop at anytime when a button is released, on either side.
This interaction is purposeful to:
- Allow for low bandwidth usage. A webcam and subsequent video feed is not running all the time.
- Allow for a visual feedback that a conversation can start.
Step 3: Physical Control Setup
We used three primary sensors to shape the physical button experience. All was included in the Grove Starter Kit for Arduino.
- Button: Used to determine presence of another person at remote location. We focus on detecting the down state and conversation will only happen if both buttons are pressed down.
- LED: Used to show people that someone is waiting on the other side to chat.
- RGB Backlight LCD: Provides hardware level feedback on call-to-action
All of these sensors are hooked to the base shield and layered on to the Intel Edison board.
For deployment, you can keep it on the power connection or run the button off of the 9V battery.
In our prototype, we used LEGO bricks to bring all the sensor pieces together into a single blocky form factor.
Step 4: Demo
The demo shows the current implementation of the button-to-video-chat experience. The button lights up because the other participant has their button down. Once we push the button down on our end, the video chat is created and initiated peer-to-peer.
In the back half of the video, you can see how the buttons react to each other. Pushing one button lights up the other button's LED, vice versa. The message on the LCD provides context to the user as to what is happening.
Step 5: Future Possibilities
We think there are many possibilities for extending this current experience in the future. Some possibilities:
- Adding more stations (with kiosk and button) to allow for large group video chat.
- Focus on just the button to allow for a large group to make a decision.
- Use the double confirm buttons to unlock teenager car keys
- Buttons pressed to open a door to a house