Introduction: EdiShake (Unfinished Project)

Introduction

The idea behind EdiShake was to create a proof of concept that it was possible to create a way for a smarter, convenient and faster method of information exchange between two people just from shaking their hand. Whether this shake occurs during a meeting or at a casual meetup, the two people wearing EdiShakes who shake hands will have the opportunity to share their contact information virtually in a worry free manner.

EdiShake uses two sensors with the possibility for additional future sensors to be added. The base two being used are the MMA7660 accelerometer and the Grove GPS which we got from the Grove Starter Kit Plus and the Grove expansion pack, Transportation and Safety respectively.

The entire project was created during Intel IoT Roadshow in Toronto in Nov 2015. Unfortunately due to the lack of time/lack of experience working with the Edison board/it's sensors, we ran across various issues that resulted in an unfinished project but we have provided detailed steps about our thought process and the way we planned the EdiShake.

Step 1: How It Works

Process Outline:

Goal: Device will serve as a medium of information exchange through recognizing a handshake gesture.

Steps needed to accomplish this:

1. Accelerometer

When a person shakes their hand with someone else, the accelerometer sensor which is ideally attached to their arm will identity this handshake gesture and relay a signal to the GPS sensor for it to activate.

2. GPS

Once the GPS sensor is activated, it will record down the time and coordinates of the instance the accelerometer recognized the handshake gesture into the Edison's onboard flash memory.

3. IBM BlueMix

Once the GPS records its data, the Edison will upload this information to a server cloud whenever it gets connected to the internet (Does not have to occur immediately after the handshake). The server will do some backend calculations to determine exactly which other device matches the given time/gps coordinates and will forward the device name to the front end client.

4. User end client

The client will find the user's profile (with whatever information they decided to input) that the calculated device is linked to and send that profile back to the first person for the first person is access.

Step 2: Components

Hardware

Components:

- 2 Intel Edison Development Board with Arduino

- 2 accelerometers (Grove Sensor MMA7660 3-Axis Digital Accelerometer)

- 2 GPS sensors (Grove GPS)

- 2 Grove Base Shields

- USB cables

- Power Supply/2 external batteries for the two edison boards

Software we used:

-Eclipse IoT edition using C++ language

-GitHub

Step 3: The Accelerometer

Detailed Explanation

The accelerometer's goal was to recognize a handshake which would activate the GPS in the next step.

However, recognizing a handshake is not as simple as recognizing a simple linear motion as it needed to involve the use of all 3 axises. We did some research and realized that there were very limited resources available for current gesture libraries. We only found 2 well polished ones called Wiigee and uWave.

Wiigee was problematic as it was designed to recognize gestures from a WiiRemote and would have been too difficult to port into the accelerometer sensor on the Edison given the extremely limited timeframe during the competition.

uWave on the other hand seemed to be the perfect solution as it was a library written by Zhen Wang,a current Googler for his undergraduate or graduate thesis and could be used on any accelerometer in general. A short explanation of how it works:

-There are two modes the user can use in the library:

1. The first records a predefined gesture by the user into a log file using the raw acceleration values

2. The second will try to see if any gesture the user does matches the record one.The second step is ultimately the more complex one as it does a series of substeps:

a. It does step 1 but instead of writing to a file, it just records in an array

b. Using quantization, it converts the raw analog acceleration values into digital ones

c. It compares these values to the original predefined gesture using an algorithm called Dynamic Time warping (DTW).

d. Based on the comparisons, it will recognize the gesture either as accurate enough to resemble the original recorded or not.

Ultimately, we tried to import uWave into our sensor but there were some unknown errors possibly due to incorrectly converting the c library of uWave into our C++ program which had the library of our sensors. We ended up trying to simplify the code in uWave to import into our code but we ran out of time/quantization and DTW are fairly complex algorithms. I have provided the problematic code we had in two versions, the first was trying to import the uWave library and the 2nd version tried to write a simplified version of it (along with using a button sensor/lcd screen for debugging purposes).

Here's Wang's website with the uWave library for anyone interested as well and the wikipedia page on DTW algorithm in specific:

uWave Library: http://zhen-wang.appspot.com/rice/projects_uWave....

DTW: https://en.wikipedia.org/wiki/Dynamic_time_warpin...

Step 4: The GPS

Detailed Explanation

The GPS's goal was to record the time and coordinates down into a log file whenever the accelerometer prompted it to do so.

As simple as this sounds, it proved a lot harder to do using the GPS's ublox6 library. However, the library followed the National Marine Electronic Association's (NMEA) data format and needed to be converted into readable values for the BlueMix server to use in the next step.

From some research, we found a simple library called Minmea that would convert the NMEA data in values for longitude, latitude, and time. However, this step proved to be more challenging than expected in a similar fashion related to the accelerometer as the Minmea library was in C whereas our code/the GPS library was in C++.

That said, unlike the accelerometer, we managed to make the C library compatible with no issues after a long period of time and we have attached our code below along with a link to the Minmea GitHub library.

Minmea GitHub:https://github.com/cloudyourcar/minmea

Attachments

Step 5: IBM BlueMix

Detailed Explanation

The Edison board uploads the GPS file into a cloud server whenever Wifi is available and the server does backend calculations to match those GPS time/coordinates to other Edison boards. The file contains lines of log each with the time, coordinates including lat/long and the edison board's serial number which will be used later in step 6.

Why BlueMix?

We chose IBM BlueMix because it was built with an integrated app platform that was open source and allowed you to easily create applications using a variety of languages. Furthermore, there was also a workshop during the roadshow specifically about BlueMix so we decided to make full use of the additional info we learned about BlueMix. The process for BlueMix is relatively simple.

The application was written in node.js and will record the GPS log into a larger data file (can be upscaled into an actual database). Using a self made algorithm, the app will match up pairs of GPS log lines which will be sent to the front end client. Any unmatched log lines will be remembered and rechecked the next time the data file gets updated.

The algorithm itself involves two main step, the first sorting the many lines of GPS log from separate edison boards into arrays sorted by same date and time interval of no more than 15 s between any two values in the array. This makes the 2nd half of doing comparisons a lot faster as it will compare the individual values into pairs using the time and GPS coordinates if necessary. This is because when two users shake their hands, their two edison boards should have nearly identical times logged in their GPS's and only when several shakes at once do you really need the GPS coords. With those pairs identified, it will send these pairs of values back to the client to use and will append a special identifier symbol at the end of any unmatched GPS lines to check again next time.

Unfortunately, we didn't have time to really delve into the algorithm as we spent a lot of time troubleshooting the previous 2 steps/the algorithm is a rough concept and hasn't been polished fully to work in any scenario.

Step 6: Front End Client

Detailed Explanation

To match respective users to each other based on the pairs of data sent by the back end server.

The front end client could be implemented in several ways with none of them being too complex.

1st idea: Android/Ios app that allows users to add their contact information and would send a simple notification whenever one of their handshakes gets matched.

2nd idea: Social Media API integrations where they get an additional option to just import information from their profile and whenever one of their handshakes gets matched, they can just get a link to the other user's respective social media profile for further contact/information.

3rd+ idea: Integrate the front end directly into bluemix as another app where a users can create an account to login and like 1st idea, log their information/see others when they get their handshakes matched. This may be harder to implement as there are less resources available for BlueMix than android/ios apps.

Overall, the client is very scalable and regardless where it is, the primary function is to relay the information back to the user and be in an easy to use/convenient platform.

Step 7: Additional Possibilites/Upscaling

Ultimately, we were not able to complete this project fully due to our teams limited knowledge/experience and the short time frame we had. That said, there is a lot of potential for edishake to serve a greater purpose.

With the accelerometer gesture recognizer, there is possibility of customization/expansion into other gestures like a high five or fist bump. The edison itself is actually a very small chip and if the sensors were shrunk down, you could directly incorporate it into some sort of wristband you wear or even turn the edison itself into a smart watch (it has similar specs to current smartwatches).

Furthermore, you could do additional actions based off the accelerometer recognizing a gesture as you could implement a microphone sensor to pick up a certain interval of sound whenever you did a certain gesture or activate a feature in your phone using the edison's bluetooth connectivity. Ultimately, there is a lot of scalability in this project and it could definitely serve as a concept to something more amazing.

Thanks for reading the instructable/If you would like to discuss more about the project, just leave a message here or turn the project an actual reality if you are up for the challenge!