Introduction: Datalogging With Spark Core + Google Drive

The Spark Core makes it really easy to connect your project to the Internet of Things. You can use just about any sensor that works with a regular old Arduino and automagically send it into the cloud. But what do you do after your data vanishes into the Aether? This Instructable will show you how to make it rain (data) from the cloud into a Google Spreadsheet. The setup described here monitors temperature, humidity, light and UV index; however, feel free to substitute any sensors that work with the Spark Core. To bake this specific setup you'll need the following ingredients:

Spark Core (https://store.spark.io/)

AM2302 (wired DHT22) Temperature + Humidity Sensor (https://www.adafruit.com/products/393)

Light Sensor (https://www.adafruit.com/products/1384)

UV Sensor (https://www.adafruit.com/products/1918)

Google account with Google Drive activated

Step 1: Connect Sensors to Spark Core

For the purposes of this Instructable we'll assume you've successfully captured and registered the Core to your spark.io account by following the instructions that came with it.

Then, the first thing you have to do is wire up all the sensors so that they can talk to the Spark Core. Each sensor needs connections to +5V, ground, and a digital or analog pin on the Spark Core. Wiring to the same pins shown in the breadboarding diagram (and described below) will prevent you from having to make a lot of edits to the sketch down the road. But feel free to use other pins if necessary.

Light Sensor signal to analog pin A6

UV Sensor signal analog pin A7

Temperature and Humidity Sensor signal to digital pin 6

Step 2: Load Sketch Onto Spark Core

With everything wired up correctly, connect your Spark Core to a 5V power source and wait for it to boot up and the LED to start pulsing cyan. At this point you should be able to log into your spark.io account, copy the attached demo into a new sketch, and flash the new firmware onto your Core. Remember to include the Adafruit DHT library in your sketch from the Spark IDE.

Step 3: Setup a Google Spreadsheet to Receive Your Data

Log in to your Google Account and head over to Drive. Select "My Drive" > "New File" > "Google Sheets" to create a fresh spreadsheet. Next, Select "Tools" > "Script Editor" to create a new script. Paste in the script from the attached demo file, and insert the Device ID and Access Token from your Spark.io account.

Step 4: Setup a Script to Run Automatically

Now that the script is ready you want to have it automagically fetch data from your Core and update the spreadsheet at regular intervals. Select "Resources" > "Current Project's Triggers". Select the "collectData" script, "Time driven" events, a timer scale and interval.

Step 5: Make It Rain (sensor Data)!

Configured as described, the script will be set to fetch new data every minute, so you should see the spreadsheet auto-update at one minute intervals (or longer if you set different parameters). Now sit back and watch the data stream in. You don't need to do anything else!

Comments

author
ben.landry.923 (author)2016-06-07

Hi, I can't get your code to work... In the Google Doc I keep getting undefined values, while the value being published to Particle is good. Do you know how to fix this issue?

author
yurikleb (author)2016-05-05

Hi and thanks for the tutorial again, after about a week of use i found the google spreadsheets a bit of an unreliable way for logging the data, as it skips some reading and throws errors many times, especially if the device is off or lost connection.

I ended up using ubidots, they just released a library for the Core and have an easy tutorial here: http://ubidots.com/docs/devices/particlePhoton.html

author
yurikleb made it! (author)2016-04-25

Really great tutorial and very useful!

there are few amends needs to be done to the scripts:

when declaring variables on the Photon/Core, Spark method is no longer supported and now should be referred to as "Particle" the syntax should be:
Particle.variable("result", resultstr);

in the google spreadsheet script, in the UrlFetchApp (line 4) double quotes should be replaced with a single quote so it reads
UrlFetchApp.fetch('https://api.spark.io/v1/devices/<<YOUR
DEVICE ID>>/result?access_token=<<YOUR ACCESS TOKEN>>')

Thanks!

IMG_20160425_133702.jpg
author
Binario ZapataS (author)yurikleb2016-04-25

yes i did change my script using the particle.variable and i also changed it to single quote, but the problem is still there, the script doesn't run automatically, however it does run manually, any Clue why?, see bellow the image

Script.jpg
author
yurikleb (author)Binario ZapataS2016-04-26

If works well when you press the PLAY button (run the function manually) and the spreadsheet is updated with data I see no reason why it shouldn't run with the triggers. you probably tried it before but:

1. try removing and adding the trigger
2. make sure to press 'save' after adding the trigger
3. it sometimes takes a while for the data to actually update, wait for 5-10 minutes, then refresh the spreadsheet page.
4. check your email (also spam inbox) for an error report from google (I just got one for mine)

author
Binario ZapataS (author)2016-04-24

I was not really able to make it work automatically, I did exactly what it is shown on the instruction but it just didn't work, any ideas?

author
yurikleb (author)Binario ZapataS2016-04-25

there are few small amends that need to be done to the scripts, see my comments above

author
MohsinR11 made it! (author)2016-03-30

Great App project, Thank You. Got it working with just the temp and Humi sensor as the light sensor and uv sensor is still on it's way.
Just a quick comment; you need to update this project to cater for the name change to "Particle" for both the Build firmware script and the Google Script.

GoogleTempHumDoc.JPG
author
dmw234 (author)2015-09-12

Thank you for the great instructable! It seems exactly what I want for one of my projects. But when I use your google script and add my device ID and token I get this even before I can setup the trigger for the project:

Unterminated string literal. (line 4, file "Code")

So I replaced the double quotation marks with single quotation marks in line 4 and it works fine.

Thank you so much again for the instructable! One question though: do you have any recommendations for higher temporal data collection (perhaps up to 1 second)? Unless I missed something, drive will only allow up to 1-minute resolution (which is still awesome!).

author
Gusgonnet (author)2015-06-22

Hey, I made a clone of this project using a thermistor, instead of the DHT22, to measure the temperature of my pool.

The instructions for capturing data to google sheets are both awesome and simple to follow.

Thank you for putting the effort in publishing this!

Gustavo.

About This Instructable

7,070views

35favorites

License:

Bio: i'm losing my edge...to the kids with borrowed nostalgia for the unremembered 80's
More by garriaga:Solar Powered PirateBoxControl your friends using the power of Neuroscience (Remote Control Human Part II)Datalogging with Spark Core + Plotly
Add instructable to: