Introduction: Datalogging With Spark Core + Google Drive
The Spark Core makes it really easy to connect your project to the Internet of Things. You can use just about any sensor that works with a regular old Arduino and automagically send it into the cloud. But what do you do after your data vanishes into the Aether? This Instructable will show you how to make it rain (data) from the cloud into a Google Spreadsheet. The setup described here monitors temperature, humidity, light and UV index; however, feel free to substitute any sensors that work with the Spark Core. To bake this specific setup you'll need the following ingredients:
Spark Core (https://store.spark.io/)
AM2302 (wired DHT22) Temperature + Humidity Sensor (https://www.adafruit.com/products/393)
Light Sensor (https://www.adafruit.com/products/1384)
UV Sensor (https://www.adafruit.com/products/1918)
Google account with Google Drive activated
Step 1: Connect Sensors to Spark Core
For the purposes of this Instructable we'll assume you've successfully captured and registered the Core to your spark.io account by following the instructions that came with it.
Then, the first thing you have to do is wire up all the sensors so that they can talk to the Spark Core. Each sensor needs connections to +5V, ground, and a digital or analog pin on the Spark Core. Wiring to the same pins shown in the breadboarding diagram (and described below) will prevent you from having to make a lot of edits to the sketch down the road. But feel free to use other pins if necessary.
Light Sensor signal to analog pin A6
UV Sensor signal analog pin A7
Temperature and Humidity Sensor signal to digital pin 6
Step 2: Load Sketch Onto Spark Core
With everything wired up correctly, connect your Spark Core to a 5V power source and wait for it to boot up and the LED to start pulsing cyan. At this point you should be able to log into your spark.io account, copy the attached demo into a new sketch, and flash the new firmware onto your Core. Remember to include the Adafruit DHT library in your sketch from the Spark IDE.
Step 3: Setup a Google Spreadsheet to Receive Your Data
Log in to your Google Account and head over to Drive. Select "My Drive" > "New File" > "Google Sheets" to create a fresh spreadsheet. Next, Select "Tools" > "Script Editor" to create a new script. Paste in the script from the attached demo file, and insert the Device ID and Access Token from your Spark.io account.
Step 4: Setup a Script to Run Automatically
Now that the script is ready you want to have it automagically fetch data from your Core and update the spreadsheet at regular intervals. Select "Resources" > "Current Project's Triggers". Select the "collectData" script, "Time driven" events, a timer scale and interval.
Step 5: Make It Rain (sensor Data)!
Configured as described, the script will be set to fetch new data every minute, so you should see the spreadsheet auto-update at one minute intervals (or longer if you set different parameters). Now sit back and watch the data stream in. You don't need to do anything else!