Introduction: Spark Core/Photon Part 1 - Setup

This instructable was created as part of the Instructable Build Night at MakeICT.

The Spark Core is Arduino compatable that can run many libraries for Arduino, has a built-in WiFi chip with antenna and a low power requirements. All this in a very small form factor.

  • Texas Instruments CC3000 Wi-Fi module
  • STM32F103 72Mhz ARM Cortex M3
  • 128KB flash, 20KB RAM
  • 2MB external flash
  • 802.11b/g
  • Smart Config setup

We received Spark.io Spark Cores and Spark Internet Buttons to host 2 project build nights using the Cores and Buttons to work on learning how to setup, code and build projects. During this time, Spark.io announced the new Spark Photon which is now priced almost 1/2 of the price of the Core. You are going to hear a lot about this awesome IoT capable "postage stamp-sized hackable Wi-Fi module for interacting with physical things."

We used the Spark Core companion app Spark to talk to the Cores, in this case a bunch of people in a room, the "first one running the app wins" control of all the Cores. NOTE: When I tried used the Spark app, I was unable to detect the Core even though it was blinking blue.

Nice idea, seen it work, just not for me.


Step 1: What You Will Need

To setup the Spark Core you will need:

  • Contents of the Spark Core box
    • Spark Cork
    • Breadboard Spark Core arrived mounted to
    • USB cable
  • Computer connected the the internet
  • WiFi connection for the Spark Core
  • Windows drivers for Spark Core (in the CONNECT OVER USB section)
  • PuTTY, serial terminal application

I am only covering using the Spark Core with the Windows OS.

Step 2: Getting the Core ID and Setting Up WiFi

The documentation Spark.io has is very good, however I am going to break down the steps, referring to the Spark docs when necessary for clarity. These screen prints are from a Windows 8.1 computer.

Install the Windows driver for Spark Core

Plug in the Core with the USB cable

The Core will start with a bright white light when plugged in, changing to a blinking blue light.

This is called the Listening Mode.

Find the COM port

To find the COM port the Core is attached to you have to open Device Manager. There are 2 different ways:

  • Start, Run, enter devmgmt.msc then press the ENTER key
  • right click on the My Computer icon, select the Manage menu item, select Device Manager

Under Ports (COM & LPT) find your Core.

Run PuTTY

When you start PuTTY, you will need to enter some settings for the Core.

Go to the Serial screen

  • Baudrate: 9600
  • Data Bits: 8
  • Parity: none
  • Stop Bits: 1

Go to the Session screen

  • verify the setting in the Session screen shot
  • name the settings to be save in Saved Settings
  • click on the Save button

Time to connect to the Spark, ckick the Open button. The light on your Core will go out.

You will see a blank, black screen inside PuTTY. You are going to enter to different characters

  • i - ("i" as in ID) which displays the Spark Core ID (WRITE THIS DOWN)
  • w - displays SSID:
    • enter your Wi-Fi SSID
    • press ENTER
    • enter password
    • press ENTER

You will see a message in PuTTy

Thanks! Wait about 7 seconds while I save those credentials...

Awesome. Now we'll connect!

If you see a pulsing cyan light, your Spark Core has connected to the Cloud and is ready to go!

If your LED flashes red or you encounter any other problems, visit https://www.spark.io/support to debug.

Spark <3 you!

You just entered the WiFi configuration into the Core. Smart Config is now running andcan take up to a minute. The Core will go through the following sequence of lights:

Solid blue: WiFi credentials entered

Flashing green: connecting to Wi-Fi network

Flashing cyan: connecting to Spark Cloud

Breathing cyan: connected to Spark Cloud

Step 3: Take Ownership of Your Core

Your Core is now connected to the Spark.io cloud. All this means is that you will need to take ownership of your Core by loggin into the Spark Cloud.

In my next instructable, I will address taking ownership of your Core/Photon and using the Spark Build environment.