Introduction: Guided Running Shoe (with/ Arduino IOT Cloud)
This Instructable documents a functional prototype for a guided running shoe that displays directional navigation using an LED matrix mounted on a piece of footwear. The shoe points toward a programmed destination using latitude and longitude, allowing a runner or walker to navigate without looking at a screen.
This prototype uses:
- Arduino Nano ESP32
- WiFi connection over mobile device
- Arduino IoT Cloud as the data link
- Phone sensors (GPS + accelerometer) for remote navigation intelligence
Ultimately, this project is part of my ongoing study into screenless wayfinding and wearable navigation systems.
Supplies
- Arduino Nano ESP32
- Adafruit DotStar / DotMatrix
- 1” × 1” thin acrylic sheet
- Velcro straps (3/4" width)
- Hot glue
- Construction paper
- Stranded wire
- Leather/fabric hole punch
Step 1: Wiring Up
I wired the Arduino Nano ESP32 directly to a 64-pixel Adafruit Dot Matrix using four short jumper wires. I deliberately kept the wires short (about 20–25mm), because the display and board will eventually sit almost back-to-back once mounted, and long wires create slack, stress, and failure points.
Here’s what the connections came down to:
- VBUS (5V) → 5V on the Dot Matrix
- This provides power to the pixels.
- GND → GND
- A shared reference ground is essential; without this, the entire display stays dark.
- Pin 4 → DIN
- DIN carries the actual pixel data from the board to the display.
- Pin 2 → CLK
- CLK provides the timing pulse so the matrix knows when to shift data.
There’s really not much more to it — but it wasn’t obvious until I had done it.
Before committing everything with hot glue and Velcro, I ran a quick test by uploading a simple sketch that lights all the pixels uniformly. This is the fast and dirty truth test: if every pixel comes on, the wiring is correct; if only some flicker or nothing happens, something is wrong and you fix it now, not after building the enclosure.
Step 2: Connecting the Phone and Arduino Through the IoT Cloud
Once your phone is added as a Device in the Arduino IoT Cloud, it will also automatically appear in the Things tab. Clicking into it reveals the live data coming from the phone sensors — accelerometer values, GPS coordinates, and anything else you’ve enabled. (If you have the paid plan, you’ll see extra phone variables available beyond the basics.)
The next move is to link those variables to the Arduino Nano.
This part is conceptually simple, but there are several traps that can leave you with a “compiling but not working” situation — which is exactly what happened to me.
Inside the Arduino’s Thing page, I used the Add Variable button to start syncing data. When the “Add Variable” window opens, you can choose Sync with Other Things, which lets you directly mirror any variable coming from the phone. For example, I linked accelerometer_linear from the phone to the same-named variable on the Nano. Keeping the names identical turns out to be surprisingly important — it helps avoid confusion later in troubleshooting.
The key piece — and the source of hours lost — is the permission mode. The variables must be set to Read & Write.
If you select Read Only, the values visually appear in the cloud dashboard but never actually propagate to the Arduino. And nothing will warn you about this.
Once I corrected that, I repeated the process for the remaining sensor variables.
At that point I had synced everything, but still saw no live data coming into the Nano. What finally fixed it was discovering that the Arduino IoT Cloud does not automatically regenerate its internal thingProperties.h file when variables change. So I deleted the entire sketch, reopened it fresh, and the cloud editor recreated a new thingProperties.h that actually matched the new variable setup.
Only then did the serial monitor begin reflecting real sensor data.
You know it’s working when:
- the values in both the phone Thing and the Arduino Thing refresh as you move the phone
- and those same values appear in your serial monitor when testing code
- and both devices show an Online status in IoT Cloud
It took longer than I wished, but it was a valuable discovery:
your code can succeed and still silently fail if your cloud configuration isn’t fully aligned
Step 3: The First “Light-Up” Test
Before attempting navigation logic, I wrote a simple test sketch that let me shake the phone to make the LED matrix respond.
This confirmed two very important things:
- The cloud sync was truly working in both directions
- The wiring between the Arduino and matrix was reliable
Seeing that first reaction — the LED flickering from a physical movement — is when the project felt “alive.”
Attachments
Step 4: Making the Physical Device
The materials are very minimal: Velcro, construction paper, short wire leads, the Arduino, and a square of acrylic.
The dot matrix and Arduino are attached back-to-back, with paper hot-glued between them to prevent shorts. The top face of the Arduino also gets paper, mostly to avoid accidentally pressing the reset pin, which happened several times early on.
The Velcro wraps around the whole assembly like a small belt, allowing the board to clip to the tongue or laces of the shoe. A hole punch was used to make a clean opening for the USB-C port so the shoe can be re-programmed or charged without cutting anything apart.
The acrylic square is hot-glued to the front of the matrix, acting as a lightweight protective lens and giving a clearer visual surface during motion.
Step 5: Orientation & Calibration
Because the matrix can be rotated relative to the shoe, I wrote a north-arrow visualization so I could orient the pixel drawing patterns. I then rotated the shoe until the displayed “North” matched true north.
From there, every other arrow was calibrated relative to that north reference.
Attachments
Step 6: Arrow Directions Explained
Step 7: Final Version Code
Writing the Final Code (and Surviving the Debugging)
Once the wiring and cloud syncing were stable, I shifted focus to the actual logic running the shoe. This meant writing the code that (1) interprets GPS data, (2) compares it to previous positions, (3) determines the desired heading, and (4) selects the correct arrow animation to display on the dot matrix.
A lot of this logic I wrote myself: mapping out how to compute direction, how to smooth the heading, how to convert the destination latitude/longitude into a directional arrow, and how often the GPS should be sampled to avoid noise. I also wrote all the arrow definitions for the dot matrix (North, East, South, West, and the diagonals), and laid out how the visuals should appear in relation to one another on the 8×8 grid.
The central idea was simple:
figure out which way the runner needs to turn, then display that arrow.
The implementation, however, required several subtle layers to make it robust. The logic computes bearing between the previous GPS point and the current one, to establish the movement direction of the runner, and separately computes bearing from the runner's current position to the destination. The difference between those two bearings determines the arrow.
Where ChatGPT Came In
While I established the structure, patterns, and algorithmic logic, ChatGPT served almost like a patient software assistant — helping turn those ideas into valid syntax, catching mismatched braces, and especially helping with compiling errors specific to the Arduino Cloud environment.
One repeated stumbling block was something the Arduino IoT Cloud requires:
Every variable synced from the Cloud must have a corresponding callback function in the sketch, even when the function doesn’t actually need to do anything.
Without these callbacks, the code compiles, uploads partway, and then fails at the linking stage — a frustrating behavior because the editor made it seem like the program was valid. After several rebuilds, reconfiguration attempts, and troubleshooting conversations, I realized that the callbacks had to be explicitly present, even if empty.
This was the most persistent error, and the final solution was to include the following necessary stub functions at the end of the sketch:
This solved a problem that didn’t feel like traditional debugging — it wasn’t an algorithm issue, but rather a structural requirement of the Arduino IoT Cloud system.
Once those were added, the code compiled cleanly and began behaving consistently.
With that in place, the system finally ran: the device compared live GPS position to prior positions, understood coarse heading, calculated the direction toward the destination, and displayed the appropriate arrow onto the shoe.
It didn’t just “work.”
It translated geography — thousands of coordinates in the real world — into a single point of light in front of you.
And that’s when I felt the meaning of the project click:
Instead of needing a screen, the wearer just looks down and moves toward the light.
Step 8: Adding Your Directions
Destinations can be added in the code by plugging your desired location into https://www.latlong.net/ and putting the outputted latitude, longitude coordinates in the code.
Step 9: Final Code
Step 10: Takeaways
This document doesn’t present a finished consumer product. It presents a stepping stone — a working proof that navigation can be embodied, peripheral, and minimally distracting.
The project validated that the technology stack is feasible with simple consumer level electronics.
Things I'm particularly proud of in this project:
- Project perseverance
- problem solving & troubleshooting
- Connecting between devices
What I would work on next:
- Animations for when the wifi signal disconnects
- Adding gps and compass on the device for better responsiveness + accuracy
- 3D Printing a housing for these and a battery (if the form factor gets much bigger than perhaps integrating into the shoe)
- Creating more code for how the serial monitor displays data (for better understanding)
- A dedicated instructable and video walkthrough for phone to arduino IOT connection as this is not super easy to setup alone without sufficient know-how


