Introduction: Smart Motorcycle HUD Prototype (turn-by-turn Navigation and So Much More)

Hi !

This Instructables is the story of how I designed and built a HUD (Heads-Up Display) platform designed to be mounted on motorcycle helmets. It was written in the context of the "maps"contest. Sadly, I wasn't able to completely finish this project in time for the contest's deadline, but I still wanted to share my progress on it, as well as document all the trial and error I got through making it.

The idea for this project first came to me a few years ago, when I got into motorcycles, and I was starting to look into what gear I would need to buy in order to make my rides more enjoyable. At the time, it baffled me that the best way to get some basic GPS navigation while riding was to basically attach your smartphone to your bike's handlebars. I though to myself that surely, there could be a better way to get that kind of info on the fly.

That's when it got to me : a heads-up display could be the way to get navigation while riding, without emptying your phone's battery, and exposing it to the elements.

Over time, this idea matured in my mind, and I though that having a HUD in front of me at all times would allow for many more uses than simple navigation. This is why my plan is to make the platform public and modular, so anyone can create a module that displays the information they need on their own HUD

Although there are commercially available products that fulfill this task, there aren't any that are as modular as my platform, and they also tend to be a little pricey. Anyhow, welcome to this project.

What works as of now

As stated, this project is still very much in a development state, and this is what is currently working.

- Communication between a smartphone and a ESP32 based board (phone awake)

- Optics design done (might need small adjustments on the long run)

- Android navigation app using the Mapbox navigation SDK :

- Capable of calculating and displaying the user's position on a map, as well as a route from it to the destination

- Capable of connecting to a Bluetooth device ( device's MAC address is hardcoded as of now )

- Capable of real time navigation, including extracting and sending the upcoming maneuver's information through serial Bluetooth ( only supports turns for now )

What needs work

This list contains items that are absolutely necessary for the HUD's intended use, but are not ready to be implemented yet.

- Overall design ( Helmet attachment, reflector's angle adjusting mechanism, .. )

- Android app :

- Implement off-route detection and correction

- Ability for the user to input the destination address

- Waypoints ?

- Ergonomics / Aesthetics



- An esp32 based development board

- Any somewhat recent android smartphone (Bluetooth enabled)

- A SSD1306 or other enabled 96" OLED screen (mine was 128x64 pixels, see " The brains : Microcontroller & Screen" part)

- A reflector (any piece of acrylic/glass/plexiglass will do)

- A Fresnel lens (mine had a F.length of about 13cm, see "Lens choice" part)


- Soldering Iron

- Breadboard

- A few jumper cables

- 3d printer / 3d printing service

Step 1: How It All Works : Design Choices Explained

The basic idea of a Heads Up Display is to display an image in front of someone's vision, so they don't have to look away from whatever they're doing ( be it piloting a plane, or driving a motorcycle, which will be our example case ).


Technically, this could be achieved by straight up putting a screen in front of the user's eyes. However, a screen is not transparent, and would therefore hinder its user's vision. You could then place the screen in front of a reflective surface, which would mirror the screen's content while also being see-through enough that the user could see what's in front of him.

However, this approach has a huge flaw : the actual screen is usually closer to the user's eyes than what the user actually has to focus on ( eg. the road ahead of him ). This means that, in order to read what's on the reflective surface, the user's eyes would need to adapt to the display's distance from his eyes ( let's say 20 cm ), and would then need to adapt again in order to focus on the road ahead ( ~2/5 meters ). The time this whole operation takes is precious time that should be spent looking at the road, and adapting frequently might be uncomfortable to the user after just a few minutes.

That is why I decided to add a lens between the screen and the reflector. This lens, if chosen carefully, should allow for the creation of a virtual image of the screen ( see the schematic above ), which would then appear to be further away from the user's eyes as it actually is, thus requiring less abrupt adaptations ( or none at all, in a perfect scenario ). This design allow for the user to quickly glance at the reflector, get the information he needs, and instantly look back up at the road.

The role of the smartphone

Because it was unrealistic to try and implement a whole navigation application on the ESP32 alone, I decided to make an android app that would take care of this. The app would then just need to tell the ESP32 what the user has to do to get to his destination, and the ESP32 relays that information though the HUD ( see "How the module works" figure ).

Step 2: Parts - the Brains : Microcontroller & Screen

As stated above, I planned to have my module display navigation information, while not actually having it calculate the actual positioning, tracking and real-time navigation. the user's phone would instead communicate with the module, and send it the information to then be displayed on the HUD.

To facilitate the communication between the user's phone and the module, I chose to use a ESP32 based board for this project. This choice was due to this specific module having integrated Bluetooth capabilities, as well as a few other interesting specifications (easy to use Non-Volatile Storage, dual-core CPU, enough RAM to actually drive the OLED display via I2C, ...). It is relatively simple to design PCBs based around the ESP32, which I did take into account. I also have professional experience using and designing circuits with the ESP32, which definitely influenced my choice.

The choice of the screen basically came down to whatever I could find that I though would be be bright enough for y use, while also being as small as possible. I was not very worried about the number of pixels of the screen, as my objective was to have a very minimalistic and simple UI.

It should be noted that the screen driver should be supported by a library that allows for image mirroring. That is because the displayed image gets flipped when it gets through the lens and appears on the reflector, and not having to manually reverse what gets displayed is a huge weight off our shoulders as builders.

Step 3: Parts - Optics : Finding a Compromise

The optics for this project were quite hard to approach, as I had no idea what I was even looking for when I first started this project. After some research, I understood that what I wanted to do was to create a "virtual image" of my OLED screen, that would appear to be further away from the eye than it actually is. The ideal distance for this virtual image to be formed would be at around 2-5 meters in front of the driver, a this seems to be the distance to the objects we focus on when driving (other cars, bumps on the road, etc ... ).

To achieve that goal, I chose to use a Fresnel lens, as these are quite large, cheap, they seemed to offer a good enough focal distance for my project, and they can be cut with simple scissors (which isn't the case for more refined round shaped glass lenses). Fresnel lenses can be found names like "pocket magnifier" or "reading card magnifier", as they are very appropriate to help people with bad eyesight read.

Basically, the trick here was all about finding the right compromise between :

- Having a reasonable virtual image distance (that is, how far the HUD will seem to be to the user, or how far the user will have to adjust his eyes to see what's on the HUD)

- Having the text on the screen not be too enlarged by the lens (which is basically a magnifier)

- Having a reasonable distance between the OLED screen and the lens, which would otherwise lead to a very bulky module

I personally ordered a few different lenses on amazon, and determined their respective focal lengths, before choosing one with a F.length of about 13 cm. I found this F.length, with a OLED-Lens distance of 9cm, gave me a satisfying image on my reflector (see last few images above).

As you will see on my illustrations, in order to properly focus on the displayed text, the camera used to take these pictures has to adjust as if it was focusing on a faraway object, which makes everything on the same plane as the reflector seem blurry. This is exactly what we want for our HUD.

You can find the 3d files for the lens holder on here.

Step 4: Parts - a Container to Hold Them All

As I am writing this Instructables, the actual container that will hold every piece of the heads-up display is not quite designed. I do however have a few ideas about its general shape and on how to approach certain problems (like how to hold a reflector still, and make it withstand 100+ km/h winds). This is still very much a work in progress.

Step 5: Creating a Protocol for Our Module

In order to send the navigation instructions from the phone to the development board, I had to come up with a communication protocol of my own that would allow me to easily send the required data from the phone, while also facilitating its processing once received.

At the time of writing this Instructables, the information that needed to be transmitted from the phone in order to navigate with the module were :

- The upcoming maneuver's type ( simple turn, roundabout, merging onto another road, ...)

- The upcoming maneuver's precise instructions ( dependent of the maneuver type: right/left for a turn; which exit to take for a roundabout, ... )

- The distance remaining before the upcoming maneuver ( in meters for now )

I decided to organize this data using the following frame structure :


While not being a beautiful solution, this one allows us to easily separate and distinguish each field of our protocol, which facilitated the coding on the ESP32 side.

It is important to keep in mind that, for future features, other information may need to be added to this protocol ( like the exact day and time, or the music being played on the user's phone ), which would be easily feasible using the same building logic as now.

Step 6: The Code : ESP32 Side

The code for the ESP32 is currently quite simple. It uses the U8g2lib library, which enables easy control of the OLED screen ( while enabling mirroring of the displayed image ).

Basically, all the ESP32 does is receive serial data through Bluetooth when the app sends it, parse it, and display this data or pictures based off this data ( ie. displaying an arrow instead of the sentence " turn left/right" ). Here's the code :

/*Program to control a HUD from an android app via serial bluetooth

#include "BluetoothSerial.h" //Header File for Serial Bluetooth, will be added by default into Arduino
#include <Arduino.h>
#include <U8g2lib.h>

#ifdef U8X8_HAVE_HW_SPI
#include <SPI.h>
#ifdef U8X8_HAVE_HW_I2C
#include <Wire.h>

//OLED library constructor, needs to be changed accordingly to your screen
U8G2_SSD1306_128X64_ALT0_F_HW_I2C u8g2(U8G2_MIRROR, /* reset=*/ U8X8_PIN_NONE);

// State machine detected_field values + variable
#define maneuverField 1
#define instructionsField 2
#define distanceField 3
#define endOfFrame 4
int detected_field = endOfFrame;

BluetoothSerial serialBT; // Object for Bluetooth
char incoming_char;

char maneuver[10];
char instructions[10];
char distance[10];
char tempManeuver[10];
char tempInstructions[10];
char tempDistance[10];

int nbr_char_maneuver = 0;
int nbr_char_instructions = 0;
int nbr_char_distance = 0;

boolean fullsentence = false;

void setup()
Serial.begin(9600); // Start Serial monitor in 9600 bauds
u8g2.begin(); // Init OLED control
serialBT.begin("ESP32_BT"); // Name of the Bluetooth Signal
Serial.println("Bluetooth Device is Ready to Pair");

void loop()
if (serialBT.available() && !fullsentence) // Characters being received via Bluetooth serial
incoming_char =;
Serial.print("Received:"); Serial.println(incoming_char);
switch (detected_field)
case maneuverField :
Serial.println("Detected field : maneuver");
if (incoming_char == '.' ) // Next field detected
detected_field = instructionsField;
} else { // Fill the maneuver type info array
maneuver[nbr_char_maneuver] = incoming_char;
nbr_char_maneuver ++;
case instructionsField :
Serial.println("Detected field : instructions");
if (incoming_char == ',' ) // Next field detected
detected_field = distanceField;
} else { // Fill the instructions info array
instructions[nbr_char_instructions] = incoming_char;
nbr_char_instructions ++;
case distanceField :
Serial.println("Detected field : distance");
if (incoming_char == ';' ) // End of Frame detected
detected_field = endOfFrame;
Serial.print("maneuver:"); Serial.println(maneuver);
Serial.print("instructions:"); Serial.println(instructions);
Serial.print("distance:"); Serial.println(distance);
fullsentence = true;
update_Display(); // Full frame received, parse it and display recever data
} else { // Fill the distance info array
distance[nbr_char_distance] = incoming_char;
nbr_char_distance ++;
case endOfFrame :
if (incoming_char == ':' ) detected_field = maneuverField; // New frame detected
// Do nothing

void update_Display(){
// Cache each char array to avoid possible conflicts
memcpy(tempManeuver, maneuver, nbr_char_maneuver);
memcpy(tempInstructions, instructions, nbr_char_instructions);
memcpy(tempDistance, distance, nbr_char_distance);

parseCache(); // Parse and process char arrays
fullsentence = false; // Sentence processed, ready for the next one

void parseCache(){
u8g2.clearBuffer(); // clear the internal memory
u8g2.setFont(u8g2_font_ncenB10_tr); // choose a suitable font

// char arrays -> string mandatory to use substring() function
String maneuverString = tempManeuver;
String instructionsString = tempInstructions;

//Implementing protocol here. Only supports turns for now.
if (maneuverString.substring(0,4) == "turn") { // Check for maneuver type
Serial.print("TURN DETECTED");
if (instructionsString.substring(0,5) == "right") { // Check specific instructions and display accordingly

else if (instructionsString.substring(0,4) == "left") { // Check specific instructions and display accordingly
else u8g2.drawStr(5,15,"Err."); // Invalid instructions field
/* Implement other maneuver types (roundabouts, etc..)
* else if (tempManeuver == "rdbt"){
* ]

u8g2.drawStr(5,30,tempDistance); // Display remaining distance
u8g2.sendBuffer(); // transfer internal memory to the display

// Reset all char arrays before next reading
memset(maneuver, 0, 10);
memset(instructions, 0,10);
memset(distance, 0, 10);
memset(tempManeuver, 0, 10);
memset(tempInstructions, 0,10);
memset(tempDistance, 0, 10);

// Reset number of elements in arrays
nbr_char_distance = 0;
nbr_char_instructions = 0;
nbr_char_maneuver = 0;

Step 7: The Code : Android Side

For the smartphone app, I decided to use Mapbox's navigation SDK , as it offers a lot of useful features when it comes to building a navigation map from scratch. It also allows the use of many useful listeners, which definitely help in making this module work. I also used harry1453's android-bluetooth-serial library for android, as it made Bluetooth serial communication a lot easier to put together.

If you want to build this app at home, you'll need to get a Mapbox access token, which is free up to a certain number of requests per month. You will have to put this token in the code, and build the app on your side. You will also need to code in your own ESP32's Bluetooth MAC address.

As it stands, the app can guide you from your current location to any location you can click on on the map. As mentioned in the intro, however, it doesn't support any other maneuver than turns, and does not handle off-routes yet.

You can find the whole source code on my github.

Step 8: What's Next ?

Now that the app is functional enough to actually guide its user on a set route ( if there are no deviations from the set route ), my main focus will be to improve the smartphone app, and implement the few capabilities that would make the module a viable navigation device. This includes enabling Bluetooth communication from the phone even when the screen is off, as well as support for other types of maneuvers ( roundabouts, merging, ... ). I will also implement a rerouting feature if the user deviates from the original route.

When all of this is done, I will improve the container and its attachment mechanism, 3d print it, and try and take the module for a first run.

If all goes well, my long term objective is to design a custom PCB for the embedded electronics of this project, which would save a lot of space on the final product.

I might also add some other features to this module in the future, including a time display, as well as a phone notification alarm, which could make an icon appear when the user receives a text message or a call. Finally I would love to add Spotify capabilities to this module, as a huge music fan. However, at this point in time, this is only a nice to have.

Step 9: Conclusion and Special Thanks !

As stated in the intro, although this project is far from being finished, I really wanted to share it with the world, in hopes it might inspire somebody else. I also wanted to document my research on this subject, as there isn't really a lot of hobbyist interest in AR and HUD's, which I think is a shame.

I want to give a huge thank you to Awall99 and Danel Quintana , whose respective augmented reality project inspired me a lot in the making of this module.

Thank you all for your attention, I will be sure to post an update when this project gets improved in the near future. In the meantime, see you all later !

Maps Challenge

Participated in the
Maps Challenge