Smartglove for Cyclists




Introduction: Smartglove for Cyclists

In this instructable, I will detail how I made this "smartglove" and its LED panel so that cyclists or other road users can ride/drive safer. Here is a short demo so you can see how it works (there is another video on my bike at the end of the instructable):

How it works:

The glove is made with an Arduino board that collects data from a gyroscope and an accelerometer. The Arduino code uses a model of tiny machine learning (tinyML), and it works with gesture recognition: each hand movement is analyzed and recognized (hand tilted to the left, right, front, back, etc.). Then this signal is sent through Bluetooth (BLE) to another microcontroller connected to an LED matrix (placed on a backpack, for example). According to the signal received by the microcontroller, some patterns light up on the LED matrix, so that other road users can see what the cyclist is doing (as an example right, left and straight arrows, or also text).

Origins of the project:

This is what motivated me to make this project:

  • First, I go to work by bike, and I ride more than 1 hour every day (approximately 22 km). This is always a lot of fun, except that I live in one of the most crowded cities in France, and incidents between cars and cyclists are frequent. This is especially true since Marseille is the worst city for cyclists (in France), with a huge lack of bicycle tracks (as you can read here). Therefore, this is a project to improve cyclist safety, but also to report the lack of consideration from the city toward cyclists.
  • Second, it is a project to help road users to communicate and understand each other better. In my point of view, most incivilities I see between road users are due to the fact that some users misinterpreted the behavior of the others, scaring them and leading to aggressiveness. With this device I want road users to understand each other better: arrows indicate the direction, and text can be displayed (but I strictly stick with nice and constructive text, to avoid starting conflicts).

Why is it called "smartglove"?

I initially started this project in winter, and cold weather was what motivated me to make this device on a glove. But I quickly realized it was a bad idea since it is quite hot in summer here. So placing this device in a box, and attach it to the hand was the perfect solution. But how to call such a device? Since I had no clue about a new name, I kept the term "glove", and there we are!

"Smart" comes from the tiny machine learning technique used for this project.


This project is a mix of 2 main projects I found. I did not start from scratch but rather took advantage of these projects to go further and make a more advanced one. These are the projects I got inspiration from:

  • The gesture recognition with Arduino Nano 33 BLE SENSE, that you can read by clicking here.
  • The other source of inspiration is not a specific project, but rather the concept of LED panels for cyclists. There are a lot of such projects, some are backpacks with integrated LED matrix, some others just consist in a matrix that can be attached anywhere. But in all cases, these LED panels are controlled with a remote (and not gesture recognition).


For 3D printed parts:

  • A 3D printer, or access to a fablab or online 3D printing service;
  • PLA.

For electronics parts:

  • An Arduino Nano 33 BLE SENSE;
  • Another microcontroller with BLE (Arduino Nano 33 BLE, Arduino 33 BLE SENSE, Arduino nano 33 IOT, ESP32, etc.). I decided to use an ESP32 board as I will explain later;
  • LED strip (WS2812B). I used 160 LEDs, to make a LED matrix of 20x8;
  • A quad-level shifter 3V to 5V: 74AHCT125;
  • A 1000 microF capacitor;
  • SPST switches (x3);
  • Perfboards;
  • Wires and jumper wires;
  • A 9V battery;
  • A power bank.


  • M3 screws and nuts;
  • Hook-and-loop fastener.

Step 1: Prerequisite (microcontroller, Codes)

After reading this post concerning Arduino boards and machine learning, I decided I had to give it a try. Since there are new Arduino Nano boards, I made a table to compare their characteristics, in order to make better choices before buying one.

All of these boards are really interesting, but the only one that could be used for gesture recognition is the Arduino Nano 33 BLE SENSE board since it is the only one to have sensors and supporting Tensorflow Lite. Another interesting point is that Arduino Nano 33 IOT, BLE, and BLE SENSE boards have Bluetooth, so all of them can be used on the LED matrix to receive the BLE signals.

As you will see later, the codes I uploaded on the Arduino boards are based on many Arduino codes I found. So before starting, I decided to test these codes with the examples I found.

Play with BLE

The Bluetooth connection in this project is crucial because it is how the signal is sent from the sensors to the LED panel. I never connected 2 Arduino boards with BLE before this project. So I practiced with the following example from the ArduinoBLE library:

  • LedControl sketch, used with the Arduino Nano 33 BLE Sense, and a button with pull-up resistor connected to pin 2. This example scans for BLE peripherals until one with the advertised service "19b10000-e8f2-537e-4f6c-d104768a1214" UUID is found. Once discovered and connected, it will remotely control the BLE Peripheral's LED, when the button is pressed or released.
  • LED sketch, used with Arduino Nano 33 IoT.

Unfortunately, I got many problems with the LED sketch, and 3 boards "broke" while uploading this sketch. I have no idea where this problem comes from but decided to replace the Arduino board with another microcontroller with BLE: an ESP32 board. With this new board, I used the following:

  • BLE_write sketch from the BLE ESP32 ARDUINO Library. I added a few changes to make it work with the Arduino Nano 33 BLE SENSE board. I invite you to compare the BLE_write sketch and the Smartglove_BLE_LED-matrix sketch I wrote and uploaded in step 10.
Play with built-in RGB LEDs

Did you know that Arduino Nano 33 BLE SENSE boards have RGB built-in LEDs? But why do we use them in this project? To verify if the gesture recognition worked. The idea here is to check if the signal has been sent to the LED panel, but because the panel will probably be attached is the cyclist's back, it is quite difficult to make sure the gesture recognition worked and the signal was sent to the panel.

Nothing complicated, I just slightly modified the Blink example. As you can see below, there is a red LED at pin 22, a green LED at pin 23 and a blue LED at pin 24. Note that "LOW" output turns the LED on, and "HIGH" output turns the LED off.

/* Blink example modified for built-in RGB LEDs (Arduino Nano 33 BLE sense).<a href=""> </a> */
const int LED_BUILTIN_RED = 22;
const int LED_BUILTIN_GREEN = 23;
const int LED_BUILTIN_BLUE = 24;

//the setup function runs once when you press reset or power the board
void setup() {
  // initialize digital pin LED_BUILTIN as an output.

// the loop function runs over and over again forever
void loop() {
  digitalWrite(LED_BUILTIN_RED, LOW);   // turn the LED on (HIGH is the voltage level)
  delay(1000);                       // wait for a second
  digitalWrite(LED_BUILTIN_RED, HIGH);    // turn the LED off by making the voltage LOW
  delay(1000);                       // wait for a second 

digitalWrite(LED_BUILTIN_GREEN, LOW);   // turn the LED on (HIGH is the voltage level)
  delay(1000);                       // wait for a second
  digitalWrite(LED_BUILTIN_GREEN, HIGH);    // turn the LED off by making the voltage LOW
  delay(1000);   // wait for a second
  digitalWrite(LED_BUILTIN_BLUE, LOW);   // turn the LED on (HIGH is the voltage level)
  delay(1000);                       // wait for a second
  digitalWrite(LED_BUILTIN_BLUE, HIGH);    // turn the LED off by making the voltage LOW
  delay(1000);                       // wait for a second
Play with gesture recognition and tinyML

And finally, I read the Get started with machine learning on Arduino guide, and practiced with the gesture recognition example. This example is divided into 3 main parts:

  • Data recording with the IMU_Capture code (with an Arduino Nano 33 BLE sense);
  • Train the model with the previously recorded data, on google colab: find it here (with a computer).
  • And use the trained model on the Arduino with IMU_Classifier, to recognize the gestures (with the Arduino board again).

Step 2: The Glove (1/6) : Electronics

From step 2 to step 7, I use a scheme to help you understand better the process to make the glove. The highlighted part of the scheme corresponds to the part of the project described in the current step.

Concerning the circuit of the glove, it is very simple:

  • The Arduino board;
  • A 9V battery (I used a rechargeable 9V battery);
  • An SPST switch.

Step 3: The Glove (2/6) : Case

The case is simply composed of two 3D printed parts:

  • The first part in yellow contains the Arduino board, the battery, and the switch. I designed holes in this part so the battery can be charged and the Arduino board can be programmed without removing the lid;
  • The second part in black (the lid) to protect the battery and the board.

I used a strip of hook-and-loop fasteners to attach this box on the hand.

Finally, I also designed a logo that I glued on the lid. It represents a cyclist viewed from the top, with 3 arrows (straight, left, right). The 4th arrow is detached from the others because bikes don't go backward.

Step 4: The Glove (3/6): Data Recording

Once the device ready, it was time to record data: the goal is to record all gestures many times. I set a threshold on the gyroscope, so when gyroscope values exceed this threshold, the Arduino board starts displaying the recorded data on the monitor.

These are the gestures I recorded:

  • Arm pointing to the left (regular gesture for cyclists to indicate a turn to the left);
  • Brake (gesture of the fingers to reach the brake);
  • Hand back tilt;
  • Hand front tilt;
  • Hand left tilt;
  • Hand right tilt.

Of course, you can add other gestures.

To record these data, I made a program that lights an LED of a different color every 20 movements. This way I know when to change gestures. I connected the Arduino board to my computer with the Arduino monitor open and placed my computer in my backpack.

Once all the gestures recorded, the last thing to do is to copy the data displayed on the monitor and save them in "csv" files (see the files uploaded on the next step).

Step 5: The Glove (4/6): Training

For the training, I used this link and I just modified a few lines of code. Before starting the training, I'd encourage you to plot your data and make sure every movement is similar for one "csv" gesture file.

In "Upload data", upload all your files.

In "Graph Data (optional)", add one of your filenames:

filename = "Arm_left.csv"

Then modify this line to plot only the gyroscope data:

#index = range(1, len(df['aX']) + 1)
index = range(1, len(df['gX']) + 1)

Comment the following lines, so they look like this (again I don't use accelerometer data):

#plt.plot(index, df['aX'], 'g.', label='x', linestyle='solid', marker=',')
#plt.plot(index, df['aY'], 'b.', label='y', linestyle='solid', marker=',')
#plt.plot(index, df['aZ'], 'r.', label='z', linestyle='solid', marker=',')
#plt.xlabel("Sample #")
#plt.ylabel("Acceleration (G)")

In "Parse and prepare the data", add all the names you used for your data:

#GESTURES = ["punch", "flex",]
GESTURES = ["Arm_left", "Brake", "Hand_back-tilt", "Hand_front-tilt", "Hand_left-tilt", "Hand_right-tilt"]

And change also the samples per gesture if changed in the initial Arduino code:


The last things to change is to comment the acceleration:

# normalize the input data, between 0 to 1:
# - acceleration is between: -4 to +4
# - gyroscope is between: -2000 to +2000
      tensor += [
          #(df['aX'][index] + 4) / 8,
          #(df['aY'][index] + 4) / 8,
          #(df['aZ'][index] + 4) / 8,
          (df['gX'][index] + 2000) / 4000,
          (df['gY'][index] + 2000) / 4000,
          (df['gZ'][index] + 2000) / 4000

Read and run every line of this page, and in the end, you can download the trained model.

Step 6: The Glove (5/6): Arduino Code

The final code I use on the smartglove is a mix of the following codes:

I will not describe them since this Instructable would be way too long, but I encourage you to read the original codes for a better understanding.

Add your new model to the code, and you are ready to test it!

Step 7: The Glove (6/6): Test

As you can see on the video below, the LEDs light up differently according to the gesture recognized:

Step 8: The LED Panel (1/4): Electronics

As I said previously, I encountered several problems when uploading the LED sketch from the ArduinoBLE library on the Arduino Nano 33 board, and decided to use an ESP32 board instead. Therefore on the above pictures, you can see both boards.

Since both Arduino Nano 33 BLE SENSE and ESP32 boards communicate with 3.3V logic level, I added a quad-level shifter 3V to 5V (74AHCT125) to do the level shifting, as recommended by on the Adafruit Best practice guide.

I also added a 1000 microF capacitor to prevent the initial onrush of current from damaging the pixels.

And I made this circuit on a perfboard.

You can also see that I used both plugs of the powerbank, since I was afraid the LED matrix would require too much current. Therefore, the matrix and the microcontroller are powered with different outputs of the powerbank.

Step 9: The LED Panel (2/4): Case

For the LED panel, I wanted a modular case. Therefore the case is made with several parts (also because I have a tiny 3D printer), and I placed many holes to use screws easily.

To attach the panel, I used a strip of hook-and-loop fasteners once again.

You can download all the files I designed.

Step 10: The LED Panel (3/4): Arduino Code

The final code is a mix of the following codes, after a few modifications:

  • "BLE_Write" example from the "BLE ESP32 ARDUINO" library.
  • "MatrixGFXDemo64" example from the "FastLED NeoMatrix" library.
Once again I will not describe them, but I encourage you to read and compare the original codes and the one I wrote. But feel free to ask any questions in the comment section!

Step 11: The LED Panel (4/4): Test

And it is time to test it! Each time a gesture is recognized, a signal is sent to the LED panel and a pattern is displayed. You can also note that an LED lights up on the smartglove according to the gesture recognized.

In case you missed it, this is the video shown in the introduction:

Step 12: Final Test & Conclusion

Here is a video of the device when I ride my bike!

To conclude, I am very happy with this device. It helped me get much more confident with tinyML and BLE. Since then I also bought another Arduino Nano 33 IOT board, and I am currently doing a very exciting project with it that I will publish soon (don't miss it)! But there are also things that I would change if I ever do a second version:

  • A better lid for the smartglove. The current one is just placed on the box, and stay fixed because it is tight. But during a ride, my hand hit an object and the lid fell and broke. So on the next version, I will probably use screws for the lid.
  • A better case for the LED panel. I rapidly found out that the case I made lack easy access to the microcontroller USB port, in order to debug it or change its code, without having to unscrew the entire case. Also, the powerbank cannot be accessed and charged without removing the screws.
  • More data for the training. A few times, some gestures are sometimes not recognized, and some are recognized instead of others. This is probably due to a lack of data (only 20 movements for each gesture). So more movements would make a better model and less error in gesture recognition.

It took me many months to make and write this instructable, so if it is not clear or a file is missing, please leave a comment below and I'll try to answer as best I can.

I hope you liked it, and see you soon for other exciting projects! :)

Arduino Contest 2020

Second Prize in the
Arduino Contest 2020

Be the First to Share


    • Woodworking Contest

      Woodworking Contest
    • Make It Modular: Student Design Challenge

      Make It Modular: Student Design Challenge
    • Electronics Contest

      Electronics Contest



    25 days ago

    I would also like to recreate your project but I have some problem to do so. For my idea, I want to place the arduino and the led strip onto the glove. For example if I want to turn left, I'll just reach out my hand then the led strip will start blinking. Also, I want to develop an app for bluetooth to indicates navigation (google maps) for upcoming turn. For example, the upcoming turn is right in 500m, then the led will start blinking 5 times to show the upcoming turn. So the question is do I need to use 2 arduino nano 33 ble sense? (left and right glove)


    2 months ago

    Hello Maria,
    First of all I congratulate you for the project, I tell you that I have been five long years with the same idea, my learning curve has been longer because electronics is not my profession but a hobby. And I could only make progress on weekends. I started by basic electronics courses, arduino, sewing and embroidery motivated by the Werables and finally I found a course on the Cursera platform which I learned that the development of the project is a necessity worldwide, and a start for the little ones in the management of traffic signs and their importance in the creation and appropriation of a language of icons around the safety of cyclists and pedestrians.
    At some point I tried to use the ESP32 but each time it was more complicated to understand its operation, so I opted to use only the accelerometer and the motion sensor. Using the WS2812B LED strips, which I think are fabulous and easy to program, I have already designed and built the jacket in which I plan to implement.
    But knowing today that this initial idea of about 5 years ago, if you can bring to reality motivates me again to take up the project that is abandoned. Your publication motivates me to do it and so I will clear my doubts of the ESP32 as the page says "do it yourself" and I will be consulting you if something comes up.
    Another time I will show you my work


    Reply 2 months ago

    Hi, thank you for your comment. I'd be very interested to see your project, don't hesitate to share the latest progress and pictures!


    1 year ago

    Love the idea, but I don't know if it's just me or could the 'left' and 'right' signals be misinterpreted by drivers as 'its OK to pass on my left/right', due to their similarity with road signs? I think I'd prefer simple car-like blinkers instead, but it's just a minor alteration to the design :)


    Reply 1 year ago

    Thank you for your comment, I did not realize about this but indeed these right/left signals could be misinterpreted. I'll change this!


    Reply 1 year ago

    Merci BEAUCOUP OP! I cycle constantly as a mode of transport and also (used to) bike commute to work (before lockdown and work from home). I agree with your philosophy about bicycles versus drivers, 100%. So I will certainly be making one! However it will be my first Arduino project so I will be heavily reliant on your instructions.
    Great comment about the little tweak here. Is this "blinker" change mentioned easy for a beginner to implement?
    Thank you kindly for helping me stay alive, on behalf of my family friends and I; what a beautiful invention!!


    Reply 1 year ago

    Thanks! This blinker change should be easy to modify, but since the code is quite long it might be a bit scary. If you have any problem, feel free to contact me so I can help you!
    BTW, if you are concerned about cyclists/cars interactions, you might be interested with the previous instructable I wrote: a bell for cars to warn cyclist without scaring them :) =>


    1 year ago

    Waw! Increíble idea. Tanto para la nieve como para ir en bici, me parece un gran acierto este tipo de proyectos. Facilita la vida del "rider" como de las personas que se encuentran a su alrededor. Creo que comercializando este producto se podrían evitar muchos accidentes. Gran trabajo!


    1 year ago

    good afternoon
    know if you can do the two ESP32 and accelerometers.
    the arduino is very expensive compared to esp32
    Big project


    1 year ago

    Cool! I wonder if folks could misread the left turn as "turn to my left" though?


    Reply 1 year ago

    Thank you for your comment. Indeed this could be an issue, but it is easy to change the pattern! :)


    Answer 1 year ago

    Thank you!
    Cost: if you buy brand new stuffs, I'd say something between 90€ and 100€ (the most expensive parts being the Arduino Nano 33 BLE sense board and the LED strip, at about 30€ each). But you can definitly decrease the price by using a much smaller LED panel!
    Time spent: you will probably spend 1 full weekend if already have skills in 3D printing and soldering.


    Reply 1 year ago

    Yes indeed! It is another possibility for the next versions, adding bright LEDs on the glove!


    1 year ago

    Fabuloso, intentare leer el codi por ver si soy capaz de entenderlo.
    Muchas Gracias por su trabajo


    Reply 1 year ago

    Muchas gracias! Si tiene problemos para entender el codi, puedo ayudarle!


    Reply 1 year ago

    Thanks a lot
    Start by installing the code on pc
    I will inform you.
    A greeting from Spain.
    sorry for my English ;-)


    1 year ago on Step 12

    P.S solar panel on detachable waist coat for day time power supply or wired to by detachable dynamo.the switch device could be attached onto the handle bar nearest your hand Left or Right you would use.extreme cold temperatures might affect the circuit too as too would rain