Introduction: Streaming Video From SD Card to Nokia LCD With Arduino
For a long time I thought playing back video with audio was not possible on the Arduino. Then one day I decided to give it a shot and it turned out to be simpler than I thought.
I decided to write the steps in an instructable for the benefit of others considering trying it out themselves.
This write-up is about how to create such a video for playback. It is not intended to be a full-fledged instruction on how to build a device like this !
However, I will tell you how to get one or instructions for building one, if you are interested.
Step 1: The DIY "Gamebuino" Device and How to Get One
The "Gamebuino" or actually a DIY "Fakebuino" used in this video is just a Nokia 5110 monochrome lcd display wired up to a Arduino Nano (ATMega328). Also included in my build is a SD-card breakout and a PAM 8403 amplifier breakout. All components can be bought very cheap on Ebay. Simply search with the terms in bold above.
Alternatively, you can buy a device like this ready-made at www.gamebuino.com. Aurelien Rodot, the mastermind behind the Gamebuino device has open-sourced his hardware and library. You will find all the information you need by searching his project website.
I have included a Fritzing schematic of how my own device is wired up. Just be aware, that it is not a 1:1 Gamebuino-compatible device, it is just my own version of the thing. Take note: the PAM 8403 chip is not a chip, the pins refer to connections on a PAM 8403 breakout.
Also, I will not explain how to use a Nokia 5110 LCD / SD card breakout with Arduino, even if asked. Both topics have been extensively covered by others and you will have to do your own research.
Step 2: Ripping the Video Into Numbered Bitmap Frames With Ffmpeg
The video used in this demonstration was saved from Youtube. Again, I will not explain that process here. Just Google it. In any case, I saved the video as a low resolution 3gp file, since I knew it was going to end up in a very very low resolution in the end anyhow.
So, after you have yourself a source video you wish to rip, you need to get and install yourself ffmpeg if you do not already have it.
The video was ripped into 84x48 grayscale bitmap files at 20 frames per second using the following command line:
ffmpeg -i myvideo.3gp -r 20 -f image2 -vf format=gray -s 84x48 .\output\%d.bmp
Explanation of parameters:
- ffmpeg is the program. If installed correctly, you should be able to call this from any directory on your Win/Linux box
- myvideo.3gp is the name of the source video. You need to be in the directory of the source video when you run this command. Otherwise ffmpeg will not find your source video. The video can be any format, here it just happens to be a .3gp. Could be a a .mpg or .mp4 or whatnot.
- -r 20 is the frame rate at which we will rip the frames, 20 frames per second
- -f image2 is the output format (output video as image frames, not a single video file)
- -vf format=gray means make the output format grayscale
- -s 84x48 is the pixel size of the output frames (84 by 48 pixels for Nokia 5110 LCD)
- .\output\%d.bmp is the output directory and output naming for the ripped frames. You need to make a "output" folder before you run this command ! The %d.bmp simply means to name files as follows: 1.bmp 2.bmp 3.bmp ...
Step 3: Ripping the Audio to 8-bit RAW With Audacity
Go get and install Audacity, if you haven't got it already.
Audacity needs both the LAME and ffmpeg libraries to work !! Go check from Preferences->Libraries that Audacity lists both the LAME and the ffmpeg version. If no version numbers are given, it means that Audacity has not located the libraries and the next steps will not work ! Again, I am not going to give technical advice on using Audacity. Seems harsh but believe me, there are hundreds of tutorial on Audacity already. I am not going to repeat that information here.
Next, open up the video file in Audacity. Thanks to ffmpeg, it can directly rip the audio into Audacity (this is why we need the libs in the previous steps, you see).
NOW, set project rate at 10080 Hz (see image). This is important because it will sync the video and audio. As we will be playing back video at 20 frames per second = 504 x 20 = 10080 bytes of image data per second, therefore we need also 10080 bytes of audio per second. THIS IS THE MOST IMPORTANT TRICK IN THE WHOLE THING !
NEXT, export audio as 8-bit unsigned headerless RAW (see image)
SAVE file as "audio.raw" into the output folder containing the ripped .bmp frames
Step 4: Dithering the Frames With Imagemagick "mogrify"
Next, we need to make 1-bit black & white images from the grayscale frames we ripped.
For that, we need to download and install Imagemagick:
If imagemagick is correctly installed, you should be able to run it from command line in any directory, on both Windows and Linux.
WARNING: Mogrify will operate directly on the .bmp files. If you want to keep the originals that you ripped, back them up before you proceed.
Now, go to the output directory, where you have the .bmp frames and the audio.raw.
To convert to 1-bit monochrome using simple threshold:
mogrify -dither None -monochrome *.bmp
To convert to 1-bit monochrome using Floyd-Steninberg dithering:
mogrify -dither FloydSteinberg -monochrome *.bmp
THIS WILL TAKE TIME, EVEN ON A FAST MACHINE
Step 5: Rejoining Video and Audio Into a Single Stream
Now, we need 2 tools that I wrote myself.
You can either download Windows executables I made (don't worry, they are simple command line utilities, there's no viruses or that sort of bulls*) from the link I provide here, or you can copy paste the code and compile in your favourite environment.
1. Put jvid.exe (or the executable you compiled from jvid.cpp) into the output folder and run it
This will read the .bmp file header, decide if foreground / background need to be flipped, strip the image data from the data area and join all frames image data into one chunk into a file called output.jvd
2. Put jmerge.exe (or the executable you compiled from jmerge.cpp) into the output folder and run it
This will join together output.jvd (the image stream) and audio.raw (the 8-bit raw audio file you created earlier. BOTH have to be in the same directory as the jmerge executable. The output is a file called video.jvd
VIDEO.JVD is the end result video file that will be put on the SD card !
Question: why didn't you make a nicer interface four your tools ? Why did you choose such silly names ?
Answer: this was an experiment. I am not interested in developing this further. Look at the code and make better tools yourself.
EDIT: In case you get "missing libgcc..." with jvid.exe or jmerge.exe I added libgcc runtimes (Libraries.zip) to the downloads. Download and add them to the same folder if you get "missing libgcc..." error.
Step 6: Playback Program (sketch) for the Arduino
Now that we have copied video.jvd onto the SD card, we only need the player program on the Arduino/Gamebuino/Fakebuino/whatever you have created to play back the video.
In order to use this sketch, you need 2 libraries for the Arduino
1. TimerOne library
2. PetitFS library (already included in the Gamebuino library, if you downloaded that)
There are some explanations within the sketch on how the code works but basically you are on your own. Can't be bothered to clean it up. Sorry.
Step 7: Conclusion & Caveat Emptor
[Latin, Let the buyer beware.] A warning that notifies a buyer that the goods he or she is buying are "as is," or subject to all defects.
What I mean: use all this information on your own risk. Tech support on this article will be minimal, due to lack of time.
Trying to cram a video onto a 84x48 pixel screen was very fun. I had no previous knowledge of the tools needed for this excercise, I figured it out pretty much by myself - I knew what I needed, though, thanks to prior work by others.
Thanks to Myndale for his Gamebuino audio streaming demo, and to Elm-chan for the wonderful PetitFs library. Thanks to Aurelien Rodot for inspiring with his Gamebuino project. Thanks to Joseph Rautenbach, who did the Rick Astley video on Arduino, which had no sound, but which got me thinking about how it could be done. You can see his video here.