If you've ever wanted to have "heat sensing vision," look no further! Thermal cameras are becoming cheaper and easier to use, which means they're more documented and accessible for hobbyists. In this Instructable, I'll provide an introduction to the physics behind thermography and few basic sensors that you can use in your next electronic project. Whether you want to give your robot a sense of temperature without touch, or want to find the source of a drafty window, thermography has a lot of practical uses. Let's jump in!
Step 1: What Is a Thermal Image?
Before we dive into using thermal imaging sensors, let's define a few terms and go over the physics behind the cameras to better understand what we are actually doing. I'll do my best to reasonably explain the terms and concepts mentioned.
A thermal image (thermogram) is a digital representation of a scene and a measure of the thermal radiation emitted by the pictured objects. Thermal images are captured via thermographic cameras, which are devices capable of sensing this radiation in the form of infrared light. A thermal image allows us to remotely sense the temperature of an object or at least accurately tell its temperature relative to its environment. This is useful as it allows us to essentially "see" in the dark as well as perceive the temperatures of many objects remotely.
Step 2: What Is Thermal Radiation?
All matter above absolute zero (−459.67°F) emits electromagnetic radiation as a function of its temperature. This property is referred to as blackbody radiation. Around room temperature, most objects emit this radiation at infrared wavelengths. As their temperature increases, objects begin to emit light in the visible spectrum, beginning with a dim red glow before reaching a white hot (which means it's covering most of the visible spectrum) and eventually releasing most of that energy in ultraviolet wavelengths and beyond.
Infrared light, or IR, consists of the long wavelengths of light just beyond our visual perception of nearer red wavelengths in the visible spectrum. All electromagnetic radiation carries energy, but infrared light is more readily absorbed by matter, which increases its kinetic energy, therefore increasing its temperature. Since all matter is emitting IR light as a result of blackbody radiation and is a function of its temperature, being able to accurately sense the IR radiation can allow us to create a thermal image.
Step 3: How Do We Detect Thermal Radiation?
Now that we know we want to detect long IR wavelengths to get a sense of an objects temperature, how do we actually detect it? With sensors, of course! In this case, we'll talk about bolometers, since these are the types of sensors in the cameras we'll be discussing.
In a basic sense, a bolometer is a simple sensor that absorbs thermal radiation, and changes resistance as a result. This change in resistance can be electrically measured, and the incident radiation (which should be a function of the object's temperature) can be determined. A bolometer is a large thing, so in this case, the small array of sensors in the cameras are microbolometers.
Step 4: What Is a Thermal Camera?
So, with an array of bolometers, we've got the basic means of detecting IR radiation from an object, which, as part of the thermal radiation as a function of the objects temperature, means we can begin to depict the thermal scene on our own terms. Thermal cameras need to be aware of a few properties in order to work properly. What are the thermal properties of the thing we are trying to measure? This depends on a few factors:
Different materials absorb certain wavelengths of light at varying levels, affecting their thermal energy. Understanding absorption is relevant for the other factors influencing the radiation detected by the camera.
Materials transmit certain wavelengths of light as well, absorbing some, reflecting others. Since no object is detected alone and in a vacuum, other sources of EM radiation will possibly be transmitted through the subject and into the camera as well.
For a given temperature, different materials emit thermal radiation at widely varying levels. For example, aluminum is highly emissive, while wood is an ineffective emitter. This property roughly correlates to electrical conductivity, while not a precise relationship, it is an easier way to remember and have a sense for how emissive an object is. This is one of the most important properties to recognize, as understanding that for a given equivalent temperature two dissimilar (in composition) materials will appear to be at different temperatures!
All EM radiation can be reflected as well. Depending on the surface conditions of an object as well as its reflectivity, other sources of EM radiation may bounce off the subject and strike the camera sensors, which, being indistinguishable from the directly emitted radiation from the subject, will skew the thermal image as well. D'oh.
Step 5: Thermal Camera Costs and Technology
If you've looked before, you may have noticed that thermal cameras, even very low resolution models, are wildly expensive, often costing many hundreds if not thousands of dollars. Why the steep price? If they're so useful, why aren't they becoming cheaper? Well, they are, but here's a few reasons why this is difficult:
Use of, and therefore demand for, regular visible spectrum cameras is higher and the technologies, like most microelectronic production, is based around good old silicon. Most of the optical technologies for creating electronics are designed for working with silicon processes. Thermal focal-plane arrays (FPAs) require more exotic materials and must be built with separate hardware. Also, these exotic components cannot make up the rest of the supporting circuitry (processing, power management, memory etc), so they are bonded to a more common and easily made silicon-based circuit to handle those tasks. +$
Due to the nature of the manufacturing process, the sensors for each pixel are non-uniformly responsive, so this requires additional processing power for each one to create a uniform signal base. Given an identical source of thermal radiation, even two adjacent sensors will be inconsistent in response, which means the sensor unit as a whole needs to be calibrated. Calibration for each sensor can be done at the factory as well, but this takes extra time and testing since every sensor will be unique. +$$
Like regular cameras, protecting and focusing light involves the use of lenses and windows. While creating visibly transparent lenses is common and much easier, these kinds of lenses are opaque for the IR spectrum. Thermally transparent materials are, you guessed it, made out of exotic stuff as well, which means more custom, low-volume hardware for processing and manufacturing the lenses is required. +$$$
Step 6: Using the Melexis 90621
The first sensor we'll talk about and use is the Melexis 90621 (the 90620 pinout is compatible), which is a 16x4 pixel camera. You can pick one up here.
While you won't exactly be able to record an IMAX quality film with this kinds of sensor, 16x4 pixels is more than enough to detect a human, or the direction of motion from a living object. It has a temperature range -20 degrees to 200 degrees Celsius. What's more, this is an I2C device, so adding it to a sensor-heavy project won't cost any precious extra pins. The only catch to using the 90621 is a Vdd requirement of 2.6 volts. It is 3.3V tolerant, but Melexis suggests the lower voltage as this is the potential at which the sensor is calibrated (more accuracy!). It can also operate at 500hz per frame, which is plenty fast for most applications. Not to mention, it's quite tiny, being only 10 mm in diameter and 18 in length.
Step 7: Simple Thermal Imaging Camera: Parts and Materials
(2 meter) RGB Dot Star LED strip
(6600 mAh) LiPo battery
LM317 voltage regulator
female header strip
1 uF electrolytic capacitor
(2x) 0.1 uF ceramic capacitor
(2x) 1K resistor
(2x) 4.7K resistor plywood sheet
Step 8: Electrical Design
Power is provided by a large (>5000mAh) LiPo battery which is immediately boosted from 3.7V to 5V. The 5V is fed into the Edison Arduino breakout board via a spliced micro USB cable (injecting power directly via the 5V pin is not recommended). The MLX90621 is calibrated to run best at 2.6V and requires less than 10mA, so a simple LM317 LDO regulator brings the 5V down for it. The DotStar strips are connected directly to the 5V out of the boost converter.
The Intel Edison lives on an Arduino breakout board and controls the show. It communicates over I2C via a level shifter to the MLX90621 sensor. The Edison also communicates to the DotStar clock and data pins via two digital pins. The DotStar leds have a very basic shift-register control scheme, so they can be connected to any digital pins and defined in software.
Step 9: Software
The thermal test rig is a simple Arduino sketch running on the Edison. You can find the sketch along with the library archived and attached above.
10 Get new thermal data from sensor and load into an array (16x4)
20 Map raw sensor values to 0 to 100C (The sensor can read beyond these ranges, but this is a simple test)
30 Map pixel temp values to color values in a new array with red increasing and blue decreasing towards 100
40 Shift out color value to DotStar strip
Step 10: Going Further With the FLIR Lepton
If you're looking for a few more pixels than the Melexis sensor, look no further than the FLIR Lepton module. FLIR recently began selling lower cost thermal camera modules that were meant to attach to cell phones. They also released the raw sensor, which allowed Pure Engineering to develop a breakout board to access the Lepton module's I2C and SPI interfaces. The unit has an 80x60 pixel resolution (ie ginormous) and can pop out frames at 9Hz. This isn't a technical limitation, but a political one, as high speed thermal imaging is useful for military purposes, so the units are export restricted. The Pure Engineering GitHub page has image capture software for the Intel Edison as well as other platforms.