Electronics: Introduction to Infrared Light
Many electronic circuits detect the invisible light that’s commonly called infrared. Infrared light is light whose frequency is just below the range of visible red light. Specifically, infrared is light whose frequency falls between 1 THz to 400 THz (one THz is one trillion cycles per second). The infrared spectrum falls right between microwaves and visible light.
There’s an inverse relationship between frequency and wavelength. In other words, the lower the frequency, the longer the wavelength. If you describe infrared in terms of its wavelength rather than its frequency, infrared waves are longer than the waves of visible light, but shorter than microwaves.
The wavelength of infrared light is between 0.75 to 300 micrometers, which is a millionth of a meter. Thus, at the very bottom edge of the infrared spectrum, the infrared waves are about one third of a millimeter long. At the upper end, the waves are about one thousandth of a millimeter long. If the waves get any shorter than that, they become visible light.
Infrared light is often used to detect objects that we can’t see in visible light. One common application of this is night vision. According to a principal of physics called Planck’s law, all matter emits electromagnetic radiation if its temperature is above absolute zero. Some of that radiation is in the infrared spectrum, so devices that can detect infrared light can literally see in the dark.
To enhance the effect, some night-vision devices actually illuminate an area with infrared light. Because the human eye can’t see the infrared light, the area illuminated still appears dark to us, but to a detector sensitive to infrared light, the area is lit up and fully visible.
Another common application of infrared light is for wireless communications across short distances. The best known infrared devices are television remote controls. The remote control unit contains a bright infrared light source, and the television itself includes an infrared detector.
When you point the remote control at the television and push a button, the remote control turns on the infrared light source and encodes a message on it. The receiver picks up this signal, decodes the message, and does whatever the message directs it to do — turns up the volume, changes the channel, and so on.
Like visible light, infrared light can be blocked by solid objects and it can bounce off of reflective objects. That’s why the remote won’t work if your spouse is standing between you and the television. But it’s also why you can get around your spouse by pointing the remote at a window. The infrared waves bounce off the glass and, if the angle is right, arrive at the television.
The first wireless remote control was developed by Zenith in 1955. It used ordinary visible light, could turn the TV on or off, and could change channels.
It had one nasty defect: You had to position your television in the room so that light from an outside source (such as the setting sun shining through a window) didn’t hit the light sensor. Otherwise, the TV might shut itself off right in the middle of the evening news when the sun reached just the right angle and hit the sensor.
Remotes today use complicated encoding schemes to avoid such random misfirings. You’re probably familiar with the procedure you must go through when programming a remote control to work with a particular television. This programming is necessary because there’s no widely accepted standard for how the codes on a remote control should work, so each manufacturer uses its own encoding scheme.