Electronics Projects: How to Use LEDs to Detect Polarity
In this project, you build an electronic gadget that uses two LEDs to indicate the polarity of an input voltage. The voltage is provided by a 9 V battery connected to the circuit via a DPDT knife switch that's wired to reverse the battery polarity. The two LEDs and their corresponding resistors are mounted on a small solderless breadboard.
But before jumping into the project, a little background on LEDs is in order. A light-emitting diode (also called an LED) is a special type of diode that emits visible light when current passes through it. The most common type of LED emits red light, but LEDs that emit blue, green, yellow, or white light are also available.
The schematic diagram symbol for an LED:
The two leads protruding from the bottom of an LED aren't the same length: The shorter lead is the cathode, while the anode is the longer lead.
Whenever you use an LED in a circuit, you must provide some resistance in series with the LED. Otherwise, the LED will light brightly for an instant, and then burn itself out. In this example, the LED is connected to a 9 V DC supply through a 470 Ω resistor.
To determine the value of the resistor you should use, you need to know these three things:
The supply voltage: For example, 9 V.
The LED forward-voltage drop: For most red LEDs, the forward-voltage drop is 2 V. For other LED types, the voltage drop may be different. Check the specifications on the package if you use other types of LEDs.
The desired current through the LED: Usually, the current flowing through the LED should be kept under 20 mA.
Once you know these three things, you can use Ohm's law to calculate the desired resistance. The calculation requires just four steps, as follows:
Calculate the resistor voltage drop.
You do that by subtracting the voltage drop of the LED (typically 2 V) from the total supply voltage. For example, if the total supply voltage is 9 V and the LED drops 2 V, the voltage drop for the resistor is 7 V.
Convert the desired current to amperes.
In Ohm's law, the current must be expressed in amperes. You can convert milliamperes to amperes by dividing the milliamperes by 1,000. Thus, if your desired current through the LED is 20 mA, you must use 0.02 in your Ohm's law calculation.
Divide the resistor voltage drop by the current in amperes.
This gives you the desired resistance in ohms. For example, if the resistor voltage drop is 7 V and the desired current is 20 mA, you need a 350 Ω resistor.
Round up to the nearest standard resistor value.
The next higher resistor value from 350 Ω is 390 Ω. If you can't find a 390 Ω resistor, a 470 Ω will do the trick.
Note that the minor increases in resistances mean that slightly less current will flow through the resistor, but the difference won't be noticeable. However, you should avoid going to a lower resistor value. Lowering the resistance increases the current, which can damage the LED.