I enjoy messing around with electronics. I have no formal training, but I understand the basic concepts of a volt, an amp, an ohm, etc. and I can distinguish a cathode from an anode. But I can't seem to figure out how best to light some LEDs...
The LEDs themselves are nothing too special: white, yellow, red, green. I understand that they each require different voltages to operate correctly without cratering. I recently bought an LED "driver" that claimed to offer variable voltages between 2.5 and 12 volts. I don't know how that's possible, but whatever: I bought one. It seems to be putting out around 10.5 volts DC.
I bought some 390Ohm resistors, and hooked them up to this driver (I had visited a web site that calculated 390 ohms as about the correct resistance for a single yellow 20mAh LED). When I test the voltage on the other side of the resistor, it still says 10.5 volts. But I figured that this was somehow being managed by the resistor down to a level where my LEDs could handle it. I then hooked it up to one of my flickering yellow LEDs: it flickered twice, and then burned itself out.
No wonder at 10.5 volts, right? But then what purpose was the resistor playing in this set-up? I would never hook up 10.5 volts directly to any LED, but when I have visited a site that calculates a correct resistor for me and I get that resistor and I use it and my LEDs STILL end up burning up, I figure I just don't know what I am doing. I'd like to be able to convert my socket voltage (125V AC) down to one I can use for most any LED (2.2-2.4V DC) but I just don't know how to do that. I thought a resistor running off of a 10.5V transformer was the answer, and it clearly was not. Which is why I am here. Someone please tell me what I am doing wrong before I burn up any more LEDs. Thanks!
Copyright © 2024 1QUIZZ.COM - All rights reserved.
Answers & Comments
Verified answer
Variable or settable voltages are easy with a voltage regulator circuit, but never mind that. You need to link to this "LED driver" so we can figure out what it is and what it's doing. I suspect it is doing something strange, like acting as a constant-current supply.
In the meantime, don't use it. Just use an ordinary DC power supply.
About the resistor:
It is not really the purpose of the resistor in an LED circuit to "reduce the voltage" (for that we would use two resistors in a "voltage divider" arrangement) but rather to limit the current. Think of an LED as a dead short that nevertheless somehow manages to have a non-zero "forward voltage drop", Vf in diagrams. For yellow LEDs Vf is about 2.2 volts.
For this particular LED you need to limit the current to 20 mA. (btw, most "20 mA" LEDs have an "absolute maximum" rating of 30 mA, so 20 mA is not absolute.) As you already know, if you just connect the LED across a power supply with no R, far more than 20 mA will flow and the LED will die from overheating. But this is not because the voltage is too high, it's because there is nothing in the circuit to limit the current.
(NOTE: As with most things in electronics this is not *exactly* true. The LED does present a very very small effective series resistance. So does the power supply or battery. But these internal Rs are so small that we can ignore them in circuits like this.)
If you hook your LED to a fancy current-limited power supply, set the supply for 20 mA, and turn it on, it won't matter what you set the output voltage to... except that it has to be more than the LED's Vf, or no current will flow at all.
(NOTE: Don't actually do this. Many PSUs do not limit current quickly enough to protect LEDs without a proper series R.)
So - assume you have a 10.5 volt supply and you want to limit current to 20 mA, or .02 A. If all you had to worry about was the 10.5 volts, we could just calculate
10.5 volts
-------------- = 525 ohms
0.02 amps
But you have this additional part in series in the circuit, the LED. It presents an effective resistance of zero (this is why NO voltage is safe to use without some form of current limiting). But it "drops" 2.2 volts. That means that in a circuit with a proper valued R, if you set it up like so:
supply+ -----/\/\/\/\-----|>|------ supply -
and you put a voltmeter probe across the LED, you would measure 2.2 volts, *regardless* of what the supply voltage was. The rest of the supply voltage is "dropped" across the R. (This is basic - the voltage dropped across all of the series elements always adds up to the total supply V.)
So to use the R to limit current in the circuit to 20 mA, we have to calculate R based only on the voltage dropped across the R. By Ohm's law, R = E(volts) over I(amps), so
10.5 volts - 2.2 volts
------------------------------- = 415 ohms
0.02 amps
390 is the closest standard value. Current would then be 21.3 mA (do you see why?), which is fine. Since the LED and R are in series, and current is the same everywhere in a series circuit, this is the result we need.
About the voltage measurement:
The reason you weren't able to measure the result of the resistor was that you were likely measuring with a DMM and you had no load on the R other than the DMM. Most DMMs present an impedance (call it resistance) of 10 Mohm, meaning that effectively no current was flowing. Your circuit with just the R and the DMM looked like this:
supply+ ----/\/\ 390 ohms /\/\//-----/\/\/\/ DMM 10 MOhms /\/\/\----- supply-
The voltage drop for each element in a series circuit is proportional to that element's R. So your meter was presenting practically all of the voltage drop, so of course it read the same as the supply voltage. The R would have to be up in the megohm range before you noticed much change in the meter reading, the way you had it hooked up.
Let's say that instead you did something like this:
10.5V supply+ ----/\/\ 390 ohms /\/\//--(point A)---/\/\/ 1000 ohms \/\/\/\----- supply-
and you measured from point A to supply-. what voltage would you see?
Total R in the circuit is obviously 1390 ohms. The 390 ohm R drops 390/1390 of the total V, and the 1000 ohm R drops 1000/1390. So
10.5V x (1000/1390) = 7.6 volts
and that is what you'd measure from point A to ground.
Another way: 10.5V / 1390 ohms = 7.6 mA of current is flowing. The voltage that appears across the 1K resistor is then
E = IR (ohm's law)
E = 0.0076 A x 1000 ohms
E = 7.6 V
It's really the same principles at work. But sometimes you know the current already so it's easier to just multiply by the R.