Jump to content
 

Simple Resistor Question


 Share

Recommended Posts

So you ran one particular LED at about 2V above its expected Vf "to see what it would do". That's fine, they are your LEDs to do what you want with. 

 

Maybe you got lucky and found a fairly resilient one. Round about 1980 when I got my first home computer, it had a  1MHz CPU. With a very small amount of rewiring I changed the clock frequency to 2MHz and it still worked, twice as fast - I was lucky, My CPU ran at a higher speed than it was rated at. I swapped in another 1MHz rated CPU (nominally the same), it didn't work at 2MHz. 

 

The point is, an individual device may work at a higher rating than it's designed for, but it's not guaranteed - you could have put another LED in and it may not have survived the test.  I have overrun LEDs in the past (not as violently as my previous 'experiment') and they could best be described as 'sort of working' afterwards - they lit, but not nearly as well as before they were over-voltaged. Overrunning your LEDs is quite likely to degrade their performance and shorten their life.

 

The effect is best described here http://led.linear1.org/what-happens-if-i-overdrive-an-led/

 

Remember when you look at a LED datasheet, it will have Vf minimum, Vf typical, Vf maximum at a particular test current (usually the recommended operating current for the device) - this is the manufacturer's tolerance for the range of Vfs that an LED of this type might be supplied with - you might match a particular LED to a voltage source, but take another LED and it won't match, or an overvoltage will behave differently. Put in a appropriate current limiting resistor in series with a higher voltage power supply and you can guarantee the LED will work. 

 

  • Agree 3
Link to post
Share on other sites

23 hours ago, DavidCBroad said:

Exactly.   see   https://youtu.be/OQqZYi5R4K4   video of testing an LED last evening.

 

 

David I have to admire your tenacity in your anti resistor crusade but I have to ask to what purpose? Most of us will probably power our LEDs using re-purposed wall warts or phone chargers whose voltage inevitably will not match Vf of any LEDs used so will have to resort to using resistors anyway. The few who will use batteries will inevitably ask themselves is it worth risking their LEDs for the saving of a few pence in resistors and a few seconds of soldering. So again I say WHY?

 

Richard

  • Agree 4
Link to post
Share on other sites

I stumbled upon this data-sheet.

 

Looks like these are the kind that might be used in strings for decoration. The interesting thing about them is they include two zeners that either bypass an "open" led or shunt excess current around a LED to protect it from over-current.

 

The main thing about LEDs is their intensity (brightness) is really a function of the current, and the easiest and most reliable way to produce a consistent current is to use a resistor from a higher voltage source - as has been stated many times already in this thread. But if you want to muck around with other methods, have at it ;)

 

Incidentally, above 5mA this particular LED has a 30 ohm resistive response (voltage approximately proportional to current).

  • Like 1
Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
 Share

×
×
  • Create New...