this post was submitted on 18 Apr 2024
11 points (100.0% liked)

Ask Electronics

3325 readers
1 users here now

For questions about component-level electronic circuits, tools and equipment.

Rules

1: Be nice.

2: Be on-topic (eg: Electronic, not electrical).

3: No commercial stuff, buying, selling or valuations.

4: Be safe.


founded 1 year ago
MODERATORS
 

LEDs will conduct more current when they get warmer and differences between individual LEDs mean you cannot easily put them in parallel. A constant current DC supply will be good enough for part of the LEDs but will overload some others. To normalize current a series resistor is used with each individual LED.

Now, those resistors waste a bit of power. Are they really necessary? If you put several LEDs in series the individual differences become negligible at some point and a constant current supply will suffice for several strips of series LEDs in parallel.

How many LEDs would this require? Another possibility would be to have the resistor in series with a strip of LEDs.

I got some LED strips off AliExpress that run on 12V and each individual LED has a resistor in series with it. I believe this to be quite wasteful and it would be better to have several LEDs in series with a current regulator instead. The LEDs will end up in an autonomous greenhouse where power efficiency is important.

top 4 comments
sorted by: hot top controversial new old
[–] wirehead@lemmy.world 6 points 7 months ago

You need to control the current going through the LED, either by wasting it as heat (resistor or linear controller) or via switching power supply mechanisms.

Non-intelligent LED strips (where the whole strip is the same color, as opposed to the intelligent kind where each LED can be a different color) are generally not using a resistor per LED because you can use a row of LEDs in series with a single resistor. Generally there's a marking to designate where you can cut such that they've got several LEDs per resistor because each LED is going to be somewhere in the 2-3V range, depending on color.

Strips are a design compromise built around convenience, of course. But there's a lot of engineering compromises here because the switching power supply is going to burn up some energy running things as well.

Manufacturers of finished LED products do make bright LEDs frequently by making a series-parallel array of LED chips on a single substrate such that they've pre-selected similar LEDs. But if you are building your own strips, you can use a constant-current switching supply to run a series of LEDs off of a relatively high voltage, somewhere in the 24v to 48v range, where you'd want to select for a relatively bright individual LED so you don't need to make a bunch of the constant-current switching power supplies.

[–] WaterWaiver@aussie.zone 6 points 7 months ago* (last edited 7 months ago) (1 children)

Yes it's possible to run them without resistors if you put them all in series and use a current limited power supply. That's how some LED lighting products do it, just not common LED strips.

Common LED strips are designed for convenience over efficiency. You feed them 12V and you can cut them to any shorter length without worry. You can't do that as easily with series configurations.

and a constant current supply will suffice for several strips of series LEDs in parallel.

Yes and no. I've seen lots of series-parallel products fail with blown LEDs.

For parallel LEDs to work you need three things:

  1. Very well matched LEDs.
  2. Shared heatsinking, so one LED getting hot shares some of its heat with its neighbours.
  3. Reasonable driving level. The more power you put into the LEDs the worse it gets.

These 3 things cost money so they often get skimped.

The LEDs will end up in an autonomous greenhouse where power efficiency is important.

Removing the resistors of a white 12V LED strip will (at best, in theory) increase your efficiency by 25%.

Choosing to use more LEDs and driving them at lower power levels might increase your efficiency even more than this. In 2024 you should be able to get well over 100 lumens per watt, but many LED strips overdrive the LEDs, dramatically lowering their efficiency. LED light output versus power input curves are very nonlinear, you get decreasing returns of light the more power you put in.

autonomous greenhouse

What are you growing? Sounds suspicious. Please don't do anything illegal.

If your greenhouse is anything larger than a small test then please instead proper fire detection and suppression systems. Don't get people hurt.

[–] Rolive@discuss.tchncs.de 1 points 7 months ago

Thanks. Haha no I fully intend to grow vegetables. I don't even like weed. The last time I smoked it eas 20 years ago, it gave me a panic attack and I never touched it since.

[–] SinAdjetivos@beehaw.org 2 points 7 months ago

More abstractly what you're doing with the resistor is creating a very crude linear regulator, which is fine for most applications and if you're careful about keeping your source voltage close-ish to the forward voltage of the LED this method can be fairly efficient.

Using an active constant current supply (as an example or many dedicated LED driver ICs do something very similar) can be marginally better as it allows you to reduce the waste from the linear regulator.

However, if efficiency is what you really care about you'll need to go with a switching regulator. Here's an app note going over the basics of that approach. and again you can usually find dedicated ICs for that approach.

Overall I'd recommend doing a detailed power budget and really seeing whether it's worth the cost/trouble of implementing that because while you are correct it is usually more energy efficient it can be significantly less labor/material/maintenance/longevity efficient (hence the prevalence of the humble resistor...)