I could make a loose inference, about what Lumens are.

In the later part of my childhood – in the 1970s and 1980s – we had incandescent light-bulbs, and we knew that only a small part of the so-called light they emitted was in the visible part of the spectrum. We often used this light-bulb type, because we had no better alternative. We knew that the visible part of the emitted light might have been 15% or 10% of the consumed energy.

Granted, in Industrial or Commercial Lighting, there existed other types of fixtures, such as mercury-gas-discharge tubes, that excited a phosphor with ultraviolet light, so that the phosphor was made to fluoresce. Or in some cases simply – a gas-discharge tube, with a gas-mixture of a composition unknown to me.

But, when I go to buy light-bulbs today, as an adult, like all the other customers, I see Compact Fluorescent Light-Bulbs, as well as LEDs, the brightness of which is stated in Lumens. What I generally tend to find, is that light-bulbs of the fluorescent family, which are meant to be equivalent to the ~Old, 100W~ incandescents, tend to draw approximately 23W, and are stated on the packaging to produce about 1500 Lumens.

Lightbulbs of the LED family with the same equivalence, are stated to draw about 16W, and to produce about 1500 Lumens. I have actually found LEDs, which are stated to draw about 17W, and to produce 1600 Lumens of visible brightness, but which possess a visibly-larger base, from the other types.

If I could just hazard a guess, I’d say that one way to understand Lumens, is to start with the Watts of light in the visible part of the spectrum, and to multiply those by 100. What this would suggest, is that the most-efficient LEDs waste about 1W as heat, while then fluorescents still tend to waste a bit more energy, such as perhaps 8W – some of that in the form of UV light, making those approximately 65% efficient. But this would also mean, that the efficiency of modern LEDs is hard to improve upon. If the brightest variety only seem to produce 1W of waste heat, out of 16W or 17W consumed, it would make most sense to infer that in that range of efficiencies, the Wattage can be translated into Lumens quite easily. More Watts will simply produce more light, and fewer Watts will produce less light. In percentages, the LEDs would seem to have an efficiency of about 94%.

If we have a new light-bulb type, that draws 4.5W, but that produces visible light amounting to 350 Lumens, it would follow from this thinking, that this type is wasting about 1W / 4.5W. In percentages, this would imply an efficiency of 78%.

I suppose that I can offer a comment on the temperatures which the light-bulb-bases of household LEDs reach…

Actual LEDs are diodes, and semiconductor-based diodes do not simply have the behavior of conducting in one direction, but blocking the flow of current in the other direction. Instead, semiconductor-based diodes need to be forward-biased by some small voltage, before they conduct. In most cases, this voltage corresponds to about 1/2V, or to a different fraction of a Volt, depending on what the gap-energy of the semiconductor is.

What this means for illumination, is that the wattage of light obtained, must also follow from this number of Volts – forward-bias – multiplied by the current flowing through the LED. Because 1/2V is an impractical voltage to derive illumination-wattages from directly, there is a simple trick which manufacturers use, which is to connect several actual LEDs in series. But then the facts do not change, that D.C. Power is still Voltage times Amperage, and that even several diodes in series, will typically not oppose the flow of current by more than some small number of Volts, even though forward-biased.

Therefore, typical LED lighting fixtures need to have some sort of power-converter in their base, that converts the 110V A/C we have in Canada, or the 220V A/C they have in Germany, down to a few Volts, while multiplying the Amperage efficiently by the same ratio. That power-converter could be made a simple transformer, or it could not.

And so a typical visualization which people might have, is that a low-voltage situation cannot produce much waste heat. But to the contrary, if a transformer has a secondary Voltage of only ~5V~ , but is outputting 15W, then it’s also outputting 3A of current. And any arc-welder will remind the reader, that a great deal of heat can arise, when higher amounts of current flow at low voltages, through practical conductors. Those secondary windings still need to be narrow, to form a secondary winding – yet carry 3A according to that.

Long story short, if a body simply absorbs 1W of heat and has insufficient opportunity to dissipate that heat, its temperature will gradually increase – and only increase. At some temperature, the amount of heat dissipated will have to match the amount generated, in order for this power-converter not to burn out.

I would suggest then, that after wasting only 1W of power, the bases of household LED-Light-Bulbs usually tend to overheat, because they have insufficient features to dissipate their waste-heat. Alternatively, it could be that some manufacturers are producing 1500 Lumen LED-Bulbs, which therefore consume about 16W, but that those manufacturers might only be putting physically-smaller power-converters into the base, that were really only capacitated for 800 Lumen Light-Bulbs. In that case the smaller power-converters could overheat for that reason.

What happens with electrical technology, does not really match what happened with the older mechanical technology. With Internal Combustion Engines, we used to get better efficiency, by running an under-capacitated engine at closer to 100% of its rated performance. With electrical and electronic technology, we actually lose efficiency when doing that, and the only reason for which manufacturers would be doing this, would be to reduce the per-unit price.

Now, I would not send an Engineer to Hades, for having made that sort of design-decision.

  • Reducing manufacturing costs may be his only means to stay in business, while competing against other manufacturers.
  • I know of at least one example, where the reduced manufacturing cost was being passed down to the customer.

 

There’s an important way in which my over-simplification fails to explain the realities of this power-converter. I haven’t defined in what way the circuit stabilizes the amount of current which flows through the chain of LEDs. Given the fact that even to pass high currents through a diode, results in an almost-constant voltage-loss, even to put a standard rectifier-bridge (implying 2 diodes) in series with a 10 LED load, would result in dramatic loss of efficiency, since the light-output also depends on the gap-energy, solely of the LEDs.

A partial answer to this question could come in the fact, that design-strategies have existed since the year 2010, which did not exist in the 1970s.

And, I see that the LEDs I can buy today come in the form of ‘Dimmable‘ and ‘not-Dimmable’. So today’s technology has kept some of its secrets from my prying mind.


 

By today, the old-fashioned notion of a power-transformer is often replaced by a switching power-supply, as is usually the case in PCs and laptop-computers. Additionally, in the case of LED-Light-Bulbs, there is no restriction against both the D.C. output-leads being ‘Hot’, where in a PC or laptop, one D.C. output must be neutral, thereby requiring a voltage-doubling input to produce one ‘Hot’ D.C. lead.

The avoidance of transformers can result in smaller, lighter power-converters than old-fashioned power transformers could produce.

For LED illumination, I’d guess that switching power-supplies are applicable. Here, the bridge-rectifier would be connected to the A/C house-current coming in, where to lose 1 Volt means nothing. And from there would follow an LC filter, a switching transistor, and then the heavy inductor, together with its rectifying-transistor.

In switching power-supplies, even though the switching transistor is turned off for a great percentage of the time, the amount of current that flows through the inductor matches the load-current 100% of the time. And so Still, we could face the embarrassment, of a higher-wattage load being connected, than what the power-supply was designed for.

But a curious consequence could be, that the light-bulb-designing company might be able to accomplish this, by putting more LEDs in series, instead of by putting two or three chains in parallel, thereby assuming that the amount of current they’re being fed stays constant, and resulting in greater voltage being developed on the low-voltage side. If this were the case, then the switching transistor would need to stay switched on for a greater percentage of the time (than it was supposedly meant to remain switched on for).

Hypothetically, If they did this, they might be discovering that with 5 LEDs in series, the power-converter produces 3.2A, resulting in the stated 800 Lumens, while with 10 LEDs in series, the same converter only produces 3A, resulting in the stated 1500 Lumens.

(Edit 09/03/2017 :

And then, in order for the LED-Light-Bulb to be dimmable with a duration-principle dimmer, the design could hypothetically be changed in such a way as to reduce the capacitance in the filter, that should follow the power-line-side rectifier according to good design principles. Doing so would also result in a poorer power-factor for the light-bulbs in question, according to my assumption of constant output-current, over the interval of A/C line-voltage waveform that was switched-on by the dimmer. This would seem to imply an input-current waveform which would be the inverse of the voltage waveform.

Alternatively, the feedback-loop of the power-converter, of dimmable LED-Light-Bulbs, could be made slightly more complex, so that the output-current is proportional to the instantaneous input voltage, for parts of the waveform during which the dimmer is switched on.

Doing the latter would have as advantage, a better power-factor at the input, but would have as disadvantage, that the manufacturer would find it harder to predict, how much light would actually come from the LEDs, when the dimmer is at its full-brightness setting.

Either way, because the dimmers use a switching component – a Triac – one thing the designers cannot do, is present a large capacitance to the dimmer, at the input of the light-bulb, the voltage and therefore charge of which would need to go from zero to some non-zero charge instantaneously, each time this triac switches on. )


 

(Edit 09/03/2017 : )

One way in which this type of light-bulb has been described, is as having a Blue LED, with a phosphor on it that fluoresces with a distinctly yellow light, so that the light from the two sources mixes, and produces naturally-colored light.

But in the case of light-bulbs that have been labeled as having a color-temperature of 2700K, I rather suspect that the layer of the phosphor has been made so thick, that very little of the almost-monochromatic, blue light of the LED actually penetrates. And the advantage of doing that would be, that the phosphor in question could be a long-persistence phosphor – aka a ‘slow phosphor’ – so that a duration-based dimmer will obtain different brightness-levels from the light-bulb, without irritating the owner with vibrating light.

If the manufacturer wants a ‘whiter’ color of light from the light-bulb, he can make the layer of phosphor thinner, so that some of the original blue light from the LED does pass through, and the labeling of the light-bulb as having a color-temperature of 3000K, then becomes a work of art.

I have yet to see light-bulbs with that color-temperature, simultaneously labeled as being ‘dimmable’.

Dirk

 

Print Friendly, PDF & Email

Leave a Reply

Your email address will not be published. Required fields are marked *

Please Prove You Are Not A Robot *

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <strike> <strong>