ATTENTION: You are viewing a page formatted for mobile devices; to view the full web page, click HERE.

Main Area and Open Discussion > Living Room

Our experiences with LED light bulb replacements

<< < (6/13) > >>

But if I remember what little I do correctly, the total amount of heat generated by any electronic device is a function of its resistance, not the interplay of voltage, wattage and amperage
-Sarkand (May 20, 2014, 03:59 PM)
--- End quote ---

I think you're overthinking it. It just comes down to basic conservation of energy. The amount of energy you put into the device is the same as what comes out of it. The only question is in what form is the energy emitted? But since we're looking at devices that are emitting roughly the same amount of light, we can even discount that portion of it to compare the total amount of heat generated

Technically, the total amount of energy being expended easy to figure. In the USA, the voltage should be 110V. The amount of electrical energy you're putting in is the amps, which you can calculate by dividing the bulb's wattage by the volts. We don't just how much is being turned into light, but it's a small number and similar across all the devices anyway. So it's a decent estimate to compare the relative amount of heat generated as the relative wattage of each device.

Last year I replaced my kitchen under-cabinet lights (P11s) with LEDs.
These are the strip type lights and can be custom made in whatever lengths you require, although Walmart carries them now in varying lengths.
 They required 12 volt DC current and a 110 to 12 volt adaptor (5 amps) was installed to supply same.
I reused the swithes from the original P11 fittings to operate each light individually, and am quite happy with the coolness of the fixtures and the lighting spread afforded.

It seems from what I'm reading now that this 60W limit is not so much about electricity limit, but about how much heat the fixture bases can take.  And if that's true, and LED bulbs generate huge heat at the fixture base, I may be in trouble.
-mouser (May 20, 2014, 04:17 PM)
--- End quote ---

This is quite true, but there are caveats.  For instance, my ceiling fixtures say 60W only, no higher.  However, they are embedded ceiling fixtures with a frosted glass cover.  The 60W limit is not only due to heat, but also due to bulb burnout in an enclosed environment.  I can put a 100W incandescent in the fixture, but it will die of heat exhaustion in about a third of the time that a 60W will last.  With the CFL or LED bulbs, an infrared thermometer shows a significant captured heat reduction.  CFLs show less temp than the LEDs, but as was mentioned earlier, they're a lot more damaging to the environment.  Well, at least we know that particular damage - LEDs have yet to be EPAed  :-\.

I also used the IR gun on several desk lamps.  Heat reduction, compared to an incandescent, has been fifty to seventy-five percent, depending upon the number of LEDs involved in a particular lamp.

Most telling point is that over a two (2) to three (3) year period, I've seen a significant reduction in the electric bill, ~20%-25% in winter and ~30%-40% in summer.  That has made the conversion process more than financially attractive  :Thmbsup:.

Note:  I know more than I ever wanted to learn about this because of a stupid question I asked once in an electronics class  :'(.

I reiterate:  the heat in an electrical system is generated by resistance.  Wattage, voltage and amperage have nothing to do with it, except in how they influence resistance (push more of any of these into a medium of given resistance and more heat will be generated).  You can throw 10,000 watts at 100,000 volts across a superconducting wire or surface and generate little or no heat - resistance is reduced to near zero in such conditions.  At absolute zero, resistance of a conducting medium is zero, this is a law of physics - I haven't forgotten that much.  The heat is generated by the excitement of atoms unwilling to give up their electrons in order to propagate a current.  If the current is high enough, the wave is propagated - at the expense of the heat generated when the atoms are forced give up their electrons, as they have to be raised to a higher energy level in order to do so. In the trough of the current (wave), the atoms regain their lost electrons from the free electrons surrounding, ready to repeat the cycle at next crest.  At least that is how I understand it.

My question in fact relates to the mechanism by which the heat (resistance) is still generted at such high levels, even in conditions of lower wattage.  I still like my earlier guess of a step-down or AC/DC transformer.  These things produce heat like crazy - to get rid of the extra electrons, I expect.

the heat in an electrical system is generated by resistance.  Wattage, voltage and amperage have nothing to do with it, except in how they influence resistance
-Sarkand (May 20, 2014, 07:42 PM)
--- End quote ---

No, these are all parts of the same trinity.


We know volts is 110V. And rearranging the equation gives us


So given a constant voltage of 110, talking about Ohms and Watts are really just two sides of the same coin. You can't say they've got nothing to do with each other.

For a much better discussion, see From that page (emphasis mine):

Conductive objects are always full of movable electric charges, and the overall motion of these charges is called an 'electric current.' Voltage can cause electric currents because a difference in voltage acts like a difference in pressure which pushes the conductors' own charges along. A conductor offers a certain amount of electrical resistance or "friction," and the friction against the flowing charges heats up the resistive object. The flow-rate of the moving charges is measured in Amperes. The transfer of electrical energy (as well as the rate of heat output) is measured in Watts. The electrical resistance is measured in Ohms. Amperes, Volts, Watts, and Ohms.
--- End quote ---


[0] Message Index

[#] Next page

[*] Previous page

Go to full version