I meant current, yes. Typing too fast.BBadger wrote:I think you're confusing power consumption with current. The power consumption remains the same if you're targeting the same amount of per-LED current with sufficient forward voltage. Only in power lines does it really make sense to use high voltage instead of large amounts of current for transmission purposes. The strips use minimal voltage because each channel of each LED is independently controlled. You may choose to use LEDs in series rather than in parallel if you're worried about the current carrying ability of the wires you're using, as well as only needing one current-limiting device; however, it may mean that your LED string as a whole stops functioning if one goes out.junglesmacks wrote:Ah haaaaa.. it's because you're using such a low source voltage! Sure enough.. according to the LED wizard (http://led.linear1.org/led.wiz) you are indeed pulling 1920mA assuming a forward voltage of 3.2v per LED.
But.. check out if you were able to run 18v source voltage with 5 LEDs in series.. you're power consumption would only be 400mA.
I never thought about this dynamic of less power/more efficient power usage by using a higher source voltage and having longer strings together.
The main reason that one would take current into account in an application like this would be the current carrying capacity of the components/controllers involved. For instance.. the MOSFET board that I'm using has a max of 4A per channel due to the sizes of the board traces and other factors. That's why personally I'm interested in my own current load.
Another good reason when using a large array of LEDs to use a higher source voltage with a greater amount of LEDs in series is the overall wiring work load and simplicty of design. I would much rather wire together 400 LEDS together via 80 strings of 5 LEDs with a resistor vs 400 strings with 400 resistors..
