Hey guys,
I'm designing an LED matrix using PICAXE chips as an alternative to using proprietory LED matrix controllers, such as the LED171596A. Reason for doing this is that I have large groups, or clusters, of LEDs which will always be toggled on/off together, never individually, and so writing to so many registers over i2c would simply be wasteful in terms of programming overhead. That, and I'm a maker :-)
The matrix will consist of MOSFETs switching on both the low and high sides, which means I could in theory use any sensible power supply voltage, independant of the control voltage. For current limiting, I was thinking of using the NSI45020AT1G LED driver, which provides a constant 20mA at up to 460mW to drive LEDs. So just to check my thinking here:
If the driver is rated for 460mW, that would mean it can safely drop 23V at 20mA, right? Which means the maximum possible supply voltage would be about 24V, assuming a single 1V LED connected. If I were to connect multiple LEDs in series, the power drop across the driver would fall, and I could in theory connect 22x 1V LEDs to the driver, taking the 1.8V overhead into account.
So in my particular scenario, where the supply voltage will be a maximum of 16V, it would be safe in theory to string any number of LEDs in series, for example up to 15x 1V LEDs, without impacting on also having single LEDs on other matrix nodes?
I've attached a simplified diagram of what I mean - the left of the circuit represents the node on the matrix where you have multiple LEDs in series, and on the right represents the node where you only have a single LED connected. Obviously, I'll be driving this in the same way as any other LED matrix, by cycling rows/columns and switching the opposite side dependant on what nodes are activated.