Hi group,
the 800% explanation.
This is from the Batteriser Patent Application.
If one looks at the potential return of such a device in terms of lifetime of a battery, one can see significant benefits. For instance, the AA battery in the above example would use roughly the equivalent charge of the battery output in the range of 1.5V to 1.4V. This means that after 0.1V drop, the battery's life is over. If the battery could be used until its voltage reaches 0.8V, then after 0.7V drop the battery's life is over. If one were to assume that the time versus the voltage drop is a linear function, then the life of the battery could be improved by a factor of 7 in this example. However, advantageously the time versus voltage drop is not quite linear. The time it takes for the battery voltage to drop by 0.1V is longer at lower voltages versus at higher voltages. That means that if a constant current was drawn from the battery, it would take the battery a lot longer to discharge from 1.2V to 1.1V than it would from 1.5V to 1.4V. This means that the extent to which the battery life is increased could be even higher than the factor of 7 in the above example above.
It is noted that the regulation circuit has a certain efficiency which cuts back on the extent to which the battery life is extended though the life time reduction is rather minimal. During operation, the regulator itself uses a certain amount of current from the battery. A lot of the available DC to DC converters have high efficiencies of around 95%. That is, of power supplied by the battery, 5% is used by the converter and the rest is available for the end user. However, the 5% efficiency loss due to use of a converter, when compared to the 700% gain in efficiency of the battery, is negligible. It is further noted, that the converter efficiency may drop as the battery voltage drops due to use. For example, as the battery voltage drops from 1.5V to 1V, the efficiency of the converter may drop down to 50% to 60%. However, 50% efficiency is still a significant improvement over the current approach of discarding the batteries because their voltage has dropped below the operable voltage range (i.e., 1.4-1.5V).
Since they talk about the battery having a linear discharge curve, we can treat this as a capacitor:
Assume a AA battery is 2Ahr the capacitance is 3600x2 = 7200F
I=C dv/dt
Assume the powered device has a cut-off voltage of 1.4, the voltage has to fall 0.1V
Assume a 50mA power consumption
solving for t
The device runs for 14400 seconds.
Now will introduce a lossless Boost converter.
The properties of a boost converter are power is preserved, the power consumption from the battery will be constant,
The power consumption is 1.5V x 50mA = 75mW
If the Starting voltage is Vo
And the end point voltage is V1
The discharge time can be calculated as
t = 0.5 x C (V02 - V12) / power consumption
If I plug in the numbers I get
T= 0.5 x 7200 (1.52 - 0.72)/75mW = 84,480 seconds
Even in this idealised case the time multiplier is
84,480 / 14400 = 5.8x
If the cut-off is 1.3V the multiplier is x0.5 = 2.9x
It seems like Dr. Bob didn't get the math right in the patent application !!
Adding inefficiencies, or raising the UVLO for the boost circuit only makes this worse not better.
Of course the battery discharge is not a linear function of time, it has a plateau in it. Any product that is designed (correctly) until after the plateau will work longer without the batteriser than with it.
Jay_Diddy_B