Here's a scenario, using times taken from Energizer's E91 datasheet:
I took the example of a device with a 250mA constant current discharge at 21 deg C.
With that cell, such a device would run for 7.5 hours until it was, to all intents and purposes, dead as long as it had a very low cutout voltage.
So, lets have a look at the amount of runtime you would get out of that cell if the devices low voltage cutoff was set to various levels:
This is the same data, using the same 250mA constant current scenario.
If you have a device which cuts out at 1.0V, it would cut out after just over 7 hours of operation, and you would have another 22 minutes of operation potential left in the cell, if all other things were 100% efficient, say if you took the cell out of the device with the 1V cutoff at that point and put it into a new device with the same 250mA constant current draw but with a 0.6V cutoff.
If the same device with 250mA CC draw cut out at 1.1V, it would cut out at just under 6 hours of operation, and doing the same as above you could potentially gain another 1 hour and 39 minutes of use out of the cell doing the same as mentioned above.
Now, add into this scenario the inefficiencies of the boost converter, and other inefficiencies, and the gains are not that much.
So there's no way you'd gain an extra 800%, or even 80%. You might gain a few minutes here and there.
Now if you have the batteriser on the cell from when it is inserted new into the device; see their FAQ:
"
Does the battery need to be “dead” before you use the Batteriser?No, you can start using the Batteriser at any time, even a brand new battery, and you will get the benefits of extended battery life."
Then, the inefficiencies of running a boost converter add up over the entire 6 hours or so you would be running the device.
The curve for a constant power draw device follows the same pattern.
If you have a flashlight that is a simple bulb connected to a battery, which dims after a certain amount of use, using this device will give you a constant brightness output, which is great, that should be the promotional message. But the time it runs will be LESS than the original draw, because you don't get nuthin' from nuthin', and drawing a constant power from a cell and maintaining a constant voltage output, your current draw rises until either the cell gives up or the device can't handle the current draw and cuts out. It doesn't matter how efficient or small the boost converter is, it can't beat the laws of physics/thermodynamics/conservation of energy/ohm's...
In fact, in the scenario of a flashlight, because the human eye's response to light is logarithmic not linear, you might actually get longer performance out of the unregulated flashlight in some scenarios, because of the human eye's acceptance of, and adaptability to, a lack of light vs perceived light.
Also, this shows why you can't just divide the 0.1V steps between 1.3V and 0.6V into equal portions and say that means there are 8 x 0.1V steps remaining. How much energy is contained in the 0.7-0.6V step? Zero. Same goes for the 0.8-0.7V step. Snails climbing out of wells notwithstanding.