There are a few issues you didn't mention.
As you have described, most devices will work down to 1.0-1.1v/cell.
Consider (using simple figures) a battery depleted to 1.0v/cell and still working in the device, at say 100ma. Insert the batterizer and the voltage gets boosted to 1.5v/cell - but that extra 0.5V/cell is probably going to be wasted as heat! Meanwhile even if the boost converter was 100% efficient, you'd be draining the battery at 150ma@1v to produce 100ma@1.5v with the device using only 1.0v of that - a major efficiency loss! Add in real world boost converter efficiency (in such a tiny space at such a low cost) and the situation is probably much worse, say 200 to 300ma (75-50% boost efficiency) from the battery to produce 100ma to the device.
And it gets worse. With that 150-300ma coming from the battery, you waste more power as heat through the battery's ESR resistance, and due to that and battery chemistry the battery voltage drops further, raising the input/output current ratios further even at the same boost efficiency, but the boost efficiency probably decreases as well.
And - look at the battery curves - as you increase the current drawn, you move to the steeper discharge curves on the left. You show the wasted power as the area to the right of the cutoff voltage box *at the same current*, but instead consider transposing the steeper curves to that location (to the right of the box).
In the best case, the device itself is using a switching regulator, such that it will draw less current as the input voltage rises, so not all of that extra voltage (eg: raising battery 1.0v up to 1.5v to device) is wasted as heat. But then we have the tiny Batterizer switching regulator feeding the device switching regulator; even in this best case, you are wasting power rather than saving it.
The only niche I can see is an old device which really does need 1.4v/cell input, where even will all the losses of the Batterizer, you might actually get enough gain from extracting more battery juice to have a modest net gain in total results. Not 800% of course.
And - discharging the cells to low voltage may make them more likely to leak. Not as bad aa pitfall s the shorting you mention (or mechanically jamming in tight battery cases), but another downside.
---
One other niche - using 1.2v rechargables in a device which needs higher voltage (at modest currents). We all know of some devices which haven't worked well with rechargeables, so this is a more common use case. However if it really discharges the battery to 0.6v, that's going to damage some rechargables. If this boosted 1.0-1.3v to 1.5v with high efficiency and cut off at 1.0v (or 1.1 or 1.05v) to protect the battery, it might actually have a meaningful use, oddly enough - adapting some devices to use NiMH batteries.