It's just an oversimplification of the problem. It is only true if you have an ideal voltage source capable of infinite current with zero voltage drop. In reality, as you start drawing current from the source, the voltage will dip and it will look more and more like a current source (at least until it blows), which breaks the assumptions made in the first post.
I agree though, that for an ideal voltage source, the system efficiency will be 50%. A similar argument could be made for batteries as well though. If you want to charge a battery efficiently, you don't drive it with a fixed voltage source through a resistor, you use constant current up to a set point and then CV afterward, same with a capacitor, if energy storage and efficiency is your goal at least.