josecamoessilva seems to think that a drop in the power supply leads to a drop in the output amplitude of an audio amplifier. That just doesn't happen unless the power supply voltage drops below the design limits of the amplifier.
No, I think he's saying that as you increase the current to the speakers the voltage drop in the cables increases (Ohm's law). This somehow 'clips' the sound.
Any half-decent speaker cable will have resistance measured in milliohms so I can't imagine it will have an audible effect, but, hey... I haven't read many books and I think digital sound doesn't have stairsteps so what do I know?
Math: A 200W amplifier produces 40V peaks. That gives about 5 amps into 8 Ohms.
A 3 meter cable with 3mm
2 of copper (quite skinny by speaker cable standards) has 12 milliohms of resistance there and back again. It loses 0.06 Volts in the cable. That's only 0.15% loss at a 200W peak into a skinny cable.
Conclusion: This diagram is massively exaggerated, methinks.
(Yes, I know it's not that simple because speakers are inductive and have to physically move the cone, etc., but those numbers will approach truth as the speaker cone reaches the end of its travel. )
Saying the speaker cone won't quite get there because of voltage loss in the cables doesn't pass the sniff test. Many other problems will be orders of magnitude bigger, eg.:
* Mechanical resistance to movement in the speaker cone (they're hard to push in/out).
* The inertia of the speaker cone making it go
past the desired position when it gets there.
* Inductance in the wires/coils.
Note: Some manufacturers are now adding DSPs and mathematical speaker models to their amplifiers to correct for things like speaker cone inertia.
Welcome to a brave new digital world, analog fanboys.