No, they implement current limiting and once in current limit mode will not allow the current above that current threshold.
When they do not allow the current to raise above this threshold, and this threshold is accurate, then what you get is a constant current. You just need to make sure that the supply's voltage set point is high enough so that it does not switch back to CV mode.
In the same way you could argue that, when the supply is in CV mode, it limits its output voltage and makes sure that it does not exceed the set point.
Here is the basic (incomplete) circuit of an old-school linear bench PSU:
There is one opamp for CV, another one for CC. That makes two control loops. The two diodes decide which mode to use depending on load condition and set points; as mentioned, the one wins that is restricting output current more.
But what I want to show by that drawing is, that both CV and CC loops are constructed equally: they regulate the power element to achieve a given set point. This is true CV, and true CC as well.
The biggest problem with CC mode on a bench power supply is the relatively large output capacitance. Even in CC mode it can still deliver very large currents. A true CC supply doesn't have that output capactor. This Keithley 6220 for example has an output capacitance in the pF range. Your average bench PSU probably in the order of 100uF.
Agree, I also pointed this out in a previous post. And it cannot be stressed enough. For example: you want to use your bench supply to test a low power LED. You adjust the current set point to the LED's rating of lets say 10mA. And because you think, better have enough voltage, you set the voltage set point to max, lets say 30V. You connect the LED, and zap, it is dead. The fully charged supply's output capacitor has killed it. Therefore: to test a LED, or any other sensitive device, start with 0V set point, and increase slowly. Or use a dedicated low-capacitance current source if you have. I don't