I'm going to design a linear voltage regulator for a project I am working on. I have loads in the system that have 40V mosfets in them and this is a "nominal" 24V automotive system so basically *I have to deal with surges of 40V.
Right, the motors (brush-less with builtin drive electronics) have mosfets with a maximum Vds of 40V, the manufacturer would really like to see them run on 20V (50% derating), they don't mind 24V but I'm happy with 30V, realistically in a vehicle 27V is quite likely.
I don't see the need for a regulator or any surge suppression.
The MOSFETs are rated to 40V, therefore they will be able to withstand 40V spikes, possibly even a little more, for very short periods of time. I agree, that de-rating is desirable for reliability reasons and that it's bad practise to continuously subject components to their absolute maximum ratings, but this is a stress test, rather than normal operation.
I think adding more components, such as a low drop-out regulator, would decrease the reliability of the system, as a whole, as there are more components to fail. Low drop-out regulators can also be a pain to design and can sometimes oscillate, under certain load conditions.
But there can be surges of 40V for 500ms every 5s from a source impedence of 0.5ohms. This will break the motors and is not the sort of thing a TVS or other crobaring device would have to absorb too much power.
I don't see how that can break the motors, which will be able to withstand much higher voltages, for short periods, without damage.
As far as clamping is concerned, it's doable, with a zener diode and a big enough transistor. I've done a similar thing before to absorb the excess voltage generated by a motor breaking. I used a Darlington pair and a 30V zener diode. I doubt it's worth it in this case though.
![](https://www.eevblog.com/forum/projects/linear-voltage-regulator-design/?action=dlattach;attach=370995;image)