As far as what sample rate is sufficient, it depends on the rise/fall time of the signal. As a rough conversion, use the 0.35 rule to get MHz and then base sample rate on that. So for example 100 nanoseconds is 3.5 MHz and the sample rate should be several times higher than that.
I am used to DSOs which support equivalent time sampling where the sample rate ends up being orders of magnitude higher than strictly necessary, so I almost never need to consider it. I have never really considered a sample rate of 2.5 times to be sufficient.
But I believe most scopes, including Lecroy scopes, do some sort of peak detect when they resample data for display?
I think LeCroy's older DSOs were descendants of the transient digitizers they made for physics experiments, including nuclear weapons testing, so they had big sample memories and relied on post processing rather than processing during decimation where peak detection is done. So for a long time, LeCroy eschewed peak detection.
I think the first "modern" DSO with peak detection was the Tektronix 2230 which was first available in 1986. It has a 100 MHz bandwidth and 20 MS/s sampling to support peak detection down to 100 nanoseconds. It was quickly replaced by the 2232 with a sample rate of 100 MS/s and peak detection down to 10 nanoseconds. Both supported equivalent time sampling at 2 GS/s.