"Digital debugging" consists of primarily three activities. Firstly making sure that the
analogue signal (because all signals above femtoamp level are analogue signals) will be correctly interpreted by the receiver as a digital signal - i.e. ensuring "signal integrity". Secondly, once you have valid digital signals, you often need to see the precise relationship between them, especially ensuring setup and hold times are correct. Finally, sometimes you need interpret the digital signals, e.g. to determine a count or a FSM state or the contents of a serial message.
Signal integrity requires bandwidth above all else, coupled with good probing technique. 100MHz is barely adequate for modern logic.
Timing relationship also requires bandwidth, but also requires >=2 input channels (2 is often sufficient).
Interpretation is frequently best done with digital tools, not oscilloscopes. Logic analysers for wide busses, "Bus Pirate" for many common protocols, and even printf() statements.
The single area where digital storage scopes have a USP is for slow and single-shot events.
My preference is: a
working old scope with >250MHz bandwidth, plus a Digilent Analog Discovery for slow single-shot slow events (also has a waveform generator, logic generator, logic analyser), plus printf() statements. To that must be added imagination, understanding, and a disciplined approach to design and debugging