Oh well, I hope I didn't confuse the newbies too much with my testing of the trigger output characteristics of the Rigol scopes...
Fungus is spot-on, in everyday's use, this trigger out jitter is completely irrelevant. It only comes into play if you want to synchronize some other test equipment on the trigger event of the Oscilloscope (like for example an external logic analyzer).
To understand why there is that trigger out jitter on oscilloscopes with an all-digital trigger, let me try to give a simplified explanation: Let's only focus on the "traditional" edge trigger. In case of the classic analog oscilloscope, the trigger circuitry consists of a comparator which has fed the (amplified/attenuated) input signal to its non-inverting input while the inverting input is connected to an adjustable DC voltage which defines the trigger level. Let's assume the timebase is in "hold" mode, i.e. waiting for the trigger signal. Now whenever the input signal exceeds the preset trigger leve, the comparator's output will change state from logical 0 to logical 1. This will start the time base and initiate the sweep. This was for "positive" trigger edges, in case of negative edge selected, the signal of the comparator is inverted.
Fortunately, in the digital world, the situation is more comfortable and we don't only see the signal "after" the trigger event (which was possible on analog scopes to some degree as well due to analog delay lines before the vertical signal was fed to the deflectrion plates of the oscilloscope's cathode ray tube) but we can rather have the trigger event right in the center of the screen. This is arranged by sampling the input data to memory all the time and only "looking" at the memory contents right at the time the trigger event is recognized by a digital magnitude comparator (the digital resemblance of an analog comparator).
Now the problem is the following: The sampling engine (ADC, buffer addressing and transfer logic) is screaming along, let´s say at 1GHz (1G sample per second) while the remaining logic cannot run at this speed. Let's assume, the FPGA (the component the trigger logic is implemented in) can run at 125MHz (1/8 the sampling frequency). In order to keep up with the speed of the sampling engine, the trigger logic will have to be parallelized, i.e. there are eight magnitude comparators that look at eight samples of the input signal at one time. If the cicuitry identifies a trigger event in one of the samples, it "knows" how to map the recorded samples on the screen since it can identify which one of the eight magnitude comparators provided the relevant information. Yet, the trigger out line will only be switched after all eight "bins" have been analyzed since the FPGA core can only react synchronously to its main 125MHz cycle. And this means that the "real" trigger event has happened somewhere within this interval which happens to be 8ns in case of the figures assumed in this example, which correlates to the jitter of the trigger out measurements found on the DS/MSO1000Z and DS/MSO2000 oscilloscopes.
The jitter of the displayed waveform on the screen will be much less and should be in the ballpark of plus or minus half a sampling cycle, i.e. +-500ps in case of the DS1000Z with one channel active, if no other waveform approximation is being done before processing the trigger.
The sampling clock PLL-loop related jitter that was a problem with the DS1000Z and DS2000 series several years ago (which was a design flaw and has been corrected with a firmware update a long time ago) manifested itself as a waveform jitter on the screen if the viewed portion of the waveform was displayed after a considerable delay time after the trigger event, i.e. after many periods of the internal sampling clock of the oscilloscope had passed. It just looks the same as if the measured signal had some phase jitter, and that's basically the problem in this case, for one cannot be sure if the problem lies in the measured signal or if it's a defective oscilloscope. Only a known accurate low phase-noise source permits to evaluate the accuracy of the oscilloscope's internal clock(s). But once again, this problem had been solved and unless some firmware update should "break" the fix again, was an issue of the past.
I hope this helps to enlighten the situation a little.
Cheers,
Thomas