Keysight 3000T doesn't always use whole memory.
From fastest timebase 500ps/div to 20us/div it samples in such way that if you sample at 500ps/div and stop, you can "zoom out" to 20us/div.
So if you running from anywhere from 500ps/div to 20 us/div and stop it from run, you get 1 MPoints of data after trigger (minus pre trigger time).
In single mode you get to expand out to 50 us/div, which gives 2,5 Mpoints of data.
If you are running slower timebase than 20us/div in RUN and 50us/div in single, you get only data that you see on the screen, plus minus 150us before and after so 650us altogether, at 2.5GS/s, which gives 1,625 MPoints. In SINGLE mode all the same, except 5GS/s which gives 3,25 MPoints.
And it changes as you go through timebases.
So I would say that Keysight has a strategy to maximize memory used when you go to STOP or SINGLE mode. But memory length is nor fixed nor maximum all the time.
And in a run mode it is not running full memory but some sort of circular buffer, and uses only one frame worth of data, to maximize waveforms per second rate..
When you stop it, it reassembles all the buffers in one full capture.
It is actually quite clever, but not very simple to understand.
Other strategy is one that is used by LeCroy, Picoscope and some others, that you can set MAX sample size, but it will fetch exactly and only length of data that you set it to. So for instance 50us/div, at 10 div horizontal, will result in 500us of data being captured. Not a point more or less. And also it will try to sample with as quick sample rate as it will try to maximize sample points to achieve max set size. Meaning if you set for 10MPoints MAX, at 1GS/s, you will get 1MPoints in a capture that is 1ms long. In capture 1us long (100ns/div, 10div) you will get only 1000 points. Because it is ruled by capturing exact time interval. Max sample size will be in effect only when time base is slow enough that with highest sample rate it would want to acquire 2 Gpoints of data. So, you decide how much is too much, and then it starts slowing down sample rate..
It's really two schools of thought, optimized for different usage patterns. I think both are useful. What bothers me is that scope could be made to be able to do both but nobody does it..