I certainly was thinking more of waveform storage or capturing a record for review, I know the screen resolution doesn't require many more records than it's pixel width. Extra processing overhead is a non issue in products priced in the hundreds or thousands of dollars, any semi-modern possessor can handle gobs of extra data points, this isn't 1988. Also "real time" displays are not actually true analog real time, and it doesn't need to be, the human viewer can only take it in so fast. As long as the proper delay is set for synchronizing input channels at the output. On the data processing side, PC style DDR memory (any generation) is fast enough for a billion gamers around the world who need screen resolutions and input to output delays far tighter than any scope operator.
However I can see the ADC acquisition speed for higher bandwidth scopes as a limiting factor. Ghz or higher range anyway.
Now I'm not actually suggesting that a scope can or should directly use a PC style memory bus,(maybe see last paragraph) however low cost ddr memory modules can however be packaged with or even masked right onto an ASIC.
Also ASICs aren't stuck with one giant production run every 10 years, (that would be a fool bit of management due to the hidden holding costs not to mention market shifts) as such, a point revision to add more memory as memory it becomes cheaper, can be made for each run. However the memory doesn't actually need to be part of the acquisition ASIC, the ASIC only needs to control and buffer the sample points going to the main memory, and the memory can be a second die in the same package remaining more flexible in revisioning while maintaining a reliable and short signal path. Then a proper low cost high production CPU/GPU can be used do the grunt of analyzing and displaying the data stored in memory at a more leisurely pace.
Comparison,(yes high production, but these are also retail prices) 8-10 years ago for the same era designs currently filtered down to $500 scopes:
XBOX360 $300, or AMD 65nm Phenom CPUs with built in memory controller, L2 and L3 caches of several megabytes, and 3Ghtz of 64bit wide calculation were retailing about $100, with mainboards (including all those pesky memory buses and a substantial onboard GPU) again $100 retail, PC3 packaged DDR3 at the time was around $30-40/GiB. Currently that price bracket will fetch an AMD APU (CPU and GPU on one die) with even better performance and efficiency and much lower demands on the supporting mainboard chipset.
Old PC2(240 pin packaged DDR2. again 10 years old) routinely runs 800Mhz 64bit width, double data rate (16 bytes per clock cycle, hardware gross bandwidth of 6400MB/s) and read write bandwidths (benchmarked net data in actual general use computers) of 2200-3000 MBytes/s per mem-channel with latencies of 4-15 cycles.
GDDR5 generally runs between 3.6 and 4.5 Gbit/s per pin. Newer GDDR5X is around 12Gbit/s/pin.
The Playstation4 at $255 has a total of 8 GB of GDDR5 @ 176 Gbit/s (CK 1.375 GHz and WCK 2.75 GHz) as combined system and graphics RAM for use with its AMD system on a chip comprising 8 Jaguar cores, 1152 GCN shader processors and AMD TrueAudio.
8GB at 176Gbits/s in a $250 box ! Now I don't expect a $300 scope to have this much as scopes are lower production and have significant additional hardware, but it is clear that memory cost is a fairly trivial part of the design.