Sorry to spam a link on my first comment here, but did you read about Micomsoft's (and Startech's) RGB capture cards? There is a thread about them
here.
Micomsoft has the SC-500N1 (PCI-E) and the XCAPTURE-1 (USB 3.0).
Startech has the PEXHDCAP (PCI-E).
Drivers for at least some models are interchangeable between manufacturers, I think.
Capture and playback on modern PCs is a tricky thing. It might not matter if your ultimate goal is to digitally preserve of the RGBs waveforms - perhaps an oscilloscope capture would do the trick, or something like that (people here know about that stuff).
But for playback on a modern PC, you have to deal with a few fundamental differences between the technology types. Resolution is a problem on fixed-resolution monitors (like LCDs), and if the goal is to have 1:1 pixel mapping, you lose the aspect ratio and create something that has little visual resemblance to the original signal. Upscaling is necessary on such devices. Even CRT monitors aren't likely to support arbitrarily-chosen resolutions made for consoles, and if they did, they probably wouldn't look right without tweaking the overscan settings.
Framerate is also an issue. On LCDs, you can expect to enjoy tearing and other glitches, even with framerate conversion. MultiSync CRTs won't even sync with the output of some RGB devices, like a standard Raiden II arcade board (though I haven't gone and tested all old consoles - standard NTSC framerate should work better here).
I realize that these things might not matter in the long run, if we can assume we get variable framerates and CRT-like monitors without native resolutions. But in terms of preserving the look, 1:1 pixel mapping is not a great compromise.
I'm not so big on video encoding stuff, but most of the stuff I know of expects you to use some standard resolutions - old consoles are all over the place here, again. On top of that, I think the most modern (efficient) codecs use 16x16 macroblocks, i.e., the resolution needs to be divisible by 16, not whatever funky resolution NEC or Nintendo came up with. So you likely need to pad the frames slightly - unfortunately now it won't look right once you go back to an appropriate 4:3 source. Upscaling to a standard resolution neatly avoids this potential pitfall.
@ miguelvp: You're talking about RGBHV stuff; he's talking RGBs. 640x480 / 320x240 resolutions seem like standard PC resolutions - different from 480i RGB (which is where the 15KHz comes from - compare to other retrogaming standards of 24KHz - rare outside arcades and some Japanese computers - and 31KHz, seen in some progressive scan output consoles like the original Xbox) which determines horizontal resolution by a variety of pulses; actual horizontal (and vertical!) resolutions vary from device to device. Never heard of 6.3MHz required for 240p; should be the standard 3.58MHz assuming NTSC timings. It goes without saying the signal is not making use of the VGA adapter.
Hope that wraps it up: Glad to be here, hope some of this is useful.