As I understand it, if I take a .raw file and straight away save it as a .jpg I would end up exactly what I would get if I save it as a .jpg to begin with. I assume the algorithm used by the software on my computer is more efficient than what the camera uses, so it could even be better, although I doubt it would be noticeable. So the only limiting factors are in the hardware, which is going to be the same in either case.
Maybe things would be different if you use the camera vendor's raw software (if available), but having played with two different 3rd party raw processor I can tell you straight away that in many ways, the results will depend more on the raw software used than on the camera.
There is a lot of processing and secret sauce going into generation of a viewable image from the raw sensor data. For starters, as I hinted before, the uncompressed image you get from this process is actually up to 3x larger than the raw data, go figure
Raw has surprisingly little to do with the final image.
Almost every image you see on the Internet, including professional photos, is JPEG. If you have ever seen any good looking images on the Internet, you have seen good looking JPEGs. Raw's utility (if any) is not as much in avoiding JPEG artifacts, as in gaining more control over things like color rendition and white balance, tonality, sharpening or noise reduction. Or outright processing the image beyond recognition for some supposed artistic value. Occasionally, rescuing images which had been destroyed by bad camera settings during shooting.
If you start reading on the topic, you will find rants by professionals who bluntly say they have no time to mess with that stuff and would rather buy a camera which can be tweaked to produce the images they want right off the bat