Also, since radiation has a Tabs^4 dependence, you get vastly more thermal conductivity at elevated temperatures. That is, if you want a lower thermal resistance due to radiation, you MUST run it HOT!
Vacuum is a rather awful conductor, which gives some clues about the design of vacuum tubes (back in the day). The cathode in the center is hot, closely positioned near the grid, which must remain cool (otherwise it goes leaky). The grid is usually molybdenum wire wound on copper supports, with radiator flags tacked on the top. So the grid can stay somewhat near outer glass temperatures, being heatsunk by those flags. The anode surrounds everything, and gets hot (it's where all the electrical power is going!), sometimes awfully hot (transmitter tubes made with graphite or tantalum anodes are intended to run red-hot!), hotter than the grid should be. And around everything, there's a glass envelope, which typically runs over 200C at the hottest point -- so anything inside the vacuum is running nearly double room temperature (absolute scale), which helps greatly with radiation -- as long as the baseline temperatures are tolerable!
In vacuum lab equipment, you have to use non-outgassing PCBs (RF teflon stuff, Rogers whatever -- FR4 is no good), which conduct more poorly to begin with. Thin copper traces don't account for much, so you can very easily have a jellybean op-amp dissipating not 100mW, yet burning up because there's no short path out! Often, a huge copper braid is used, not so much for grounding as heatsinking: over such length, even though it's copper, the thermal resistance sucks, but vacuum sucks more (ha..), so it's a net win.
Lessons not all that applicable to consumery stuff (< 150C silicon, enough watts that you need a fan), but good to keep in mind all the same.
Tim