Jay_Diddy_B,
Not sure why you're focusing on efficiency and power factor...
I assumed the major issue is the harmonic currents causing voltage drops and voltage distortion.
What am I missing?
Power factor is essentially indirectly describing the magnitude of distortion. When you un-link the "
current being drawn waveform" from the "
voltage supplying it waveform", you start to introduce distortion in the way the power is being drawn from the source, the power factor of the load reduces from 1.0 as you move away from purely resitive where the current drawn directly tracks the voltage input. This does not necessarily change the overall efficiency, but it does change
how the power is being
drawn from the line.
Essentially, if the power factor of your loads is 1.0, your incoming power sine wave will still look like a sine wave. If your power factor is 0.1, you're not going to be looking at a sine wave anymore, even if it's still a sine wave coming out of the generator. You can computer model this like above, or physically simulate it on your bench to see for yourself.
This is very easy to play with safely on your bench using a decent low voltage transformer, (preferably a torroid) off the mains, or a generated 50/60 Hz sine wave signal if you have a AF signal generator (optionally followed by a power amp or buffer stage, etc.) and then add some appropriate small series resistance to simulate at least the main resistive component of the grid impedance, then attach your various loads. You can then monitor visually on your scope and take samples and do calculations based on what you see, simulating what various loads to to the real mains using that low-voltage setup.
The first time I ever realized this was trying to run an inverter/charger off a portable generator at an off-grid cabin. The battery charger wouldn't put out nearly as much power as you would think was possible, even less than with the previous, smaller generator. The solution was for me to modify the generator, removing its ability to do 240v output but doubling the stiffness of the 120v output by simply directly paralleling the output windings.
I also added a fairly beefy motor-run capacitor across the line right at the generator (with a fuse, of course) to help improve the power factor (by helping to boost those flattened peaks of the sinewave caused by the rectifier -> battery load, especially when trying to equalize the cells in the battery bank where you need every volt you can get to try to push them up to 15-15.5v across each series string or whatever.)
After the modification, the charging cables to the battery bank (IIRC, I think it was 00 (2/0) AWG, which is about 67.5 mm2) got noticeably warm. My friend, who is an electrical engineer, came out of the cabin to the generator shed and said "Wow, that's impressive," that he could feel the cables getting warm. Sure enough, I went and checked and, yep, they were noticeably warm, even though they were only doing something like 125A. Impressive, though, indeed, since we could barely get above 30A before, even after doubling the conductor cross-sectional area from the generator shed to the inverter/charger.
I hadn't ever really considered that stuff before because most of the time, for us mere mortals and our typical residential loads, you can just assume that the grid is infinitely low impedance and will always be able to supply your juice, regardless of how "bad" your load is. This is not actually the case in the macro picture and most people, even many electricians and many, many engineers do not fully grok this, unless they actually work with power electronics or distribution systems on a regular basis.
Edit: Added additional explanation in first paragraph for clarity.