If you put two LED's in series, that doesn't do anything to limit the current. The voltage/current graph of an LED isn't linear - it's anything but. So you will essentially see little to no brightness with linearly increasing voltage... and then hit a tiny tiny narrow voltage range where you have close to a linear voltage/current relationship, then any extra voltage will cause the current to skyrocket.
So you still need a resistor.
If you are interested in most light for the power consumed, then two LED's in series are more efficient, but if you only need a certain amount of light (for an indicator in a USB device, for example), then you are much better off driving either with PWM or with a switching supply. There are easy-to-use chips out there that do this for a wide range of current needs. Diodes Inc, Allego and many others have such devices - there's literally thousands of others on Digikey.
Bottom line - any power you burn up as heat isn't producing light and that's inefficient. There's also the fact that many LED's are most efficient at very low drive levels on a "how easy to see the LED with your eyes vs. power consumed" basis.