So you discovered the joy of Japanese datasheets. JRC, NEC or Mitsubishi, all that stuff was marketed for audio products and usually only specifies noise in terms of total noise output from some RIAA preamp or other "typical" audio circuit. Oftentimes A-weighted.
That being said, this one has a "noise versus source resistance" plot. Of course
it's the total noise over some unspecified bandwidth (lemme guess, 20-20k) with a few weighting schemes to choose from, but you can see the general pattern: it's about 1.5x worse at 200Ω than at 10Ω. Since Johnson noise of 200Ω is 1.8nV/rtHz, it fully explains the increase and no third noise source is apparent.
For the record, the minimum limit of current noise is the shot noise of the input bias current. Good bipolar opamps achieve this limit, at least above some tens of Hz. Shot noise density formula due to one Schottky: √(3.2e-19·I
B), so 1pA/rtHz for 3.6µA.
I actually bought one of those chips, but haven't played with it yet.
I now have time to come back to this. I'm not seeing the trick - can you elaborate? In fact, I'm not sure that I understand what you're saying - is it that effective GBW is a function of overall amp bandwidth, rather than the constant 10 MHz or whatever quoted in the datasheet?
Okay, I dug out my old notes. Actually, NE5532 had only some 20~25MHz GBW at audio frequencies. The 30MHz figure was for a fake 5532 from an auction site, I have no idea what's inside of that one but (surprisingly) it has to be some good shit
It means that open loop gain at 10MHz is 1x and at 1MHz is 10x, but at 100kHz it's about 230x, and then at 10kHz it's 2300x again.