Currently I'm simulating a single supply inverting opamp (Ideally I'd like non-inverting but I don't know how to get "fractional" gain with one)
The input spans from 0V to Vcc, which is also the supply volts for the opamp and its not rail to rail.
The output starts off flat, then goes into a linear transfer section and flattens out again.
I want to limit the input span to the linear region of the output, is there an easy way to determine that?
Keep in mind that when I bread-board this I'll collect actual values and want to do the same to these points
I'm using ltspice and can export the data and graph it in an external program such as
Graph or excel
The only method that springs to mind at the moment is take 2 points in the middle of the linear section, creating a linear equation for them and then checking the deviation of the actual values from the derived line
Are there any simple methods I'm missing or is my circuit fundamentally flawed