Why can't you just use normal proven circuits in your designs like everyone else. Your ideas somehow never fail to draw a "what the fuck" reaction here. You need to either stop trying to reinvent the wheel, or maybe apply as application engineer in ADI or something serious....
Why not just power it with 3.3V?
If rail-to-rail is important, even a +5V supply is still infinitely safer. Most STM32 can tolerate 5V input to some extent if you limit the current. It's a specified operating mode in their datasheet. Although in my experience it may even survive without current limiting (don't intentionally do it in prod designs duh.)
Use a -0.3V negative rail or similar if you need the signal to swing to exactly zero. Something enough for the opamp output to slew into, but not enough to reverse bias whatever input structure on the uC.
Depending on how important the ADC accuracy is, a much better approach is to scale the ADC input into 0 - 2.048 V and use external 2.048 V
REF. There's much less chance that a slight overdrive will mess the V
DD rail, and especially AV
DD.
Opamps max output current is 26mA....maybe micro can survive it for a little bit?
Use actual resistor to limit the current, don't rely on output drive limitation alone. Make sure the resistor value is large enough to limit the current to safe value but not too large to affect sample and hold circuitry. STM32 datasheets specify maximum recommended R
AIN values for specific sampling rate, look at it.