The difference is of course the pulsed load on the 3.3V regulator. The blue LED most likely has a Vf up around 3V so probably draws current pulses of the order of 10mA or so. OTOH the IR LED will draw pulses of over 100mA. As the LM1086 datasheet recommends an aluminum Electrolytic or Tantalum output capacitor to provide a minimum ESR for stability, I doubt its terribly happy with an undersized ceramic + being 'kicked' hard at a frequency two decades higher than the corner of its frequency response.
The easy fix would be to feed the LED from +5V via a 39 ohm resistor, but I'd still advise plenty of local decoupling between the top end of the feed resistor and the MOSFET source to keep that pulsed current loop small and local.
I figured that might be the case, I was hung up on the 3.3V because I eventually wanted the thing to be battery powered. As for the regulator, I looked at the datasheet, but lacking any of the recommended capacitors I had to take my chances. During testing I however repeatedly bypassed it to no avail.
FWIW for future testing I will probably stick to your recommendation of 39R and 5V.
I also made a critical error in my original post where I mentioned the "IR LED gets stuck on", when in fact it kept pulsing, leading me to suspect the microcontroller has some part in the fault(s).
In order to reduce the amount of "antennas" in my test setup I now rebuilt the circuit on a scrap bit of perf board and completely removed the regulator until further notice, instead powering the thing from my - semi decent - bench PSU; probably also something I should have done sooner.
If space and cost are not major concerns, use three IR LEDs in series, powered from 5V, to get triple the IR, for the same current.
Thanks for the suggestion! Though currently I'm more concerned with getting the circuit to run at all
Failure to follow the datasheet of the voltage regulator, most likely.
In particular, use of ancient LDO with strict requirements about the output capacitor. Modern day LDO regulators provide actually low dropout, and are stable with zero ESR, so no need to use tantalums or explicit series resistors, but satisfying minimum capacitance is enough.
But that regulator is usable with the correct type and amount of output capacitance, and the massive dropout of 1.5V is just barely OK for 5V to 3.3V application; so add a 10uF tantalum and see what happens. Failing to have a 10uF tantalum, put any random 100µF electrolytic cap and try your luck.
Noted! I did what I could with the parts I had and hoped I could get away with it, as I figured the circuit is not particularly demanding... Though at this stage, I could have just gone out and get the proper components.
Is the receiver isolated (in space) from the transmitter?
.... or is the transmitted signal interfering with the original IR signal?
I seem to remember that commercial IR repeaters used to simply amplify the original signal without de-modulating, and thus with no significant delays.
Your setup seems to de-modulate 38kHz original signal, then re-modulate and send. This causes a delay that in turn would cause interfering feedback, should the receiver see the re-tranmitted, delayed signals.
If receiver is fully isolated from transmitter (perhaps in a different room), then none of the above applies
Oh man, you sure know how to scare someone, I wish I could tell you I considered that. Luckily the parts were facing opposite directions and with my current "overhaul" I further isolated the parts.
However there is definitely something funky going on that is infrared related:
Ok, at this point I'm pretty serously spooked.
More info:
I'm using a TSSP4P38 to receive and demodulate my IR signal.
My current test setup uses the same components as before, except for the voltage regulator and it's all somewhat consolidated on perf board instead of breadboard. R2 has been switched out for 39R and I'm driving the whole thing at 5V from my bench PSU.
With no LED - so no chance of feedback - my scope indicates that the quality and repeatability of the signal going to the transistor gate varies with the distance of the remote control to the receiver
At about 10cm the signal looks terrible:
At 100cm not always perfect but pretty dang good:
And at 300cm worse again:
Note that in the lumped together "signal blocks" the signal is still oscillating at 38kHz, so it looks like for some reason the controller isn't turning off the output.
Measuring the output of the IR receiver shows why: Except for at about 1m distance, the decoded signal is simply not representative of what the remote sends and happens to correspond exactly with what the micro outputs.
So is the receiver busted?
Measuring its output out of circuit doesn't suggest so at all. No matter the distance of the remote the signal, the receiver output was consistent and, as far as I can tell, correct.
I also measured the signal the controller outputs to the MOSFET gate without having the transistor attached:
Now, distance again does not seem to make a difference. The slow signal decays after some of the pulse blocks are most likely from the somewhat hacky way the output gets turned off and occurs when the output is deactivated while the 38kHz pulse is high.
This however is normally taken care of by the pull-down resistor R1:
Thanks for all the suggestions so far! I'll probably put the project to rest for the day, gotta clear my head...