For the JIT part, I think that's something we all must accept. After all as variety of devices booms, JIT is inevitable. Most programs running on Windows are already JIT based, thanks to .NET. Almost 100% software running on Android is so, and some iOS software are JIT, while Apple is pushing the shift from native to JIT.
In many cases, if your program is not custom made algorithm intensive, JIT can be faster than native due to better optimized libraries and "free" 64-bit support.
Pre-caching is the technology supporting JIT. In many cases, especially for Android, when a JIT program first runs, OS does a thorough AOT compilation to accelerate successive loading. .NET offers similar feature, but not automatically to user programs (or the evil WinSXS will be much larger).
The experience I've gleaned with JIT over the past couple of decades is that the touted benefits are very rarely if ever in evidence, whereas the downsides, in particular slow startup, most definitely are. .NET is possibly the worst offender I've experienced, but Java's not great either. We did a demo of a real case distributed system at Barcleona TechEd about 15 years ago at the birth of .NET, we had to pre-load everything before we took to the stage as it took ten minutes for the splash screen to appear, hardly great for a live auditorium demo! In real life practical deployments, I wrote a number of hacks to override the GAC by overwriting the cache to stop it recompiling at the drop of a hat. More recently you've been able to precompile to avoid JIT altogether, there's a reason they added that feature!
A further problem of JIT, whether dynamic or adaptive, is the validity of what's compiled. If the optimiser changes, or, in the case of adaptive compilation, the code is compiled with an unrepresentative dataset, it is quite possible that the resulting code is wrong, so you may see in edge cases dramatic non-deterministic slow downs for no apparent reason. I called them edge cases, but that doesn't mean it's rare. I would much rather have consistent and deterministic object code that I reproduce problems with and correct easily than a quicksand stew where I can't.
IMHO particularly in dedicated embedded systems, I just don't see any justification for JIT, it's a completely unnecessary technology, adding no value. It's not like the CPU fairies come around in the middle of the night and switch out the CPU on you.