But the elephant in the room is that a typical UART peripheral usage implementation, even in a modern ARM MCU, is below 10 lines of code!
Maybe three lines of turning the peripheral clock on, and connecting the UART to the IO pins. Which is something a library does not magically do for you; you need to still know how to do it, with another library call which is very likely more than 3 LoC.
Then approx. two lines to configure the UART: enable, set baud rate register.
And finally, 2 * 2 LoC to use it: check if byte is available, and read it. Check for tx empty flag, write byte.
Yes, a total beginner cannot do it in 5 minutes, but 2 weeks is a ridiculous claim, unless they are struggling with very basics of C language which prevents them to use any library, as well.
I have been through this as a beginner, as well. I tried to access a standard character parallel LCD using a library, tried it for 5 hours, no success, wrote my own code in 30 minutes. As a beginner hobbyist.
I tried accessing one wire thermometers using a library, tried it for 5 hours, no success, wrote my own code in 2-3 hours. As a beginner hobbyist.
Most of those who do instead of talking all theoretical have the exact same experience. Note, this applies to trivial things, like UART, GPIO, SPI, ADC...
Beginners often struggle to configure things. They can write crappy C code by trial and error and often like to do it. It being crappy doesn't matter to them or anyone else, it's the way to learn.
If "using a library" means you get an easy-to-install Windows IDE which includes said library by default and hitting "compile" produces a working project, this means development is easy. But it is not because of the library. It could equally well be an UART example project of that 10 lines-of-code! As a result, the beginner could actually learn what is happening, and still have that easy learning curve.
But no, that would not result in the nice vendor-lock-in aspect that the manufacturers are so keen on.
The "library is needed because library is needed because library is needed because otherwise NIH" argument is just ridiculous. How many layers of abstraction do you need? Why stop at n when you can have n+1? You do need a library to calculate Clarke transform, which is two multiplications by constants and an addition. Do you need an RMS library to calculate RMS? Some think you do. Finally, where is the limit - do you need a separete library to calculate a + b, while the C language has built-in "+" operator?
I propose that instead of such extremist principled position, we would use more pragmatic metrics to decide library usage, such as:
* The complexity of doing it yourself
* The complexity of using an available library, including the time that it invested in finding and installing that library
* The required level of customization. A math processing library, for example, could work with pure functions and compile to any CPU architecture just fine. An MCU peripheral library needs to interact with the interrupt system, other user code in said interrupts, other peripherals, and so on, and hence the libraries only tend to cover simplistic cases.
* Licensing...