Then one day you upgrade to v3.4 of newlang only to find your code no longer compiles because newlang now supports iterators and 'yield' can no longer be used that way! Right away we must change code, run unit test, create commits, review code and so on, a total waste of people's valuable time all imposed by a language because the designers didn't think about backward compatibility.
It is not just that. When such changes occur, significantly in the future. The original programmers, may be unavailable (busy on other projects), or have left the company. Even if available, it might be many years ago, when they wrote that code, so they have totally forgotten about it.
So, for them (or others), to now go back, and rework the source code. Can be a rather time-consuming, problematic (further new bugs might be introduced, because they no longer understand the code), and an expensive loss of (valuable/limited) resources, that could be better spent, doing other useful things.
It might even be someone, trying to recompile the software, and sort out any dependencies, just so they can use it, some time in the future. So if the (then) current V3.4, no longer compiles, without many fatal errors, using the existing source code base (which could be open source or commercial). That will put a real damper, on the activities, within the new language.
Some (very successful and popular) programming languages, have such huge existing software bases, in source format. That making future changes to the language, without breaking any existing functionality. Can be a real-minefield.
Taking CPU instruction sets, as an analogy. On the one hand, the X86 / X86-64 instruction sets, have amazing compatibility, even going back around 40 years, to the original 8086/88 CPUs. But some people think, that it also causes big baggage (like) problems. Whereas alternative architectures, such as Arm, tend to be rewritten from scratch, only going back one or two instruction sets, as regards compatibility. But that potentially allows the instruction set to move on from the old days, and soak up, any modern CPU performance improvements, with minimal handicapping, from old, somewhat obsolete instruction sets, of the distant past.
E.g. How many people, really run native 8088 programs, on their 192 core (dual socket 2 x 96 core), AMD Epic processors?
I have heard some of the backward compatibility, may be no longer available, but I'm not sure. Removing it has been discussed, but I'm not sure if it happened or not.
Windows, seems to happily drop support, for even rather modern/recent processors. But that could be, because they get paid license fees, mainly when new PCs are purchased, so it is in their interests, to obsolete CPUs, even if they are rather recent ones.
In other words. It is annoying when a programming language, introduces, show breaking changes, between say versions 2.0 and 3.0 . But, that might (or might not), be worth it, from a longer time scale point of view.
E.g. Despite Python, sometimes dramatically changing, between major release number versions. It does seem to still be very popular. So I wouldn't like to say, one method is best. It is a tricky situation to handle.
As they say, you can't make a omelette without breaking eggs. Maybe such dramatic changes, is ok, if done responsibly, and only very rarely. Such as once every 7 or 8 years.
If you DON'T allow show stopping changes, ever. Eventually the thing (a programming language in this case), can eventually become stale and fixed. Unable to move forward, with any/much momentum. Ultimately, risking the language becoming obsolete (i.e. people stop using it), as time goes on.
If you are still primarily aiming this to be an MCU language. There is a tendency for such projects, to have relatively short development time-scales (but, they COULD be very long, it depends). With whatever language version, they started out with, on that particular MCU, remaining, throughout the project life-time.
Because if you have MCU (embedded) software, which works well and is considered bug free. Made with version 1.24 of the compiler, nine years ago.
Experience tends to find, that if you update the compiler to the latest version 9.79, on a much later version of windows. There is a tendency, for various bugs (that didn't exist previously), to suddenly appear, mismatches within the new (updated) libraries and even entire functionality, that is no longer present in the latest libraries that come from the manufacturer.
The latest compiler, may not even work, with your MCU, from many years ago.
If it ain't broke, don't mess with it.
With FPGAs, and older specific FPGAs, typically. You can't even use, older FPGAs, on the latest FPGA software packages, as support was dropped for those older FPGAs, many years/versions ago.
You started off this thread with mention of PIC MCUs. Which tend to lose support for the older PIC models, on the latest programmer-devices/compilers and libraries. So, total backward compatibility, may not be necessary or even a good idea. I suppose care/caution is needed.
Also remember the K.I.S.S. (Keep It Simple Silly/stupid), principal. Making the compiler too overly complicated, with way too much functionality. May mean that, it can never be completed, given its available resources (such as programming team size/time, for creating the compiler). Sometimes, you have to harshly remove extra functionality, to make things practicable.
I did start the thread with an emphasis on MCU applications, that's true. Perhaps I should explain/clarify what that means. To me, it means a compiled language that includes features that although having general world use, are particularly helpful to the hardware programmer. In other words a language that can be attractive and linguistically contends with C and comparable languages used today.
For example explicit support for a bit data type, operators for arith-shift, logic-shift, several ways of overlaying types in memory, offset as well as traditional pointers and so on.
A little bit about me too, might also help understand my position here, I apologize for the length and the apparent self indulgence. I was an electronics hobbyist as a kid, when I lived in Liverpool UK. I was about 14 or so when a mate introduced me to transistors, radio and stuff. I quickly became engrossed and he and I did several projects together in the mid/late 1970s.
We managed to get stuff published in Everyday Electronics too, a bit of cash but a fun experience for two teenagers. I soon became absorbed by microprocessors after they became huge in magazines like Practical Wireless and ETI, I bought a basic 6502 kit from Acorn and interfaced that to motors and all sorts.
By the age of like 18/19 I got into a specialist electronics college (my mate got a place at Riversdale college in Liverpool and studied marine radio and radar and ended up doing defense work overseas for a few years) and for the next two years studied (full time) electronics, DC/AC, passive circuits, transistors, amplification, RF, telephone systems, power/motors etc, microprocessors, digital systems, lots of hands on work with scopes, soldering and stuff, a pretty dense two year education, approx at least the first year of an EE degree IMHO.
After I finished I did some more writing for about a year, just living an ordinary life in Liverpool and still reading everything I could find on electronics and computing, this is around 1980/1981.
I started to look for a job -
hugely wanted an electronics job, I had very good marks from college, several published articles in my portfolio and so on, but could not find work, Liverpool was depressed, high unemployment, factories closing and a few jobs I did interview for outside of Liverpool never materialized, I think not having a proper degree was a handicap too, despite my definite competence and skills.
By fluke a friend mentioned a new 14 week government run course in Liverpool, taught by a private firm, subsidized by the government to deal with high unemployment. Two hundred applied for twenty places, I was one of those that got through the lengthy test process.
I was asked to choose between training in IBM Assembler, COBOL and PL/I. A friend said "PL/I's a pretty serious language, great for writing systemy stuff" so I said OK what the heck and chose that.
Toward the end the firm made efforts to get us interviews, I got one in Enfield, London and they offered me a job, so I left home and started working for the local government as a trainee PL/I programmer, it was nice, decent work, lots of older, experienced clever people and several lads about my age all interested in microprocessors, BASIC, assembler, electronics, Apple etc.
I learned a lot about large scale projects, working in a team, paying attention to detail, communicating technically with others, I was liked and I liked the place.
I ended up working as a software guy thereafter, my 'dream' of a serious electronics career faded and I became a "developer" and am to this day.
Now, as a trained, educated, intelligent electronics person I was interested to see PL/I's features for bits and storage overlays and pointers and so on, I was very at home with technically oriented software, people gave me "technical" work because I had insights that only an electronics engineer would have,
I took it for granted that many other languages also supported stuff like bits, pointers, offsets etc but slowly realized they did not, as I learned C and other languages I recall thinking "Man, this is crazy, that older mainframe language is way beyond these other languages and it would be much better to write stuff in PL/I than C" and so on.
In a word, most other languages looked feeble, rushed, poorly thought out, clumsy, to me anyway, and recall I was not and am not a lightweight in electronics.
So that's my background and so when I discuss programming languages and MCU programming, I'm not naive as some here might want to think, I am inexperienced specifically with these devices and don't pretend to be an expert but I do know how to develop software, I do understand electronics, I do understand compilers, machine language, assembler, and so on.
So this "new language" if I do take it to the next level, is a very serious subject to me, and I see a place for a language that specifically supports features that will aid in MCU development work, I'm 100% definite about that. It's also very interesting, I've built a large compiler already in the past (and wrote it in C) so I am well aware of what's involved.
The PIC stuff was just comments I'd picked up on other forums, PIC developers griping about how hard it is to use C, it seems PIC is an outlier anyway so nothing to be gained by dwelling on that.