Author Topic: The Imperium programming language - IPL  (Read 86839 times)

0 Members and 2 Guests are viewing this topic.

Offline Sherlock HolmesTopic starter

  • Frequent Contributor
  • **
  • !
  • Posts: 570
  • Country: us
Re: A new, hardware "oriented" programming language
« Reply #50 on: November 23, 2022, 04:35:17 pm »
The idea of the compiler calculating NOP's to yield equal execution time through multiple code paths sounds great until you remember interrupts, which are almost always a part of embedded systems.

What are you going to do, disable interrupts for large portions of code to prevent asynchronicity? That defeats the very purpose of hardware interrupts and essentially forces a polling based architecture that won't be compatible with a huge number of embedded environments.

I suppose you could commandeer some of the target's hardware timer resources and set up a sort of timekeeping scheme, but that smacks of certain dynamic "features" which are the worst parts of C++ and many other OO-like languages.

I'm not discouraging you from the exercise, but have realistic expectations. The chance for something like this to go mainstream is very close to zero.

These are absolutely important questions, they are pertaining though to language semantics and I've not devoted much time at all to that area yet. I'm primarily looking at the grammar and leaning toward a PL/I like grammar which is a simple grammar yet offers huge flexibility for future growth of the language.

The suitability of emitting umpteen NOP instructions is down to the developer, it is for them to use or not as they see fit and of course they could abuse it or misuse it just as we can do today with many languages, so I fully accept the concerns you raise but see them as the domain of the designer rather than the language itself.
“When you have eliminated all which is impossible, then whatever remains, however improbable, must be the truth.” ~ Arthur Conan Doyle, The Case-Book of Sherlock Holmes
 

Offline Sherlock HolmesTopic starter

  • Frequent Contributor
  • **
  • !
  • Posts: 570
  • Country: us
Re: A new, hardware "oriented" programming language
« Reply #51 on: November 23, 2022, 04:41:32 pm »
That's simply not true, we can emit multiple platform specific NOP's today by embedding assembler in C. There are assembler MCU developers out there that struggle to use C because of this kind of thing. They have carefully crafted code where the want some execution path to take exactly the same number of clock cycles as some other, so they embed multiple NOPs sometimes, their designs require that.

Simple enough microcontrollers that can produce cycle-accurate predictable timing by simple instruction counting are becoming exceedingly rare. These 8-bitters still exist, but such cycle-accuracy combined to generally very crappy performance is something people do not actually want. Instead, people buy modern high-performance microcontrollers which can still produce very accurate timing by just scaling the absolute time used per cycle down, by utilizing higher clock speed. For example, a 12-cycle ISR latency on a 400MHz MCU looks like HALF a cycle on a 16MHz AVR/PIC; at that point you simply do no care if it sometimes takes 14 cycles due to some pipeline or branch mispredict or something. I have done pretty timing-sensitive things simply on interrupt handlers, and the advantage is ease of writing, reading, and maintaining that code. Manual cycle counting, or automated version thereof, is simply not needed anymore, except in very rare cases, which require careful understanding anyway.

The problem with this "predictive timing language" is, it becomes tied to certain type of hardware, and then you have replicated what the XCORE folks have done (apparently pretty well).

This is simply not true, I spoke recently with very experienced engineers who routinely use assembler for 8 bit PIC processors that are a large market. Trying to recode some parts of their existing, working designs in C is proving a huge challenge with code bloat and optimizations causing serious problems. Simple optimizations too like short-circuit evaluations.

All I'm arguing here really is for a new language to have features that help avoid these kinds of problems, where some aspects of the generated code are more controllable in a fine grained way, the NOP idea is just an example of what's required rather than a formal proposed language feature.
“When you have eliminated all which is impossible, then whatever remains, however improbable, must be the truth.” ~ Arthur Conan Doyle, The Case-Book of Sherlock Holmes
 

Offline Sherlock HolmesTopic starter

  • Frequent Contributor
  • **
  • !
  • Posts: 570
  • Country: us
Re: A new, hardware "oriented" programming language
« Reply #52 on: November 23, 2022, 04:55:16 pm »
I am not an MCU programmer.

Why not? And why am I posting here?

Well, I am not an MCU programmer, because for my needs I don't need it and for hobby purposes it is simply said too complex for the occasional project.

I am posting here, to add a different perspective:

In my opinion, the greatest revolution in MCU programming was due to Arduino and the extremely easy and well documented SDK. This opened doors to many hobby applications, otherwise inacessible to a wide percentage of hobbyists.

So I wonder: why try to program "Yet Another C Compiler" for MCU, if the existing programmers have no problem with the existing SDK's (as the discussion so far indicates)? Why not, instead, develop an SDK/language for the masses that allow to use more generic (cheaper) MCU and program them as easily (or even more easily) as what Arduino offers?

Also, another suggestion: current MCU SDK's are bloated in size! If you want to program PIC, AVR and something else, you easily fill your harddisk with three huge installations. If you want to do something nice, make it small! Preferrably something that just deflates into a single folder.

Sorry if this is beyond your purpose or if my line of thought doesn't make sense (it does to me, though).

Regards,
Vitor

I couldn't agree more, the basis for my interest in this is simply that C is not and never was designed for MCU programming, it was never designed at all for that problem domain. So my questions are really about how could, how should a language be designed specifically for this problem domain? That's where I'm coming from.

I am also certainly not proposing "Yet Another C Compiler" either, the language might end up with a few superficial similarities to C but will likely have more in common with say PL/I as well as other languages.

The issue of some degree of fine grained control over code generation keeps coming up too, I don't think that is ever given much consideration at all in programming language design, the goal for decades has been to totally abstract the hardware away as much as possible, Considering ideas and options for giving some kind of control over the code generation is a novel concept.

As I said earlier the "emit multiple NOPs" idea is more by way of an example, a way to convey the problem, another way to expose that kind of control at a language level might be something like:

Code: [Select]
exact void some_function()
{

if (A | ^B & C | ^D)
{
    // do stuff
}
else
{
   // do other stuff
}

}

Where an attribute "exact" or "uniform" (for example) conveys the requirement that all execution paths should be the same - so far as is reasonably possible anyway. That is a direct influence over code generation and platform specifics, so how it would work isn't the subject (yet). But there are semantic implications immediately, for example an "exact" function could never be allowed to call or invoke any other function that was not itself "exact".




“When you have eliminated all which is impossible, then whatever remains, however improbable, must be the truth.” ~ Arthur Conan Doyle, The Case-Book of Sherlock Holmes
 

Offline ataradov

  • Super Contributor
  • ***
  • Posts: 11780
  • Country: us
    • Personal site
Re: A new, hardware "oriented" programming language
« Reply #53 on: November 23, 2022, 05:00:06 pm »
I'm not sure why are you trying to convince us. if you think it is a good idea - do it. Get back to us when you have it working and we'll have a look at it.

This is not the first dreamy thread about ideal languages. There is no point in dreaming up more potential ideas until there is a base compiler that at least does something.
Alex
 
The following users thanked this post: Siwastaja, newbrain

Offline Sherlock HolmesTopic starter

  • Frequent Contributor
  • **
  • !
  • Posts: 570
  • Country: us
Re: A new, hardware "oriented" programming language
« Reply #54 on: November 23, 2022, 05:01:24 pm »
I've only mentioned the kinds of things raised by very experienced hardware/software engineers that rely heavily on PIC, their needs, concerns and frustrations are real, not imagined, I am simply collating broad issues and concerns raised by people who find C restrictive or inflexible.

To me, you just sound like a case of Dunning-Kruger. Those who "rely heavily on PIC" probably are "very experienced engineers" only by very flexible definition of "very experienced". Your posts have already demonstrated you having quite serious lack of basic knowledge*, probably by your learning getting hindered by excessive self-esteem. I would say: lurk moar. This forum is a great resource, read and participate in discussions on the MCU & programming subforums for 5 more years. There is a lot to learn, and by getting to read comments from some people MUCH MUCH more experienced than your PIC friends, your priority list of things you would like to see in a "C replacement" would significantly change.

*) for example, mixing totally up what "native" types mean, which is pretty important in context of hardware-oriented language!

But sure, there is a niche of cases where writing cycle accurate things on a 8-bit PIC is still relevant, and where a C replacement which can help do that with less work, is also relevant. But the result would not be "a new, hardware oriented programming language", but "a PIC-specific niche tool".

And sorry in advance for being blunt, don't take it too seriously, I just struggle to write in any other way, take it with a grain of salt.

I don't regard your post as blunt only rude Your personal opinion of my "understanding" and "knowledge" and "excessive self esteem" have no place in intellectual discourse. They are - if I might be so blunt - little more than ad hominem arguments, so please can we not stick to the subject under discussion and not your opinions of my character or motives or abilities?

Thank you.
« Last Edit: November 23, 2022, 05:05:20 pm by Sherlock Holmes »
“When you have eliminated all which is impossible, then whatever remains, however improbable, must be the truth.” ~ Arthur Conan Doyle, The Case-Book of Sherlock Holmes
 

Offline Sherlock HolmesTopic starter

  • Frequent Contributor
  • **
  • !
  • Posts: 570
  • Country: us
Re: A new, hardware "oriented" programming language
« Reply #55 on: November 23, 2022, 05:04:33 pm »
I'm not sure why are you trying to convince us. if you think it is a good idea - do it. Get back to us when you have it working and we'll have a look at it.

This is not the first dreamy thread about ideal languages. There is no point in dreaming up more potential ideas until there is a base compiler that at least does something.

With all due respect, I am not the subject sir, the subject is in the thread's title, your opinions of me, my motives and so on are irrelevant to the subject under discussion.

“When you have eliminated all which is impossible, then whatever remains, however improbable, must be the truth.” ~ Arthur Conan Doyle, The Case-Book of Sherlock Holmes
 

Offline Sherlock HolmesTopic starter

  • Frequent Contributor
  • **
  • !
  • Posts: 570
  • Country: us
Re: A new, hardware "oriented" programming language
« Reply #56 on: November 23, 2022, 05:08:23 pm »
I have a golden rule when discussing potentially emotive subjects in internet forums, I know others likely shares this but evidently there are some who do not. That rule is that I never say anything to a person that I would not be prepared to say to them in person sitting in a meeting room with several other people. If one applies that rule before posting, the world would possibly become a better place, anyway - back to the subject, I'm enjoying the discussion, some good constructive replies have come in and they do give food for thought.


« Last Edit: November 23, 2022, 05:10:53 pm by Sherlock Holmes »
“When you have eliminated all which is impossible, then whatever remains, however improbable, must be the truth.” ~ Arthur Conan Doyle, The Case-Book of Sherlock Holmes
 

Offline ataradov

  • Super Contributor
  • ***
  • Posts: 11780
  • Country: us
    • Personal site
Re: A new, hardware "oriented" programming language
« Reply #57 on: November 23, 2022, 05:10:36 pm »
Ok, but what do you want from us then?

The only question in the OP is what we think about the existing languages. And majority of people that participated in the thread seems to agree that existing languages are sufficient to the point where there is no emergency. If something better comes up, we'll have a look at it.

Nobody wants to think of cool features you can implement in some imagined language because this is not the first and not the last thread. Nothing ever comes out of them.
Alex
 

Offline Sherlock HolmesTopic starter

  • Frequent Contributor
  • **
  • !
  • Posts: 570
  • Country: us
Re: A new, hardware "oriented" programming language
« Reply #58 on: November 23, 2022, 05:20:04 pm »
Ok, but what do you want from us then?

I see the participants here as individual and independent voices, so I would not refer to the participants as "us", you speak for yourself here, there is no "us" and to imply that is to presume everybody here agrees with your views which I don't think is true.

The only question in the OP is what we think about the existing languages. And majority of people that participated in the thread seems to agree that existing languages are sufficient to the point where there is no emergency. If something better comes up, we'll have a look at it.

Even if that were true, what of it? Disagreement - polite, sound, reasoned disagreement - is a normal and healthy part of intellectual discourse, being disagreed with is often valuable in some way or other, I certainly do not expect universal agreement in any technical discussion.

Nobody wants to think of cool features you can implement in some imagined language because this is not the first and not the last thread. Nothing ever comes out of them.

So the argument you are now making is that you can prove with absolute certainty that discussions in this forum about programming language grammars and semantics always, absolutely, in every case, never ever yield anything of utility?

That sir is a philosophical opinion not an evidence based proposition, at least I'd be very surprised indeed if you could offer a proof.





“When you have eliminated all which is impossible, then whatever remains, however improbable, must be the truth.” ~ Arthur Conan Doyle, The Case-Book of Sherlock Holmes
 

Offline tggzzz

  • Super Contributor
  • ***
  • Posts: 20768
  • Country: gb
  • Numbers, not adjectives
    • Having fun doing more, with less
Re: A new, hardware "oriented" programming language
« Reply #59 on: November 23, 2022, 06:03:12 pm »
...

wouldn't have said it better.
Mentioning PICs, some of the proposed improvements are already part of the XC compilers, others are requests that clearly come from assembly programmers who can't stop thinking in assembly and ostentably write in what i call "Clever C" (forgetting that you CAN, and some time must, use separate assembly modules. Just conform to the ABI and call them as C functions so the compiler is happy, other programmers will be as well)
Dedicated strings requrie heap and personally speaking in a C project i prefer to avoid heap. In languages in which is a given i simply apply another mindset (using it responsibly of course)

I've only mentioned the kinds of things raised by very experienced hardware/software engineers that rely heavily on PIC, their needs, concerns and frustrations are real, not imagined, I am simply collating broad issues and concerns raised by people who find C restrictive or inflexible.

I do hope you aren't planning a language that is specific to current PIC processors and current PIC peripherals.

If that isn't the case, what will your strategy for inclusion/exclusion of a language feature? Either a small language that is all useful with all processors/peripherals. Or a large language that contains all possibilities, only some of which are exploitable with given processor/peripherals.

Without being able to articulate your strategy (to an audience including yourself), you will flail around in different directions.
There are lies, damned lies, statistics - and ADC/DAC specs.
Glider pilot's aphorism: "there is no substitute for span". Retort: "There is a substitute: skill+imagination. But you can buy span".
Having fun doing more, with less
 

Offline coppice

  • Super Contributor
  • ***
  • Posts: 9559
  • Country: gb
Re: A new, hardware "oriented" programming language
« Reply #60 on: November 23, 2022, 06:28:59 pm »
This is simply not true, I spoke recently with very experienced engineers who routinely use assembler for 8 bit PIC processors that are a large market. Trying to recode some parts of their existing, working designs in C is proving a huge challenge with code bloat and optimizations causing serious problems. Simple optimizations too like short-circuit evaluations.
Code bloat is pretty much inevitable when using a high level language for a PIC or an 8051. There was one very focused C compiler for tiny cores. MicroChip bought it, and shut down things like the 8051 version, which had been quite popular for putting fairly complex apps on 8051 cores. That left the market with no dense code compilers for the 8051, and a reasonably dense compiler for the PIC. If you think code bloat is bad with the PIC compiler, you should try looking at some 8051 apps.
 

Offline Siwastaja

  • Super Contributor
  • ***
  • Posts: 8888
  • Country: fi
Re: A new, hardware "oriented" programming language
« Reply #61 on: November 23, 2022, 06:52:50 pm »
This is simply not true, I spoke recently with very experienced engineers who routinely use assembler for 8 bit PIC processors that are a large market. Trying to recode some parts of their existing, working designs in C is proving a huge challenge with code bloat and optimizations causing serious problems.

(emphasis mine.) Again being rude (to other people, too:) sounds like peter-h is one of your very experienced engineers.

I can relate, I have written cycle-accurate code on 8-bitters and so on, but got into more challenging projects more than 15 years ago. I still believe what you propose is more like a niche tool than actually useful as a generic paradigm, or a necessity.

Old projects, on old HW,  written using old-time methods - let them be, or if you absolutely must change them, find people who can work like "back then". If large changes are needed, redesign using today's HW and design principles.

I'm suspicious about the need of doing large, fundamental rewrites to 8-bit projects without hardware changes, and I'm also suspicious about the supposedly large size of these 8-bit projects. They are usually quite small, given the limitations in memory.

Squeezing out the very last drops of memory or CPU performance (micro-optimization) is specialized work anyway. If you are going to sell 10000 units you just use large enough MCU and pay some dozen cents more for it and don't need to optimize until cows come home. If you are going to sell 10000000 units, then it makes financial sense to squeeze hell out of it to save $0.10 * 10000000 = $1000000, but then you can afford a real professional who does the job in a few days and does not complain about needing a new language. Been there, done that, too.
 
The following users thanked this post: newbrain, JPortici

Offline IDEngineer

  • Super Contributor
  • ***
  • Posts: 1944
  • Country: us
Re: A new, hardware "oriented" programming language
« Reply #62 on: November 23, 2022, 07:12:21 pm »
C is not and never was designed for MCU programming
True, and yet it's just about the perfect "universal language" for embedded applications. It's close to the hardware without (necessarily) requiring the user to learn the specifics of the chip. Bloat is often related to a language's abstraction level, after all. Just as bloat is often related to the programmer's inexperience with the underlying hardware.

As for a "language that is [processor] specific", that's what libraries are for. And that's not limited to C, either. All the common, basic constructs are part of the language whether it's C, C++, Python, Basic, etc. Then, if you don't want to roll your own interface to the specific hardware, you grab a library. Sure it will be bloated since libraries generally try to be all things to all people, but that's the price you pay for the convenience of being abstracted (there's that word again!) from the hardware details.

I suspect all you're going to get with an "embedded language" is something that looks like C with a bunch of libraries - and that exists today. And it will be specific to that one family... so you'll have different "language versions" for various processors. Even the API won't be the same because the on-chip peripherals differ from one family to another, so code won't be "portable" except in the coarsest sense.

Consider the complexities of limited resources: I'm short one Timer, but I figure out that on this specific chip I can have it feed into a CCP module and then use the latter's interrupt to effectively synthesize an extra Timer. How are you going to make that portable? Today's available timers are 8 bits, tomorrow's are 16 bits, different families have different numbers of 8 and 16 bit timers, some of the (E)CCP modules have unique features (example: timestamping inbound CAN messages), etc. This is why there are different libraries for different families, and why most skilled embedded developers don't bother with libraries anyway.

Again, I'm not trying to be a downer here. I love your enthusiasm and passion. But you openly stated "I've not devoted much time at all to language semantics yet". Semantics and implementation are almost the same thing in embedded environments. When you're dealing with a full blown, virtual memory, isolated execution context operating system you have the luxury of living in an academia-like "ideal world" where everything is fully abstracted. Embedded systems and microcontrollers aren't like that. Properly done, every line of embedded code is written while considering the blended hardware+firmware environment because every line can have ill effects.

I think this thread can have merit if you focus on asking people what problems they have, and then go off to consider how you'd solve those situations. So far the proposed "features" would be hindrences to me and I would aggressively avoid using them.
 

Offline Sherlock HolmesTopic starter

  • Frequent Contributor
  • **
  • !
  • Posts: 570
  • Country: us
Re: A new, hardware "oriented" programming language
« Reply #63 on: November 23, 2022, 07:14:37 pm »
...

wouldn't have said it better.
Mentioning PICs, some of the proposed improvements are already part of the XC compilers, others are requests that clearly come from assembly programmers who can't stop thinking in assembly and ostentably write in what i call "Clever C" (forgetting that you CAN, and some time must, use separate assembly modules. Just conform to the ABI and call them as C functions so the compiler is happy, other programmers will be as well)
Dedicated strings requrie heap and personally speaking in a C project i prefer to avoid heap. In languages in which is a given i simply apply another mindset (using it responsibly of course)

I've only mentioned the kinds of things raised by very experienced hardware/software engineers that rely heavily on PIC, their needs, concerns and frustrations are real, not imagined, I am simply collating broad issues and concerns raised by people who find C restrictive or inflexible.

I do hope you aren't planning a language that is specific to current PIC processors and current PIC peripherals.

If that isn't the case, what will your strategy for inclusion/exclusion of a language feature? Either a small language that is all useful with all processors/peripherals. Or a large language that contains all possibilities, only some of which are exploitable with given processor/peripherals.

Without being able to articulate your strategy (to an audience including yourself), you will flail around in different directions.

These are good points. No, there is no special focus on PIC products, I just wanted to include some of the problems experienced in that domain, into the overall picture, if there was some very obscure thing, specific to some very narrow device then there's little to be gained from considering that unless the effort is not too high.

As for inclusion/exclusion strategies I think that's a fascinating question and I've not thought about it in any real detail yet.

It would include some kind of classification of language features I suppose. I suppose one could start listing features (whatever these get defined as) and then tabling  these for their utility across a range of devices.

In fact this is a very good question and I might take a stab at starting such a table with a view to having people more expert than I, contribute too and comment on.

I suppose we'd even need to define an MCU for the purpose of creating this tabulation.

At a first pass, these things might be inputs to how we do this:



and



Of course there are the peripherals to consider too.



Whether the language could or should abstract some aspects of some peripherals is an open question for me, ADC's are ADC's but there are sometimes fiddly, idiosyncrasies specific to some that are not seen in others, this is where my own knowledge is rather weak.

The bus width is a trait familiar to any language designer, so we expose numeric types that reflect that to some extent.

Interrupts are a clear thing to consider too, as are exceptions, also memory allocation, there are several ways to manipulate memory but most languages limit this to the concepts of static, stack and heap and offer limited way to interact with these, perhaps this is something that could be incorporated.

Anyway you do raise some important questions, clearly there is more to be done in this regard.



“When you have eliminated all which is impossible, then whatever remains, however improbable, must be the truth.” ~ Arthur Conan Doyle, The Case-Book of Sherlock Holmes
 

Offline Sherlock HolmesTopic starter

  • Frequent Contributor
  • **
  • !
  • Posts: 570
  • Country: us
Re: A new, hardware "oriented" programming language
« Reply #64 on: November 23, 2022, 07:17:15 pm »
This is simply not true, I spoke recently with very experienced engineers who routinely use assembler for 8 bit PIC processors that are a large market. Trying to recode some parts of their existing, working designs in C is proving a huge challenge with code bloat and optimizations causing serious problems. Simple optimizations too like short-circuit evaluations.
Code bloat is pretty much inevitable when using a high level language for a PIC or an 8051. There was one very focused C compiler for tiny cores. MicroChip bought it, and shut down things like the 8051 version, which had been quite popular for putting fairly complex apps on 8051 cores. That left the market with no dense code compilers for the 8051, and a reasonably dense compiler for the PIC. If you think code bloat is bad with the PIC compiler, you should try looking at some 8051 apps.

True, I wonder what underlies this "bloat"? experienced assembly language programmers probably have great insights here and if there is some "bloat" due to a language itself rather than poor code generation and optimization, it would be interesting to quantitatively, somehow, measure this.
“When you have eliminated all which is impossible, then whatever remains, however improbable, must be the truth.” ~ Arthur Conan Doyle, The Case-Book of Sherlock Holmes
 

Offline Sherlock HolmesTopic starter

  • Frequent Contributor
  • **
  • !
  • Posts: 570
  • Country: us
Re: A new, hardware "oriented" programming language
« Reply #65 on: November 23, 2022, 07:19:16 pm »
This is simply not true, I spoke recently with very experienced engineers who routinely use assembler for 8 bit PIC processors that are a large market. Trying to recode some parts of their existing, working designs in C is proving a huge challenge with code bloat and optimizations causing serious problems.

(emphasis mine.) Again being rude (to other people, too:) sounds like peter-h is one of your very experienced engineers.

I can relate, I have written cycle-accurate code on 8-bitters and so on, but got into more challenging projects more than 15 years ago. I still believe what you propose is more like a niche tool than actually useful as a generic paradigm, or a necessity.

Old projects, on old HW,  written using old-time methods - let them be, or if you absolutely must change them, find people who can work like "back then". If large changes are needed, redesign using today's HW and design principles.

I'm suspicious about the need of doing large, fundamental rewrites to 8-bit projects without hardware changes, and I'm also suspicious about the supposedly large size of these 8-bit projects. They are usually quite small, given the limitations in memory.

Squeezing out the very last drops of memory or CPU performance (micro-optimization) is specialized work anyway. If you are going to sell 10000 units you just use large enough MCU and pay some dozen cents more for it and don't need to optimize until cows come home. If you are going to sell 10000000 units, then it makes financial sense to squeeze hell out of it to save $0.10 * 10000000 = $1000000, but then you can afford a real professional who does the job in a few days and does not complain about needing a new language. Been there, done that, too.

I recently came across this in my studies:



Now I interpret that to mean that 8 bit devices are alive and well and very much in demand.
« Last Edit: November 23, 2022, 07:29:33 pm by Sherlock Holmes »
“When you have eliminated all which is impossible, then whatever remains, however improbable, must be the truth.” ~ Arthur Conan Doyle, The Case-Book of Sherlock Holmes
 

Offline Sherlock HolmesTopic starter

  • Frequent Contributor
  • **
  • !
  • Posts: 570
  • Country: us
Re: A new, hardware "oriented" programming language
« Reply #66 on: November 23, 2022, 07:26:59 pm »
C is not and never was designed for MCU programming
True, and yet it's just about the perfect "universal language" for embedded applications. It's close to the hardware without (necessarily) requiring the user to learn the specifics of the chip. Bloat is often related to a language's abstraction level, after all. Just as bloat is often related to the programmer's inexperience with the underlying hardware.

As for a "language that is [processor] specific", that's what libraries are for. And that's not limited to C, either. All the common, basic constructs are part of the language whether it's C, C++, Python, Basic, etc. Then, if you don't want to roll your own interface to the specific hardware, you grab a library. Sure it will be bloated since libraries generally try to be all things to all people, but that's the price you pay for the convenience of being abstracted (there's that word again!) from the hardware details.

I suspect all you're going to get with an "embedded language" is something that looks like C with a bunch of libraries - and that exists today. And it will be specific to that one family... so you'll have different "language versions" for various processors. Even the API won't be the same because the on-chip peripherals differ from one family to another, so code won't be "portable" except in the coarsest sense.

Consider the complexities of limited resources: I'm short one Timer, but I figure out that on this specific chip I can have it feed into a CCP module and then use the latter's interrupt to effectively synthesize an extra Timer. How are you going to make that portable? Today's available timers are 8 bits, tomorrow's are 16 bits, different families have different numbers of 8 and 16 bit timers, some of the (E)CCP modules have unique features (example: timestamping inbound CAN messages), etc. This is why there are different libraries for different families, and why most skilled embedded developers don't bother with libraries anyway.

Again, I'm not trying to be a downer here. I love your enthusiasm and passion. But you openly stated "I've not devoted much time at all to language semantics yet". Semantics and implementation are almost the same thing in embedded environments. When you're dealing with a full blown, virtual memory, isolated execution context operating system you have the luxury of living in an academia-like "ideal world" where everything is fully abstracted. Embedded systems and microcontrollers aren't like that. Properly done, every line of embedded code is written while considering the blended hardware+firmware environment because every line can have ill effects.

I think this thread can have merit if you focus on asking people what problems they have, and then go off to consider how you'd solve those situations. So far the proposed "features" would be hindrences to me and I would aggressively avoid using them.

Again, this raises excellent points, I will re-read your post again later. But did want to ask you which of these would hinder you rather than help:


  • No reserved words, thus enabling new keywords to be added over time.
  • Support 'bit' as a native data type.
  • Support 'strings' as a native type, BCD/decimal as well.
  • Support for namespaces.
  • Computed gotos.
  • Flexible alignment, packing and padding directives.
  • Nested functions.
  • Precision timing features like emit multiple NOP operations or ensure identical execution time for (say) case clauses in a switch.
  • Support for an async/await model.
  • Consider synchronization and fence abstractions
  • Possibly expose "bit bands" linguistically, if supported in some target.
  • Support sparse arrays

What are the kinds of things you'd personally include in a list like this?

Consider the namespaces question, I implemented this in C recently as part of a small hobby project, it would have been so much easier and lower maintenance and higher performance if the language just exposed a "namespace" keyword or something.

What about a computed goto:

Code: [Select]

handler = calculate_handler();

goto manage_event[handler];


We can have an array of function pointers in C but we can't have any arrays of labels, goto targets, again this is low cost from a language design standpoint (as is namespaces) yet a pretty helpful thing. Using function pointers works too, can be coded but then if we had some nested design, where blocks of code goto blocks of code that might also goto blocks of code...the overhead of call/return/push/pop starts to grow yet vanishes with a computed goto.

The computed goto could, might, prove helpful in the implementation of state machines without the cost incurred by using tables of function pointers, especially in designs where the handlers to very little and the cost of invoking them exceeds the cost of the work they might do.


« Last Edit: November 23, 2022, 07:48:38 pm by Sherlock Holmes »
“When you have eliminated all which is impossible, then whatever remains, however improbable, must be the truth.” ~ Arthur Conan Doyle, The Case-Book of Sherlock Holmes
 

Offline SiliconWizard

  • Super Contributor
  • ***
  • Posts: 15439
  • Country: fr
Re: A new, hardware "oriented" programming language
« Reply #67 on: November 23, 2022, 07:48:52 pm »
I personally think your list is a mixed bag with a lot of too specific features (such as the computed gotos and the cycle-based timing) which seem odd at best, but that's just an opinion.
For a lot of the rest, this is actually available in... Ada. Problem sure is that Ada tends to yield large object code so not really adapted to small targets.

As to specifically namespaces, I agree, but I do think the "module" approach is much, much better than raw namespaces. Namespaces are a hack.

I've been saying that C lacks modules, for a long time. They introduced modules in C++ recently, but the definition they introduced is IMO a joke. Proper modules? Just look at Modula-2, Modula-3, Oberon, Ada packages. I particularly like Modula-3 modules that can be parameterized, Ada packages as well.

 

Offline Sherlock HolmesTopic starter

  • Frequent Contributor
  • **
  • !
  • Posts: 570
  • Country: us
Re: A new, hardware "oriented" programming language
« Reply #68 on: November 23, 2022, 07:54:09 pm »
I personally think your list is a mixed bag with a lot of too specific features (such as the computed gotos and the cycle-based timing) which seem odd at best, but that's just an opinion.
For a lot of the rest, this is actually available in... Ada. Problem sure is that Ada tends to yield large object code so not really adapted to small targets.

As to specifically namespaces, I agree, but I do think the "module" approach is much, much better than raw namespaces. Namespaces are a hack.

I've been saying that C lacks modules, for a long time. They introduced modules in C++ recently, but the definition they introduced is IMO a joke. Proper modules? Just look at Modula-2, Modula-3, Oberon, Ada packages. I particularly like Modula-3 modules that can be parameterized, Ada packages as well.

Tell me about "modules" how do they work? what do they buy you? I'm interested in hearing this from a real world "language user" so to speak! Are they for binary reuse? or ways to package source code for easier reuse?
“When you have eliminated all which is impossible, then whatever remains, however improbable, must be the truth.” ~ Arthur Conan Doyle, The Case-Book of Sherlock Holmes
 

Offline Sherlock HolmesTopic starter

  • Frequent Contributor
  • **
  • !
  • Posts: 570
  • Country: us
Re: A new, hardware "oriented" programming language
« Reply #69 on: November 23, 2022, 08:00:04 pm »
I wanted something like this "module" idea and became fixated on it recently, in fact this was a trigger for me getting interested in the whole language thing.

I crafted a way of working that would be much nicer if innately supported at a language level, you can read what I did here, I documented it for my own sake and to sanity check myself.
“When you have eliminated all which is impossible, then whatever remains, however improbable, must be the truth.” ~ Arthur Conan Doyle, The Case-Book of Sherlock Holmes
 

Offline brucehoult

  • Super Contributor
  • ***
  • Posts: 4538
  • Country: nz
Re: A new, hardware "oriented" programming language
« Reply #70 on: November 23, 2022, 08:09:05 pm »
This is simply not true, I spoke recently with very experienced engineers who routinely use assembler for 8 bit PIC processors that are a large market. Trying to recode some parts of their existing, working designs in C is proving a huge challenge with code bloat and optimizations causing serious problems. Simple optimizations too like short-circuit evaluations.

That is because 8 bit PIC is utterly unsuited to running any kind of modern high level compiled language.
 

Offline Sherlock HolmesTopic starter

  • Frequent Contributor
  • **
  • !
  • Posts: 570
  • Country: us
Re: A new, hardware "oriented" programming language
« Reply #71 on: November 23, 2022, 08:30:09 pm »
This is simply not true, I spoke recently with very experienced engineers who routinely use assembler for 8 bit PIC processors that are a large market. Trying to recode some parts of their existing, working designs in C is proving a huge challenge with code bloat and optimizations causing serious problems. Simple optimizations too like short-circuit evaluations.

That is because 8 bit PIC is utterly unsuited to running any kind of modern high level compiled language.

That can't be true Bruce, CP/M for example - an OS no less - was written in PL/I a high level compiled language, CP/M ran on the 8080, Z80 and 8085 all 8 bit devices and has been ported to other 8 bit chips too.

The device doesn't "run a language" either, it runs machine code, so long as that code has been produced sensibly for the target it will run no differently to any code surely?

Also C cannot be described as a "modern language" by any stretch, it must be fifty years old.




« Last Edit: November 23, 2022, 08:31:43 pm by Sherlock Holmes »
“When you have eliminated all which is impossible, then whatever remains, however improbable, must be the truth.” ~ Arthur Conan Doyle, The Case-Book of Sherlock Holmes
 

Offline brucehoult

  • Super Contributor
  • ***
  • Posts: 4538
  • Country: nz
Re: A new, hardware "oriented" programming language
« Reply #72 on: November 23, 2022, 09:16:17 pm »
This is simply not true, I spoke recently with very experienced engineers who routinely use assembler for 8 bit PIC processors that are a large market. Trying to recode some parts of their existing, working designs in C is proving a huge challenge with code bloat and optimizations causing serious problems. Simple optimizations too like short-circuit evaluations.

That is because 8 bit PIC is utterly unsuited to running any kind of modern high level compiled language.

That can't be true Bruce, CP/M for example - an OS no less - was written in PL/I a high level compiled language, CP/M ran on the 8080, Z80 and 8085 all 8 bit devices and has been ported to other 8 bit chips too.

And this has what to do with PIC, exactly?

8080/z80 is about 100x more compiler-friendly than PIC. But about 10x less compiler-friendly than the also 8 bit 6809 or AVR.

Quote
The device doesn't "run a language" either, it runs machine code, so long as that code has been produced sensibly for the target it will run no differently to any code surely?

You're talking to someone who writes compilers, and also designs new CPU instructions. Thanks for the lesson.

8 bit PIC is fundamentally unsuited to at least the following features of modern programming languages:

- pointers and anything that is not located at an absolute address

- constant data structures in ROM

- recursive or reentrant functions

- deep call chains at all

- runtime code generation

Quote
Also C cannot be described as a "modern language" by any stretch, it must be fifty years old.

Ahead of its time, clearly.

Sadly, it is becoming more and more apparent that you don't know the first thing about either CPU instruction sets or programming languages, and you won't listen to those who do, so I'm out of this conversation.
 

Offline SiliconWizard

  • Super Contributor
  • ***
  • Posts: 15439
  • Country: fr
Re: A new, hardware "oriented" programming language
« Reply #73 on: November 23, 2022, 09:16:56 pm »
I've used 8-bit PICs in the past, and while for the 16F series I only used assembly (had tried a C compiler back then, but it was just not worth it), for the 18F series, I only used C. The mcc18 compiler produced decent code for the 18F. Never had any issue with that. I've also used SDCC on 8051 stuff (Cypress FX1/FX2) with success. So, certainly usable.
 

Offline Sherlock HolmesTopic starter

  • Frequent Contributor
  • **
  • !
  • Posts: 570
  • Country: us
Re: A new, hardware "oriented" programming language
« Reply #74 on: November 23, 2022, 09:47:04 pm »
This is simply not true, I spoke recently with very experienced engineers who routinely use assembler for 8 bit PIC processors that are a large market. Trying to recode some parts of their existing, working designs in C is proving a huge challenge with code bloat and optimizations causing serious problems. Simple optimizations too like short-circuit evaluations.

That is because 8 bit PIC is utterly unsuited to running any kind of modern high level compiled language.

That can't be true Bruce, CP/M for example - an OS no less - was written in PL/I a high level compiled language, CP/M ran on the 8080, Z80 and 8085 all 8 bit devices and has been ported to other 8 bit chips too.

And this has what to do with PIC, exactly?

8080/z80 is about 100x more compiler-friendly than PIC. But about 10x less compiler-friendly than the also 8 bit 6809 or AVR.

Quote
The device doesn't "run a language" either, it runs machine code, so long as that code has been produced sensibly for the target it will run no differently to any code surely?

You're talking to someone who writes compilers, and also designs new CPU instructions. Thanks for the lesson.

8 bit PIC is fundamentally unsuited to at least the following features of modern programming languages:

- pointers and anything that is not located at an absolute address

- constant data structures in ROM

- recursive or reentrant functions

- deep call chains at all

- runtime code generation

Quote
Also C cannot be described as a "modern language" by any stretch, it must be fifty years old.

Ahead of its time, clearly.

Sadly, it is becoming more and more apparent that you don't know the first thing about either CPU instruction sets or programming languages, and you won't listen to those who do, so I'm out of this conversation.

Please review our conversation. I described an engineer who was encountering significant difficulties writing some of his existing (large) assembler code into C. The C he was using is supplied by Microchip for their 8 bit PIC devices.

So before making sweeping pronouncements like "8 bit PIC is utterly unsuited to running any kind of modern high level compiled language" you need to ask yourself why Microchip are seemingly unaware of this.

That you happen to know more than I about PIC does not justify insults either, whatever happened to mature, mutually respectful, polite discourse...

“When you have eliminated all which is impossible, then whatever remains, however improbable, must be the truth.” ~ Arthur Conan Doyle, The Case-Book of Sherlock Holmes
 


Share me

Digg  Facebook  SlashDot  Delicious  Technorati  Twitter  Google  Yahoo
Smf