Author Topic: The Imperium programming language - IPL  (Read 86830 times)

0 Members and 11 Guests are viewing this topic.

Offline tggzzz

  • Super Contributor
  • ***
  • Posts: 20768
  • Country: gb
  • Numbers, not adjectives
    • Having fun doing more, with less
Re: A new, hardware "oriented" programming language
« Reply #225 on: November 27, 2022, 04:15:41 pm »
Around the same time I was skeptical about the approach of the HP PA-RISC successor, ItanicItanium. At that time any tiny change to the implementation required someone to re-hand-optimise inner loops; the compiler was supposed to solve that issue, but never quite managed it. In addition, just as CPU power consumption was becoming the limiting factor, the Itanic strategy was to waste power doing lots of speculative execution.
It was obvious from the very start of Itanium development that its strategy would make for extremely high power consumption. So, it seemed to be a strategy to split the market into mobile and non-mobile threads. That was always a very weird strategy for a company like Intel, when notebooks were a booming business.

Intel did not design the Itanic architecture. HP designed the Itanic.

The objective was simple. In 1989 it was known that HP PA-RISC would run out of steam in a decade's time. When that became significant to customers, they would look for alternatives. What became the Itanic was simply a means of keeping customers buying HP PA-RISC machines. By the mid to late 90s that was the server market.
There are lies, damned lies, statistics - and ADC/DAC specs.
Glider pilot's aphorism: "there is no substitute for span". Retort: "There is a substitute: skill+imagination. But you can buy span".
Having fun doing more, with less
 

Offline Mechatrommer

  • Super Contributor
  • ***
  • Posts: 11713
  • Country: my
  • reassessing directives...
Re: A new, hardware "oriented" programming language
« Reply #226 on: November 27, 2022, 04:28:14 pm »
I've been pondering the idea of attempting to create a new (compiled) programming language specifically designed for hardware programming... tldr...
This is not as crazy as it might at first appear, I'm a very experienced software developer
leave the job to computer scientists and linguists, or at least, go discuss with them. what you are trying to do is like a very experienced engineer who want to lay a new theoritical physics in quantum mechanics. you may ended up reinventing the wheel or may not know what you are dealing with.. if you have to ask here, you are not the man trust me, i'm sorry to break the mood.. i'm not the man, nor most of us here. so rather than keep heated arguing endlessly, like a bunch of experienced engineers discussing about quantum mechanics that they have no clue about... find some reading materials about what a "computer programming language" is all about.. its a classical knowledge nothing new, C/C++ language was designed to be as low level as it can get while maintaining human readable syntax and avoid coupling to certain machine architecture, but generic and compatible enough for all. if you think you can do better, show your working proof. just a blueprint or "list of ideas" will not be enough. try do it actually and you may find the actual barrier in the middle of your journey. just "list of ideas" is easy to brag about. if you have to ask me what features i want in a programming language? i would say none! C/C++ provides me all i need to convert the semantic of procedural processes into machine readable (actual working) processes, through subroutines, definitions etc. if you think you have struggled in embedded system, i doubt your proficiency as a software developer, esp if you cannot differentiate between what a "language" and "compiler" and "machine dependent codes" is? you can make a parser or a compiler not necessarily you can make a new "language". cheers, ymmv.
Nature: Evolution and the Illusion of Randomness (Stephen L. Talbott): Its now indisputable that... organisms “expertise” contextualizes its genome, and its nonsense to say that these powers are under the control of the genome being contextualized - Barbara McClintock
 
The following users thanked this post: MK14

Offline Sherlock HolmesTopic starter

  • Frequent Contributor
  • **
  • !
  • Posts: 570
  • Country: us
Re: A new, hardware "oriented" programming language
« Reply #227 on: November 27, 2022, 04:40:38 pm »
So of course we can discuss what the language can do, go ahead I've been asking for some nine pages now! For example I recently said that I think the language should support runtime access to metadata like string capacity/lengths, arrays dimensions and bounds and so on. The C language is - IMHO - rather poor in that area, so lets address that.

Accessing implementation internals like string capacity/length, but isn't that exactly where C is the best tool you have? Higher level languages hide all such implementation details in their string classes (that's the whole point). In C, such string type does not exist, you have to create your own, which also means you have access to everything you want, and exactly as you want. More work, but this is why embedded developers with limited resources prefer C.

Many high level languages actually let you find the current and maximum lengths of strings or the rank and bound of arrays at runtime. C does not, sure you can write code to somehow create abstractions that do that if you really want, but that's really what languages are for, to reduce the need for such drudgery. If the language can reasonably do something that reduces developer effort then let's exploit that, after all we have "while" and "until" - why have that if we can "simulate" that with a goto loop?


Arrays are indeed a weak point in C, being too primitive and requiring you to babysit implementation details, but if you did actually read any of this thread, it has been discussed in numerous replies above, mostly by Mr. N. Animal. No need to complain about this not being discussed.

On the other hand, people often write crappy code like this, as taught in classes:
Code: [Select]
#define ARRAY_LEN
unsigned char array[ARRAY_LEN] = {1, 2, 3, 4};

for(int i=0; i<ARRAY_LEN; i++)
   ...


When C is much more capable:
Code: [Select]
uint8_t array[] = {1, 2, 3, 4}; // automagic array length!

for(int i=0; i<NUM_ELEM(array); i++) // despite being automagic, length is still known, at compile time! Googling the NUM_ELEM helper macro is left as an excercise for reader. Add to utils.h or something
    ...

Of course still quite primitive, but... We manage! And those who complain, often do not even know what we have available. As I said earlier, use C to its fullest, and it's not as crappy as some lazy examples from 1980's seem like.

I do not dispute that one can "manage" in C, just as one can "manage" in assembler, but the fact that one can manage in assembler does not demonstrate the C language is therefore unnecessary.

“When you have eliminated all which is impossible, then whatever remains, however improbable, must be the truth.” ~ Arthur Conan Doyle, The Case-Book of Sherlock Holmes
 

Offline Sherlock HolmesTopic starter

  • Frequent Contributor
  • **
  • !
  • Posts: 570
  • Country: us
Re: A new, hardware "oriented" programming language
« Reply #228 on: November 27, 2022, 04:43:52 pm »
Around the same time I was skeptical about the approach of the HP PA-RISC successor, ItanicItanium. At that time any tiny change to the implementation required someone to re-hand-optimise inner loops; the compiler was supposed to solve that issue, but never quite managed it. In addition, just as CPU power consumption was becoming the limiting factor, the Itanic strategy was to waste power doing lots of speculative execution.
It was obvious from the very start of Itanium development that its strategy would make for extremely high power consumption. So, it seemed to be a strategy to split the market into mobile and non-mobile threads. That was always a very weird strategy for a company like Intel, when notebooks were a booming business.

Intel did not design the Itanic architecture. HP designed the Itanic.

The objective was simple. In 1989 it was known that HP PA-RISC would run out of steam in a decade's time. When that became significant to customers, they would look for alternatives. What became the Itanic was simply a means of keeping customers buying HP PA-RISC machines. By the mid to late 90s that was the server market.



« Last Edit: November 27, 2022, 04:50:55 pm by Sherlock Holmes »
“When you have eliminated all which is impossible, then whatever remains, however improbable, must be the truth.” ~ Arthur Conan Doyle, The Case-Book of Sherlock Holmes
 

Offline Sherlock HolmesTopic starter

  • Frequent Contributor
  • **
  • !
  • Posts: 570
  • Country: us
Re: A new, hardware "oriented" programming language
« Reply #229 on: November 27, 2022, 04:48:51 pm »
I've been pondering the idea of attempting to create a new (compiled) programming language specifically designed for hardware programming... tldr...
This is not as crazy as it might at first appear, I'm a very experienced software developer
leave the job to computer scientists and linguists, or at least, go discuss with them. what you are trying to do is like a very experienced engineer who want to lay a new theoritical physics in quantum mechanics. you may ended up reinventing the wheel or may not know what you are dealing with.. if you have to ask here, you are not the man trust me, i'm sorry to break the mood.. i'm not the man, nor most of us here. so rather than keep heated arguing endlessly, like a bunch of experienced engineers discussing about quantum mechanics that they have no clue about... find some reading materials about what a "computer programming language" is all about.. its a classical knowledge nothing new, C/C++ language was designed to be as low level as it can get while maintaining human readable syntax and avoid coupling to certain machine architecture, but generic and compatible enough for all. if you think you can do better, show your working proof. just a blueprint or "list of ideas" will not be enough. try do it actually and you may find the actual barrier in the middle of your journey. just "list of ideas" is easy to brag about. if you have to ask me what features i want in a programming language? i would say none! C/C++ provides me all i need to convert the semantic of procedural processes into machine readable (actual working) processes, through subroutines, definitions etc. if you think you have struggled in embedded system, i doubt your proficiency as a software developer, esp if you cannot differentiate between what a "language" and "compiler" and "machine dependent codes" is? you can make a parser or a compiler not necessarily you can make a new "language". cheers, ymmv.

Another "I'm content with 'C' so any kind of discussion is fruitless" post, needless to say there are a number of inaccuracies and errant assumptions in what you say, I'll leave it at that.


“When you have eliminated all which is impossible, then whatever remains, however improbable, must be the truth.” ~ Arthur Conan Doyle, The Case-Book of Sherlock Holmes
 

Offline Mechatrommer

  • Super Contributor
  • ***
  • Posts: 11713
  • Country: my
  • reassessing directives...
Re: A new, hardware "oriented" programming language
« Reply #230 on: November 27, 2022, 04:57:15 pm »
Many high level languages actually let you find the current and maximum lengths of strings or the rank and bound of arrays at runtime. C does not, sure you can write code to somehow create abstractions that do that if you really want, but that's really what languages are for, to reduce the need for such drudgery.
at what cost? as a language designer, you should already have the answer in your belt. btw, you can start designing one "highly managed" language if you want to, you can start a list of mostly used features and we would love to try... if what you have in mind is that (highly managed language), looking at C is a bad start. start looking at other highly managed language such as maybe Python? or Rust? or whatever, i dont know, i'm not a proficient "all languages" programmer... C is not a managed language. but most people use it. its like automatic car vs manual car. both will have market. i choose manual car... lets see if one day manual car extinct. C#.. anybody?
« Last Edit: November 27, 2022, 05:15:23 pm by Mechatrommer »
Nature: Evolution and the Illusion of Randomness (Stephen L. Talbott): Its now indisputable that... organisms “expertise” contextualizes its genome, and its nonsense to say that these powers are under the control of the genome being contextualized - Barbara McClintock
 

Offline Sherlock HolmesTopic starter

  • Frequent Contributor
  • **
  • !
  • Posts: 570
  • Country: us
Re: A new, hardware "oriented" programming language
« Reply #231 on: November 27, 2022, 05:15:12 pm »
Many high level languages actually let you find the current and maximum lengths of strings or the rank and bound of arrays at runtime. C does not, sure you can write code to somehow create abstractions that do that if you really want, but that's really what languages are for, to reduce the need for such drudgery.

at what cost? as a language designer, you should already have the answer in your belt.

Well you need to define "cost" first and you haven't. Does it mean man hours to implement in the compiler? memory used for the metadata? performance costs in simply having the metadata, performance cost in accessing the metadata or what? It's a totally valid question and one that will certainly have an answer once we define "cost".

btw, you can start designing one "highly managed" language if you want to, you can start a list of mostly used features and we would love to try... if what you have in mind is that (highly managed language), looking at C is a bad start. start looking at other highly managed language such as maybe Python? or Rust? or whatever, i dont know, i'm not a proficient "all languages" programmer... C is not a managed language. but most people use it. its like automatic car vs manual car. both will have market. i choose manual car... lets see if one day manual car extinct.

I'm certainly not advocating a "managed" language in the sense of C# or Java, I'm looking a compiled imperative language with a grammar borrowed mostly from PL/I but including parts of other grammars too.

At a minimum the language would include:

1. More storage classes than C (defined, based etc)
2. More data types including decimal/BCD and strings of fixed or variable length.
3. Arbitrary rank arrays with optional bound specifiers.
4. Distinguish between function and procedures.
5. Fine control over padding and alignment and field ordering etc.
6. No object oriented features, no GC, no virtual functions and so on.
7. No need for "forward declarations"
8. Nested functions/procedures.
9. Some kind of "computed" goto support.
10. No reserved words.
11. No explicit pointer arithmetic, like ptr++ and ptr-- and so on.
12. Invocable variables (a simpler form of C's function pointer)

That's the basic initial list anyway, entirely reasonable and feasible and something I've implemented successfully in the past.

What I'd like to add is stuff that might be specific to MCU development, features that are not so relevant for conventional programming, (like for example constant execution time functions/procedures where the execution is - or strives to be - independent of state or arguments).

I don't the there's an appreciation of the benefits of a no-reserved-words grammar. We can add new keywords freely and never break backward compatibility.

Look at this draft example:

Code: [Select]

func get_fastest_motors_speed (X) float(binary(128)) optimize(speed) determinate
{
   arg X binary(16);

   return (something);
}


Those keywords decorating the function definition, could grow, we could add new keywords and never break existing code (the various attributes too can specified in any order).

That's just for example, the specific terms used are made up, not really to be taken as definitive, just made up to show the potential flexibility.





« Last Edit: November 27, 2022, 05:29:25 pm by Sherlock Holmes »
“When you have eliminated all which is impossible, then whatever remains, however improbable, must be the truth.” ~ Arthur Conan Doyle, The Case-Book of Sherlock Holmes
 

Offline Mechatrommer

  • Super Contributor
  • ***
  • Posts: 11713
  • Country: my
  • reassessing directives...
Re: A new, hardware "oriented" programming language
« Reply #232 on: November 27, 2022, 05:22:29 pm »
Well you need to define "cost" first and you haven't.
"resources" cost, mcu storage space and RAM to store all those drudgery avoidance codes. your language may not run very well in atTiny mcu, or asks for larger ARM processor etc. its inevitable... ;) this will open to endless debate. so just make one and put it into the ring, lets see who takes the bait.
Nature: Evolution and the Illusion of Randomness (Stephen L. Talbott): Its now indisputable that... organisms “expertise” contextualizes its genome, and its nonsense to say that these powers are under the control of the genome being contextualized - Barbara McClintock
 

Offline Sherlock HolmesTopic starter

  • Frequent Contributor
  • **
  • !
  • Posts: 570
  • Country: us
Re: A new, hardware "oriented" programming language
« Reply #233 on: November 27, 2022, 05:47:50 pm »
Well you need to define "cost" first and you haven't.
"resources" cost, mcu storage space and RAM to store all those drudgery avoidance codes. your language may not run very well in atTiny mcu, or asks for larger ARM processor etc. its inevitable... ;) this will open to endless debate. so just make one and put it into the ring, lets see who takes the bait.

A fixed length string needs (say) 1 or 2 bytes for its capacity.
A varying string needs likewise, 2 or 4 bytes 1 or 2 for capacity and 1 or 2 for current length.

The capacity is a constant so can be shared by all strings of that capacity. That is:

Code: [Select]

dcl usernames(1024) string(64); // array of 1024 strings - total metadata is 2 bytes in this specific example.


Would use only 1 byte of capacity metadata for all 1024 strings. That's a negligible cost in most scenarios I imagine. A similar analysis holds for arrays, small handful of bytes, likely one for rank and two for upper bound and two for lower bound, or perhaps one could optimize that. If the array is declared in such a way one byte will suffice for each bound then we'd do that. You can do a lot of stuff like that in a compiler, one we've generated the AST there is a huge amount of information present that can be used to make decisions like that when code generation starts.

If we adopt PL/I's support for "*" length specifiers, then we can even write code like this:

Code: [Select]

func parse_text (text) bool
{

   arg text string(*);

   L = length(text); // Get the capacity of this fixed length string

}

That function could be called in umpteen places, one might pass a string(32) another a string(1024) another a string(8) etc, the code can at runtime nevertheless find that capacity.

There are other details of course, this isn't a full analysis but you get the idea. In fact one thing emerges and that's that we could have two length oriented builtin functions - "capacity" and "length". (Actually I just looked and the original IBM PL/I had "maxlength" and "length" I think "capacity" is neater!)



« Last Edit: November 27, 2022, 06:00:52 pm by Sherlock Holmes »
“When you have eliminated all which is impossible, then whatever remains, however improbable, must be the truth.” ~ Arthur Conan Doyle, The Case-Book of Sherlock Holmes
 

Offline Mechatrommer

  • Super Contributor
  • ***
  • Posts: 11713
  • Country: my
  • reassessing directives...
Re: A new, hardware "oriented" programming language
« Reply #234 on: November 27, 2022, 05:54:32 pm »
thats too optimistic... what about "managerial codes" (framework) to handle all those?
Nature: Evolution and the Illusion of Randomness (Stephen L. Talbott): Its now indisputable that... organisms “expertise” contextualizes its genome, and its nonsense to say that these powers are under the control of the genome being contextualized - Barbara McClintock
 

Offline Sherlock HolmesTopic starter

  • Frequent Contributor
  • **
  • !
  • Posts: 570
  • Country: us
Re: A new, hardware "oriented" programming language
« Reply #235 on: November 27, 2022, 06:08:52 pm »
This is the kind of things I enjoy discussing, as I look at the above another optional attribute comes up:

Code: [Select]

dcl usernames(1024) string(64) opaque; // array of 1024 strings - no string metadata

dcl gradenames(1024) opaque string(64) opaque; // array of 1024 strings - no string metadata no array metadata

/* or */

dcl usernames(1024) string(64,opaque); // array of 1024 strings - no string metadata

dcl gradenames(1024,opaque) string(64,opaque); // array of 1024 strings - no string metadata no array metadata



Here we just added a new keyword "opaque" that tells the compiler "Do not retain any metadata for this datum" so that has zero overheads. We could use that on a declaration for some datum that we knew had no need, ever for such metadata, this is just thinking aloud, not saying this is a good or bad idea really. But being able to control whether such metadata is generated or not, is surely the kind of thing that would help if writing code for an MCU, also if such declarations are on the stack of an invoked function, avoiding metadata for small routines would be helpful, there's obviously a cost there that's not necessary if one never needs to know lengths or ranks etc.

 

« Last Edit: November 27, 2022, 06:51:16 pm by Sherlock Holmes »
“When you have eliminated all which is impossible, then whatever remains, however improbable, must be the truth.” ~ Arthur Conan Doyle, The Case-Book of Sherlock Holmes
 

Offline Mechatrommer

  • Super Contributor
  • ***
  • Posts: 11713
  • Country: my
  • reassessing directives...
Re: A new, hardware "oriented" programming language
« Reply #236 on: November 27, 2022, 06:11:50 pm »
well thats more and more becoming machine friendly grammar ;D good luck!
Nature: Evolution and the Illusion of Randomness (Stephen L. Talbott): Its now indisputable that... organisms “expertise” contextualizes its genome, and its nonsense to say that these powers are under the control of the genome being contextualized - Barbara McClintock
 

Offline coppice

  • Super Contributor
  • ***
  • Posts: 9559
  • Country: gb
Re: A new, hardware "oriented" programming language
« Reply #237 on: November 27, 2022, 11:03:57 pm »
Around the same time I was skeptical about the approach of the HP PA-RISC successor, ItanicItanium. At that time any tiny change to the implementation required someone to re-hand-optimise inner loops; the compiler was supposed to solve that issue, but never quite managed it. In addition, just as CPU power consumption was becoming the limiting factor, the Itanic strategy was to waste power doing lots of speculative execution.
It was obvious from the very start of Itanium development that its strategy would make for extremely high power consumption. So, it seemed to be a strategy to split the market into mobile and non-mobile threads. That was always a very weird strategy for a company like Intel, when notebooks were a booming business.

Intel did not design the Itanic architecture. HP designed the Itanic.

The objective was simple. In 1989 it was known that HP PA-RISC would run out of steam in a decade's time. When that became significant to customers, they would look for alternatives. What became the Itanic was simply a means of keeping customers buying HP PA-RISC machines. By the mid to late 90s that was the server market.
The Itanium's basic design originated at HP, and HPs own silicon designers did much or all of the design for the first silicon. However, Intel bought into this program, not just as a silicon fabricator, but as their core performance strategy going forwards. It seems they were so eager for something completely proprietary to them as their step to 64 bit systems, they forgot all the lessons the industry had learned up to that point - compatibility is king, software moves slowly, so make sure you can run the same software in the maximum possible number of places, old code needs to run well on new hardware, etc. The last point they learned well, when the brilliant Pentium Pro design bombed in the market, because it ran old 16 bot Microsoft code too slowly. Having been burned badly by that, just a few years later they did the same thing, putting a half baked x86 mode into the Itanium.
 

Offline Sherlock HolmesTopic starter

  • Frequent Contributor
  • **
  • !
  • Posts: 570
  • Country: us
Re: A new, hardware "oriented" programming language
« Reply #238 on: November 27, 2022, 11:13:40 pm »
While reviewing the state of fixed and floating point numeric data and types, I came across this (emphasis mine)

Quote
Explicit support for fixed-point numbers is provided by a few computer languages, notably PL/I, COBOL, Ada, JOVIAL, and Coral 66. They provide fixed-point data types, with a binary or decimal scaling factor. The compiler automatically generates code to do the appropriate scaling conversions when doing operations on these data-types, when reading or writing variables, or when converting the values to other data types such as floating-point.

and

Quote
Moreover, in 2008 the International Standards Organization (ISO) issued a proposal to extend the C programming language with fixed-point data types, for the benefit of programs running on embedded processors.[3] Also, the GNU Compiler Collection (GCC) has back-end support for fixed-point.[4][5]

That's from Wikipedia. There's more too:

Quote
Digital Signal Processors have traditionally supported fixed-point arithmetic in hardware. But more recently, many DSP-enhanced RISC processors are starting to support fixed-point data types as part of their native instruction set. When the precision requirements of the application can be met with fixed-point arithmetic, then this is preferred since it can be smaller and more efficient than floating-point hardware. DSP algorithms often represent the data samples and the coefficients used in the computation as fractional numbers (between -1 and +1) to avoid magnitude growth of a multiplication product. Fractional data type, where there are zero integer bits, is a subset of the more general fixed-point data type.

So it seems that explicit programming language support for such data types would be prudent. Additionally, floating point (binary) support for "half" as well as "single" and "double" seems like a no brainer too.

Finally there really needs to be a more systematic way to represent some things like this:

Quote
The fixed-point types are
short _Fract,
_Fract,
long _Fract,
long long _Fract,
unsigned short _Fract,
unsigned _Fract,
unsigned long _Fract,
unsigned long long _Fract,
_Sat short _Fract,
_Sat _Fract,
_Sat long _Fract,
_Sat long long _Fract,
_Sat unsigned short _Fract,
_Sat unsigned _Fract,
_Sat unsigned long _Fract,
_Sat unsigned long long _Fract,
short _Accum,
_Accum,
long _Accum,
long long _Accum,
unsigned short _Accum,
unsigned _Accum,
unsigned long _Accum,
unsigned long long _Accum,
_Sat short _Accum,
_Sat _Accum,
_Sat long _Accum,
_Sat long long _Accum,
_Sat unsigned short _Accum,
_Sat unsigned _Accum,
_Sat unsigned long _Accum,
_Sat unsigned long long _Accum.

I make no apologies for saying this notation is quite ridiculous, it is retrograde, looking more like some 1940s primitive machine code than a 21st century programming language. Of course if the grammar restricts the flexibility for supporting new data types then perhaps the proponents of this can be forgiven.

This is precisely the kind of stuff I was really hoping to hear more about from engineers here, these are precisely the kinds of expansions of language capabilities that I've been talking about, this is why grammar is so so important, the C grammar is a great example of technical debt that has grown and grown and grown.







« Last Edit: November 27, 2022, 11:17:41 pm by Sherlock Holmes »
“When you have eliminated all which is impossible, then whatever remains, however improbable, must be the truth.” ~ Arthur Conan Doyle, The Case-Book of Sherlock Holmes
 

Offline coppice

  • Super Contributor
  • ***
  • Posts: 9559
  • Country: gb
Re: A new, hardware "oriented" programming language
« Reply #239 on: November 27, 2022, 11:44:46 pm »
Quote
The fixed-point types are
short _Fract,
_Fract,
long _Fract,
long long _Fract,
unsigned short _Fract,
unsigned _Fract,
unsigned long _Fract,
unsigned long long _Fract,
_Sat short _Fract,
_Sat _Fract,
_Sat long _Fract,
_Sat long long _Fract,
_Sat unsigned short _Fract,
_Sat unsigned _Fract,
_Sat unsigned long _Fract,
_Sat unsigned long long _Fract,
short _Accum,
_Accum,
long _Accum,
long long _Accum,
unsigned short _Accum,
unsigned _Accum,
unsigned long _Accum,
unsigned long long _Accum,
_Sat short _Accum,
_Sat _Accum,
_Sat long _Accum,
_Sat long long _Accum,
_Sat unsigned short _Accum,
_Sat unsigned _Accum,
_Sat unsigned long _Accum,
_Sat unsigned long long _Accum.

I make no apologies for saying this notation is quite ridiculous, it is retrograde, looking more like some 1940s primitive machine code than a 21st century programming language. Of course if the grammar restricts the flexibility for supporting new data types then perhaps the proponents of this can be forgiven.

This is precisely the kind of stuff I was really hoping to hear more about from engineers here, these are precisely the kinds of expansions of language capabilities that I've been talking about, this is why grammar is so so important, the C grammar is a great example of technical debt that has grown and grown and grown.
I think its time the C standard flipped things around, and started with length specific types, and derived bland vague names, like int and short int, from the specific ones - mostly for backwards compatibility and places where the size of the variable is essentially irrelevant, like a small loop counter.
 

Offline Sherlock HolmesTopic starter

  • Frequent Contributor
  • **
  • !
  • Posts: 570
  • Country: us
Re: A new, hardware "oriented" programming language
« Reply #240 on: November 28, 2022, 02:37:47 pm »
Quote
The fixed-point types are
short _Fract,
_Fract,
long _Fract,
long long _Fract,
unsigned short _Fract,
unsigned _Fract,
unsigned long _Fract,
unsigned long long _Fract,
_Sat short _Fract,
_Sat _Fract,
_Sat long _Fract,
_Sat long long _Fract,
_Sat unsigned short _Fract,
_Sat unsigned _Fract,
_Sat unsigned long _Fract,
_Sat unsigned long long _Fract,
short _Accum,
_Accum,
long _Accum,
long long _Accum,
unsigned short _Accum,
unsigned _Accum,
unsigned long _Accum,
unsigned long long _Accum,
_Sat short _Accum,
_Sat _Accum,
_Sat long _Accum,
_Sat long long _Accum,
_Sat unsigned short _Accum,
_Sat unsigned _Accum,
_Sat unsigned long _Accum,
_Sat unsigned long long _Accum.

I make no apologies for saying this notation is quite ridiculous, it is retrograde, looking more like some 1940s primitive machine code than a 21st century programming language. Of course if the grammar restricts the flexibility for supporting new data types then perhaps the proponents of this can be forgiven.

This is precisely the kind of stuff I was really hoping to hear more about from engineers here, these are precisely the kinds of expansions of language capabilities that I've been talking about, this is why grammar is so so important, the C grammar is a great example of technical debt that has grown and grown and grown.
I think its time the C standard flipped things around, and started with length specific types, and derived bland vague names, like int and short int, from the specific ones - mostly for backwards compatibility and places where the size of the variable is essentially irrelevant, like a small loop counter.

That's a good suggestion. I was refreshing my memory over the IEEE standards and see that defines (among other things) - binary16, binary32, binary64, binary128, decimal32, decimal64 and decimal128.

The binary32 is known as "float" and the binary64 is known as "double", well the less common binary16 is also known as "half" - so these "short hand" names are a convenience.

The .Net ecosystem does this too, it has a systematic technical name for such types and then the less formal convenient name, totally interchangeable. I'm of the opinion that a specification like this is appealing in a new language:

Code: [Select]

...float(dec(32)), float(dec(64)), float(dec(128))
...float(bin(16)), float(bin(32)), float(bin(64)), float(bin(128)), float(bin(256))


Then a subset of these could also have alternative specifiers:

Code: [Select]

float(bin(16)) or half
float(bin(32)) or single
float(bin(64)) or double
float(bin(128)) or quad
float(bin(256)) or oct


Very easy to support grammatically and one could choose whichever specifier is the more appropriate based project/user preferences.








“When you have eliminated all which is impossible, then whatever remains, however improbable, must be the truth.” ~ Arthur Conan Doyle, The Case-Book of Sherlock Holmes
 

Offline Sherlock HolmesTopic starter

  • Frequent Contributor
  • **
  • !
  • Posts: 570
  • Country: us
Re: A new, hardware "oriented" programming language
« Reply #241 on: November 28, 2022, 02:41:50 pm »
Ooh, I didn't notice this until now either, suffix for literal constants:

Quote
‘hr’ or ‘HR’ for short _Fract and _Sat short _Fract
‘r’ or ‘R’ for _Fract and _Sat _Fract
‘lr’ or ‘LR’ for long _Fract and _Sat long _Fract
‘llr’ or ‘LLR’ for long long _Fract and _Sat long long _Fract
‘uhr’ or ‘UHR’ for unsigned short _Fract and _Sat unsigned short _Fract
‘ur’ or ‘UR’ for unsigned _Fract and _Sat unsigned _Fract
‘ulr’ or ‘ULR’ for unsigned long _Fract and _Sat unsigned long _Fract
‘ullr’ or ‘ULLR’ for unsigned long long _Fract and _Sat unsigned long long _Fract
‘hk’ or ‘HK’ for short _Accum and _Sat short _Accum
‘k’ or ‘K’ for _Accum and _Sat _Accum
‘lk’ or ‘LK’ for long _Accum and _Sat long _Accum
‘llk’ or ‘LLK’ for long long _Accum and _Sat long long _Accum
‘uhk’ or ‘UHK’ for unsigned short _Accum and _Sat unsigned short _Accum
‘uk’ or ‘UK’ for unsigned _Accum and _Sat unsigned _Accum
‘ulk’ or ‘ULK’ for unsigned long _Accum and _Sat unsigned long _Accum
‘ullk’ or ‘ULLK’ for unsigned long long _Accum and _Sat unsigned long long _Accum

There's a problem here clearly and something really does need to be done about it, band aid and glue just isn't working!

A neater way is required for specifying type info for literal constants...

“When you have eliminated all which is impossible, then whatever remains, however improbable, must be the truth.” ~ Arthur Conan Doyle, The Case-Book of Sherlock Holmes
 

Offline Sherlock HolmesTopic starter

  • Frequent Contributor
  • **
  • !
  • Posts: 570
  • Country: us
Re: A new, hardware "oriented" programming language
« Reply #242 on: November 28, 2022, 03:04:46 pm »
So here's a question for the MCU engineers:

I am reading lots of articles and blogs about the utility of fixed point arithmetic for certain classes of device. Binary fixed point is quite easy to implement as you know, but what ranges are useful?

The original IBM PL/I language support this generic specifier:

Code: [Select]
dcl counter fixed bin (P,S);

Here P is the total number of digits and S is the "scale factor".

Within implementation defined maximums, any value for P and S was supported.

One could create variables like this:

Code: [Select]
dcl counter1 fixed bin (18,7); // precision=18, scale=7
dcl counter2 fixed bin (29,3);
dcl counter2 fixed bin (30,-6);

Furthermore, such a datum could be declared with a scale/precision that was an expression, not just a compile time constant, not a hard thing to support.

But what kind of flexibility is really needed? could one solve 99% of problems with just a few precisions like 8, 16 and 32?

What about the signedness? Would a rule that even precisions are always unsigned and odd precisions are always signed, work?

« Last Edit: November 28, 2022, 03:46:46 pm by Sherlock Holmes »
“When you have eliminated all which is impossible, then whatever remains, however improbable, must be the truth.” ~ Arthur Conan Doyle, The Case-Book of Sherlock Holmes
 

Offline Sherlock HolmesTopic starter

  • Frequent Contributor
  • **
  • !
  • Posts: 570
  • Country: us
Re: A new, hardware "oriented" programming language
« Reply #243 on: November 28, 2022, 04:22:03 pm »
I'd be interested in hearing about some real world use cases for coroutines, a couple of people have raised this in this thread.

There isn't (or wasn't ten years ago when I was blogging about coroutines in C#) a lot out there about coroutines, it seems they had their roots in assembler programming (in fact in a particular processors architecture in the 1950s or 1960s, a very specific instruction if I recall...).

So if anyone has coded these in C or assembler, I'd be fascinated to learn more about their real world relevance.

I can see, the concept was first put to use on a Burroughs 220 in 1963.

« Last Edit: November 28, 2022, 04:32:19 pm by Sherlock Holmes »
“When you have eliminated all which is impossible, then whatever remains, however improbable, must be the truth.” ~ Arthur Conan Doyle, The Case-Book of Sherlock Holmes
 

Offline Sherlock HolmesTopic starter

  • Frequent Contributor
  • **
  • !
  • Posts: 570
  • Country: us
Re: A new, hardware "oriented" programming language
« Reply #244 on: November 28, 2022, 06:17:54 pm »
Coroutines are rather interesting, looking at the machine code implementation is fascinating from a high level language perspective.

There is some overlap of meanings though in the literature, for example the basic coroutine described in the 1963 paper is not quite the same idea as the more recent iterator pattern and some richer examples of "coroutines" in languages like Kotlin or libraries like Unity are closer to async/await asynchronous enumerables with implied use of multiple threads, there are similarities of course but they are distinct mechanisms.

This simple image captures the essence of what's described in the 1963 paper.



So as far as I can tell that's the "classic" definition of a coroutine, a mechanism where control can flow as shown and where each procedure retains access to its stack frame as control cycles back and forth.

Of course one can have loops rather than simple linear sequences and there could be three, four or more procedures participating...

So how is that pattern leveraged in MCU work? what kinds of problems can benefit from such a mechanism?



« Last Edit: November 28, 2022, 06:19:32 pm by Sherlock Holmes »
“When you have eliminated all which is impossible, then whatever remains, however improbable, must be the truth.” ~ Arthur Conan Doyle, The Case-Book of Sherlock Holmes
 

Offline rstofer

  • Super Contributor
  • ***
  • Posts: 9937
  • Country: us
Re: A new, hardware "oriented" programming language
« Reply #245 on: November 28, 2022, 06:23:14 pm »
Interesting presentation on the difference between a coroutine and a thread:

https://www.educba.com/coroutines-vs-threads/

If the language is going to be used for IoT or Edge computing, it might be helpful to have parallel computing as part of the language.  Fortran already has this feature as part of the language specification.
« Last Edit: November 28, 2022, 06:26:14 pm by rstofer »
 

Offline Sherlock HolmesTopic starter

  • Frequent Contributor
  • **
  • !
  • Posts: 570
  • Country: us
Re: A new, hardware "oriented" programming language
« Reply #246 on: November 28, 2022, 06:40:01 pm »
Interesting presentation on the difference between a coroutine and a thread:

https://www.educba.com/coroutines-vs-threads/

If the language is going to be used for IoT or Edge computing, it might be helpful to have parallel computing as part of the language.  Fortran already has this feature as part of the language specification.

That's also an informative page, very helpful.

By parallelism are you thinking of multiple cores? or preemptive multitasking on a single core?
“When you have eliminated all which is impossible, then whatever remains, however improbable, must be the truth.” ~ Arthur Conan Doyle, The Case-Book of Sherlock Holmes
 

Online SiliconWizard

  • Super Contributor
  • ***
  • Posts: 15439
  • Country: fr
Re: A new, hardware "oriented" programming language
« Reply #247 on: November 28, 2022, 06:56:08 pm »
Coroutines are a pretty old concept. Modula-2 introduced them in 1978.

You can find them in one form or another in a number of more recent languages. Even scripting languages such as Lua.
 

Offline rstofer

  • Super Contributor
  • ***
  • Posts: 9937
  • Country: us
Re: A new, hardware "oriented" programming language
« Reply #248 on: November 28, 2022, 07:31:14 pm »
By parallelism are you thinking of multiple cores? or preemptive multitasking on a single core?

More as Nvidia style of parallelism across multiple thousand Cuda units with a coordinated completion.  And this explanation from Oracle:

https://docs.oracle.com/cd/E19957-01/805-4940/6j4m1u7qk/index.html

There's a reason I bought a laptop with an RTX3070 GPU and it's the 5120 Cuda units for parallel computing of matrix operations for Machine Learning.  That's a bunch of parallelism if it can be expressed in the language.  And it's just a modest GPU, there are bigger devices as well as standalone units.

ML is alright as a theory but it eventually needs to flow down to the factory floor and that's where the idea of IoT or Edge computing comes into play and things like image recognition tend to take a ton of math.
 

Offline rstofer

  • Super Contributor
  • ***
  • Posts: 9937
  • Country: us
Re: A new, hardware "oriented" programming language
« Reply #249 on: November 28, 2022, 08:14:49 pm »
It seems to me that the multiplicity of numeric types will ultimately lead to a trainwreck.

One thing that I find problematic with Python is that variables are not declared, they just happen.  Given a variable name,  you have no idea what is being represented.  Could be a string, a float, an int, probably a vector, maybe a matrix and even if you thought you knew the shape, even that could change during execution.  How far back do you have to look to find where it was last modified?

Same thing with MATLAB for that matter. 

One thing I like about Modern Fortran is 'implicit none'.  Every variable has to be declared and no more of the leading character determining the type unless otherwise declared. 

I also like the 'intent' attribute:

http://www.personal.psu.edu/jhm/f90/statements/intent.html

I'm not sure what to think about functions returning multiple values and the ability to ignore pieces of the return values.  Returning two ndarrays and only keeping one seems bizarre.  But so does using indentation as a syntactic element;

I do like the idea of slicing arrays and being able to concatenate rows or columns.

Although I grumble about the lack of declarations, I do like Python's ndarray.

Is white space going to be significant?  At least in Fortran IV, it wasn't.  The statement:
DO10I=1,4
could be the beginning of a loop or just another real variable being set to a value all the way up to the comma.  I never tested that concept but it was a side effect of the fact that Fortran ignored white space.  Apparently in fixed format Modern Fortran, white space is still ignored but in free form it is significant

https://community.intel.com/t5/Intel-Fortran-Compiler/spaces-not-ignored-in-free-format/td-p/1112317

Embedded spaces (or under scores) in long numeric strings can be useful.
SpeedOfLight = 186_000 miles per second

 


Share me

Digg  Facebook  SlashDot  Delicious  Technorati  Twitter  Google  Yahoo
Smf