Author Topic: The Imperium programming language - IPL  (Read 86859 times)

0 Members and 12 Guests are viewing this topic.

Offline Sherlock HolmesTopic starter

  • Frequent Contributor
  • **
  • !
  • Posts: 570
  • Country: us
Re: A new, hardware "oriented" programming language
« Reply #25 on: November 22, 2022, 08:48:36 pm »
Hmm, a very thought provoking real-world comparison of Rust and Zig.

“When you have eliminated all which is impossible, then whatever remains, however improbable, must be the truth.” ~ Arthur Conan Doyle, The Case-Book of Sherlock Holmes
 

Offline Sherlock HolmesTopic starter

  • Frequent Contributor
  • **
  • !
  • Posts: 570
  • Country: us
Re: A new, hardware "oriented" programming language
« Reply #26 on: November 22, 2022, 09:33:27 pm »
PL/I was also one of the first languages used to write an operating system, some years before C arose. Contrary to popular belief too, C was created primarily for ease of implementing a compiler, programming hardware was never a goal of the designers any more than several other languages at the time.
PL/1 was quite late to the "writing operating systems in a high level language" game. Which is why so few were written in it. Several languages, often described as real time oriented rather than operating system oriented, were developed for building operating systems before PL/1 existed.

C was intended to be close to the metal. That's really all a systems oriented language is.

The guys that eventually developed C worked on the Multics project. That was written in PL/I (a simplified version crafted by the late Bob Freiburghouse).

The PL/I language brought together a number of emerging language features for the first time, some of which were founded on systems programming on large mainframes. That meant the language designers were familiar with interrupts, efficient layout of memory for complex, fine grained data structures, list processing and other things that were barely considered before.

The language was a major milestone in computing, the designers took what they regarded as the best features of Cobol, Fortran and Algol, at the time the leading languages in business, science and universities.

It was the first language to incorporate concurrency features, it had a bit data type (for the very reason that bit fields are prevalent in operating system data structures).

It was perhaps the first programming language to be specified in a formal language, the Vienna Definition Language, this allowed both the grammar and the semantics to be specified jointly.

It was certainly very influential but not so much in the world of PCs, of course CP/M an early OS for the 8080 family by Bill Gate's nemesis Gary Kildall the founder of Digital Research. CP/M was written (this is an 8 bit OS in the 1970s, talk about resource constrained hardware) in a variant of PL/I named PL/M, by Kildall.

Consider:

Quote
Unlike other contemporary languages such as Pascal, C or BASIC, PL/M had no standard input or output routines. It included features targeted at the low-level hardware specific to the target microprocessors, and as such, it could support direct access to any location in memory, I/O ports and the processor interrupt flags in a very efficient manner. PL/M was the first higher level programming language for microprocessor-based computers and was the original implementation language for those parts of the CP/M operating system which were not written in assembler. Many Intel and Zilog Z80-based embedded systems were programmed in PL/M during the 1970s and 1980s. For instance, the firmware of the Service Processor component of CISC IBM AS/400 was written in PL/M.

The C language became popular not because of any inherent language prowess, not because of some special affinity for "hardware" level coding, but because there was no commercially available PL/I implementation available for early PCs and Microsoft had never created a compiler for it (though the did sell COBOL and FORTRAN compilers back then). When MS DOS started to dominate rather than CP/M there was no huge motive to invest in PL/I and so C started to grow because (and this is the chief reason) it is very easy to write a compiler for C. So it is an accident of history and market forces that elevated C to prominence and then of course Borland competed with Microsoft with C and then C++ emerged, leading to more attention for C.

Too few developers stop to learn the history of such things, PL/I was not the perfect language, none are, but it did bring together some (at the time) revolutionary ideas and some of those are still unknown in modern languages, technology is as influenced by fashion as many other things in life!



“When you have eliminated all which is impossible, then whatever remains, however improbable, must be the truth.” ~ Arthur Conan Doyle, The Case-Book of Sherlock Holmes
 
The following users thanked this post: mcovington

Offline SiliconWizard

  • Super Contributor
  • ***
  • Posts: 15439
  • Country: fr
Re: A new, hardware "oriented" programming language
« Reply #27 on: November 22, 2022, 09:41:31 pm »
The C language became popular not because of any inherent language prowess, (...)

That can be said for almost all languages that became popular.
They all became popular because they filled some void.

C has passed the test of time though, and I'm not sure all of those new languages will.

 

Offline coppice

  • Super Contributor
  • ***
  • Posts: 9559
  • Country: gb
Re: A new, hardware "oriented" programming language
« Reply #28 on: November 22, 2022, 10:12:46 pm »
The C language became popular not because of any inherent language prowess, not because of some special affinity for "hardware" level coding, but because there was no commercially available PL/I implementation available for early PCs and Microsoft had never created a compiler for it (though the did sell COBOL and FORTRAN compilers back then). When MS DOS started to dominate rather than CP/M there was no huge motive to invest in PL/I and so C started to grow because (and this is the chief reason) it is very easy to write a compiler for C. So it is an accident of history and market forces that elevated C to prominence and then of course Borland competed with Microsoft with C and then C++ emerged, leading to more attention for C.

Too few developers stop to learn the history of such things, PL/I was not the perfect language, none are, but it did bring together some (at the time) revolutionary ideas and some of those are still unknown in modern languages, technology is as influenced by fashion as many other things in life!
PL/M was released really early in the life of microprocessors - around 1974.  C is listed as being earlier than that, but that was just internal to Bell Labs. It didn't get a real public viewing for a while. When I learned C at the end of the 70s it was still niche, while huge amounts of code for microprocessors existed in PL/M. This included CP/M itself and a lot of early CP/M applications. I was using a mature implementation of PL/M86 in 1979, when the first 8086s were available, and before MSDOS was. It had been developed and tuned with an 8086 simulator, and we did some of our development using that simulator, because the 8086 chips were not yet widely available. For quirky historical reasons the developer of MSDOS, before MS bought it, developed it in assembly language.  Since PL/M86 was already available, and MSDOS was developed roughly as a CP/M clone, a lot of existing code could easily have been ported to MSDOS, but I'm not clear how much was. Microsoft misjudged the market, and put effort into Fortran and Cobol. They had to buy in a decent C compiler from Lattice when C suddenly took off. I don't think its clear why PL/M suddenly died, but it could have been Intel's licencing fees. The PC suddenly made cheap king in the software market. For people like me, who had used Algol, Coral, PL/1, PL/M and other languages for several years, C was just what we were looking for.

Fortran on microprocessors wasn't that commercially valuable in the early days of MSDOS. Cobol was, but MS-COBOL was a dog. Things like MicroFocus's MicroCOBOL dominated the "lets move this app to a small machine" market.

You didn't mention the biggest influence on C, and the language from which its name comes - BCPL. BCPL and C are about as unlike PL/1 in their thinking as could be.

 

Offline Sherlock HolmesTopic starter

  • Frequent Contributor
  • **
  • !
  • Posts: 570
  • Country: us
Re: A new, hardware "oriented" programming language
« Reply #29 on: November 22, 2022, 11:03:03 pm »
The C language became popular not because of any inherent language prowess, not because of some special affinity for "hardware" level coding, but because there was no commercially available PL/I implementation available for early PCs and Microsoft had never created a compiler for it (though the did sell COBOL and FORTRAN compilers back then). When MS DOS started to dominate rather than CP/M there was no huge motive to invest in PL/I and so C started to grow because (and this is the chief reason) it is very easy to write a compiler for C. So it is an accident of history and market forces that elevated C to prominence and then of course Borland competed with Microsoft with C and then C++ emerged, leading to more attention for C.

Too few developers stop to learn the history of such things, PL/I was not the perfect language, none are, but it did bring together some (at the time) revolutionary ideas and some of those are still unknown in modern languages, technology is as influenced by fashion as many other things in life!
PL/M was released really early in the life of microprocessors - around 1974.  C is listed as being earlier than that, but that was just internal to Bell Labs. It didn't get a real public viewing for a while. When I learned C at the end of the 70s it was still niche, while huge amounts of code for microprocessors existed in PL/M. This included CP/M itself and a lot of early CP/M applications. I was using a mature implementation of PL/M86 in 1979, when the first 8086s were available, and before MSDOS was. It had been developed and tuned with an 8086 simulator, and we did some of our development using that simulator, because the 8086 chips were not yet widely available. For quirky historical reasons the developer of MSDOS, before MS bought it, developed it in assembly language.  Since PL/M86 was already available, and MSDOS was developed roughly as a CP/M clone, a lot of existing code could easily have been ported to MSDOS, but I'm not clear how much was. Microsoft misjudged the market, and put effort into Fortran and Cobol. They had to buy in a decent C compiler from Lattice when C suddenly took off. I don't think its clear why PL/M suddenly died, but it could have been Intel's licencing fees. The PC suddenly made cheap king in the software market. For people like me, who had used Algol, Coral, PL/1, PL/M and other languages for several years, C was just what we were looking for.

Fortran on microprocessors wasn't that commercially valuable in the early days of MSDOS. Cobol was, but MS-COBOL was a dog. Things like MicroFocus's MicroCOBOL dominated the "lets move this app to a small machine" market.

You didn't mention the biggest influence on C, and the language from which its name comes - BCPL. BCPL and C are about as unlike PL/1 in their thinking as could be.

Yes, they are indeed extremely different. Bob Freiburghouse was an interesting guy, he accidentally got into computing after graduating in EE (as his bio explains) and somehow got asked to write a Fortran compiler for some machine or other, perhaps while still in the navy. Anyway he went to to setup a language business in the 60s, he did well, he supplied several language compilers to DEC, CDC, Burroughs and several others for some years, this included variants of PL/I.

He describes how he developed a generic table driven code generator that allowed him to take any of his front end and generate the code for any back end, this was highly innovative at the time, when few companies were in the commercial language business.

He also worked on Multics, here's a fascinating paper of his I used as a guide when developing a PL/I compiler for Windows, I recall getting a copy of this from a library in London, this was before the internet.

This other paper explains how PL/I came to be selected for Multics development.

There are a handful of things I noticed when I first began to learn C back in 1990, that are the same as in PL/I, these are 1) Comments use /* and */ and 2) pointer referencing uses -> and 3) The dot notation for structure member access, these all originated (I believe) in PL/I, which Kernigan and Ritchie must have used when working on Multics.









“When you have eliminated all which is impossible, then whatever remains, however improbable, must be the truth.” ~ Arthur Conan Doyle, The Case-Book of Sherlock Holmes
 

Online ataradov

  • Super Contributor
  • ***
  • Posts: 11780
  • Country: us
    • Personal site
Re: A new, hardware "oriented" programming language
« Reply #30 on: November 22, 2022, 11:34:55 pm »
posts like this are one of the reasons why i don't want to publish or release my-c.
Yet you can't stop mentioning it in every topic.
Alex
 
The following users thanked this post: newbrain, JPortici

Offline DiTBho

  • Super Contributor
  • ***
  • Posts: 4247
  • Country: gb
Re: A new, hardware "oriented" programming language
« Reply #31 on: November 22, 2022, 11:45:24 pm »
There are a handful of things I noticed when I first began to learn C back in 1990

yup, indeed the first release Dragon Book offers PL/I examples mixed with C-examples.
The opposite of courage is not cowardice, it is conformity. Even a dead fish can go with the flow
 

Offline brucehoult

  • Super Contributor
  • ***
  • Posts: 4538
  • Country: nz
Re: A new, hardware "oriented" programming language
« Reply #32 on: November 23, 2022, 01:38:57 am »
C is, and has always been, a hardware level oriented language. That's why it has always excelled in applications that need to be near the metal, like MCUs and operating systems. What would your new language bring to the table.

There are several things that a new language would bring, here's a summary of the more salient:

  • No reserved words, thus enabling new keywords to be added over time.

A recipe for programmer confusion, and nothing whatsoever to do with MCUs

Quote
  • Support 'bit' as a native data type.
  • Support 'strings' as a native type, BCD/decimal as well.

Not native types on many or most MCUs, so not sure what you're trying to achieve there. What kind of strings? That sounds like you need a heap.

Quote
  • Support for namespaces.

Nothing whatsoever to do with MCUs. C++ already exists.

Quote
  • Computed gotos.

You want to calculate the actual machine address to jump to? In a portable high level language? Presumably with equal spacing between the different jump targets? So the programmer is going to have to know the exact instructions on the target that will be generated from each bit of code being jumped to, and their size?

Insanity.

Quote
  • Flexible alignment, packing and padding directives.
  • Nested functions.

C already has this, at least as implemented in gcc and clang. Most MCUs don't have any better support for inserting and extracting arbitrary bit fields than C has, so even if you can write it a bit more simply you are fooling yourself about the efficiency. And you can already use macros or inline functions in C.

Quote
  • Precision timing features like emit multiple NOP operations or ensure identical execution time for (say) case clauses in a switch.

Most modern CPUs, including microcontrollers, can't guarantee that without taking significant other measures that the programmer really has to arrange by themselves, such as loading critical code into TIM instead of flash, turning off caches and branch prediction, and so forth.

This is not an appropriate feature for a supposedly high level language.

Quote
  • Support for an async/await model.

This requires heap-allocated closures, and I think probably garbage collection.

On a PIC? Just no.
 
The following users thanked this post: newbrain, JPortici, SiliconWizard

Offline mcovington

  • Regular Contributor
  • *
  • Posts: 191
  • Country: us
Re: A new, hardware "oriented" programming language
« Reply #33 on: November 23, 2022, 02:45:50 am »
I never used PL/M but remember PL/1 very fondly, having used it a lot in the mid to late 1970s.  It was the first programming language we encountered that did not actively try to restrict what we could do, and it suited IBM mainframes very well (that's what a lot of us were using).  Most of us didn't see C until the 1980s.
 

Offline brucehoult

  • Super Contributor
  • ***
  • Posts: 4538
  • Country: nz
Re: A new, hardware "oriented" programming language
« Reply #34 on: November 23, 2022, 04:38:16 am »
I never used PL/M but remember PL/1 very fondly, having used it a lot in the mid to late 1970s.  It was the first programming language we encountered that did not actively try to restrict what we could do, and it suited IBM mainframes very well (that's what a lot of us were using).  Most of us didn't see C until the 1980s.

I used PL/I in my first job, on a DG minicomputer. Some subset I guess. It was like Pascal, FORTRAN, and COBOL thrown into a big heap and thoroughly stirred.
 

Offline Siwastaja

  • Super Contributor
  • ***
  • Posts: 8888
  • Country: fi
Re: A new, hardware "oriented" programming language
« Reply #35 on: November 23, 2022, 07:52:58 am »
A slightly cleaned up version of C would be nice, but language inventors never stop at that because it is "boring".

+1 for this, best chance for success is to take C and make small improvements to it, making it truly hardware-oriented. For example, you could add keywords like big_endian or little_endian so that compiler would handle the conversions and interfaces would look clean with no manual shifting, make bitfield ordering well defined, maybe add ranges to variables ADA/VHDL-style which would enable the compiler to do compile-time checking and optimize better.

And of course, arbitrary width types. Maybe I want uint5_t and int129_t. As you are building a new language, they should not be defined type names in some header file like in C, but a type system, syntax could look like uint(8) or something.

 
The following users thanked this post: newbrain, iMo

Offline IDEngineer

  • Super Contributor
  • ***
  • Posts: 1944
  • Country: us
Re: A new, hardware "oriented" programming language
« Reply #36 on: November 23, 2022, 08:23:32 am »
The idea of the compiler calculating NOP's to yield equal execution time through multiple code paths sounds great until you remember interrupts, which are almost always a part of embedded systems.

What are you going to do, disable interrupts for large portions of code to prevent asynchronicity? That defeats the very purpose of hardware interrupts and essentially forces a polling based architecture that won't be compatible with a huge number of embedded environments.

I suppose you could commandeer some of the target's hardware timer resources and set up a sort of timekeeping scheme, but that smacks of certain dynamic "features" which are the worst parts of C++ and many other OO-like languages.

I'm not discouraging you from the exercise, but have realistic expectations. The chance for something like this to go mainstream is very close to zero.
 

Offline JPortici

  • Super Contributor
  • ***
  • Posts: 3527
  • Country: it
Re: A new, hardware "oriented" programming language
« Reply #37 on: November 23, 2022, 09:20:05 am »
...

wouldn't have said it better.
Mentioning PICs, some of the proposed improvements are already part of the XC compilers, others are requests that clearly come from assembly programmers who can't stop thinking in assembly and ostentably write in what i call "Clever C" (forgetting that you CAN, and some time must, use separate assembly modules. Just conform to the ABI and call them as C functions so the compiler is happy, other programmers will be as well)
Dedicated strings requrie heap and personally speaking in a C project i prefer to avoid heap. In languages in which is a given i simply apply another mindset (using it responsibly of course)
 

Offline Siwastaja

  • Super Contributor
  • ***
  • Posts: 8888
  • Country: fi
Re: A new, hardware "oriented" programming language
« Reply #38 on: November 23, 2022, 10:54:34 am »
That's simply not true, we can emit multiple platform specific NOP's today by embedding assembler in C. There are assembler MCU developers out there that struggle to use C because of this kind of thing. They have carefully crafted code where the want some execution path to take exactly the same number of clock cycles as some other, so they embed multiple NOPs sometimes, their designs require that.

Simple enough microcontrollers that can produce cycle-accurate predictable timing by simple instruction counting are becoming exceedingly rare. These 8-bitters still exist, but such cycle-accuracy combined to generally very crappy performance is something people do not actually want. Instead, people buy modern high-performance microcontrollers which can still produce very accurate timing by just scaling the absolute time used per cycle down, by utilizing higher clock speed. For example, a 12-cycle ISR latency on a 400MHz MCU looks like HALF a cycle on a 16MHz AVR/PIC; at that point you simply do no care if it sometimes takes 14 cycles due to some pipeline or branch mispredict or something. I have done pretty timing-sensitive things simply on interrupt handlers, and the advantage is ease of writing, reading, and maintaining that code. Manual cycle counting, or automated version thereof, is simply not needed anymore, except in very rare cases, which require careful understanding anyway.

The problem with this "predictive timing language" is, it becomes tied to certain type of hardware, and then you have replicated what the XCORE folks have done (apparently pretty well).
« Last Edit: November 23, 2022, 10:59:20 am by Siwastaja »
 

Offline Bicurico

  • Super Contributor
  • ***
  • Posts: 1783
  • Country: pt
    • VMA's Satellite Blog
Re: A new, hardware "oriented" programming language
« Reply #39 on: November 23, 2022, 11:52:56 am »
I am not an MCU programmer.

Why not? And why am I posting here?

Well, I am not an MCU programmer, because for my needs I don't need it and for hobby purposes it is simply said too complex for the occasional project.

I am posting here, to add a different perspective:

In my opinion, the greatest revolution in MCU programming was due to Arduino and the extremely easy and well documented SDK. This opened doors to many hobby applications, otherwise inacessible to a wide percentage of hobbyists.

So I wonder: why try to program "Yet Another C Compiler" for MCU, if the existing programmers have no problem with the existing SDK's (as the discussion so far indicates)? Why not, instead, develop an SDK/language for the masses that allow to use more generic (cheaper) MCU and program them as easily (or even more easily) as what Arduino offers?

Also, another suggestion: current MCU SDK's are bloated in size! If you want to program PIC, AVR and something else, you easily fill your harddisk with three huge installations. If you want to do something nice, make it small! Preferrably something that just deflates into a single folder.

Sorry if this is beyond your purpose or if my line of thought doesn't make sense (it does to me, though).

Regards,
Vitor
« Last Edit: November 23, 2022, 11:55:09 am by Bicurico »
 

Offline tggzzz

  • Super Contributor
  • ***
  • Posts: 20768
  • Country: gb
  • Numbers, not adjectives
    • Having fun doing more, with less
Re: A new, hardware "oriented" programming language
« Reply #40 on: November 23, 2022, 12:28:19 pm »

There are several things that a new language would bring, here's a summary of the more salient:

  • No reserved words, thus enabling new keywords to be added over time.
  • Support 'bit' as a native data type.
  • Support 'strings' as a native type, BCD/decimal as well.
  • Support for namespaces.
  • Computed gotos.
  • Flexible alignment, packing and padding directives.
  • Nested functions.
  • Precision timing features like emit multiple NOP operations or ensure identical execution time for (say) case clauses in a switch.
  • Support for an async/await model.

These are the kinds of things that I've seen other people raise or complain about sometimes, things that a freshly designed language could readily accomodate.

Items to consider...

Cache coherency, between different cores and with interrupts. That implies a memory model, as C has belatedly realised.

Non uniform memory architecture. Some bits of memory are much further away from the cores than others. "Caches are the new RAM, RAM is the new disc, disc is the new mag tape".

If you want guaranteed timing, then consider the effects of all the caches (inc TLB and others). In addition, you must understand what was necessary to achieve guaranteed timing. See the XMOS xCORE plus xC system (Buy them at digikey ;) ) That also has the equivalent of your async/await sorted in a sound theoretical and practical way, which has since been copied in other languages.the

Rather than have a computed goto, have a computed come from instruction. Yes, that has existed!
There are lies, damned lies, statistics - and ADC/DAC specs.
Glider pilot's aphorism: "there is no substitute for span". Retort: "There is a substitute: skill+imagination. But you can buy span".
Having fun doing more, with less
 
The following users thanked this post: DiTBho

Offline brucehoult

  • Super Contributor
  • ***
  • Posts: 4538
  • Country: nz
Re: A new, hardware "oriented" programming language
« Reply #41 on: November 23, 2022, 01:11:55 pm »
In my opinion, the greatest revolution in MCU programming was due to Arduino and the extremely easy and well documented SDK. This opened doors to many hobby applications, otherwise inacessible to a wide percentage of hobbyists.

This I entirely agree with.

For many people and many purposes, it is completely appropriate to use a more powerful MCU than you strictly need, and use some of that extra speed and memory capacity to run easier to use programming languages and libraries.

The Arduino library, implemented using C++ for a 16 MHz AVR, is a good step in that direction.

Some people are aghast that digitalWrite() takes 2 µs when you can do the "same thing" (not quite, actually) using a single AVR machine code instruction taking 1/16th of a µs.

Which matters not at all if you're using it to switch on your garage light, or even control a motor or R/C aircraft servo where a control loop of 1 ms or even 10 ms is absolutely fine.

And if you run the Arduino library on a 320 MHz HiFive1 or 600 MHz dual-issue Teensy 4.0 or 240 MHz ESP32 then your inefficient code runs just as well as the finest hand-crafted AVR or PIC code.

The problem then becomes that C++ and the Arduino library are too low level and inconvenient for many things. Python or JavaScript or Scheme becomes a much better idea.

And then you can have convenient strings and dictionaries and async/await programming model.

I'm not against any of that.

I'm against thinking you can or should try to mix that with using NOPs for timing, or trying to force branches of a switch to have equal execution times. JUST NO. Use a faster CPU and proper event-driven programming.

We now have an amazing situation where a 20 MHz 8 bit 2 KB RAM ATMega328 costs $2.50 in quantity, but you can get a 200 MHz 32 bit RISC-V MCU with WIFI and BlueTooth and several hundred KB of RAM for about the same price. Or, you can get a 48 MHz 32 bit RISC-V with 4 KB RAM for $0.12.

Quote
Also, another suggestion: current MCU SDK's are bloated in size! If you want to program PIC, AVR and something else, you easily fill your harddisk with three huge installations.

I don't know why people want customised Eclipse-based IDEs, or why manufacturers think they do. Command line toolchains are not very big.
 

Offline brucehoult

  • Super Contributor
  • ***
  • Posts: 4538
  • Country: nz
Re: A new, hardware "oriented" programming language
« Reply #42 on: November 23, 2022, 01:14:50 pm »
Cache coherency, between different cores and with interrupts. That implies a memory model, as C has belatedly realised.

Basically as soon as dual-core PCs started to be common.
 
The following users thanked this post: newbrain

Offline mcovington

  • Regular Contributor
  • *
  • Posts: 191
  • Country: us
Re: A new, hardware "oriented" programming language
« Reply #43 on: November 23, 2022, 01:39:42 pm »
For many people and many purposes, it is completely appropriate to use a more powerful MCU than you strictly need, and use some of that extra speed and memory capacity to run easier to use programming languages and libraries.

HEAR, HEAR!  It is entirely appropriate to use a $20 microcontroller instead of a 50-cent microcontroller in order to greatly facilitate the development of the firmware.  In a mass-produced product with simple firmware, use the 50-cent micro, and spend a month writing the program.  But for a one-off prototype (whether in industry, science, or hobby), use the Arduino or RPi and program it in a language that is easy to use and test.

If it were $20 versus $500, it would be a somewhat different trade-off.

I think there was such an issue with mainframes 50 years ago, but not so well understood.  The IBM 370 and OS were designed to run the apps, with a lot of fine tuning of disk performance, etc., and were harder to develop software on.   (IEFBR14, for those who remember it, was the reductio ad absurdum of that.)   VAX/VMS and UNIX seemed to be slanted much more towards making life easier for the programmers, on the ground that if you do that, they will make the apps run better.  And look at what almost all subsequent programming is descended from!
 

Offline newbrain

  • Super Contributor
  • ***
  • Posts: 1772
  • Country: se
Re: A new, hardware "oriented" programming language
« Reply #44 on: November 23, 2022, 01:42:23 pm »
And of course, arbitrary width types. Maybe I want uint5_t and int129_t. As you are building a new language, they should not be defined type names in some header file like in C, but a type system, syntax could look like uint(8 ) or something.
Your wish has been graciously granted by the standard fairy, including similar syntax.
Nandemo wa shiranai wa yo, shitteru koto dake.
 
The following users thanked this post: Siwastaja

Offline tggzzz

  • Super Contributor
  • ***
  • Posts: 20768
  • Country: gb
  • Numbers, not adjectives
    • Having fun doing more, with less
Re: A new, hardware "oriented" programming language
« Reply #45 on: November 23, 2022, 02:14:30 pm »
I'm against thinking you can or should try to mix that with using NOPs for timing, or trying to force branches of a switch to have equal execution times. JUST NO. Use a faster CPU and proper event-driven programming.

Seconded.

However the issue of predictable and predicted timing is important when considering whether deadlines will be met.
There are lies, damned lies, statistics - and ADC/DAC specs.
Glider pilot's aphorism: "there is no substitute for span". Retort: "There is a substitute: skill+imagination. But you can buy span".
Having fun doing more, with less
 

Offline tggzzz

  • Super Contributor
  • ***
  • Posts: 20768
  • Country: gb
  • Numbers, not adjectives
    • Having fun doing more, with less
Re: A new, hardware "oriented" programming language
« Reply #46 on: November 23, 2022, 02:17:34 pm »
Cache coherency, between different cores and with interrupts. That implies a memory model, as C has belatedly realised.

Basically as soon as dual-core PCs started to be common.

It was a practical problem decades earlier. As usual, the HPC mob ran into the problem earlier than most, but commercial computing wasn't far behind.
There are lies, damned lies, statistics - and ADC/DAC specs.
Glider pilot's aphorism: "there is no substitute for span". Retort: "There is a substitute: skill+imagination. But you can buy span".
Having fun doing more, with less
 

Offline Sherlock HolmesTopic starter

  • Frequent Contributor
  • **
  • !
  • Posts: 570
  • Country: us
Re: A new, hardware "oriented" programming language
« Reply #47 on: November 23, 2022, 03:47:34 pm »
...

wouldn't have said it better.
Mentioning PICs, some of the proposed improvements are already part of the XC compilers, others are requests that clearly come from assembly programmers who can't stop thinking in assembly and ostentably write in what i call "Clever C" (forgetting that you CAN, and some time must, use separate assembly modules. Just conform to the ABI and call them as C functions so the compiler is happy, other programmers will be as well)
Dedicated strings requrie heap and personally speaking in a C project i prefer to avoid heap. In languages in which is a given i simply apply another mindset (using it responsibly of course)

I've only mentioned the kinds of things raised by very experienced hardware/software engineers that rely heavily on PIC, their needs, concerns and frustrations are real, not imagined, I am simply collating broad issues and concerns raised by people who find C restrictive or inflexible.
“When you have eliminated all which is impossible, then whatever remains, however improbable, must be the truth.” ~ Arthur Conan Doyle, The Case-Book of Sherlock Holmes
 

Offline Sherlock HolmesTopic starter

  • Frequent Contributor
  • **
  • !
  • Posts: 570
  • Country: us
Re: A new, hardware "oriented" programming language
« Reply #48 on: November 23, 2022, 04:29:45 pm »
A slightly cleaned up version of C would be nice, but language inventors never stop at that because it is "boring".

+1 for this, best chance for success is to take C and make small improvements to it, making it truly hardware-oriented. For example, you could add keywords like big_endian or little_endian so that compiler would handle the conversions and interfaces would look clean with no manual shifting, make bitfield ordering well defined, maybe add ranges to variables ADA/VHDL-style which would enable the compiler to do compile-time checking and optimize better.

And of course, arbitrary width types. Maybe I want uint5_t and int129_t. As you are building a new language, they should not be defined type names in some header file like in C, but a type system, syntax could look like uint(8) or something.

That is a possibility, improving C. However that does impose restrictions on grammar and for me - as a compiler/language developer - the grammar is perhaps the most interesting area, once decisions have been made at a grammar level there's usually no going back, one must live with the consequences for all future versions.

What you say about arbitrary width types is something I value a great deal, of course PL/I had that and the grammar fully supports it. There's a lot to be gained by leveraging a grammar like that.

I have a draft grammar that I'm outlining and the way we declare items is a key part of that, there are problems with the C practice of

Code: [Select]
<typename> <identifier> ;
as the basis for declarations, the biggest of these IMHO is that it restricts language enhancements because the <typename> could be a user defined name like say "await" or "yield" and then there, right there, we kill any prospect of ever adding a new language keyword "await" (however that might be implemented, which is an unrelated question here).

So keyword driven languages are an obvious choice for an extensible grammar, in the case of declaration we simply have a keyword as PL/I, Kotlin, Rust and some other languages do.

Being able to specify values for type related attributes is a huge help too, for extensibility, bin(3), bin(99) whatever, are parameterizing the concept of binary numbers and can be extended to many other types too.

If we declare a "string" in C we must indicate the length/size so why is that flexibility restricted? why can't it be used just as easily for numeric types or other types? Only the grammar matters here, once the grammar is defined there's no going back.
 


“When you have eliminated all which is impossible, then whatever remains, however improbable, must be the truth.” ~ Arthur Conan Doyle, The Case-Book of Sherlock Holmes
 

Offline Siwastaja

  • Super Contributor
  • ***
  • Posts: 8888
  • Country: fi
Re: A new, hardware "oriented" programming language
« Reply #49 on: November 23, 2022, 04:30:24 pm »
I've only mentioned the kinds of things raised by very experienced hardware/software engineers that rely heavily on PIC, their needs, concerns and frustrations are real, not imagined, I am simply collating broad issues and concerns raised by people who find C restrictive or inflexible.

To me, you just sound like a case of Dunning-Kruger. Those who "rely heavily on PIC" probably are "very experienced engineers" only by very flexible definition of "very experienced". Your posts have already demonstrated you having quite serious lack of basic knowledge*, probably by your learning getting hindered by excessive self-esteem. I would say: lurk moar. This forum is a great resource, read and participate in discussions on the MCU & programming subforums for 5 more years. There is a lot to learn, and by getting to read comments from some people MUCH MUCH more experienced than your PIC friends, your priority list of things you would like to see in a "C replacement" would significantly change.

*) for example, mixing totally up what "native" types mean, which is pretty important in context of hardware-oriented language!

But sure, there is a niche of cases where writing cycle accurate things on a 8-bit PIC is still relevant, and where a C replacement which can help do that with less work, is also relevant. But the result would not be "a new, hardware oriented programming language", but "a PIC-specific niche tool".

And sorry in advance for being blunt, don't take it too seriously, I just struggle to write in any other way, take it with a grain of salt.
« Last Edit: November 23, 2022, 04:32:36 pm by Siwastaja »
 
The following users thanked this post: JPortici


Share me

Digg  Facebook  SlashDot  Delicious  Technorati  Twitter  Google  Yahoo
Smf