Author Topic: The Imperium programming language - IPL  (Read 86827 times)

0 Members and 9 Guests are viewing this topic.

Offline Sherlock HolmesTopic starter

  • Frequent Contributor
  • **
  • !
  • Posts: 570
  • Country: us
Re: A new, hardware "oriented" programming language
« Reply #150 on: November 25, 2022, 06:18:13 pm »
Isn't it obvious that all the implementation dependent behaviour isn't standardised?!

What weird mental acrobatics. It being in the standard means it is standardized! Standard basically says: we did not define the effect of this operation, so don't do that unless you really know what you are doing.

Would be very nice if implementation-defined behavior was not needed in the standard, but it's still much better to have standardized implementation-defined behavior than the standard being completely silent about some corner cases. This makes C standard good, compared to poorly standardized languages where you can't know what happens by reading the standard.

Quote
And then there's all the undefined behaviour

Exactly the same for UB! This is the strong point of C: UB being standardized, so just read and understand the standard, don't do UB, and you are good to go.

Having such standard is not obvious at all.

Quote
Surgeons have to be rigorously trained and pass exams, and are regularly monitored by formal regulatory bodies. Anybody can call themselves a programmer and practice [sic] it without being monitored.

But anyone can also buy surgical knives freely and do whatever with them. And doing something with C where human lives are at a stake, say design an autopilot for a passenger aircraft, also requires at least some form of monitoring by formal regulatory bodies. Anyone is free to use surgical knife to peel an apple, even if that poses a risk of cutting oneself. My surgeon analogue is thus valid.

Well regarding the latter, avionics software development processes must be certified by the FAA as DO-178C compliant. For this reason most avionics software is written in Ada, not C, not C++.

“When you have eliminated all which is impossible, then whatever remains, however improbable, must be the truth.” ~ Arthur Conan Doyle, The Case-Book of Sherlock Holmes
 

Online SiliconWizard

  • Super Contributor
  • ***
  • Posts: 15439
  • Country: fr
Re: A new, hardware "oriented" programming language
« Reply #151 on: November 25, 2022, 06:44:48 pm »
You say "talk is cheap" yet that entire post is nothing but anecdotes, how about facts, metrics, objective data?

FWIW, to me the one single objective datapoint is I professionally develop firmwares for embedded devices in C and have absolutely no problem with it whatsoever and do not complain about it lacking anything critical, or generating critical bugs for me. I haven't stepped in any C footgun in years, just make classical bugs (wrong assumptions, logical mistakes) I would do in any language, and not too many. I also don't mind supporting my code, it's not fire-and-forget.

I know a few others in my field who also do not struggle. But I'm not going to give names, just like you are not giving the names of your "very experts" who struggle with PICs.

Yeah, ditto.
 

Offline pcprogrammer

  • Super Contributor
  • ***
  • Posts: 4411
  • Country: nl
Re: A new, hardware "oriented" programming language
« Reply #152 on: November 25, 2022, 07:24:27 pm »
Seems like the python thread all over again.

A "better" programming language will not help you become a better programmer. It is not about the language, it is the mindset that makes a good programmer. The ability to understand why something does not work as intended and then fix it, is what counts.

For me plain C without libraries and HAL work very well on bare metal embedded projects. Yes you have to do everything yourself, but over time you build up your own set of code that can be used over and over again. The benefit of this is that you know what everything does.

I have seen things go sideways with for example C#, where a lot is done for you, and it can take a lot of effort to fix it when it turns out to be a bug in the libraries that come with it. Also had to work around bugs in javascript that can take a lot of time to figure out why things go wrong. Even with C++ and then perform bound checks, I have seen memory leaks within the libraries.

But like others wrote, if you like doing something like this go ahead, but I expect that not a lot of the expert embedded developers want to become testers of your new language.

Online Nominal Animal

  • Super Contributor
  • ***
  • Posts: 6967
  • Country: fi
    • My home page and email address
Re: A new, hardware "oriented" programming language
« Reply #153 on: November 25, 2022, 07:44:26 pm »
Would be very nice if implementation-defined behavior was not needed in the standard, but it's still much better to have standardized implementation-defined behavior than the standard being completely silent about some corner cases.
In my opinion, it is the implementation-defined behaviour that lets C evolve users first, as opposed to standards folk dictating the evolution to users and compiler developers.  It is closely intertwined with the C extensions, but I see the IB as giving leeway to change, and extensions as new growth.

Long-winded explanation of the basis of that opinion:

As I mentioned before, I use extensions to the C and C++ language, especially when they are supported by more than one compiler on the target hardware.
I also often rely on POSIX C (which adds to the standard library, but also requires specific behaviour in some places where the C standard itself is more relaxed).

In certain cases, I do rely on implementation-defined behaviour, when the compiler I use explicitly defines what its behaviour is.  It is okay, as long as the behaviour expectation is well described in the code, documentation, and build instructions.

In the case of GCC, for character set stuff, first check the preprocessor implementation-defined behaviour (and limits); the C compiler implementation-defined behaviour is defined here.

GCC implementation-defined behaviour I routinely rely on (but knowing it is implementation-defined):
  • signed integers use two's complement format, with no extraordinary bit patterns.
  • Signed integer bit shifts << and >> have no UB aspects.  They are performed on the underlying bit pattern, with right shift on signed types ("arithmetic right shift") sign-extending, and zero-extending on unsigned types.  When overflow occurs, the bits are simply discarded.
  • Converting a signed integer to an unsigned integer type, when the unsigned integer type cannot describe the signed value, uses modulo arithmetic.
As to the extensions, I use statement expressions in function-like macros to evaluate macro arguments only once (especially with the typeof keyword extension); __float128 (usually via libquadmath, though); array qualifiers work like pointer qualifiers so that type var[n]] can be used where a const type var[n] is expected; void * can consume any type of variadic pointer; initialization of objects with static storage duration by compound literals with constant elements; function, variable, type attributes; fallthrough hint in case statements; deprecated and unavailable attributes in enums; '\e' as an alias for '\033'; __alignof__ in C90/C99 like C11 _Alignof; extended asm (especially in static inline helper functions defined in header files); vector extensions; __atomic built-ins with C++11-compatible memory model; other built-ins like __builtin_constant_p(), __builtin_types_compatible(), __builtin_expect(), __builtin_assume_aligned(), __builtin_prefetch(); and __thread for thread-local storage class keyword.

If we look at how the C standard evolved up to C99, implementation-defined behaviour let compiler developers and their users add new useful features.  For example, the __thread TLS keyword cannot be implemented without compiler support, but it makes a major difference to multithreaded programming in POSIX C.  C11 was an unfortunate sidestep, and C17 added nothing new, but C23 seems to be back on sane track in my opinion.

Undefined behaviour is the "here be dragons" part of C: unexplored country.  Most of them are cases where the undefinedness stems from any arbitrary choice on the behaviour placing undue implementation limits on the compilers.  For example, when multiple pre- or post-increment or decrement operators are applied to the same variable in a single expression –– say, ((++x*y++)-(++x*++y)).  You could dictate a specific behaviour, but that would severely limit what kind of mechanisms C compilers use to evaluate such statements.  In particular, currently most use an Abstract Syntax Tree model, and apply expression optimization to the tree, before conversion to machine code; and it is likely that any left to right ordering is lost in the optimization phase.  If you dictate an order based on left-to-right positioning, it would require ASTs to add an ordering property, and any requirements related to that would likely conflict with the optimization passes.  No, better leave them undefined behaviour, unless someone comes up with a convincing use case that does not suffer from such.
 
The following users thanked this post: Siwastaja, DiTBho

Offline Sherlock HolmesTopic starter

  • Frequent Contributor
  • **
  • !
  • Posts: 570
  • Country: us
Re: A new, hardware "oriented" programming language
« Reply #154 on: November 25, 2022, 09:50:52 pm »
There is no surprise that there are engineers quite content with C, or even committed to C. That is something I fully expect. But their presence does not somehow invalidate attempts to explore alternatives; that C has a huge following does not alter the fact that there are weaknesses in the language, that some weaknesses can be avoided by an improved language, and improved grammar.

Java is the new COBOL yet it too has a dedicated following, committed devotees.

This thread is simply aimed at those people who are interested enough in languages and pushing the envelope, to speculate on improvements and suggest ideas that might be relevant to a new MCU oriented language.

So I get it, I get that some consider this pointless, but some of us do not, so can we stop complaining about C being critiqued and press on with positive contributions?


“When you have eliminated all which is impossible, then whatever remains, however improbable, must be the truth.” ~ Arthur Conan Doyle, The Case-Book of Sherlock Holmes
 

Offline Sherlock HolmesTopic starter

  • Frequent Contributor
  • **
  • !
  • Posts: 570
  • Country: us
Re: A new, hardware "oriented" programming language
« Reply #155 on: November 25, 2022, 09:58:07 pm »
Seems like the python thread all over again.

A "better" programming language will not help you become a better programmer. It is not about the language, it is the mindset that makes a good programmer. The ability to understand why something does not work as intended and then fix it, is what counts.

For me plain C without libraries and HAL work very well on bare metal embedded projects. Yes you have to do everything yourself, but over time you build up your own set of code that can be used over and over again. The benefit of this is that you know what everything does.

I have seen things go sideways with for example C#, where a lot is done for you, and it can take a lot of effort to fix it when it turns out to be a bug in the libraries that come with it. Also had to work around bugs in javascript that can take a lot of time to figure out why things go wrong. Even with C++ and then perform bound checks, I have seen memory leaks within the libraries.

But like others wrote, if you like doing something like this go ahead, but I expect that not a lot of the expert embedded developers want to become testers of your new language.

What - specifically - do you find problematic with C# ? do you have an actual, real world example you can share?

« Last Edit: November 25, 2022, 10:00:00 pm by Sherlock Holmes »
“When you have eliminated all which is impossible, then whatever remains, however improbable, must be the truth.” ~ Arthur Conan Doyle, The Case-Book of Sherlock Holmes
 

Offline Sherlock HolmesTopic starter

  • Frequent Contributor
  • **
  • !
  • Posts: 570
  • Country: us
Re: A new, hardware "oriented" programming language
« Reply #156 on: November 25, 2022, 10:03:58 pm »
Most of the complaining here is not about specific suggestions about language improvements or changes, but more about the fact that someone is actually daring to rock the boat and discuss the subject dispassionately, point out the elephant in the room!
“When you have eliminated all which is impossible, then whatever remains, however improbable, must be the truth.” ~ Arthur Conan Doyle, The Case-Book of Sherlock Holmes
 

Online ataradov

  • Super Contributor
  • ***
  • Posts: 11780
  • Country: us
    • Personal site
Re: A new, hardware "oriented" programming language
« Reply #157 on: November 25, 2022, 10:32:48 pm »
Most of the complaining here is not about specific suggestions about language improvements or changes,
Once again, this is not the case. All your suggestions were addressed and discussed.

daring to rock the boat
You would rock the boat if you come up even with a prototype compiler or coherent description of the proposed language. You are not the first one to want a new better language. Some people did the work and ended up with Zig and Rust and a ton of others. If you don't like them, propose what you would change. But if all you would change is a syntax for the computed gotos, then don't blame us that we don't care.

Do the work, propose something concrete to discuss and we will gladly do so. Right now all we have is a set of random ideas.
« Last Edit: November 25, 2022, 10:35:01 pm by ataradov »
Alex
 
The following users thanked this post: Jacon, SiliconWizard

Offline tggzzz

  • Super Contributor
  • ***
  • Posts: 20768
  • Country: gb
  • Numbers, not adjectives
    • Having fun doing more, with less
Re: A new, hardware "oriented" programming language
« Reply #158 on: November 25, 2022, 10:36:51 pm »
Most of the complaining here is not about specific suggestions about language improvements or changes, but more about the fact that someone is actually daring to rock the boat and discuss the subject dispassionately, point out the elephant in the room!

Not quite.

It is pointing out that if someone suggests jumping out of the frying pan, they should have good reasons why they aren't jumping into the fire. Or some similar tortured analogy!
There are lies, damned lies, statistics - and ADC/DAC specs.
Glider pilot's aphorism: "there is no substitute for span". Retort: "There is a substitute: skill+imagination. But you can buy span".
Having fun doing more, with less
 

Offline Sherlock HolmesTopic starter

  • Frequent Contributor
  • **
  • !
  • Posts: 570
  • Country: us
Re: A new, hardware "oriented" programming language
« Reply #159 on: November 25, 2022, 10:56:36 pm »
Most of the complaining here is not about specific suggestions about language improvements or changes,
Once again, this is not the case. All your suggestions were addressed and discussed.

daring to rock the boat
You would rock the boat if you come up even with a prototype compiler or coherent description of the proposed language. You are not the first one to want a new better language. Some people did the work and ended up with Zig and Rust and a ton of others. If you don't like them, propose what you would change. But if all you would change is a syntax for the computed gotos, then don't blame us that we don't care.

Do the work, propose something concrete to discuss and we will gladly do so. Right now all we have is a set of random ideas.

You're advised to reread the entire thread if you are under the illusion nothing concrete has been proposed, that's probably why you're confused.

“When you have eliminated all which is impossible, then whatever remains, however improbable, must be the truth.” ~ Arthur Conan Doyle, The Case-Book of Sherlock Holmes
 

Offline Sherlock HolmesTopic starter

  • Frequent Contributor
  • **
  • !
  • Posts: 570
  • Country: us
Re: A new, hardware "oriented" programming language
« Reply #160 on: November 25, 2022, 10:59:48 pm »
Most of the complaining here is not about specific suggestions about language improvements or changes, but more about the fact that someone is actually daring to rock the boat and discuss the subject dispassionately, point out the elephant in the room!

Not quite.

It is pointing out that if someone suggests jumping out of the frying pan, they should have good reasons why they aren't jumping into the fire. Or some similar tortured analogy!

Sorry to point out, but the thread is not about frying pans or jumping, perhaps you're thinking of something else? This is what's known in debating circles as a strawman argument, rather than actually debate some specific thing I might have actually proposed or suggested you prefer to attack an imaginary argument, disagreeing with something I never even said.

I thought I was dealing with engineers, dispassionate logical thinkers, I'm seeing rather a lot of emotion and confusuion.

« Last Edit: November 25, 2022, 11:06:47 pm by Sherlock Holmes »
“When you have eliminated all which is impossible, then whatever remains, however improbable, must be the truth.” ~ Arthur Conan Doyle, The Case-Book of Sherlock Holmes
 

Online ataradov

  • Super Contributor
  • ***
  • Posts: 11780
  • Country: us
    • Personal site
Re: A new, hardware "oriented" programming language
« Reply #161 on: November 25, 2022, 11:06:57 pm »
Everything you proposed here is trivial and was already known, none of this is enough to justify making a new language or even extending the exiting ones.

Most people here are engineers, and everyone participating in this thread expressed their opinion on your proposals. This is the engineering opining based on people's experience with using the exiting tools. And the consensus seems to be that none of this is really necessary. May be nice to have if it was already there, but not necessary. What else is to discuss here?

If you want more safety and better defined behaviour, use Rust. Many companies and individuals are switching to it instead of C.

But if you want to think that we all are wrong, then there is nothing we can do about it.
Alex
 

Offline Sherlock HolmesTopic starter

  • Frequent Contributor
  • **
  • !
  • Posts: 570
  • Country: us
Re: A new, hardware "oriented" programming language
« Reply #162 on: November 25, 2022, 11:16:34 pm »
Everything you proposed here is trivial and was already known, none of this is enough to justify making a new language or even extending the exiting ones.

I disagree.

Most people here are engineers, and everyone participating in this thread expressed their opinion on your proposals. This is the engineering opining based on people's experience with using the exiting tools. And the consensus seems to be that none of this is really necessary. May be nice to have if it was already there, but not necessary. What else is to discuss here?

Even if this were true it matters not, this is an informal discussion about ways one might produce a new programming language for MCU use. That you disapprove is irrelevant, your opinions are noted, why persist in these perpetual complaints? why do you care?

If you want more safety and better defined behaviour, use Rust. Many companies and individuals are switching to it instead of C.

As I suspected you are confused, please show us all the post where I said I wanted "more safety and better defined behavior" can you do that? No, because it is false statement, there is no such post, you either imagined it or made it up. I do hope you code better than you troll.

« Last Edit: November 25, 2022, 11:23:38 pm by Sherlock Holmes »
“When you have eliminated all which is impossible, then whatever remains, however improbable, must be the truth.” ~ Arthur Conan Doyle, The Case-Book of Sherlock Holmes
 

Online ataradov

  • Super Contributor
  • ***
  • Posts: 11780
  • Country: us
    • Personal site
Re: A new, hardware "oriented" programming language
« Reply #163 on: November 25, 2022, 11:22:43 pm »
this is an informal discussion about ways one might produce a programming language for MCU use.
At this thread shows, no one here is interested in a language specifically for MCUs. Why would we? What if your next project needs and MPU as well (as increasingly happens)? Whys would we not what to use the same exact tool set?

"Elephant in the room" here is that we don't see a point in discussing this topic. Not because we are entrenched with C and refuse to even look at something else. Most of us looked at other stuff a lot, and in the end C ends up being the best option when you consider everything, not just a very narrow use case of programming low level PICs.
Alex
 

Offline Sherlock HolmesTopic starter

  • Frequent Contributor
  • **
  • !
  • Posts: 570
  • Country: us
Re: A new, hardware "oriented" programming language
« Reply #164 on: November 25, 2022, 11:28:27 pm »
this is an informal discussion about ways one might produce a programming language for MCU use.
At this thread shows, no one here is interested in a language specifically for MCUs. Why would we? What if your next project needs and MPU as well (as increasingly happens)? Whys would we not what to use the same exact tool set?

If you believe that then why not just ignore the thread instead of clogging it with your pessimism and vitriol, do you really have nothing better to do with your time?

"Elephant in the room" here is that we don't see a point in discussing this topic. Not because we are entrenched with C and refuse to even look at something else. Most of us looked at other stuff a lot, and in the end C ends up being the best option when you consider everything, not just a very narrow use case of programming low level PICs.

We're done young man, I've been working with electronics, radio and computers since before you were born, you're doing a disservice to your employer Microchip with this intolerant (dare I say, arrogant) and rude behavior in a public forum, you are - like it or not - a representative of Microchip and at this point I am far from impressed with your manner, you can whine and winge about me all you like, I have nothing more to say to you, good evening.

Now, getting back to the actual subject...



« Last Edit: November 25, 2022, 11:43:09 pm by Sherlock Holmes »
“When you have eliminated all which is impossible, then whatever remains, however improbable, must be the truth.” ~ Arthur Conan Doyle, The Case-Book of Sherlock Holmes
 

Offline brucehoult

  • Super Contributor
  • ***
  • Posts: 4538
  • Country: nz
Re: A new, hardware "oriented" programming language
« Reply #165 on: November 25, 2022, 11:58:35 pm »
Most of the complaining here is not about specific suggestions about language improvements or changes, but more about the fact that someone is actually daring to rock the boat and discuss the subject dispassionately, point out the elephant in the room!

Bullshit.

It's frequently my job to "rock the boat" by proposing changes and improvements to processes, libraries, programming languages, and even new machine code instructions. You can find my name for example in the credits for the RISC-V base instruction set and the B and V extensions.

Your proposed changes address ONE aspect of the problem:

- does the change make a programmer's life slightly more convenient?

You totally ignore every other aspect, such as:

- can the proposed feature be efficiently implemented on the target devices?

- is it a significant improvement vs using a library (function, macro) or maybe code generator instead?

- is the product of the improvement and the frequency of use sufficient to justify the incremental cumulative increase in complexity of the language, compiler, manuals, training?
 

Online Nominal Animal

  • Super Contributor
  • ***
  • Posts: 6967
  • Country: fi
    • My home page and email address
Re: A new, hardware "oriented" programming language
« Reply #166 on: November 26, 2022, 02:04:11 am »
There is no surprise that there are engineers quite content with C, or even committed to C.
When talking about low level programming languages –– which is what I understand 'a hardware "oriented" programming language' to mean ––, C is just the one with the best proven track record decades long.  It isn't that great, it's just the 'benchmark' one for others, due to its widespread use and role in systems programming and embedded development.

For examples of nearly bug-free programs written in C in the systems programming domain, go check out D. J. Bernstein's djbdns, qmail, daemontools, cdb.  This is the guy behind Curve25519, having released it in 2005.

Like it, dislike it, doesn't matter, C is just a tool.  But as a tool, its features and track record are significant.  So are its deficiencies, and that means any real effort to do better is valuable.

In comparison, C# is a managed language. .NET Micro requires at least 256k of RAM. .NET nanoFramework requires at least 64k of RAM, and runs on Cortex-M and RISC-V (ESP32-C3) cores.  So, perhaps suitable for medium to large embedded devices, but decidedly unsuitable for small ARMs and anything less than 32-bit architecures.

Ada can be used to program AVR 8-bit microcontrollers (see AVR-Ada), but it is still relatively little used.  One possible reason is that while GCC GNAT is GPL3+ licensed with a runtime library exception, AdaCore sells GNAT Pro, and the FSF/GCC GNAT is seen as "inferior", with the "proper" version being the sole product of a commercial company.  (Or maybe that's just me.)

I get that some consider this pointless
No, that's not it at all.  Not pointless, more like bass-ackwards.  We want the results too, we just have seen your approach before leading to nowhere.  We're trying to steer you to not repeat that, but actually produce something interesting.

If you start a language design from scratch, you must understand the amount of design choices already made for existing languages.  The ones in languages that have survived use in anger, are the ones where the choices support a programming paradigm the users find intuitive and effective.

Why did DiTBHo not start from scratch, and instead pared down C to a subset with some changes and additions, to arrive at their my-C, designed for strictly controlled and enforced embedded use cases?  Because they needed a tool fit for a purpose, and it was a straightforward way to achieve it.  Results matter.

Why did SiliconWizard's Design a better "C" thread 'not go anywhere'?  It just sprawled around, with individual features and other languages discussed.  In fact, it really showed how complicated and hard it is to do better than C from scratch; with other languages like Ada discussed but nobody knowing exactly why they never got as much traction as C.  Just consider this post by brucehoult about midway in the thread, about how C with its warts and all still maps to different hardware so well.

Me, I have worked on replacing the standard C library with something better.  Because the C standard defines freestanding environment where the C standard library is not available in quite a detail –– unlike say C++, which also has the same concept, but leaves it basically completely up to implementations to define what it means ––, this is doable.  I aim to fix many of the issues others have with C.  With C23 around the corner, the one change I think might actually make a difference is to arrays not decay to pointers, and instead conceptually use arrays everywhere to describe memory ranges.  Even just allowing type variation based on a later variable in the same argument list would make it possible to replace buffer overrun prone standard library functions with almost identical replacements, that would allow the C compiler to detect buffer under- and overruns at compile time.  It would only be a small addition, perhaps a builtin, to make it possible to prove via static analysis that all memory accesses are valid.
In other words, I'm looking to change the parts of C that hinder me or others, not start from scratch.

Am I a C fanboi?  No.  If you look at my posting history, you'll see that I actually recommend using an interpreted language, currently Python, for user interfaces (for multiple reasons).

I currently use C for some embedded (AVRs, mainly), and a mixed C/C++ freestanding environment for embedded ARM development; I also use POSIX C for systems programming in Linux (mostly on x86-64).  (I sometimes do secure programming, dealing with privileges and capabilities; got some experience as a sysadmin at a couple of universities, and making customized access solutions for e.g. when you have a playground with many users at different privilege levels, and subsections with their own admins, including sub-websites open to the internet.  It's not simple when you're responsible nothing leaks that shouldn't leak.)

Okay, so if we believe that a ground-up design from scratch is unlikely to lead to an actual project solving the underlying problems OP (Sherlock Holmes) wants to solve, what would?

Pick a language, and a compiler, you feel you can work with.  It could be C, it could be Ada, it could be whatever you want.  Obviously, it should have somewhat close syntax to what you prefer, but it doesn't have to be an exact match.  I'll use C as the language example below for simplicity only; feel free to substitute it with something else.

Pick a problem, find languages that solve it better than C, or invent your own new solution.  Trace it down to the generated machine code, and find a way to port it back to C, replacing the way C currently solves it.  Apply it in real life, writing real-world code that heavily uses that modified feature.  Get other people to comment on it, and maybe even test it.  Find out if the replacement solution actually helps with real-world code.  That often means getting an unsuspecting victim, and having them re-solve a problem using the modified feature, using only your documentation of the feature as a guide.

Keep a journal of your findings.

At some point, you find that you have enough of those new solutions to construct a completely new language.  At this point, you can tweak the syntax to be more to your liking.  Start writing your own compiler, but also document the language the compiler works with, precisely.  As usual, something like ABNF is sufficient for syntax, but for the paradigm, the approach, I suggest writing additional documentation explaining your earlier findings, and the solution approach.  Small examples are gold here.  The idea is that other people, reading this additional documentation, can see how you thought, so they can orient themselves to best use the new language.

Theory is nice, but practical reality always trumps theory.  Just because the ABNF of a language looks nice, doesn't mean it is an effective language.  As soon as you can compile running native binaries, start creating actual utilities – sort, grep, bc for example –, and look at the machine code the compiler generates.  Just because the code is nice and the abstractions just perfect, does not mean they are fit for generating machine code.  Compare the machine code to what the original language and other languages produce, when optimizations are disabled (for a more sensible comparison).

During this process, do feel free to occasionally branch into designing your language from scratch.  If you keep tabs on your designs as your understanding evolves, you'll understand viscerally what the 'amount of design choices' I wrote above really means.  It can be overwhelming, if you think of it, but going at it systematically, piece by piece, with each design choice having an explanation/justification in your journal and/or documentation, it can be done, and done better than what we have now.

Finally: I for one prefer passionate, honest, detailed posts over dispassionate politically correct smooth-talk.
 
The following users thanked this post: MK14, Jacon, DiTBho, pcprogrammer

Offline Wilksey

  • Super Contributor
  • ***
  • Posts: 1329
Re: A new, hardware "oriented" programming language
« Reply #167 on: November 26, 2022, 02:46:49 am »
When I first saw the title I thought one was referring to an alternative to VHDL / Verilog.
Growing up in the 80's I was a committed user of 6502 and z80 assembler having owned Acorn and Amstrad machines, the BASIC language set was just not cutting it for me.  Cue the 90's and PC's became more available, we still had BASIC (in the form of MS QBASIC and Visual Basic) and the new fangled world of C/C++ (Borland, Microsoft, Watcom etc) and Pascal, and of course, assembler, sure these probably existed before then but in my neck of the woods it wasn't as widely discussed or represented in the likes of magazines until then.

Years later (cut to today) there are many languages available for the PC but chips like microcontrollers and microprocessors still have a few options, BASIC, Pascal or C/C++, even assembler is being less favoured by modern toolchains.

I would be interested to see what a "new" language would look like and what the uptake would be, I think C/C++ has just become the standard target tool when manufacturers bring out a new chip.

It is an interesting idea, and this must have been tried and tested before so why did it not take off, and what would be different to your offering vs a previous failed one.
I don't know if it would gain enough traction to disrupt the current market as C is supported and understood on most if not all microprocessors / microcontrollers.
 
The following users thanked this post: MK14

Offline pcprogrammer

  • Super Contributor
  • ***
  • Posts: 4411
  • Country: nl
Re: A new, hardware "oriented" programming language
« Reply #168 on: November 26, 2022, 06:08:28 am »
What - specifically - do you find problematic with C# ? do you have an actual, real world example you can share?

I have left C# behind me some 20 years ago, basically not long after it came up. Maybe it has improved over time, but for the work I did and do I never needed or wanted to use it. Did work for a company that changed over to it for their front end of the product and they had a hard time getting it to work properly, but that is also long ago.

My origin in programming started with basic and went onto Z80 assembler soon after. Then 8051 and 6502 assembler. Loved it very much because you are in control all the way. Started with C in 1996 and stuck with it ever since, apart from projects where C++, ASP, PHP, javascript or basicscript where needed.

I don't feel the need for a language that protects me from making mistakes.

Edit:

With embedded work it depends on the microcontroller one is using what is useful. With only 20KB of RAM, some memory management can become a pain in the bum, and for most tasks it is known what the needed memory is, and will be setup during programming. No dynamic allocating needed, no garbage collection that often fails, and so on.

When diving into the Internet Of Things this becomes more in play, but then you start to need beefier microcontrollers with more memory to get things done. And sure for a lot of programmers it will be beneficial to have a language that does all the hard work for them, but when shit hits the fan, they will have an even harder time to solve the problem. That is why I, and probably ataradov, like C so much. If you don't want it, you can keep all the "luxury" out the door.

For instance, with the Arduino setup you can write up something quickly and have it work, but the moment you need something special it becomes harder to realize it, because of the lack of transparency. And yes I have looked under the hood of the Arduino system and it is diving into a lot of sub directories and header files to find something specific. Same with the STM32 HAL code. In the end I find it easier to just read the reference manual of a microcontroller and write code for it at the lowest level possible without resorting to assembler.
« Last Edit: November 26, 2022, 06:59:33 am by pcprogrammer »
 
The following users thanked this post: MK14

Offline tggzzz

  • Super Contributor
  • ***
  • Posts: 20768
  • Country: gb
  • Numbers, not adjectives
    • Having fun doing more, with less
Re: A new, hardware "oriented" programming language
« Reply #169 on: November 26, 2022, 09:11:25 am »
Most of the complaining here is not about specific suggestions about language improvements or changes, but more about the fact that someone is actually daring to rock the boat and discuss the subject dispassionately, point out the elephant in the room!

Bullshit.

It's frequently my job to "rock the boat" by proposing changes and improvements to processes, libraries, programming languages, and even new machine code instructions. You can find my name for example in the credits for the RISC-V base instruction set and the B and V extensions.

Yup; my background was similar.

Quote
Your proposed changes address ONE aspect of the problem:

- does the change make a programmer's life slightly more convenient?

You totally ignore every other aspect, such as:

- can the proposed feature be efficiently implemented on the target devices?

That's the one that should kill most language proposals. Too often it doesn't, and the result is Domain Specific Languages rather than Domain Specific Libraries.

DSLanguages almost always grow until they become tangled and incomprehensible. They also lack tool support, which is a vital ecosystem component of widely used languages.

Quote
- is it a significant improvement vs using a library (function, macro) or maybe code generator instead?

- is the product of the improvement and the frequency of use sufficient to justify the incremental cumulative increase in complexity of the language, compiler, manuals, training?

Spot on.

Don't forget that it is difficult to employ staff to work on anything which won't help them to get another job.

However, creating your own language is fun, just as creating your own processor/ISA is fun. But don't expect other people to want to use it!
There are lies, damned lies, statistics - and ADC/DAC specs.
Glider pilot's aphorism: "there is no substitute for span". Retort: "There is a substitute: skill+imagination. But you can buy span".
Having fun doing more, with less
 

Offline tggzzz

  • Super Contributor
  • ***
  • Posts: 20768
  • Country: gb
  • Numbers, not adjectives
    • Having fun doing more, with less
Re: A new, hardware "oriented" programming language
« Reply #170 on: November 26, 2022, 09:26:04 am »
When diving into the Internet Of Things this becomes more in play, but then you start to need beefier microcontrollers with more memory to get things done. And sure for a lot of programmers it will be beneficial to have a language that does all the hard work for them, but when shit hits the fan, they will have an even harder time to solve the problem. That is why I, and probably ataradov, like C so much. If you don't want it, you can keep all the "luxury" out the door.

For instance, with the Arduino setup you can write up something quickly and have it work, but the moment you need something special it becomes harder to realize it, because of the lack of transparency. And yes I have looked under the hood of the Arduino system and it is diving into a lot of sub directories and header files to find something specific. Same with the STM32 HAL code. In the end I find it easier to just read the reference manual of a microcontroller and write code for it at the lowest level possible without resorting to assembler.

Yup.

I used a naked "8-pin Arduino" in my Vetinari clock, because that was the quick way to get an undemanding result.

But when making something that required very low power consumption, I simply wrote C to peek/poke the atmega328 registers. That avoided me having to work out how Arduino libraries might be frustrating me.
There are lies, damned lies, statistics - and ADC/DAC specs.
Glider pilot's aphorism: "there is no substitute for span". Retort: "There is a substitute: skill+imagination. But you can buy span".
Having fun doing more, with less
 
The following users thanked this post: MK14, pcprogrammer

Offline iMo

  • Super Contributor
  • ***
  • Posts: 5263
  • Country: ag
Re: A new, hardware "oriented" programming language
« Reply #171 on: November 26, 2022, 09:47:07 am »
A need for a hw-oriented language? There is the Forth  :D
Readers discretion is advised..
 
The following users thanked this post: newbrain, MK14

Offline brucehoult

  • Super Contributor
  • ***
  • Posts: 4538
  • Country: nz
Re: A new, hardware "oriented" programming language
« Reply #172 on: November 26, 2022, 10:02:32 am »
Why did SiliconWizard's Design a better "C" thread 'not go anywhere'?  It just sprawled around, with individual features and other languages discussed.  In fact, it really showed how complicated and hard it is to do better than C from scratch; with other languages like Ada discussed but nobody knowing exactly why they never got as much traction as C.  Just consider this post by brucehoult about midway in the thread, about how C with its warts and all still maps to different hardware so well.

I note that one of the desires there was "give easy access to the flags". I pointed out that many ISAs don't even *have* flags. A lot more people in this group have been gaining experience with such CPUs in the 16 months since that post :-) But even on ISAs that have flags, there are big differences. The DEC/Motorola/ARM group have a family similarity, with differences, but the Intel family is really quite different.

Access to the NZ flags (SZ on Intel) is trivial, so when people say they want access to the flags what they usually mean is carry (C,CY), sometimes overflow (V,OF), rarely parity or nibble carry (both on Intel).

Even for the people who just want Carry, I highly doubt they are prepared for the different implementations of the carry flag with respect to subtraction and compare.

And it's simply not hard to express what you ACTUALLY WANT in portable C anyway. The compiler will automatically use the flags where available.

Code: [Select]
unsigned long add_carry(unsigned long x, unsigned long y, unsigned long a, unsigned long b){
    return (a+b) < a ? x+y+1 : x+y;
}

x86:

Code: [Select]
        mov     rax, rdi
        add     rdx, rcx
        adc     rax, rsi
        ret

Exactly what you'd hope for.

ARMv7:

Code: [Select]
        cmn     r2, r3
        it      cs
        addcs   r1, r1, #1
        add     r0, r0, r1
        bx      lr

The code could be better, actually, but at least branch-free.

ARMv8:

Code: [Select]
        cmn     x2, x3
        adc     x0, x1, x0
        ret

Fine.

RISC-V:

Code: [Select]
        not     a2, a2
        sltu    a2, a2, a3
        add     a0, a0, a1
        add     a0, a0, a2
        ret

No flags needed. Branch-free too.

Interesting that both ARM versions and RISC-V all chose to use "compare with negative" instead of adding, but that works too.


 
The following users thanked this post: newbrain, MK14, Nominal Animal, DiTBho

Offline tggzzz

  • Super Contributor
  • ***
  • Posts: 20768
  • Country: gb
  • Numbers, not adjectives
    • Having fun doing more, with less
Re: A new, hardware "oriented" programming language
« Reply #173 on: November 26, 2022, 10:10:05 am »
A need for a hw-oriented language? There is the Forth  :D

Once, 40 years ago, I had a couple of weeks to bring up a new peripheral created by someone else. In order to reduce the amount of typing, I implemented a quick and very dirty macro expansion tool. It grew a bit, was sufficient for the job, but the limitations were already becoming apparent. Fortunately it could be thrown away after the job ended, and was.

I learned two things, which have stood the test of time:
  • tools need to be expandable when they hit their limits. Scrottly little domain specific languages always become cancerous through overgrowth
  • I should have used an existing tiny language sufficient for that job: Forth

I've seen that first lesson repeated (not by me!) several times over the succeeding decades.
There are lies, damned lies, statistics - and ADC/DAC specs.
Glider pilot's aphorism: "there is no substitute for span". Retort: "There is a substitute: skill+imagination. But you can buy span".
Having fun doing more, with less
 

Offline tggzzz

  • Super Contributor
  • ***
  • Posts: 20768
  • Country: gb
  • Numbers, not adjectives
    • Having fun doing more, with less
Re: A new, hardware "oriented" programming language
« Reply #174 on: November 26, 2022, 10:12:19 am »
Even for the people who just want Carry, I highly doubt they are prepared for the different implementations of the carry flag with respect to subtraction and compare.

And it's simply not hard to express what you ACTUALLY WANT in portable C anyway. The compiler will automatically use the flags where available.

Bignums aren't easy in portable C :)
There are lies, damned lies, statistics - and ADC/DAC specs.
Glider pilot's aphorism: "there is no substitute for span". Retort: "There is a substitute: skill+imagination. But you can buy span".
Having fun doing more, with less
 


Share me

Digg  Facebook  SlashDot  Delicious  Technorati  Twitter  Google  Yahoo
Smf