Leaving ubiquity and popularity and availability aside, what is it about C that give it any kind of status as a systems programming language?
I think it was the concept of a pointer *, which often resolved to a memory address, and that address could be just about anything. Fun things like **thing were simple to write (but maybe not so simple to understand). Also the inclusion of all bit operators and/or/xor/shifts. The standard library includes system type functions like malloc(). A lot of C maps well to machine instructions. And the lack of any nanny code like array bound checking made it fast (but dangerous).
It persists because it is still dangerous. Many languages try to avoid the danger and impose rules that are 95% helpful and 5% frustrating.
In my opinion.
Well, pointers were there in languages that influenced the design of C.
To take a less well known example, consider Algol68. INT x states x is an immutable integer, REF INT x states x is a reference to an integer which is variable, REF REF INT state x is a pointer to an integer. In Algol 68 the types [10]INT, [10]REF INT, REF[10]INT, REF[10]REF INT are all distinct types and all useful. They closely correspond to the types declared by these C declarations:
typedef const int i[10]; typedef const int *ir[10]; typedef int const *ri[10]; typedef int const *rir[10]; http://www.cap-lore.com/Languages/aref.html
I wonder if the OP realises that Algol68 has many of the features he deems desirable, e.g. "if" can be a variable, and it has symbol set (with alternatives) that work on the various different character sets available on computers at the time. And those are just the start. https://opensource.com/article/20/6/algol68 Truly, Algol is an improvement on most of its successors.
I suspect the OP is unaware of https://rosettacode.org/wiki/Rosetta_Code which gives the same program in many different languages. Examples: there are 68 implementations of Dijkstra's algorithms, and bitwise operations implemented in 144 languages.
That really ought to give him something to "compare and contrast" before committing to something for his language.
Algol68 was the language that most influenced PL/I which I've spoken about at length here, there was also a build of it that accepted Russian keywords, I wonder if you were aware of that.
I was aware of the Russian Algol68 variant thanks. PL/1 never made an impression on this side of the pond. The nearest was a brief flurry of interest in PL/M in the late 70s.
Are you claiming there were no computer installations in the UK that used PL/I?
You have seemed to be interested in expressing bit operations. Which of the 144 examples are you thinking of following. Do you prefer the VHDL or Smalltalk concepts? If none of the 144, what will be in your 145th variant?
I'm considering symbolic
operators for left/right logical shift, right arithmetic shift, rotate left and rotate right, and possibly population count, is that what you're asking me?
I have to ask, politely, would you please stop insinuating that I am "unqualified" in some way or other, to discuss this subject? Repeatedly making disparaging remarks and insulting comments is really not what I expect in any forum like this.
It isn't necessarily disparaging to regard someone as unqualified; we have all been unqualified. The key point is what any such person does or does not do to become more qualified. "Learning the literature/history" is necessary.
The key point is that you are conflating (no doubt intentionally) one's certifications with one's argument. You attack a person's arguments or ideas not by counter arguments but by insults and demeaning comments. One doesn't attack or undermine an argument by attacking or undermining the person presenting the argument. This is a common fallacy committed by inexperienced or disingenuous debaters, those who prefer rhetoric to logic.
If I made some claim to a mathematician, developed some proof of some conjecture, the mathematician would never taint their evaluation of said proof by judging my personality, qualifications, what literature I might have read, what experiences I might have had, he'd argue, respond, on the basis of facts and logic, consider this from Noam Chomsky:
In my own professional work I have touched on a variety of different fields. I've done my work in mathematical linguistics, for example, without any professional credentials in mathematics; in this subject I am completely self-taught, and not very well taught. But I've often been invited by universities to speak on mathematical linguistics at mathematics seminars and colloquia. No one has ever asked me whether I have the appropriate credentials to speak on these subjects; the mathematicians couldn't care less. What they want to know is what I have to say. No one has ever objected to my right to speak, asking whether I have a doctor's degree in mathematics, or whether I have taken advanced courses in the subject. That would never have entered their minds. They want to know whether I am right or wrong, whether the subject is interesting or not, whether better approaches are possible—the discussion dealt with the subject, not with my right to discuss it.
Emphasis mine.
You might want to give his words some serious thought, perhaps get more familiar with the literature as it were.
I went to the trouble in my very first post to mention that I was experienced with compiler design and development, I anticipated that some might be skeptical, that a language is a large undertaking and not to be treated glibly, I know all that and therefore summarized what I had done before.
Yet you continue to insinuate I'm some kind of idiot.
Implementing a compiler and selecting which concepts to use/discard when creating a language have very little in common.
How did you establish that view? it seems to be an opinion and an imprecise one at that. It also now seems you
don't doubt my capacity to design and build a real working compiler then, but you doubt my ability to "select concepts" to include in a new language, I can select whatever I want to select though, I started the thread.
From your statements in this thread, it appears that your experience is limited to a relatively narrow subset of the conceptual types of languages in existence[1].
Which of my "statements" are you interpreting that way? This sounds like a "No true Scotsman" argument in the making frankly.
You fail to engage with suggestions that would widen (possibly beneficially) your understanding. For example, you've completely ignored the possibility of your language including one facility that is very useful in resource-constrained embedded systems: fixed point arithmetic.
Would you like me to show you the posts where I advocated fixed point binary and decimal types and arithmetic? perhaps you overlooked them, if I showed them to you they'd
prove you wrong on this specific point wouldn't they? Seriously, just say the word and I'll show you those posts.
I don't remember you discussing how multicore/multithread parallel processing will be expressed in your language; that will be critically important in the future.
If you want to discuss that then do so, have you asked me specifically about this area and been ignored? Why do you think I should be raising things that you regard as relevant, just raise them. Berating me for not raising points that you want to raise seems frankly pathetic.
I try to become less inexperienced in the things that matter to me. Knowing how and when to use languages that are fundamentally and conceptually very different is one such case. OTOH, implementing a compiler is completely uninteresting to me.
So just to be very clear
you've never personally designed and written lexical analyzers, grammars, parsers, code generators or optimizers then?
[1] e.g. C, Pascal, PL/M, Delphi, Coral-66 are all procedural languages with little difference between them. Conceptually very different languages are VHDL, LISP, Smalltalk, Forth, OCCAM/xC etc etc.
I think you meant that the former are "imperative" languages, but no matter, different languages are different, I'm glad you brought that to my attention.