Author Topic: Does anybody learn C any more?  (Read 38879 times)

0 Members and 1 Guest are viewing this topic.

Online coppercone2

  • Super Contributor
  • ***
  • Posts: 10840
  • Country: us
  • $
Re: Does anybody learn C any more?
« Reply #25 on: August 18, 2019, 06:25:31 pm »
It's 2019, people!
C is obsolete, we gotta learn LISP Smalltalk Java Haskell Intercal Rust :scared:

c-sharp ??
 

Offline Mechatrommer

  • Super Contributor
  • ***
  • Posts: 11713
  • Country: my
  • reassessing directives...
Re: Does anybody learn C any more?
« Reply #26 on: August 18, 2019, 06:40:42 pm »
It's 2019, people!
C is obsolete, we gotta learn LISP Smalltalk Java Haskell Intercal Rust :scared:
why in the 2019, if they have to invent a new language, why they have to bring the fucking ";" as linefeed along? chr(13) and chr(10) have been used for millenia as linefeed and carriage return, a relic brought down from C/C++ :palm: everything else is a good thing in C/C++ except this (this make C is 99.99% my favourite language) why?
The use of a statement delimiter predates C by a long way. It was introduced because of problems found in early languages, like Fortran, where the end of the line meant the end of the statement. Fortran fixed this with...
for new language... no excuse, regardless... many new languages such as Phyton dont require ";", the cute, lovely and sweety Basic never requires this since decades... any OS will use some sort of CR and/or LF that can be interpreted as ";". for end of statement that can be fixed with a single "END" syntax (or whatever) for a unit program, not thousands of ";" in each line we have to type... real inefficient... i can accept ";" as separator between "lines" of code in a single line of text editor, or clarity if anybody chooses to... but not ";" + CR/LF requirement when i dont want it. ";" + CR/LF can easily interpreted as few extra empty line(s)...

ps: currently programming in RobotC for something serious... :P
Nature: Evolution and the Illusion of Randomness (Stephen L. Talbott): Its now indisputable that... organisms “expertise” contextualizes its genome, and its nonsense to say that these powers are under the control of the genome being contextualized - Barbara McClintock
 

Offline coppice

  • Super Contributor
  • ***
  • Posts: 9559
  • Country: gb
Re: Does anybody learn C any more?
« Reply #27 on: August 18, 2019, 07:18:09 pm »
It's 2019, people!
C is obsolete, we gotta learn LISP Smalltalk Java Haskell Intercal Rust :scared:
why in the 2019, if they have to invent a new language, why they have to bring the fucking ";" as linefeed along? chr(13) and chr(10) have been used for millenia as linefeed and carriage return, a relic brought down from C/C++ :palm: everything else is a good thing in C/C++ except this (this make C is 99.99% my favourite language) why?
The use of a statement delimiter predates C by a long way. It was introduced because of problems found in early languages, like Fortran, where the end of the line meant the end of the statement. Fortran fixed this with...
for new language... no excuse, regardless... many new languages such as Phyton dont require ";", the cute, lovely and sweety Basic never requires this since decades... any OS will use some sort of CR and/or LF that can be interpreted as ";". for end of statement that can be fixed with a single "END" syntax (or whatever) for a unit program, not thousands of ";" in each line we have to type... real inefficient... i can accept ";" as separator between "lines" of code in a single line of text editor, or clarity if anybody chooses to... but not ";" + CR/LF requirement when i dont want it. ";" + CR/LF can easily interpreted as few extra empty line(s)...

ps: currently programming in RobotC for something serious... :P
The terminator for a C statement is ";", not ";" + CR/ LF.
 

Offline Kleinstein

  • Super Contributor
  • ***
  • Posts: 14859
  • Country: de
Re: Does anybody learn C any more?
« Reply #28 on: August 18, 2019, 08:01:35 pm »
With µCs C is still pretty much standard. Due to GCC one can use ADA or maybe even FORTRAN if one really wants, but not many do that. One can use C++ on larger µCs, but it is still slightly tricky to really use the C++ features and not just the slightly more stringent checks and still use C like code.

Todays C99 or ISO C is quite different from the original C that was used to write the early UNIX.

I don't like C very much, but it's an common evil one has to live with, just like windows or x86. Still around though technically full of  :palm: :wtf: |O.
 

Online SiliconWizard

  • Super Contributor
  • ***
  • Posts: 15439
  • Country: fr
Re: Does anybody learn C any more?
« Reply #29 on: August 18, 2019, 08:19:56 pm »
one can use ADA or maybe even FORTRAN if one really wants, but not many do that

ADA yes, FORTRAN, I've never seen it used on MCUs. If you have any related project in mind, I'd be curious to see that. ;D
I don't remember much of FORTRAN, but I'd guess it would not be easy to directly access memory/registers with it. If there is a way, again I'd be curious.

I don't like C very much, but it's an common evil one has to live with, just like windows or x86. Still around though technically full of  :palm: :wtf: |O.

Well, not to get into yet another fruitless language war, but I don't really see the point in "still around'. It's a tool and it does the job.
Forks are still around as eating utensils, should we use connected bluetooth things instead to put food inside our mouth? Hammers are still around to bang nails, does that make them obsolete? ;D
« Last Edit: August 18, 2019, 08:21:42 pm by SiliconWizard »
 

Offline techman-001

  • Frequent Contributor
  • **
  • !
  • Posts: 748
  • Country: au
  • Electronics technician for the last 50 years
    • Mecrisp Stellaris Unofficial UserDoc
Re: Does anybody learn C any more?
« Reply #30 on: August 18, 2019, 08:29:37 pm »
For the hobbyist, C was the first higher level language available on early 8080 and Z80 machines.  So, no, I didn't learn it in the last few years.  I have been using it for nearly 40 years (starting around '80).  I preferred Pascal but that came along a bit later.

I would still be inclined to use C as a language for embedded programming simply because I don't believe in dynamic memory allocation (heap) on memory limited embedded processors.  I can avoid the standard string functions (that use a heap) and everything works out fine.

C and Fortran are the only languages I actually use.  I have tried, off and on, over the last 20 years of so to develop an interest in C++ and I just can't quite 'get it'.  It's not the mechanics of the language (well, ok, it is), it's the 'why' I can't wrap my head around.  Same with Python.  I want to like it, I admire the built in structures, but then they bastardize the syntax with indent levels.  Not that I don't indent code levels, I just want some kind of delimiters like '{' and '}'.

I'm just too old to learn new stuff and, given that the old stuff is still in use, have no real desire to push beyond my comfort zone.  It's not like I need (or even want) a job.

I can relate to the languages you describe in much the same way being a 'old timer' myself. I also liked Fortran and Cobol when I tried them about 20 years ago.

However C wasn't the first higher level language available on early 8080 and z90 machines  ( as also mentioned by Rick Law in a newer post):

https://www.forth.com/resources/forth-programming-language/
In 1976, Robert O. Winder, of RCA’s Semiconductor Division engaged FORTH, Inc. to implement Forth on its new CDP-1802 8-bit microprocessor [Rather, 1976b], [Electronics,1976]. The new product, called “microFORTH,” was subsequently implemented on the Intel 8080, Motorola 6800 and Zilog Z80, and sold by FORTH, Inc. as an off-the-shelf product. microFORTH was successfully used in numerous embedded microprocessor instrumentation and control applications in the United States, Britain and Japan.
« Last Edit: August 18, 2019, 08:31:32 pm by techman-001 »
 

Offline 0culus

  • Super Contributor
  • ***
  • Posts: 3032
  • Country: us
  • Electronics, RF, and TEA Hobbyist
Re: Does anybody learn C any more?
« Reply #31 on: August 18, 2019, 08:30:48 pm »
Long live C! Long live buffer overflows!  :-DD
 

Offline coppice

  • Super Contributor
  • ***
  • Posts: 9559
  • Country: gb
Re: Does anybody learn C any more?
« Reply #32 on: August 18, 2019, 08:36:57 pm »
For the hobbyist, C was the first higher level language available on early 8080 and Z80 machines.  So, no, I didn't learn it in the last few years.  I have been using it for nearly 40 years (starting around '80).  I preferred Pascal but that came along a bit later.
...
...

If you consider BASIC, FORTH high level languages, both were available for CPM on 8080.  z80 was a cloned (functional clone, not hardware clone) of 8080 with much enhancement over the 8080, so 8080 predates z80.  I had BASIC on my SOL20 (8080) in 1977 before C was available for the 8080, CPM BASIC was the standard then. 

I don't rule out that C was out there for 8080 before 77, but I was religiously reading the main available computer magazine at the time - "Byte Magazine" and I don't recall any advertisement of C compilers out there.  Anything available on CPM would likely be before DOS.

Altair BASIC was used by Microsoft to make Altair-Microsoft BASIC in 1975 before DOS existed. Microsoft did not make MS-DOS until IBM contracted them to create DOS for the IBM-PC (which was 8088 a mixed 8/16 bit improvement of 8080 which was 8 bit only).  That was the time when MS purchased OS from Seattle Computer Products and made it into DOS for IBM-PC.  IBM PC with PC DOS was introduced in 1981.  So Microsoft BASIC pre-dates MS-DOS and PC-DOS.

Running with under 64K ram that the PC can use for DOS, any C compiler back than were primitive.  You cannot consider C then to be anything like C later when "extended memory" came into play.  I think the Greenleaf C compiler was the first "real" industry standard C compiler for PC.  It later was purchased by Microsoft as the first version of Microsoft C which then became the industry standard.

Now if you are talking about "Structured Language", I agree with you.  BASIC and FORTH are high level languages, but not Structured Languages.  Borland Pascal came much later than C.
The first high level language available on most early microprocessors was a stripped down form of PL/1. PL/1 was really gaining traction in the early 70s, but was too big and complex a language to be fully implemented on small machines. So, Motorola, Intel and a number of others came up with MPL, PL/M and various other names for their own stripped down PL/1 dialect.
 

Offline magic

  • Super Contributor
  • ***
  • Posts: 7246
  • Country: pl
Re: Does anybody learn C any more?
« Reply #33 on: August 18, 2019, 09:45:44 pm »
Well, not to get into yet another fruitless language war, but I don't really see the point in "still around'. It's a tool and it does the job.
Forks are still around as eating utensils, should we use connected bluetooth things instead to put food inside our mouth?
They both work but one has surely done more accidental damage than the other :D
The entirety of <string.h> for example should have never been created. I would really prefer if everybody had his own string library, at least life would be harder for hackers. And maybe even a precious few of them wouldn't be as bad as the standard one.

Or the syntax of declarations, it's horrible. Just recently I thought I was very smart because I knew how to declare a function returning int[2], but it didn't work. "Expected something something", very helpful, thank you. I spent a minute trying different combinations of parentheses even though I was quite sure I got it right the first time. Finally gave up, went to the Internet, turns out that fixed size arrays are the only type which cannot actually be returned from a function :wtf:
 

Offline rstofer

  • Super Contributor
  • ***
  • Posts: 9937
  • Country: us
Re: Does anybody learn C any more?
« Reply #34 on: August 18, 2019, 09:59:10 pm »
PL/M was available to the well-financed, not to hobbyists.  Digital Research did release PL/I for the CP/M system but it was a good deal after C.  Think BDS C in '79 versus PL/I in 1983.  That was long after I wrote an 8080 Assembler using PL/I for a grad school requirement ('75).  IBM 360/45 - a truly grim machine but affordable at the time.

https://en.wikipedia.org/wiki/BDS_C
https://winworldpc.com/product/digital-research-pl-i-compiler/1x

FWIW, the PL/M compiler was written in Fortran as a cross compiler.

Yes, Altair Basic was available MUCH earlier - like when the machine was released in '75 - but Basic wasn't what I would consider a high level language or a reasonable application language (by definition, you had to release source although later on there were some true compilers).  There was also the requirement to actually purchase overpriced MITS memory boards in order to qualify to purchase Altair Basic.  I still have Bill Gates' article in a user group newsletter ranting about people stealing his software.

Let's not forget Li-Chen Wang's Palo Alto Tiny Basic ('75).  When I brought up a new 8085 system, this was one of the first things I would port.  The Intel Monitor was useful but I could get more done with Tiny Basic.

https://en.wikipedia.org/wiki/Li-Chen_Wang

UCSD Pascal came out around 1977 but I didn't get a copy until 1980.  It was a much more practical language for business applications.  It's still a terrific language and I'm happy to see it included in 2.11BSD for the PiDP11/70 project

Things were pretty fluid in the late '70s and early '80s.  Byte Magazine was important but Dr Dobbs was a  more definitive source for what was happening.

I got my Altair in '76, shortly before finishing grad school.  A couple of years later I had a home-built floppy disk controller (based on Western Digital FD1771 chip) and dual drives.  The datasheet says April '79, ok, I was an early adopter.  I played with CP/M for a very long time, made some money writing custom BIOSes for folks and, in fact, I have it running on a 50 MHz Z80 with all 16 logical drives and most of the toys.  I also got heavily involved with UCSD Pascal around '80.  You would be amazed at how well CP/M 2.2 runs when clocked at 50 MHz!

It's been an interesting ride over the last 40 years.

All of the above in the context of microcomputers.  For mainframes:  PL/I was introduced in 1964, COBOL in 1959, Algol in 1958 and FORTRAN in 1957.  C is a newcomer, first released in 1972.  Just a "johnny come lately" entry to computer languages.  Let's not give it more credit than it's due.
« Last Edit: August 18, 2019, 10:06:33 pm by rstofer »
 
The following users thanked this post: techman-001

Offline Mechatrommer

  • Super Contributor
  • ***
  • Posts: 11713
  • Country: my
  • reassessing directives...
Re: Does anybody learn C any more?
« Reply #35 on: August 18, 2019, 11:29:46 pm »
Long live C! Long live buffer overflows!  :-DD
and the lazy bunch of programmers aka codewarrior speed demon wannabe. which can easily be overcomed with custom array/linked list/tree class. python or java just have overbloated version of that... 2 transactions per second on 6GHz 4-12 cores CPU cough cough. as people ask for more power due to their simulation and graphics lagging, and the programmer gives excuse that its a math intensive algorithm... long live the CPU ::)
Nature: Evolution and the Illusion of Randomness (Stephen L. Talbott): Its now indisputable that... organisms “expertise” contextualizes its genome, and its nonsense to say that these powers are under the control of the genome being contextualized - Barbara McClintock
 

Offline djacobow

  • Super Contributor
  • ***
  • Posts: 1169
  • Country: us
  • takin' it apart since the 70's
Re: Does anybody learn C any more?
« Reply #36 on: August 18, 2019, 11:42:18 pm »
I just joined a new company as a mixed hw / embedded firmware guy, and I was disappointed to find out their codebase is 100% straight C.

I've been using C++ for embedded projects for years, and though I don't really like C++ as an application programming language, it's actually pretty great for embedded. The key is to restrict yourself in what parts of C++ you use. Stay way from most of the STL and dynamic memory allocation in general. But templates instead of macros? Heck yeah. Default function arguments? Yes. Classes? Sure, when it is helpful. consts rather than defines? For sure.

« Last Edit: August 19, 2019, 03:01:04 am by djacobow »
 

Offline 0culus

  • Super Contributor
  • ***
  • Posts: 3032
  • Country: us
  • Electronics, RF, and TEA Hobbyist
Re: Does anybody learn C any more?
« Reply #37 on: August 19, 2019, 12:14:34 am »
Long live C! Long live buffer overflows!  :-DD
and the lazy bunch of programmers aka codewarrior speed demon wannabe. which can easily be overcomed with custom array/linked list/tree class. python or java just have overbloated version of that... 2 transactions per second on 6GHz 4-12 cores CPU cough cough. as people ask for more power due to their simulation and graphics lagging, and the programmer gives excuse that its a math intensive algorithm... long live the CPU ::)

Lazy? Buffer overflows are literally the #1 cause of vulnerabilities, have been for decades, and C is literally responsible for it. We can do a lot better and still be fast, e.g. Rust.

I'm not saying C is all bad, but when you start composing complex systems out of it (such as an OS kernel), you are asking for trouble. Because programmers, even the best ones...and this might blow your mind...are humans.
 

Offline hamster_nzTopic starter

  • Super Contributor
  • ***
  • Posts: 2812
  • Country: nz
Re: Does anybody learn C any more?
« Reply #38 on: August 19, 2019, 12:23:04 am »
I just joined a new company as a mixed hw / embedded firmware guy, and I was disappointed to find out their codebase is 100% straight C.

I've been using C++ for embedded projects for years, and though I don't really are for C++ as an application programming language, it's actually pretty great for embedded. The key is to restrict yourself in what parts of C++ you use. Stay way from most of the STL and dynamic memory allocation in general. But templates instead of macros? Heck yeah. Default function arguments? Yes. Classes? Sure, when it is helpful. consts rather than defines? For sure.

I mostly agree with the "C with a few helpful bits from C++" philosophy  - what are your thoughts on using virtual functions, virtual classes and base clases/inheritance in general for embedded work? I flipflop between liking and hating it.

Dynamic memory allocation is pretty hard to avoid if you get passed external data to process (e.g. my current joy is JSON files). My own view is you are better off developing a nice "the way we do things here" standard, so everybody is in agreement of what patterns are good, and what patterns are bad, allowing oversights to be spotted quickly. Also having tools to see what is going on is good - e.g. a custom wrapped malloc()/free() that lets you know everything during development is a good idea, IMO.
Gaze not into the abyss, lest you become recognized as an abyss domain expert, and they expect you keep gazing into the damn thing.
 

Offline rhodges

  • Frequent Contributor
  • **
  • Posts: 339
  • Country: us
  • Available for embedded projects.
    • My public libraries, code samples, and projects for STM8.
Re: Does anybody learn C any more?
« Reply #39 on: August 19, 2019, 12:45:49 am »
I just joined a new company as a mixed hw / embedded firmware guy, and I was disappointed to find out their codebase is 100% straight C.
I would be glad to read your reasons for this. I usually read the assembly output from my code to "trust but verify" it is correct. How do you verify that your C++ output is correct? Is it your experience with the compiler that helps you see the patterns that you see as good code?

My limited experience with embedded C++ is with KDE, where one of our programmers had a performance issue. I looked into the code, and there were several similar, but slightly different code paths, and they busted the I-Cache. A simple C function with ONE code path would have reduced the cache problem.

In the embedded world,. LOOK AT YOUR ASSEMBLY CODE when time is an issue. Of course, before writing embedded code, READ AND KNOW the architecture.
Currently developing STM8 and STM32. Past includes 6809, Z80, 8086, PIC, MIPS, PNX1302, and some 8748 and 6805. Check out my public code on github. https://github.com/unfrozen
 

Offline Rick Law

  • Super Contributor
  • ***
  • Posts: 3487
  • Country: us
Re: Does anybody learn C any more?
« Reply #40 on: August 19, 2019, 01:12:28 am »
If you have picked up plain old ANSI C in the last few years, how and why did you learn it?

I originally learnt C as it was the standard compiled 3GL for PC programming during the 16-bit era, (along with with maybe Pascal). I still use it often and respect it as being the swiss army knife of programming languages.

What would drive people to take the leap from the highly abstracted world of things like Python to the 'madness' of C today?

It can't just be Arduino and embedded development...

(I'm currently writing a REST API service in C for only one reason - performance. Python gave 2 transactions per second, my C implementation gives 200... Most likely with a bug count to match!)

Now that I have unloaded on some of the issues I disagreed with...Good time to segway back to the OP.

I learn C by myself but I have learned FORTRAN in college.

Here is why:
By the time I learn C, I have learned a few other language by myself including assembler and ALGOL.  I learn C because it has the clean Structured Language construct like ALGOL, but I can't find ALGOL for a small machine at the self-paid purchase price level affordable to a student.  I was really looking at only C, BASIC, FORTH as choices due to budget.    In my experience then, Structured Language is a lot easier to develop and maintain over unstructured ones particularly when requirements changes.  With C being a Structured Language and light with resource requirement during run time, it was the best choice.  My programming work was not directly work/study related but for fun.  I do FORTH and BASIC just for the hack of it but none with apps I like to keep.

By the time budget Pascal (Borland) became available, I have no need for it.  Beside, I feel it was too resource heavy.

I did a few year (5 < n <10) programming professionally (mostly with C) as a path to management.  Programming continued to be something I did for fun when time permits.

I've learned C++ sometime along my path, but frankly I found it too resource wasteful.  With my background (being used to resource constrains), I use C++ when I must, but avoid it when I can - and I included "reuse" and maintenance as part of the decision process of C vs C++.  I think if you are a good C programmer, there is nothing to gain using C++ over C.
 

Offline rsjsouza

  • Super Contributor
  • ***
  • Posts: 6071
  • Country: us
  • Eternally curious
    • Vbe - vídeo blog eletrônico
Re: Does anybody learn C any more?
« Reply #41 on: August 19, 2019, 01:48:19 am »
On the note of structured/unstructured languages, I recall how cool it was the first time I saw Borland's Turbo Basic 1.0. That was pretty cool, although at that time I already had my feet deep into Pascal.
Vbe - vídeo blog eletrônico http://videos.vbeletronico.com

Oh, the "whys" of the datasheets... The information is there not to be an axiomatic truth, but instead each speck of data must be slowly inhaled while carefully performing a deep search inside oneself to find the true metaphysical sense...
 

Offline djacobow

  • Super Contributor
  • ***
  • Posts: 1169
  • Country: us
  • takin' it apart since the 70's
Re: Does anybody learn C any more?
« Reply #42 on: August 19, 2019, 03:08:29 am »
I just joined a new company as a mixed hw / embedded firmware guy, and I was disappointed to find out their codebase is 100% straight C.

I've been using C++ for embedded projects for years, and though I don't really are for C++ as an application programming language, it's actually pretty great for embedded. The key is to restrict yourself in what parts of C++ you use. Stay way from most of the STL and dynamic memory allocation in general. But templates instead of macros? Heck yeah. Default function arguments? Yes. Classes? Sure, when it is helpful. consts rather than defines? For sure.

I mostly agree with the "C with a few helpful bits from C++" philosophy  - what are your thoughts on using virtual functions, virtual classes and base clases/inheritance in general for embedded work? I flipflop between liking and hating it.

Dynamic memory allocation is pretty hard to avoid if you get passed external data to process (e.g. my current joy is JSON files). My own view is you are better off developing a nice "the way we do things here" standard, so everybody is in agreement of what patterns are good, and what patterns are bad, allowing oversights to be spotted quickly. Also having tools to see what is going on is good - e.g. a custom wrapped malloc()/free() that lets you know everything during development is a good idea, IMO.

There are times you can't avoid it, but you can do stuff to restrain the pain. One thing I have done is force libraries to use a pool I have pre-allocated. At least when I'm done with that library, I can reclaim the pool. JSON is a pain, I agree, but I have to say it makes development on the "other" side so much better, that I'm happy to take the hit on the embedded side. There are some tiny low-footprint json libraries out there. However, years ago where memory was super tight. I wrote a simple library to parse JSON strings "in situ" with no tree structure created anywhere. Every time you wanted something from the JSON stream, you had to rescan the string, but in our case, performance wasn't so important. The library is here, haven't used it in years. It's not heavily tested, but worked for my purposes. (https://github.com/djacobow/djs)

I'm working on a personal project right now on 8bit AVR where I have made use of inheritance and virtual functions, and so far ... no regrets. The alternative was function pointers and this is just the same thing, but neater. Well, maybe not neater as I think the syntax is probably more typing, not less. But I get nice type checking and a warm feeling that everything fits together.
 

Offline djacobow

  • Super Contributor
  • ***
  • Posts: 1169
  • Country: us
  • takin' it apart since the 70's
Re: Does anybody learn C any more?
« Reply #43 on: August 19, 2019, 03:12:40 am »
I just joined a new company as a mixed hw / embedded firmware guy, and I was disappointed to find out their codebase is 100% straight C.
I would be glad to read your reasons for this. I usually read the assembly output from my code to "trust but verify" it is correct. How do you verify that your C++ output is correct? Is it your experience with the compiler that helps you see the patterns that you see as good code?

My limited experience with embedded C++ is with KDE, where one of our programmers had a performance issue. I looked into the code, and there were several similar, but slightly different code paths, and they busted the I-Cache. A simple C function with ONE code path would have reduced the cache problem.

In the embedded world,. LOOK AT YOUR ASSEMBLY CODE when time is an issue. Of course, before writing embedded code, READ AND KNOW the architecture.

I will sometimes compile with -S to see what the compiler is doing, and am more likely to do it for performance-critical code, but I'll be honest and say I don't do it that much these days. Most of the code is not performance critical, and my productivity is more important. Of course, you never know for sure that a compiler is doing what you told it to do (or even what you thought you told it to do), but I'm not sure that's a solid reason not to use compilers and more advanced languages. Of course, unit tests help build confidence in the output. I also will cross-compile on my host when I can and run unit tests in a hosted environment, if it's not too much trouble to get working. Of course, this helps you with correctness of your code, but it doesn't help at all if the target compiler is screwing up what your host compiler doesn't.
 

Offline thermistor-guy

  • Frequent Contributor
  • **
  • Posts: 390
  • Country: au
Re: Does anybody learn C any more?
« Reply #44 on: August 19, 2019, 03:14:44 am »
...
The first high level language available on most early microprocessors was a stripped down form of PL/1. PL/1 was really gaining traction in the early 70s, but was too big and complex a language to be fully implemented on small machines. So, Motorola, Intel and a number of others came up with MPL, PL/M and various other names for their own stripped down PL/1 dialect.

I have fond memories of developing embedded code in PLM80 on 80C51 family devices and variants. The compiler was very predictable, so, with experience, you could write the source in a way that gave the execution times you needed.

Right now I'm learning Python, for a quantitative finance application. C# and Python seem to be pervasive in that world.

I'd happily go back to C for embedded applications, if it felt like the right tool for the job.
 

Offline magic

  • Super Contributor
  • ***
  • Posts: 7246
  • Country: pl
Re: Does anybody learn C any more?
« Reply #45 on: August 19, 2019, 07:20:12 am »
All of the above in the context of microcomputers.  For mainframes:  PL/I was introduced in 1964, COBOL in 1959, Algol in 1958 and FORTRAN in 1957.  C is a newcomer, first released in 1972.  Just a "johnny come lately" entry to computer languages.  Let's not give it more credit than it's due.
C deserves credit for being the first portable macro assembler and enabling probably the first portable operating system. It also did away with all those stupid special statements and commands which plague old imperative languages, replacing them with plain functions, written in C. You can implement full C standard library in C, try that with Pascal's writeln. The languages you listed really are DSLs in comparison, I would never call COBOL or FORTRAN a "general purpose" language. Maybe Algol, if it's true what they say about similarities to Pascal, particularly the kinds of Pascal that have been adapted to low-level programming by adding pointers and whatnot.

Buffer overflows are literally the #1 cause of vulnerabilities, have been for decades, and C is literally responsible for it. We can do a lot better and still be fast, e.g. Rust.

I'm not saying C is all bad, but when you start composing complex systems out of it (such as an OS kernel), you are asking for trouble.
As usual, no C thread is safe from the Rust Evangelism Strikeforce.

Here's a quick one for you fanboys: I have a billion of short-lived lived heap objects which need to be indexed in a few long-lived trees/hashmaps, by reference of course, I'm not going to make ten copies of each.

How does Rust help me prevent dangling references, other than "hire sufficiently competent programmers, use unsafe and employ some code review and unit testing, as you would in C++, but please teach them this new esoteric language because C++ is so last year".

programmers [...] are humans
You would be surprised. Some are robots.
« Last Edit: August 19, 2019, 07:25:55 am by magic »
 

Online SiliconWizard

  • Super Contributor
  • ***
  • Posts: 15439
  • Country: fr
Re: Does anybody learn C any more?
« Reply #46 on: August 19, 2019, 02:17:23 pm »
Well, not to get into yet another fruitless language war, but I don't really see the point in "still around'. It's a tool and it does the job.
Forks are still around as eating utensils, should we use connected bluetooth things instead to put food inside our mouth?
They both work but one has surely done more accidental damage than the other :D

And you may be surprised which it is. :D

https://www.cpsc.gov/s3fs-public/hazard_housewares.pdf
 

Offline legacy

  • Super Contributor
  • ***
  • !
  • Posts: 4415
  • Country: ch
Re: Does anybody learn C any more?
« Reply #47 on: August 19, 2019, 03:05:52 pm »
Because programmers, even the best ones...and this might blow your mind...are humans.

You are right. The only reason why I still use C is that can "do Rust" only on x86 (and maybe on Arm), while you can "do C" on MIPS (llvm is weak), HPPA (llvm is unsupported) and PPC (llvm is experimental). That is my problem  :-//
« Last Edit: August 19, 2019, 03:12:12 pm by legacy »
 

Offline rstofer

  • Super Contributor
  • ***
  • Posts: 9937
  • Country: us
Re: Does anybody learn C any more?
« Reply #48 on: August 19, 2019, 03:11:38 pm »

I'm not saying C is all bad, but when you start composing complex systems out of it (such as an OS kernel), you are asking for trouble. Because programmers, even the best ones...and this might blow your mind...are humans.

And, yet, C is used for all the main OS kernels on the planet.  I don't keep up, I am just too old, but is there a single OS of any consequence written in Rust?  I tend to think in terms of OS kernels being the most complex code around and Rust is purported to be a systems programming language.  There should be some example OSes out there somewhere.

Buffer overflow was always due to sloppy code, not the C language and certainly not the standard libraries.  Many string functions have a 'size' parameter to prevent overrun.  It's up to the programmer to use the functions correctly.  Use strncpy(), not strcpy().

Better yet, write your own versions and, among other things, they probably won't use the heap.  Win-win!  And you will know that they are thread safe.  Win-win-win!

Sometimes you have to wonder:  How did the fellows at Bell Labs ever create Unix and C at the same time (more or less)?
 
The following users thanked this post: PlainName, SiliconWizard

Offline legacy

  • Super Contributor
  • ***
  • !
  • Posts: 4415
  • Country: ch
Re: Does anybody learn C any more?
« Reply #49 on: August 19, 2019, 03:27:14 pm »
Sometimes you have to wonder:  How did the fellows at Bell Labs ever create Unix and C at the same time (more or less)?

The first Unix compared to a modern Linux Kernel has only the 1% of the complexity. The PDP11 compared to a modern RISC (PPC multi-cores) has just a fraction of the complexity, and it had no problem concerning "unaligned access, cache-coherence, etc". The datasheet about the Freescale e500 is 4000 pages!!! The latest Clement's book about the m68k is no more than 400 pages with exercises and teaching material, and the first UNIX was not designed with SMP, AMP, threads, and all the complexity that we have nowadays. The latest Linux 5.3 for HPPA2 is 22Mbyte with everything compiled "static", XINU, which is a near heir of UNIX, is no more than 512Kbyte on a 68020 machine.
 


Share me

Digg  Facebook  SlashDot  Delicious  Technorati  Twitter  Google  Yahoo
Smf