Author Topic: Is ST Cube IDE a piece of buggy crap?  (Read 227460 times)

0 Members and 7 Guests are viewing this topic.

Offline SiliconWizard

  • Super Contributor
  • ***
  • Posts: 15444
  • Country: fr
Re: Is ST Cube IDE a piece of buggy crap?
« Reply #450 on: March 20, 2022, 08:36:07 pm »
Yes yes. NULL and 0 are the same thing per the standard, so... as newbrain said, if it's not actually 0 behind the scenes, the compiler should silently handle this. (Now I've never encountered a platform for which the compiler would transform a 0 pointer to some other value than 0, but that may exist.)

Now obviously in cases where your own "invalid pointer" value should be something else - that you would want complete control over - just define your own constant and use it instead of NULL. That would avoid problems and avoid confusing everyone reading your code.

An example of that is in the Windows API, where the invalid "handle" value is not 0, but the largest integer that can be represented for a pointer. While "handles" are usually pointers, you're not supposed to know, and those are considered "opaque" types, so you shouldn't make any assumption. Defining your own pointer types, while often frowned upon, can be a decent alternative to limit confusion and not have to bother with what NULL or 0 really is for a pointer on your particular system, if that could be an issue.

 

Online nctnico

  • Super Contributor
  • ***
  • Posts: 28113
  • Country: nl
    • NCT Developments
Re: Is ST Cube IDE a piece of buggy crap?
« Reply #451 on: March 20, 2022, 08:59:56 pm »
Things like if (!my_pointer)  yadayada_error(); is just bad coding since NULL can be defined as any value.
No, this is wrong on two counts:
  • !my_pointer is guaranteed to yield true if my_pointer is the NULL pointer.
  • NULL is defined as either the constant 0 or the same cast to (void*)

References (C11):
  • 6.5.3.3 Unary arithmetic operators, §5:
    The result of the logical negation operator ! is 0 if the value of its operand compares unequal to 0, 1 if the value of its operand compares equal to 0. The result has type int. The expression !E is equivalent to (0==E).
    Now, 0 in a pointer context is the NULL pointer so the expression will do the right thing.
  • 6.3.2.3 Pointers, §3:
    An integer constant expression with the value 0, or such an expression cast to type void *, is called a null pointer constant . 66) If a null pointer constant is converted to a pointer type, the resulting pointer, called a null pointer, is guaranteed to compare unequal to a pointer to any object or function.
    So NULL cannot be defined in other ways, and 0 is a perfectly good NULL pointer.

If the actual machine NULL pointer needs to be something different for whatever reason, it all must happen by compiler magic.

EtA: OTOH, if one does something like (uintptr_t)p and p is a NULL pointer, I don't see anything in the standard that would prevent getting something different from 0 (but I should check more accurately, definitely nothing in 6.3.2.3).
:-+
Still -in general-, you have to ask yourself how robust your code is in terms of maintainability and portability across C compilers / C versions when you have to delve deep into the specifications and relying on a specific C standard for your code to do what you think it should do. I prefer code in a way that it is as unambiguous as possible to what it should (or shouldn't) do.
« Last Edit: March 20, 2022, 09:04:08 pm by nctnico »
There are small lies, big lies and then there is what is on the screen of your oscilloscope.
 

Offline newbrain

  • Super Contributor
  • ***
  • Posts: 1773
  • Country: se
Re: Is ST Cube IDE a piece of buggy crap?
« Reply #452 on: March 20, 2022, 09:40:20 pm »
Still -in general-,
[...]
relying on a specific C standard
In general, I agree (e.g. - in general - I eschew bit fields as they are a minefield of implementation defined behaviour).
In particular, the guarantees above have been in the language since C89 (and probably in K&R), so I give them for granted.
I admit I usually write !my_pointer and feel no remorse.
(Out of curiosity, I'll check tomorrow if we have a specific design rule at work - I program for fun, very seldom at work nowadays).
Nandemo wa shiranai wa yo, shitteru koto dake.
 

Offline NorthGuy

  • Super Contributor
  • ***
  • Posts: 3251
  • Country: ca
Re: Is ST Cube IDE a piece of buggy crap?
« Reply #453 on: March 20, 2022, 09:50:51 pm »
That said, I could not completely avoid them, in things like this

Code: [Select]
data=0;
for (address=0x080e0000; address<0x080fffff; address+=4)
{
if ( (*(volatile uint32_t*)address) != data )
error++;
data+=4;
}

The compilers usually allow ways to declare an array located at the specified address. By using such array, you can avoid using pointers.

You can even build linked lists, trees, or other things alike using array indices instead of pointers.
 

Offline SiliconWizard

  • Super Contributor
  • ***
  • Posts: 15444
  • Country: fr
Re: Is ST Cube IDE a piece of buggy crap?
« Reply #454 on: March 20, 2022, 10:20:07 pm »
Things like if (!my_pointer)  yadayada_error(); is just bad coding since NULL can be defined as any value. BTW: IMHO you should always use conditional statements with an explicit value anyway.

Yeah I don't like that either at all. But not particularly because those are pointers. As you added, I dislike it in general. The reason lies in the whole confusion there has always been in C due to its original lack of a true boolean type, so any expression could serve as a condition. This looks bad IMHO and is a recipe for obfuscation. But that comes a lot from my preference for strongly-typed approaches.

Sure C has added booleans in C99 (which actually act as such, more or less), but since it has not enforced booleans for conditional expressions (for obvious compatibility with previous revisions - that would break a gigantic amount of existing code), and implicitly converts to and from integers, that doesn't change a whole lot. You can still combine booleans with integers and pointers and make a whole unreadable mess.

For fun, just look at this:
Code: [Select]
#include <stdbool.h>

bool foo(int n, void *p)
{
        bool b = n > 10;
        int m = b && p + 1;
        return m < 2;
}

Which would not even give a single warning even with "-Wall -Wconversion".
Perfectly valid C, but what the heck.  :-DD
« Last Edit: March 20, 2022, 10:26:54 pm by SiliconWizard »
 

Offline newbrain

  • Super Contributor
  • ***
  • Posts: 1773
  • Country: se
Re: Is ST Cube IDE a piece of buggy crap?
« Reply #455 on: March 20, 2022, 11:42:18 pm »
Perfectly valid C, but what the heck.  :-DD
I beg to differ.
Had p been any pointer but a pointer to void, that code would have been compliant.
As it is, it violates the constraint found in 6.5.6 Additive operators, §2:
Quote
Constraints
For addition, either both operands shall have arithmetic type, or one operand shall be a pointer to a complete object type and the other shall have integer type. (Incrementing is equivalent to adding 1.)

By definition, void is not a complete type, so this is not valid C.
Note that it's not UB but just plainly wrong, as the 'shall' is not satisfied inside a constraint, as oppsed to outside (UB).

Now, gcc is especially lenient here.
It will issue a warning with -std=c11 -pedantic <- That's in my default options. Maybe it says something about me  :blah:

gcc accepts arithmetic on void pointers as an extension (treating them as pointers to char) - as if C needed more type relaxation...
« Last Edit: March 20, 2022, 11:54:02 pm by newbrain »
Nandemo wa shiranai wa yo, shitteru koto dake.
 

Offline SiliconWizard

  • Super Contributor
  • ***
  • Posts: 15444
  • Country: fr
Re: Is ST Cube IDE a piece of buggy crap?
« Reply #456 on: March 21, 2022, 12:05:29 am »
Perfectly valid C, but what the heck.  :-DD
I beg to differ.
Had p been any pointer but a pointer to void, that code would have been compliant.
As it is, it violates the constraint found in 6.5.6 Additive operators, §2:
Quote
Constraints
For addition, either both operands shall have arithmetic type, or one operand shall be a pointer to a complete object type and the other shall have integer type. (Incrementing is equivalent to adding 1.)

By definition, void is not a complete type, so this is not valid C.
Note that it's not UB but just plainly wrong, as the 'shall' is not satisfied inside a constraint, as oppsed to outside (UB).

Now, gcc is especially lenient here.
It will issue a warning with -std=c11 -pedantic <- That's in my default options. Maybe it says something about me  :blah:

gcc accepts arithmetic on void pointers as an extension (treating them as pointers to char) - as if C needed more type relaxation...

Ahah, it was just meant as a what the heck thing. Replace the void by any type you like. I just spitted it out quickly. But it does show yet another pitfall. =)
And good catch, I'm sure many people wouldn't have noticed. But I think the point is taken.

I like C's flexibility, but IMHO, all those implicit conversions and type compatibility are atrocious from any reasonable point of view. And I think its flexibility wouldn't have greatly suffered without that.

Yes, gcc allows void * arithmetic by default, as a char *. It's pretty questionable. Clang does exactly the same by default (and like gcc, adding '-pedantic' gives a warning). Just because it essentially mimicks gcc's options and defaults as much as it can, to try and make it a drop-in replacement for gcc. (Talk about a way to push Clang/LLVM rather than strive to make it correct by default.) Anyway.
 
The following users thanked this post: newbrain

Offline abyrvalg

  • Frequent Contributor
  • **
  • Posts: 837
  • Country: es
Re: Is ST Cube IDE a piece of buggy crap?
« Reply #457 on: March 21, 2022, 12:59:28 am »
I eschew bit fields as they are a minefield of implementation defined behaviour
Noticed that bitfields yields significantly more optimised code with GCC on Cortex-M, an example: https://godbolt.org/z/G5ErEfrMc
 

Offline NorthGuy

  • Super Contributor
  • ***
  • Posts: 3251
  • Country: ca
Re: Is ST Cube IDE a piece of buggy crap?
« Reply #458 on: March 21, 2022, 01:15:33 am »
I like C's flexibility, but IMHO, all those implicit conversions and type compatibility are atrocious from any reasonable point of view.

Not from any. You view C as an HLL. If you view C as a macro assembler, these features are not horrible, but quite desirable.
 

Offline westfw

  • Super Contributor
  • ***
  • Posts: 4316
  • Country: us
Re: Is ST Cube IDE a piece of buggy crap?
« Reply #459 on: March 21, 2022, 02:17:25 am »
I don't really understand the apparently fervent requirement for "never" having to "drop down" into ASM code to do things.

I wish "they" would spend less time "undefining" behaviors that people have used for years, and more time trying to come up with a standardized way to efficiently insert ASM code.
 

Online ataradov

  • Super Contributor
  • ***
  • Posts: 11780
  • Country: us
    • Personal site
Re: Is ST Cube IDE a piece of buggy crap?
« Reply #460 on: March 21, 2022, 02:38:45 am »
All modern compilers implement bitfields the same way. The spec authors stopped paying any attention to them long time ago, but thankfully industry has established reasonable and consistent behavior.  And yes, they result in good code optimizations.

I would not worry about using them too much. If you have a compiler that somehow manages to get them wrong, then you are likely to have other issues anyway.
Alex
 

Offline SiliconWizard

  • Super Contributor
  • ***
  • Posts: 15444
  • Country: fr
Re: Is ST Cube IDE a piece of buggy crap?
« Reply #461 on: March 21, 2022, 03:08:36 am »
I eschew bit fields as they are a minefield of implementation defined behaviour
Noticed that bitfields yields significantly more optimised code with GCC on Cortex-M, an example: https://godbolt.org/z/G5ErEfrMc

That's interesting. I had taken a look at this for x86_64, and didn't notice a difference. I just took a look with your example. And indeed, code in both cases is rather similar.
But of course it all depends on the instruction set. With RISC-V, the code is also almost the same in both cases.

So that's something to keep in mind - yes, bit fields seem more efficient on Cortex-M targets. With GCC. With Clang/LLVM, the generated code is exactly the same in both cases!
« Last Edit: March 21, 2022, 03:10:15 am by SiliconWizard »
 

Offline abyrvalg

  • Frequent Contributor
  • **
  • Posts: 837
  • Country: es
Re: Is ST Cube IDE a piece of buggy crap?
« Reply #462 on: March 21, 2022, 02:00:51 pm »
And this may actually be useful. I ran into an issue where I inadvertently used an uninitialized pointer and the compiler removed the whole body of the function replacing it with one "udf" instruction. If the code contains undefined behaviour - just mark it as such and replace with a clear undefined instruction. This actually helped me to quickly find the source of the issue, as I've got a clear and repeatable fault.
IMO this is an example of how deeply the C world is broken - a compiler plants a bomb in the code and we are happy about the explosion being clear and repeatable. Why not just raise a compile-time error?? Could someone show an example of deliberately leaving this "udf" in production code for good?
 

Offline newbrain

  • Super Contributor
  • ***
  • Posts: 1773
  • Country: se
Re: Is ST Cube IDE a piece of buggy crap?
« Reply #463 on: March 21, 2022, 02:32:05 pm »
(Out of curiosity, I'll check tomorrow if we have a specific design rule at work - I program for fun, very seldom at work nowadays).
I checked, and no - no rule.
BUT: we follow SEI-CERT C rules and recommendations, and the notation if (!p) ... to check for NULL pointers is  sometime used in the "compliant" examples, though I admit the prevalent one is p != NULL.
So, six of one, half dozen of the other...
Nandemo wa shiranai wa yo, shitteru koto dake.
 

Online ataradov

  • Super Contributor
  • ***
  • Posts: 11780
  • Country: us
    • Personal site
Re: Is ST Cube IDE a piece of buggy crap?
« Reply #464 on: March 21, 2022, 04:05:52 pm »
IMO this is an example of how deeply the C world is broken - a compiler plants a bomb in the code and we are happy about the explosion being clear and repeatable. Why not just raise a compile-time error??
I do agree with that. I would much rather see an error there. I have not looked deeper (and actually could not reproduce that behaviour in an artificial test later), but with -W -Wall there were no warnings of any sort. May be even stricter set of flags would have worked.

And not being able to reproduce is easily also tells me that it is some very specific optimization that does this, not a coherent check. But at least it would be nice to see a message when the compiler decided to issue "udf" outside of the intentional assembly section.
« Last Edit: March 21, 2022, 04:08:23 pm by ataradov »
Alex
 

Offline NorthGuy

  • Super Contributor
  • ***
  • Posts: 3251
  • Country: ca
Re: Is ST Cube IDE a piece of buggy crap?
« Reply #465 on: March 22, 2022, 05:11:08 pm »
IMO this is an example of how deeply the C world is broken ...

C is not broken. GCC is overdeveloped.
 

Online ataradov

  • Super Contributor
  • ***
  • Posts: 11780
  • Country: us
    • Personal site
Re: Is ST Cube IDE a piece of buggy crap?
« Reply #466 on: March 22, 2022, 05:23:06 pm »
GCC faithfully implements the standard to the fullest extent. So, yes, it is the standard that is broken.
Alex
 
The following users thanked this post: nctnico

Offline NorthGuy

  • Super Contributor
  • ***
  • Posts: 3251
  • Country: ca
Re: Is ST Cube IDE a piece of buggy crap?
« Reply #467 on: March 22, 2022, 06:37:07 pm »
GCC faithfully implements the standard to the fullest extent. So, yes, it is the standard that is broken.

There are different ways to implement the standard. Nothing prevents you from implementing it in a stupid, unreasonable way. I'll try to illustrate this with an example.

Standard says "undefined" is "behavior, upon use of a nonportable or erroneous program construct or of erroneous data,
for which this International Standard imposes no requirements".  IMHO "undefined" should be seen as a carte blanche for the compiler. The compiler does not need to worry about the consequences - it is up to the programmer.

For example, (x >> y) is undefined when y is 32 or greater (assuming 32-bit integers).  This compiler should not worry what happens when y >= 32. This lets the compiler find the most efficient implementation. When the ISA has a shift instruction, the compiler can simply uses the instruction. What the instruction produces for y >= 32 doesn't matter. If it did, the compiler would have to generate extra code which handles y >= 32 situations properly. Thus "undefined" allows to avoid generating unnecessary code. Such is the benefit of "undefined".

Another compiler may chose a different approach - it may verify if y is indeed below 32, and if it isn't, it could produce some sort of prescribed "undefined" behaviour. As a result, instead of a single instruction, you'll get a bloated code checking for the condition which never happens.

Both approaches formally "faithfully implement the standard to the fullest extent", but the former approach is reasonable while the later one is stupid.

 

Online ataradov

  • Super Contributor
  • ***
  • Posts: 11780
  • Country: us
    • Personal site
Re: Is ST Cube IDE a piece of buggy crap?
« Reply #468 on: March 22, 2022, 06:42:40 pm »
Sure, and in case of dereferencing NULL pointer, compiler chose to issue the most optimal code - a single "udf" instruction instead of a whole function that would return unpredictable results anyway. There is no check, it could statically check for this.

There is no easy way to pass judgement on what is "stupid"  and what is not. It all depends on the situation and there is room to discuss.

My personal preference would be for the standard to define behaviour in all cases. The definition should make for a sensible implementation at least on the current platforms. But if not possible - then a bloated version should be implemented. But this goes back to the "high level assembler" vs a full programming language discussion.
Alex
 

Offline SiliconWizard

  • Super Contributor
  • ***
  • Posts: 15444
  • Country: fr
Re: Is ST Cube IDE a piece of buggy crap?
« Reply #469 on: March 22, 2022, 06:59:42 pm »
What IS reasonable for a compiler is to issue proper warnings when encountering an undefined behavior per the standard. The rest is irrelevant IMO. It may implement UBs in the simplest/easiest/most "efficient" way, which sounds reasonable too, but I don't think it should really matter. Developers should just not leave UB in their code, otherwise it's their entire responsibility. If the punishment is very inefficient and convoluted code, then so be it. It's after all a possibly "reasonable" punishment.

Implementation-defined behaviors is a different beast. Developers may make use of that in particular cases, and of course in this case, they have to know what they're doing and know what the compiler is doing exactly.

Sure UBs in general are questionable. But you're not going to change this in C. So the best approach is to just avoid them in your code. If you really don't want to have to avoid UBs as defined in C, your best bet is to use another language. The rest is not reasonable. In particular, insisting on coding with UBs and expecting a specific behavior from a specific compiler and a specific version of it is face-palming material.
 

Online peter-hTopic starter

  • Super Contributor
  • ***
  • Posts: 4161
  • Country: gb
  • Doing electronics since the 1960s...
Z80 Z180 Z280 Z8 S8 8031 8051 H8/300 H8/500 80x86 90S1200 32F417
 

Offline NorthGuy

  • Super Contributor
  • ***
  • Posts: 3251
  • Country: ca
Re: Is ST Cube IDE a piece of buggy crap?
« Reply #471 on: March 22, 2022, 07:54:32 pm »
My personal preference would be for the standard to define behaviour in all cases. The definition should make for a sensible implementation at least on the current platforms. But if not possible - then a bloated version should be implemented. But this goes back to the "high level assembler" vs a full programming language discussion.

"Undefined" situation occurs when you make an error or mistake. Regardless of what happens, your program will not work anyway. By the time you test everything, errors will be removed (to the best of your ability). Thus it doesn't matter what would happen if errors were still there.

If the error can be caught at run time it's a good idea to put up a warning. If it can only be determined at run-time, the compiler needs to generate code to do this. If you like the idea, there are plenty of languages which do exactly that. Use them. There's no reason to insist that C must do the same.

I personally believe that the run-time checking is not capable of turning a poorly written software into a good one. Hence, I'd rather live without it. That's a matter of personal preference if you wish.
 

Online ataradov

  • Super Contributor
  • ***
  • Posts: 11780
  • Country: us
    • Personal site
Re: Is ST Cube IDE a piece of buggy crap?
« Reply #472 on: March 22, 2022, 08:10:59 pm »
"Undefined" situation occurs when you make an error or mistake.
This depends on how you define a mistake.

As was mentioned before, if you write "a = x >> y", and compiler can trace that "y" is always bigger than 32, it can remove the entire instruction leaving "a" to have its previous value. This is nuts, but a perfectly legal optimization. And if this happens in a function where "y" depends on the parameter, the way you call the function may completely change the generated code. You wrote the function expecting the big shifts to produce 0, but you don't have the right to expect this.

Again, most modern compilers will act reasonably in cases like this, so there are no real issues. But as this thread shows, pushing optimization levels will shift what is "reasonable" and you may experience issues in edge cases. This is not a big deal in a grand scheme of things.
Alex
 

Offline NorthGuy

  • Super Contributor
  • ***
  • Posts: 3251
  • Country: ca
Re: Is ST Cube IDE a piece of buggy crap?
« Reply #473 on: March 22, 2022, 09:01:59 pm »
Again, most modern compilers will act reasonably in cases like this, so there are no real issues. But as this thread shows, pushing optimization levels will shift what is "reasonable" and you may experience issues in edge cases. This is not a big deal in a grand scheme of things.

I agree, the GCC maintainers are suckers for optimization. I think it's better to keep things reasonable, rather than trying to squeeze out the last drop of optimization. I think GCC already achieved reasonable level of optimizations 15-20 years ago. After that it's all diminishing returns. But certainly not a big deal. Yet.
 

Offline newbrain

  • Super Contributor
  • ***
  • Posts: 1773
  • Country: se
Re: Is ST Cube IDE a piece of buggy crap?
« Reply #474 on: March 22, 2022, 09:47:28 pm »
compiler can trace that "y" is always bigger than 32
[...]
 you don't have the right to expect this.
These are probably the two crucial points.

The compiler has limited knowledge (especially across translation units - we already had the discussion about link time optimization), and only in some case can know whether a certain statement will end up with UB.
The standard was written with an eye not to impose a heavy burden on the generated code (e.g., integer overflow, array boundary checks etc.).
A compiler could implement all checks and be compliant (aborting the program, throwing an exception or error message, and when possible giving a warning at compile time are all acceptable behaviours for UB).

Other languages are stricter, and will check a lot of things at runtime. Ariane 5, AFAICR went down for having better checks*.

Static analysers are getting better and better at discovering "bad" code, but can't catch all UB yet.

Hence, about the rights, it's fundamental with C to know where your rights begin and end.
In this respect it requires a higher level of attention than most other languages, stricter coding standards, and a deeper understanding of the rules.

* If my memory does not fail, a numeric overflow situation in some attitude sensor resulted in an Ada exception printout data to be sent on a control channel, wreaking havoc with guidance. Data from that sensor was not actually used in Ariane 5, but it was left there from Ariane 4 to simplify design.
Nandemo wa shiranai wa yo, shitteru koto dake.
 


Share me

Digg  Facebook  SlashDot  Delicious  Technorati  Twitter  Google  Yahoo
Smf