Author Topic: Is ST Cube IDE a piece of buggy crap?  (Read 204971 times)

peter-h and 1 Guest are viewing this topic.

Online peter-hTopic starter

  • Super Contributor
  • ***
  • Posts: 3832
  • Country: gb
  • Doing electronics since the 1960s...
Re: Is ST Cube IDE a piece of buggy crap?
« Reply #475 on: March 22, 2022, 10:18:05 pm »
Quote
if you write "a = x >> y", and compiler can trace that "y" is always bigger than 32, it can remove the entire instruction leaving "a" to have its previous value

Surely a=0 in that case, no?

How can that be a valid optimisation?

Same with x << y.
Z80 Z180 Z280 Z8 S8 8031 8051 H8/300 H8/500 80x86 90S1200 32F417
 

Online newbrain

  • Super Contributor
  • ***
  • Posts: 1742
  • Country: se
Re: Is ST Cube IDE a piece of buggy crap?
« Reply #476 on: March 22, 2022, 10:40:15 pm »
Quote
if you write "a = x >> y", and compiler can trace that "y" is always bigger than 32, it can remove the entire instruction leaving "a" to have its previous value

Surely a=0 in that case, no?

How can that be a valid optimisation?

Same with x << y.
What I just said, know your rights (and when you don't have them).

From ISO/IEC 9899:2011, Chapter "6.5.7 Bitwise shift operators", §3:
Quote
[...]If the value of the right operand is negative or is greater than or equal to the width of the promoted left operand, the behavior is undefined.

Just for completeness, consider also §4 for shifts with signed integers:
Quote
[...]If E1 has a signed type and nonnegative value, and E1 × 2E2 is representable in the result type, then that is
the resulting value; otherwise, the behavior is undefined.

Why? Because not defining the behaviour makes C efficiently portable to architecture that would, e.g., throw when a shift greater than than the word size is used.

EtA: I think the rationale (issued with C89, but updated for C99) should be compulsory reading (the standard, too). Then one can indulge in the extra guarantees and amenities of the common compilers, but it should be a reasoned choice (e.g., at work we have an explicit rule that allows gcc extensions).
« Last Edit: March 22, 2022, 10:49:55 pm by newbrain »
Nandemo wa shiranai wa yo, shitteru koto dake.
 

Online peter-hTopic starter

  • Super Contributor
  • ***
  • Posts: 3832
  • Country: gb
  • Doing electronics since the 1960s...
Re: Is ST Cube IDE a piece of buggy crap?
« Reply #477 on: March 22, 2022, 10:45:55 pm »
If x and y are uint32_t then a=0 has to be correct.

Anything else is just setting up traps.
Z80 Z180 Z280 Z8 S8 8031 8051 H8/300 H8/500 80x86 90S1200 32F417
 

Online SiliconWizard

  • Super Contributor
  • ***
  • Posts: 14943
  • Country: fr
Re: Is ST Cube IDE a piece of buggy crap?
« Reply #478 on: March 22, 2022, 11:10:47 pm »
It's UB. So, you're setting your own trap.

Whether it's formally a "trap" depends on your POV. From a math POV, the result should be 0.
From a programming POV, shifting by more than the bit width of a number does not make sense.
So yeah, C has definitely many programming "quirks" regarding arithmetic. But C is not a math language. Matlab is. Kinda. =)
 
The following users thanked this post: newbrain

Offline ataradov

  • Super Contributor
  • ***
  • Posts: 11495
  • Country: us
    • Personal site
Re: Is ST Cube IDE a piece of buggy crap?
« Reply #479 on: March 22, 2022, 11:23:43 pm »
If x and y are uint32_t then a=0 has to be correct.
Anything else is just setting up traps.
Well, we have what we have. Historical reason for this behaviour is that C was just a "high level assembly". And in that respect, ">>" would be translated to an underlying assembly instruction. And the result would be whatever hardware does in that case. Most modern hardware would produce 0, but this was not the case in the 70s. So, C was originally specified to translate to the hardware as directly as possible. Nobody even thought about modern levels of optimization.

And this "reasonable" behaviour of the hardware is the result of the convergence between the hardware and the software. Nobody in their right mind would design hardware that is hard for the compilers to work with (Itanium, LOL). So, compiles and CPU architectures mostly came to an agreement, and many modern architectures include instructions to specifically match typical needs of a high level compiler. This is where defining new languages with less UB in them makes sense. But redefining that for C may mean abandoning some very old or very strange hardware, which is not inherently a bad, but C std committee will never go for that.
« Last Edit: March 22, 2022, 11:38:50 pm by ataradov »
Alex
 
The following users thanked this post: newbrain

Online newbrain

  • Super Contributor
  • ***
  • Posts: 1742
  • Country: se
Re: Is ST Cube IDE a piece of buggy crap?
« Reply #480 on: March 22, 2022, 11:41:05 pm »
Anything else is just setting up traps.
Like, for example, the native assembler shift instruction of the PDP-11, where only the lower six bit of the shift value were taken into account, as a signed value.

So if x=0x10000 and y= 50,  x << y will give you 0x0004 (a 14 bit shift right) if implemented efficiently. Traps, traps everywhere!

I am pretty sure most of modern architectures behave as you expect.
The standard tries to cater also for architectures that are not so common, leaving more leeway to corner and ill defined cases.

EtA: Heck, even the x86 uses CL mod 32 as shift count for 32 bit shift instructions!

Anyhow, you can rejoice, something is moving in a direction you may appreciate: the C2X draft (C23 in the making) mandates two's complement, so at least one's complement and sign magnitude will soon(ish) be out.
« Last Edit: March 22, 2022, 11:53:10 pm by newbrain »
Nandemo wa shiranai wa yo, shitteru koto dake.
 

Offline ataradov

  • Super Contributor
  • ***
  • Posts: 11495
  • Country: us
    • Personal site
Re: Is ST Cube IDE a piece of buggy crap?
« Reply #481 on: March 22, 2022, 11:53:41 pm »
Although even on modern ARM LSR instruction takes only lower 8 bits of the second register into account, so  if R1=0xab and R2 = 0x12340001, then instruction LSR R1, R2 would result in R1 =0x55, not 0 (equivalent of shifting by 1), so it is on you (or the compiler) to ensure that the shift amount is valid.

So yeah, that's why we can't have nice things.
« Last Edit: March 23, 2022, 12:06:53 am by ataradov »
Alex
 
The following users thanked this post: newbrain

Online peter-hTopic starter

  • Super Contributor
  • ***
  • Posts: 3832
  • Country: gb
  • Doing electronics since the 1960s...
Re: Is ST Cube IDE a piece of buggy crap?
« Reply #482 on: March 23, 2022, 03:47:39 pm »
Quote
Well, we have what we have. Historical reason for this behaviour is that C was just a "high level assembly". And in that respect, ">>" would be translated to an underlying assembly instruction. And the result would be whatever hardware does in that case. Most modern hardware would produce 0, but this was not the case in the 70s. So, C was originally specified to translate to the hardware as directly as possible. Nobody even thought about modern levels of optimization.

In asm, >> is done with shift-right instructions. If you have x >> 32 then there won't be anything left in x. Simple. How can be "optimised" to anything other than 0? With a barrel shifter it is exactly the same. the ARM32 has a 32 bit barrel shifter and that will produce 0 also.

It's just a useless compiler. How can anyone pretend this is ok, with a straight face?

And a great case for not "upgrading" the compiler version, ever. When my current devt is out there, we will never again change GCC, from v10.x.
Z80 Z180 Z280 Z8 S8 8031 8051 H8/300 H8/500 80x86 90S1200 32F417
 

Offline ataradov

  • Super Contributor
  • ***
  • Posts: 11495
  • Country: us
    • Personal site
Re: Is ST Cube IDE a piece of buggy crap?
« Reply #483 on: March 23, 2022, 04:16:18 pm »
In asm, >> is done with shift-right instructions. If you have x >> 32 then there won't be anything left in x. Simple. How can be "optimised" to anything other than 0? With a barrel shifter it is exactly the same. the ARM32 has a 32 bit barrel shifter and that will produce 0 also.
As shown above, shift instruction does not use all the bits from the register. That's the reason for C to be specified like this - C just translates to the low level instruction and the actual behaviour is up to the hardware. Your options here is to either generate the code that would limit the value for each shift, or require the programmer to do so. C picked the later.

It's just a useless compiler. How can anyone pretend this is ok, with a straight face?
And a great case for not "upgrading" the compiler version, ever. When my current devt is out there, we will never again change GCC, from v10.x.
It has nothing to do with the recent optimizations, this behaviour has been in the standard since the beginning.
Alex
 

Offline ataradov

  • Super Contributor
  • ***
  • Posts: 11495
  • Country: us
    • Personal site
Re: Is ST Cube IDE a piece of buggy crap?
« Reply #484 on: March 23, 2022, 04:27:58 pm »
Here is a simple code for you:
Code: [Select]
volatile uint32_t a = 0xab;
  volatile uint32_t b = 0x12340001;
  volatile uint32_t c = a >> b;

Run this on any ARM MCU and look at the value of "c". It will not be 0. Volatile is just there so things don't get optimized, they have no bearing on the issue.
Alex
 

Online peter-hTopic starter

  • Super Contributor
  • ***
  • Posts: 3832
  • Country: gb
  • Doing electronics since the 1960s...
Re: Is ST Cube IDE a piece of buggy crap?
« Reply #485 on: March 23, 2022, 04:39:32 pm »
OK; I get that. Thanks.

But still if a compiler was to optimise x >> 1000, it should set x=0, not leave x unchanged.
Z80 Z180 Z280 Z8 S8 8031 8051 H8/300 H8/500 80x86 90S1200 32F417
 

Offline ataradov

  • Super Contributor
  • ***
  • Posts: 11495
  • Country: us
    • Personal site
Re: Is ST Cube IDE a piece of buggy crap?
« Reply #486 on: March 23, 2022, 04:42:21 pm »
But still if a compiler was to optimise x >> 1000, it should set x=0, not leave x unchanged.
No, it should not, the spec says so. It is not optimization, it is how the hardware behaves. In my example compiler issues full code, nothing is optimized out.

Although in this specific case (shift by a constant), optimizing compiler is better positioned to get things right, as it can set the result to 0 without additional overhead. Non-optimizing compiler will generate a shift instruction and the result would depend on the underlying hardware. And in case of the variables on both sides of the shift, there is nothing compiler can do, it has to issue a shift instruction.
« Last Edit: March 23, 2022, 05:10:09 pm by ataradov »
Alex
 
The following users thanked this post: newbrain

Online peter-hTopic starter

  • Super Contributor
  • ***
  • Posts: 3832
  • Country: gb
  • Doing electronics since the 1960s...
Re: Is ST Cube IDE a piece of buggy crap?
« Reply #487 on: March 23, 2022, 06:44:27 pm »
OK; I guess if you are shifting by a constant and the constant is > 31 then the code is dumb anyway.

Probably in that case the # of shifts would be a result of a macro or some compile time calc because nobody will actually write x = 1 << 75.
Z80 Z180 Z280 Z8 S8 8031 8051 H8/300 H8/500 80x86 90S1200 32F417
 

Offline nctnico

  • Super Contributor
  • ***
  • Posts: 27422
  • Country: nl
    • NCT Developments
Re: Is ST Cube IDE a piece of buggy crap?
« Reply #488 on: March 23, 2022, 06:47:11 pm »
OK; I guess if you are shifting by a constant and the constant is > 31 then the code is dumb anyway.

Probably in that case the # of shifts would be a result of a macro or some compile time calc because nobody will actually write x = 1 << 75.
Yep. I have inherited a piece of code that throws such a warning from a bunch of macros so I'll revisited that code some time soon.
There are small lies, big lies and then there is what is on the screen of your oscilloscope.
 

Online SiliconWizard

  • Super Contributor
  • ***
  • Posts: 14943
  • Country: fr
Re: Is ST Cube IDE a piece of buggy crap?
« Reply #489 on: March 23, 2022, 06:50:05 pm »
OK; I get that. Thanks.

But still if a compiler was to optimise x >> 1000, it should set x=0, not leave x unchanged.

Well, you're stubborn (which is not necessarily a bad thing, but sometimes it doesn't help.) ;D

From a math standpoint, you're right. From a hardware standpoint, it depends on how the right shift is implemented. But I don't know of any CPU that can actually perform a right shift of 1000 bits in a single instruction. Now from a programming language standpoint, anything goes, and as we've said multiple times, C chose to make this UB.

As C favors efficiency, since there's usually no low-level operation that allows such a shift, then doing nothing at all is what it's allowed to do.

You may think (that's what your example seems to imply) that any compiler seeing this operation with a *literal* as the shift amount would have no difficulty optimizing this as 0. But what if the shift amount is a variable which value is not known in advance? The compiler can't do anything then: it would have to add a run-time test on all targets that do not support shift amounts that big, which is the majority. And then you'd find run-time tests sprinkled every time you use a variable shift in your code.

So out of consistency, C's behavior for this is undefined in any case - whether the shift amount is a constant or a variable.

Note that if the shift amount is a literal, the compiler will give you a warning anyway. GCC gives: "warning: right shift count >= width of type [-Wshift-count-overflow]"
But if the amount is constant, but not a literal, GCC is silent. (even with -Wall -pedantic)

So very simple examples:
Code: [Select]
int foo(int n)
{
        const int shamt = 1000;
        return n >> shamt;
}
=> no warning. (But Clang does give a warning for this.)

Code: [Select]
int foo(int n)
{
        return n >> 1000;
}
=> warning.

cppcheck will spot the issue in both cases. I highly recommend using static analysis tools when you can. cppcheck does a decent job, it's lightweight and open-source.
« Last Edit: March 23, 2022, 06:58:17 pm by SiliconWizard »
 

Online peter-hTopic starter

  • Super Contributor
  • ***
  • Posts: 3832
  • Country: gb
  • Doing electronics since the 1960s...
Re: Is ST Cube IDE a piece of buggy crap?
« Reply #490 on: March 23, 2022, 07:27:55 pm »
Quote
So out of consistency, C's behavior for this is undefined in any case - whether the shift amount is a constant or a variable.

A program should produce a consistent result even if it has a bug.

This is why one pre-fills BSS with 0x00, for example: so uninitialised variables don't bite you. A perfect programmer would not need the pre-fill. And anyway you don't get the pre-fill on vars declared on the stack.

Beyond absolutely trivial code, it is impossible to be sure there isn't a bug somewhere, and it is impossible to test everything. So undefined ops are a bad thing, unless every version of a compiler does the same "undefined" thing.


Z80 Z180 Z280 Z8 S8 8031 8051 H8/300 H8/500 80x86 90S1200 32F417
 

Offline ataradov

  • Super Contributor
  • ***
  • Posts: 11495
  • Country: us
    • Personal site
Re: Is ST Cube IDE a piece of buggy crap?
« Reply #491 on: March 23, 2022, 07:57:44 pm »
It depends on the target hardware, not the compiler. If you don't want undefined behaviour in this case, the compiler would have to issue value check before each shift. This would lead to some very inefficient code. So, at least in the case of C it is on you to check that the value is correct where you believe it may not be. Most of the time normal program flow ensures correctness of the value, so no need to check anything.

Also, initializing BSS to 0 is a requirement, not a nice thing you do just in case. Even explicitly initialized variables with the value 0 will go to BSS section.
Alex
 

Online newbrain

  • Super Contributor
  • ***
  • Posts: 1742
  • Country: se
Re: Is ST Cube IDE a piece of buggy crap?
« Reply #492 on: March 23, 2022, 08:22:54 pm »
This is why one pre-fills BSS with 0x00, for example: so uninitialised variables don't bite you. A perfect programmer would not need the pre-fill. And anyway you don't get the pre-fill on vars declared on the stack.
What do you mean? That is just how the language is defined.
A perfect programmer writes good code in the boundaries set by the language definition.

I won't repeat explanations from other members or from me, you seem quite waterproof to any argument  :horse:
If you don't like C, there are so many usable languages that the only problem is the time to try them all, some is also good.

Standard Pascal does not even have shift operators, but all compilers support them, so implementations might differ.

I would suggest you try Rust, which has well defined behaviour for all shift cases.

When the shift generates an overflow, the program by default will panic (abort), unless you explicitly ask the compiler not to care for overflows.
Moreover, shift operation are well defined both for signed and unsigned types, another difference with C where right shifting a negative (signed, of course) integer is implementation defined.
Implementation defined means that the compiler can do what it wants, but it must be documented; as opposed to UB, a program which includes IDB is compliant, though not strictly compliant.

But really, do read the rationale and learn the standard (C11 is nice, C17 does not change much and C2x is still in the making), if you don't want to get caught in the many pitfalls C admittedly has. Coding by expectations is going to bite you.

Quote
I highly recommend using static analysis tools when you can.
This.
Cppcheck is good, clang-analyzer is also very good and FOSS, and Ericsson has built a FOSS driver for both to facilitate their use for CI and large projects (with multi file checking) and to store results in a database, called CodeChecker.

EtA: I re-checked, and CodeChecker will not run cppcheck (only clang-analyzer and clang-tidy), but is capable to ingest and store cppcheck reports, together with many other formats
« Last Edit: March 23, 2022, 08:33:43 pm by newbrain »
Nandemo wa shiranai wa yo, shitteru koto dake.
 

Online SiliconWizard

  • Super Contributor
  • ***
  • Posts: 14943
  • Country: fr
Re: Is ST Cube IDE a piece of buggy crap?
« Reply #493 on: March 23, 2022, 08:38:43 pm »
Standard Pascal does not even have shift operators, but all compilers support them, so implementations might differ.

I have used Pascal in the past, but I admit I don't know the standard as well as I know the C standard. Are you sure about not having shift operators? Are shl and shr just compiler extensions?
 

Online newbrain

  • Super Contributor
  • ***
  • Posts: 1742
  • Country: se
Re: Is ST Cube IDE a piece of buggy crap?
« Reply #494 on: March 23, 2022, 09:14:41 pm »
I have used Pascal in the past, but I admit I don't know the standard as well as I know the C standard. Are you sure about not having shift operators? Are shl and shr just compiler extensions?
I have used Pascal (turbo) for my master thesis (a compiler, go figure) more than three decades ago - it had shift operators/keywords AFAICR, but the official standard (1990) does not.
I cannot be sure about the 1991 update, though.

I won't link to the online pdf, as I'm not sure they are legitimate copies, but your favourite search engine will find them if you search for ISO/IEC 7185:1990.
(I have all the official C and C++ standards since C99, but for them the drafts at open-std.org are absolutely OK both from the technical and property rights point of view)

Fortran90 has intrinsic shift functions, but look here (not the actual standard, bold mine):
Quote
A.49 ISHFT (I, SHIFT)
          Description. Performs a logical shift.
          Class. Elemental function.
          Arguments.
          I must be of type integer.
          SHIFT must be of type integer. The absolute value of SHIFT
          must be less than or equal to BIT_SIZE (I)

This is exactly what UB is: the standard (manual here) does not say what happens when the constraint is not fulfilled.
The whole UB hullabaloo is because the C standard makes an attempt to warn the reader of some cases of UB (albeit defining others through omission and a general rule).

Nandemo wa shiranai wa yo, shitteru koto dake.
 

Online SiliconWizard

  • Super Contributor
  • ***
  • Posts: 14943
  • Country: fr
Re: Is ST Cube IDE a piece of buggy crap?
« Reply #495 on: March 23, 2022, 10:12:30 pm »
I have used Pascal in the past, but I admit I don't know the standard as well as I know the C standard. Are you sure about not having shift operators? Are shl and shr just compiler extensions?
I have used Pascal (turbo) for my master thesis (a compiler, go figure) more than three decades ago - it had shift operators/keywords AFAICR, but the official standard (1990) does not.
I cannot be sure about the 1991 update, though.

Just got the 1990 std, and you're right. Actually, there doesn't seem to be any bitwise operator at all (there were boolean operators, and "sets", which could more or less be used for the purpose of bitwise operations, but no direct bitwise operation on integers as far as I can tell)? Yes, all of them were in TP and other compilers. But that makes me think of the recent discussion about Pascal vs. C, and then that would definitely be a case "against" Pascal for any moderately low-level software, which wasn't raised in said discussion.

Extended Pascal (ISO 10206, that I could find in Postscript), which was a different standard, added a lot, but most things were already present in existing compilers as extensions in some form. I could not find any specific addition for bitwise operations, though? Looks like it was not a big concern for Pascal, or something.


 

Online newbrain

  • Super Contributor
  • ***
  • Posts: 1742
  • Country: se
Re: Is ST Cube IDE a piece of buggy crap?
« Reply #496 on: March 23, 2022, 11:37:36 pm »
Quote
which wasn't raised in said discussion.
I honestly had the idea to raise it, but in the end did not.

The Pascal standard is useless, since when it came out it described a language that was already obsolete and too limited to be really useful: not even separate compilation was considered, relegated to a note about "Many processors provide, as an extension, the directive external".

Nobody cares (and I think ever cared) about standard Pascal - it's a dead, pointless, standard.
This means that the language has been in good measure "implementation defined" for three decades (and before, in fact).
The only certainty is the behaviour of the compiler du jour.

The C89 standard, on the contrary, was forward looking at the time.
It set the course of the language by diverging radically from K&R and making clarity on a lot of dubious (sequence  ;)) points, precisely defining the translation phases and the preprocessor, formalizing the modern function declarations etc. etc.

The fact that we still use it as a reference in its more modern revisions and that there's a slow but certain bidirectional osmosis between the standard and the compilers are a testament to its modernity, consistency and usefulness.

PS: C89 came out while I was in my last Uni year, and at the time I thought "That's bad, this will stifle the evolution of the language" How wrong I was.
Nandemo wa shiranai wa yo, shitteru koto dake.
 

Online peter-hTopic starter

  • Super Contributor
  • ***
  • Posts: 3832
  • Country: gb
  • Doing electronics since the 1960s...
Re: Is ST Cube IDE a piece of buggy crap?
« Reply #497 on: March 24, 2022, 09:09:18 am »
Pascal was a teaching language at universities. I had to do it, in the 1970s. It was useless for doing anything practical, although Delphi was a good tool, before M$ VC++ took over the world of PC software (and multiplied the typical bug count 100x). Pascal was accordingly popular with those of my generation who found it handy to carry on with it.

I think C is a good language but I think anybody who has to rely on standard implementations of undefined behaviour is going to - at best - product illegible code. It is already "mandatory" for programmers to use the absolute minimum of comments, rendering most code unreadable unless a lot of time is wasted first. I comment C almost like I comment asm.

In 1991 I designed a datacomms product which was user-programmable, with a built-in Pascal compiler. It was really quite slick, but only because I added loads of extensions, including bitwise operators. It was Z180 based and the guy who wrote the compiler c. 1975 was still around to help with the integration. It still sells!
« Last Edit: March 24, 2022, 09:36:55 am by peter-h »
Z80 Z180 Z280 Z8 S8 8031 8051 H8/300 H8/500 80x86 90S1200 32F417
 

Online newbrain

  • Super Contributor
  • ***
  • Posts: 1742
  • Country: se
Re: Is ST Cube IDE a piece of buggy crap?
« Reply #498 on: March 24, 2022, 10:39:25 am »
I think C is a good language but I think anybody who has to rely on standard implementations of undefined behaviour is going to - at best - product illegible code.
The crux with UB is that it's uncharted territory and should remain as such (we can split some hair here, e.g. some HW interfacing etc., but take it as a general principle).
If your code need to be reliable, vaguely portable, and correct, just do not venture there, there is no "standard implementation"* - I'm always baffled by threads you see of people trying to dissect the expected behaviour of i=i++-like stuff.
That kind of statement is outside the realm of C, compiler writers are known to exploit this to improve handling of well defined cases, just live with it.

I did not get the comment about code legibility, especially in relation to UB? That depends a lot on the programmer, style guides and design can only do so much.

*E.g, real story: friend came to me with "gcc in the new Ubuntu is broken" "fat chance, show me your code" "... a[j] = j++...".
Nandemo wa shiranai wa yo, shitteru koto dake.
 

Online Siwastaja

  • Super Contributor
  • ***
  • Posts: 8353
  • Country: fi
Re: Is ST Cube IDE a piece of buggy crap?
« Reply #499 on: March 24, 2022, 01:27:12 pm »
I think anybody who has to rely on standard implementations of undefined behaviour is going to - at best - product illegible code.

... just don't?

See, all languages have illegal constructs, you are just forbidden to do X.

C goes so far as to spell out clearly: don't do this. But it seems some people love to still do it, and use the magic words "UB" as some kind of excuse of doing it.

But in fact, it's better that forbidden things are explicitly called out, instead of you having to understand that if it's not mentioned, it's not OK to do. This choice also makes C very well-specified, stable and robust language.

There certainly are special cases where we need to use UB. We just need to take care of only doing it when actually needed, constraint the scope (for example, decide that this code is never meant to be ported across different platforms), and document these places. But this is all a tiny special corner case. It should not affect your workflow very much. If it does, it sounds like you are just using UB as an excuse to being lazy, and writing buggy code. And that attitude problem affects code written in any language.
 


Share me

Digg  Facebook  SlashDot  Delicious  Technorati  Twitter  Google  Yahoo
Smf