So I tried a couple of compilers on:
#define triple(x) x+x+x
int foo(int n){
return triple(++n);
}
clang on my Mac gives:
0000000000000000 <ltmp0>:
0: 08 04 00 0b add w8, w0, w0, lsl #1
4: 00 19 00 11 add w0, w8, #6
8: c0 03 5f d6 ret
So, that's returning 3*n + 6, i.e. for n=10 it's 11+12+13.
gcc on x86 Linux gives:
0000000000000000 <foo>:
0: f3 0f 1e fa endbr64
4: 8d 44 7f 07 lea 0x7(%rdi,%rdi,2),%eax
8: c3 retq
That's 3*n + 7, so actually for n=10 that's 12+12+13.
gcc for risc-v gives:
0000000000000000 <foo>:
0: 0025079b addiw a5,a0,2
4: 0017979b slliw a5,a5,0x1
8: 250d addiw a0,a0,3
a: 9d3d addw a0,a0,a5
c: 8082 ret
So that's 2*(n+2) + (n+3), or 3*n + 7, consistent with x86 gcc.
I'm glad I said 12+12+13 is arguable in my previous post! That was based on the idea of evaluating side effects of each operand to a binary operator (or function) just before applying that operator (or function), and leaving side-effects on arguments of other operators until they were about to be evaluated.