Author Topic: Google at it again... Carbon!  (Read 4572 times)

0 Members and 1 Guest are viewing this topic.

Offline ataradov

  • Super Contributor
  • ***
  • Posts: 11752
  • Country: us
    • Personal site
Re: Google at it again... Carbon!
« Reply #25 on: October 25, 2022, 11:48:03 pm »
CPUs don't "speak".  But sure, believe whatever nonsense you want.

Also, I'm not really sure what your MOSFET musings contribute here at all. Go down to atoms, think bigger.
Alex
 

Offline eti

  • Super Contributor
  • ***
  • !
  • Posts: 1801
  • Country: gb
  • MOD: a.k.a Unlokia, glossywhite, iamwhoiam etc
Re: Google at it again... Carbon!
« Reply #26 on: October 25, 2022, 11:53:03 pm »
The high level language design affects how efficient your final binary could be.

C does not care about array bounds checking, it will just access members outside the array. Really fast binary code though. Swift has array bounds checking. Each array access has to compile into a sequence of instructions that first check that the index is valid for a given array. Safe, but slow (relatively) code.

So, "everything is binary" is a true, but completely meaningless sentence.

This is why Rust talks a lot about "zero cost abstractions". They are purposefully designing high level language concepts keeping in mind how they would be implemented in the final binary code.

And if you really think that you can write assembly code that is as efficient as code generated by the compiler, then you really have no idea what you are talking about. Given low level architectural optimizations in the modern CPUs, it no longer possible to write sufficient chinks of code that would be faster than compiler generated code. There is just too much low level stuff to keep in mind and adjust for individual CPU architectures.

In reference to your edits:

"And if you really think that you can write assembly code that is as efficient as code generated by the compiler, then you really have no idea what you are talking about."


I don't recall having said that, and didn't.  The bottom line is, irrefutably, binary is the end product. That was my ONLY point. No ego or emotion involved here, just a fact. Nope, I cannot and would not write a compiler, let alone a better one :)
 

Offline eti

  • Super Contributor
  • ***
  • !
  • Posts: 1801
  • Country: gb
  • MOD: a.k.a Unlokia, glossywhite, iamwhoiam etc
Re: Google at it again... Carbon!
« Reply #27 on: October 25, 2022, 11:58:25 pm »
CPUs don't "speak".  But sure, believe whatever nonsense you want.

Also, I'm not really sure what your MOSFET musings contribute here at all. Go down to atoms, think bigger.

I feel, based on our past interactions, you are intent on being "right" and starting an argument here.

I humbly ask you to READ AGAIN and not read through the "he's WRONG!!" filter - I explicitly clarified that silicon DOES NOT "speak" - we are talking about arrangements of logic gates, and the "combination" of the signals is what is "seen" as the one that causes said logic arrangement to execute its opcode by means of all parts of it waiting for said correct combination of "1" and "0" to be in alignment.

You clearly show great disdain for those you deign "below" you; I know you are a clever chap, so being intelligent, you will know that there is little to achieve from "proving" someone else is "wrong" and you are "right" - we may BOTH be wrong... so please, find another use of your time.

 

Offline EEVblog

  • Administrator
  • *****
  • Posts: 38641
  • Country: au
    • EEVblog
Re: Google at it again... Carbon!
« Reply #28 on: October 26, 2022, 02:27:43 am »
I feel, based on our past interactions, you are intent on being "right" and starting an argument here.

I humbly ask you to READ AGAIN and not read through the "he's WRONG!!" filter - I explicitly clarified that silicon DOES NOT "speak" - we are talking about arrangements of logic gates, and the "combination" of the signals is what is "seen" as the one that causes said logic arrangement to execute its opcode by means of all parts of it waiting for said correct combination of "1" and "0" to be in alignment.

You clearly show great disdain for those you deign "below" you; I know you are a clever chap, so being intelligent, you will know that there is little to achieve from "proving" someone else is "wrong" and you are "right" - we may BOTH be wrong... so please, find another use of your time.

Eti, given your recent thread about problems you've been having, I'm going to have to recommend that you let this be and walk away from this thread.
 

Offline eti

  • Super Contributor
  • ***
  • !
  • Posts: 1801
  • Country: gb
  • MOD: a.k.a Unlokia, glossywhite, iamwhoiam etc
Re: Google at it again... Carbon!
« Reply #29 on: October 26, 2022, 03:42:31 am »
I feel, based on our past interactions, you are intent on being "right" and starting an argument here.

I humbly ask you to READ AGAIN and not read through the "he's WRONG!!" filter - I explicitly clarified that silicon DOES NOT "speak" - we are talking about arrangements of logic gates, and the "combination" of the signals is what is "seen" as the one that causes said logic arrangement to execute its opcode by means of all parts of it waiting for said correct combination of "1" and "0" to be in alignment.

You clearly show great disdain for those you deign "below" you; I know you are a clever chap, so being intelligent, you will know that there is little to achieve from "proving" someone else is "wrong" and you are "right" - we may BOTH be wrong... so please, find another use of your time.

Eti, given your recent thread about problems you've been having, I'm going to have to recommend that you let this be and walk away from this thread.

Yep. I’ve no intention of allowing something so meaningless, eat my lunch or spoil my week. Cheers Dave :)
 

Offline Fixpoint

  • Regular Contributor
  • *
  • Posts: 97
  • Country: de
Re: Google at it again... Carbon!
« Reply #30 on: October 26, 2022, 08:38:32 am »
I agree, upon reflection, that the majority of that post was a little (or a lot) based on my inexperience, as I am not a programmer.

I respect your insight.

Quote
However, your last quotation - that makes no sense - do you think the silicon speaks ENGLISH? Processors talk IN BINARY - whether or not you have a known compilation process, or the interpreted language silently compiles the source on the fly, it ALL compiles down to binary; it can't NOT do so, otherwise how do you propose that the transistors on the die, work? NO other way except the base level fundamental mechanism of transistors arranged into logic gates.

You are right that processors "talk in binary", however you are making a logical short circuit when concluding that this means your code has to be compiled into CPU-specific machine language -- or even compiled at all:

* code can be compiled into other formats (i.e., CPU-independent bytecode);
* code can be interpreted.

You probably already know languages that are not used in conjunction with compilers, for instance many dialects of BASIC (introduced in the 1960s) and Python (introduced in 1989).
 

Offline eti

  • Super Contributor
  • ***
  • !
  • Posts: 1801
  • Country: gb
  • MOD: a.k.a Unlokia, glossywhite, iamwhoiam etc
Re: Google at it again... Carbon!
« Reply #31 on: October 26, 2022, 10:55:27 pm »
I agree, upon reflection, that the majority of that post was a little (or a lot) based on my inexperience, as I am not a programmer.

I respect your insight.

Quote
However, your last quotation - that makes no sense - do you think the silicon speaks ENGLISH? Processors talk IN BINARY - whether or not you have a known compilation process, or the interpreted language silently compiles the source on the fly, it ALL compiles down to binary; it can't NOT do so, otherwise how do you propose that the transistors on the die, work? NO other way except the base level fundamental mechanism of transistors arranged into logic gates.

You are right that processors "talk in binary", however you are making a logical short circuit when concluding that this means your code has to be compiled into CPU-specific machine language -- or even compiled at all:

* code can be compiled into other formats (i.e., CPU-independent bytecode);
* code can be interpreted.

You probably already know languages that are not used in conjunction with compilers, for instance many dialects of BASIC (introduced in the 1960s) and Python (introduced in 1989).

My point is, and it cannot be ANY other way, that even if not actively done by the programmer, SOMETHING HAS to translate/compile the bytecode or whatever INTO BINARY, whether done consciously as a step by Mr Programmer, or it being done automagically, and hidden from view.  It all HAS to be binary by the time its the turn of the processor to process... in binary.

I am being pedantic, as pedantry in this case is factual precision. A silicon die literally cannot "speak" - it blindly executes various assortments of mix and match 0100101100010101 states, which is then decoded back into something for the system to act upon. Utter pedantry would be to point out (as another person did) about atoms - yeah - it's obvious atoms exist, ergo MOSFETS are formed from them, but my point is that, no matter WHAT does it, CPU = only binary code, so something, somewhere ALWAYS produces a binary stream FOR the CPU, and that is ALL it can process.

>>> http://www.differencebetween.net/technology/difference-between-bytecode-and-machine-code/
« Last Edit: October 26, 2022, 11:14:13 pm by eti »
 

Offline Fixpoint

  • Regular Contributor
  • *
  • Posts: 97
  • Country: de
Re: Google at it again... Carbon!
« Reply #32 on: October 27, 2022, 10:05:48 am »
I am being pedantic, as pedantry in this case is factual precision.

You are not at all being pedantic, you are just mixing up different categories that don't belong together. You are actually being imprecise because you don't clearly differentiate between some important concepts.

Let's remember what question we are discussing. Your claim was that it doesn't matter what language we use because they all have to translated to some sort of binary format. And THAT is not correct in two ways at the same time:

1. On an abstract level, there are many ways how languages may differ from each other that just have nothing to do with the question whether they are compiled or not. You already agreed to this point.

2. On a technical level, when there is an interpreter, the code is NOT translated into some binary form. It just doesn't happen. Let's REALLY be pedantic and precise here: The CPU is executing the interpreter's code, not the application's code. The CPU never sees the application's code. Only the interpreter sees that code, and in a fundamentally different way than the CPU would see it.

Item 2 enables very deep manipulations and tricks at runtime that are not possible with statically compiled code (well, depending on the runtime environment and the CPU). So, this distinction has not just theoretical consequences but also very relevant practical ones.

For instance, most of the time Python code is not translated into binary form. What the interpreter does is not the same as compiling into binary. If we want to be pedantic and precise, we have to acknowledge this fact. (Actually, we have to acknowledge it even we are not being pedantic.)
 

Offline eti

  • Super Contributor
  • ***
  • !
  • Posts: 1801
  • Country: gb
  • MOD: a.k.a Unlokia, glossywhite, iamwhoiam etc
Re: Google at it again... Carbon!
« Reply #33 on: October 27, 2022, 01:14:27 pm »
Again, ALL the CPU can understand is binary. No amount of "debates" or refutation will change that. Even as a non-programmer I know that. It literally can't NOT be so.

No matter how the code written ends up in that binary state, it still does.
« Last Edit: October 27, 2022, 02:39:55 pm by eti »
 

Offline Fixpoint

  • Regular Contributor
  • *
  • Posts: 97
  • Country: de
Re: Google at it again... Carbon!
« Reply #34 on: October 27, 2022, 06:46:59 pm »
Again, ALL the CPU can understand is binary. No amount of "debates" or refutation will change that. Even as a non-programmer I know that. It literally can't NOT be so.

No matter how the code written ends up in that binary state, it still does.

Now you are again in a different category. "Binary state", what's that supposed to mean? Honestly, all of this ist just so imprecise that it's difficult to reach any conclusion.

Of course you can transform any data in some kind of binary *representation*. I repeat: *representation*, ok? But that doesn't mean anything. In our context, the only meaningful thing is the CPU instruction set. And the code does not need to be mapped into this instruction set.
 

Offline ataradov

  • Super Contributor
  • ***
  • Posts: 11752
  • Country: us
    • Personal site
Re: Google at it again... Carbon!
« Reply #35 on: October 27, 2022, 06:51:59 pm »
Fixpoint, don't waste your time, this is unfixable. As a "non-programmer" he has a very surface level understanding of how things work.

By that logic, source code itself is also "binary".
« Last Edit: October 27, 2022, 06:53:41 pm by ataradov »
Alex
 

Offline eti

  • Super Contributor
  • ***
  • !
  • Posts: 1801
  • Country: gb
  • MOD: a.k.a Unlokia, glossywhite, iamwhoiam etc
Re: Google at it again... Carbon!
« Reply #36 on: October 27, 2022, 08:00:46 pm »
Fixpoint, don't waste your time, this is unfixable. As a "non-programmer" he has a very surface level understanding of how things work.

By that logic, source code itself is also "binary".

I don't need to be a programmer to understand the FUNDAMENTALS of ones and zeroes influencing arrangements of transistors as logic gates. Would you care to condescend to me a little more?
 

Offline ataradov

  • Super Contributor
  • ***
  • Posts: 11752
  • Country: us
    • Personal site
Re: Google at it again... Carbon!
« Reply #37 on: October 27, 2022, 08:01:34 pm »
LOL.
Alex
 

Offline eti

  • Super Contributor
  • ***
  • !
  • Posts: 1801
  • Country: gb
  • MOD: a.k.a Unlokia, glossywhite, iamwhoiam etc
Re: Google at it again... Carbon!
« Reply #38 on: October 27, 2022, 08:09:10 pm »
Again, ALL the CPU can understand is binary. No amount of "debates" or refutation will change that. Even as a non-programmer I know that. It literally can't NOT be so.

No matter how the code written ends up in that binary state, it still does.

Now you are again in a different category. "Binary state", what's that supposed to mean? Honestly, all of this ist just so imprecise that it's difficult to reach any conclusion.

Of course you can transform any data in some kind of binary *representation*. I repeat: *representation*, ok? But that doesn't mean anything. In our context, the only meaningful thing is the CPU instruction set. And the code does not need to be mapped into this instruction set.

Tell me, old bean, how are instructions presented to the CPU? Yep, binary. My initial and current point is that the end result of ALL the layers is that a CPU *ONLY* understands binary code, and the *WHATEVER* you want to call it, IS ALWAYS converted to binary - the way one talks to the CPU, and unless you have some (currently in ubiquitous use) magical CPU which doesn't use binary, tell me, what else IS there?
« Last Edit: October 27, 2022, 08:14:52 pm by eti »
 

Offline Fixpoint

  • Regular Contributor
  • *
  • Posts: 97
  • Country: de
Re: Google at it again... Carbon!
« Reply #39 on: October 27, 2022, 08:47:47 pm »
Again, ALL the CPU can understand is binary. No amount of "debates" or refutation will change that. Even as a non-programmer I know that. It literally can't NOT be so.

No matter how the code written ends up in that binary state, it still does.

Now you are again in a different category. "Binary state", what's that supposed to mean? Honestly, all of this ist just so imprecise that it's difficult to reach any conclusion.

Of course you can transform any data in some kind of binary *representation*. I repeat: *representation*, ok? But that doesn't mean anything. In our context, the only meaningful thing is the CPU instruction set. And the code does not need to be mapped into this instruction set.

Tell me, old bean, how are instructions presented to the CPU? Yep, binary. My initial and current point is that the end result of ALL the layers is that a CPU *ONLY* understands binary code, and the *WHATEVER* you want to call it, IS ALWAYS converted to binary - the way one talks to the CPU, and unless you have some (currently in ubiquitous use) magical CPU which doesn't use binary, tell me, what else IS there?

Well, the thing is, I already answered exactly those questions. At this point, I really can only repeat myself. I try a last time with even more detail:

1. The CPU executes instructions in a binary format called its machine language. The whole of those machine codes is its instruction set.
2. The program that is executed by the CPU is, in our case, an interpreter.
3. The application program is not executed by the CPU. It can't be because it was never compiled to machine code, and that's the only thing the CPU understands.
4. The application program is, on a technical level, DATA, not CODE, and is executed by the interpreter, which in turn is executed by the CPU.
5. A classic interpreter (like the original BASIC and PHP interpreters) operates directly on the source code. There is no intermediate byte code compilation involved.
6. Of course, for performance reasons source code can be compiled into byte code (which, mind you, has nothing to with the CPU's instruction set!) which is then executed by an interpreter called a virtual machine.

So, let me say it a last time: When working with interpreters, the source code is NOT translated to any kind of binary format. It is only REPRESENTED in some binary form, like ASCII or whatever. But this has nothing to do with the CPU because we are talking about DATA here while the CPU only understands CODE (machine instructions).

In order to get these things right, you have to differentiate between fundamental concepts like syntax vs semantics, data vs representation of data, data vs code, compilation vs interpretation. If you mix all these things up, you can end up with statements like "everything is the same because in the end there are only 1s and 0s". Well, yes, in the end there are only 1s and 0s, but that DOES NOT MEAN EVERYTHING IS THE SAME. That's just a logical short.

PS: The screenshot you showed contains false statements. It claims that "the virtual machine converts bytecode into specific machine instructions". That is awful nonsense. Honestly, I get very annoyed when I see such incompetence. The people who wrote that obviously don't know what an interpreter is, what a VM is, and what a JIT compiler is, or they just don't care for correct language. Just awful.
« Last Edit: October 27, 2022, 08:57:57 pm by Fixpoint »
 

Offline eti

  • Super Contributor
  • ***
  • !
  • Posts: 1801
  • Country: gb
  • MOD: a.k.a Unlokia, glossywhite, iamwhoiam etc
Re: Google at it again... Carbon!
« Reply #40 on: October 27, 2022, 08:56:23 pm »
Again, ALL the CPU can understand is binary. No amount of "debates" or refutation will change that. Even as a non-programmer I know that. It literally can't NOT be so.

No matter how the code written ends up in that binary state, it still does.

Now you are again in a different category. "Binary state", what's that supposed to mean? Honestly, all of this ist just so imprecise that it's difficult to reach any conclusion.

Of course you can transform any data in some kind of binary *representation*. I repeat: *representation*, ok? But that doesn't mean anything. In our context, the only meaningful thing is the CPU instruction set. And the code does not need to be mapped into this instruction set.

Tell me, old bean, how are instructions presented to the CPU? Yep, binary. My initial and current point is that the end result of ALL the layers is that a CPU *ONLY* understands binary code, and the *WHATEVER* you want to call it, IS ALWAYS converted to binary - the way one talks to the CPU, and unless you have some (currently in ubiquitous use) magical CPU which doesn't use binary, tell me, what else IS there?

Well, the thing is, I already answered exactly those questions. At this point, I really can only repeat myself. I try a last time with even more detail:

1. The CPU executes instructions in a binary format called its machine language. The whole of those machine codes is its instruction set.
2. The program that is executed by the CPU is, in our case, an interpreter.
3. The application program is not executed by the CPU. It can't be because it was never compiled to machine code, and that's the only thing the CPU understands.
4. The application program is, on a technical level, DATA, not CODE, and is executed by the interpreter, which in turn is executed by the CPU.
5. A classic interpreter (like the original BASIC and PHP interpreters) operates directly on the source code. There is no intermediate byte code compilation involved.
6. Of course, for performance reasons source code can be compiled into byte code (which, mind you, has nothing to with the CPU's instruction set!) which is then executed by an interpreter called a virtual machine.

So, let me say it a last time: When working with interpreters, the source code is NOT translated to any kind of binary format. It is only REPRESENTED in some binary form, like ASCII or whatever. But this has nothing to do with the CPU because we are talking about DATA here while the CPU only understands CODE (machine instructions).

In order to get these things right, you have to differentiate between fundamental concepts like syntax vs semantics, data vs representation of data, data vs code, compilation vs interpretation. If you mix all these things up, you can end up with statements like "everything is the same because in the end there are only 1s and 0s". Well, yes, in the end there are only 1s and 0s, but that DOES NOT MEAN EVERYTHING IS THE SAME. That's just a logical short.

PS: The screenshot you showed contains false statements. It claims that "the virtual machine converts bytecode into specific machine instructions". That is awful nonsense. Honestly, I get very annoyed when I see such incompetence. The people who wrote that obviously don't know what an interpreter is, what a VM is, and what a JIT compiler is. Just awful.


*** FINALLY. That is the ONLY thing I am talking about - T___H___E________O___N____L____Y______T___H___I___N___G

I maybe worded it ambiguously, but my initial point, a few posts ago (again, probably I worded it poorly) is that, at the end of the day, and at the BARE METAL, all that matters to the CPU is seeing pins high and low, aka binary.

Sorry to confuse everyone - I am bailing from this thread, hah. I am sure I might have misunderstood some of this, and I apologise profusely if I appeared, or was arrogant.
« Last Edit: October 27, 2022, 09:00:16 pm by eti »
 

Offline SiliconWizardTopic starter

  • Super Contributor
  • ***
  • Posts: 15340
  • Country: fr
Re: Google at it again... Carbon!
« Reply #41 on: October 27, 2022, 09:21:27 pm »
Again, ALL the CPU can understand is binary. No amount of "debates" or refutation will change that. Even as a non-programmer I know that. It literally can't NOT be so.

No matter how the code written ends up in that binary state, it still does.

I don't get what you are trying to say though.

Because if you are actually saying that programming languages don't matter, because they all end up as binary code anyway, you are clearly missing the whole point of programming.
 

Offline eti

  • Super Contributor
  • ***
  • !
  • Posts: 1801
  • Country: gb
  • MOD: a.k.a Unlokia, glossywhite, iamwhoiam etc
Re: Google at it again... Carbon!
« Reply #42 on: October 27, 2022, 09:29:32 pm »
Again, ALL the CPU can understand is binary. No amount of "debates" or refutation will change that. Even as a non-programmer I know that. It literally can't NOT be so.

No matter how the code written ends up in that binary state, it still does.

I don't get what you are trying to say though.

Because if you are actually saying that programming languages don't matter, because they all end up as binary code anyway, you are clearly missing the whole point of programming.

Yeah I cocked it right up, the explanation. I think we might abandon it, put it down to human failure... I hope!
 


Share me

Digg  Facebook  SlashDot  Delicious  Technorati  Twitter  Google  Yahoo
Smf