Again, ALL the CPU can understand is binary. No amount of "debates" or refutation will change that. Even as a non-programmer I know that. It literally can't NOT be so.
No matter how the code written ends up in that binary state, it still does.
Now you are again in a different category. "Binary state", what's that supposed to mean? Honestly, all of this ist just so imprecise that it's difficult to reach any conclusion.
Of course you can transform any data in some kind of binary *representation*. I repeat: *representation*, ok? But that doesn't mean anything. In our context, the only meaningful thing is the CPU instruction set. And the code does not need to be mapped into this instruction set.
Tell me, old bean, how are instructions presented to the CPU? Yep, binary. My initial and current point is that the end result of ALL the layers is that a CPU *ONLY* understands binary code, and the *WHATEVER* you want to call it, IS ALWAYS converted to binary - the way one talks to the CPU, and unless you have some (currently in ubiquitous use) magical CPU which doesn't use binary, tell me, what else IS there?
Well, the thing is, I already answered exactly those questions. At this point, I really can only repeat myself. I try a last time with even more detail:
1. The CPU executes instructions in a binary format called its machine language. The whole of those machine codes is its instruction set.
2. The program that is executed by the CPU is, in our case, an interpreter.
3. The application program is not executed by the CPU. It can't be because it was never compiled to machine code, and that's the only thing the CPU understands.
4. The application program is, on a technical level, DATA, not CODE, and is executed by the interpreter, which in turn is executed by the CPU.
5. A classic interpreter (like the original BASIC and PHP interpreters) operates directly on the source code. There is no intermediate byte code compilation involved.
6. Of course, for performance reasons source code can be compiled into byte code (which, mind you, has nothing to with the CPU's instruction set!) which is then executed by an interpreter called a virtual machine.
So, let me say it a last time: When working with interpreters, the source code is NOT translated to any kind of binary format. It is only REPRESENTED in some binary form, like ASCII or whatever. But this has nothing to do with the CPU because we are talking about DATA here while the CPU only understands CODE (machine instructions).
In order to get these things right, you have to differentiate between fundamental concepts like syntax vs semantics, data vs representation of data, data vs code, compilation vs interpretation. If you mix all these things up, you can end up with statements like "everything is the same because in the end there are only 1s and 0s". Well, yes, in the end there are only 1s and 0s, but that DOES NOT MEAN EVERYTHING IS THE SAME. That's just a logical short.
PS: The screenshot you showed contains false statements. It claims that "the virtual machine converts bytecode into specific machine instructions". That is awful nonsense. Honestly, I get very annoyed when I see such incompetence. The people who wrote that obviously don't know what an interpreter is, what a VM is, and what a JIT compiler is, or they just don't care for correct language. Just awful.