Is Minecraft a graphical programming language? I've heard it is Turing complete
I had forgotten about PLC ladder logic, never having used them. Can they express unbounded iteration? ISTR there being some unpleasant hacks to sort-of enable advanced concepts like subroutines.
I've used some of the simulation languages, and they are top level block diagrams where the processing is done in C/C++. To me that is syntactic sugar like FSM diagramming languages. Nobody thinks Harel Statecharts are a programming language.
It looks like you are right about LabView. It had changed a lot since I first glanced at it decades ago. I suspect HP's VEE is similar.
I'm completely unfamiliar with gaming environments, and it will stay that way
I never said that these graphical programing languages are actually any good. I hate most of them in at least some way.
None the less they are languages that are sold for prices of a decent used car and/or have user bases counting into the millions. So id argue that they do count as being "successful languages", so apparently there are a lot of people out there that do like them.
Those PLC ladder diagrams are particularly high up my hate list. They have hacky ways of making them do all sorts of things due to allowing a sort of "assembler instruction inlining". That makes it into even more of a horrible unreadable mess, but again pretty much every industrial automation technician knows them and many among them use it to program stuff.
Minecrafts redstone mechanic could indeed be seen as a primitive graphical HDL language. But lets be honest when have you ever seen a fully graphically based HDL that actually was useful and productive? (AltiumDesigners FPGA tools was pretty close actually until new management came along shitcanned the feature)
That being said some of the languages on my list i actually see being more productive versus classical coding. For example Matlab Simulink does make sense in visualizing how signals flow from one thing to the next since it is tailored to work very much like signal processing. I also see shader graphs as a sensible language for programming GPUs since those things tend to have a very signal processing like structure (like simulink) and so a node graph is clearer than just passing things in chains from function to function. Combine that with the typical say Blender(or other <insert other artistic 3D tool>) user that might just be a visual effects artist working on a movie CGI render, so they have no idea what coding even is, nor anyone on the team that knows it (like perhaps a game development studio might have)
I find it hard to argue with that
(I ought to declare that I became beguiled by the possibilities of graphical programming back in 1985 after playing with the first Macs. I even tried to find a way of creating a product which could have ended up like Vizio or perhaps LabView, but I couldn't find a way to finance it. Oh well
I've continued to keep an eye out for graphical languages since then, with little success)
Certainly some problems are inherently dataflow oriented, and those ought to map well onto "schematics". By "schematics" I mean top-level blocks connected by wires along which signals flow.
I'm trying to remember which comms simulation system I looked at >25 years ago. It might have been Comdisco or Matlab. In the end it was faster for me to simply code up my problems without the learning curve associated with the Big Tools.
I do remember that any "interesting" special purpose algorithms relevant to your problem domain had to be coded as C functions inside a wrapper/block. Conceptually obvious, but sensibly indicating that graphics were the wrong tool. The LabView technique seems too low level for all but simple automation (shades of COBOL's "add a to b giving c"!)
PLC ladder diagrams are a good example of a hammer; excellent at their
specialised job, but bloody awkward when you need a screwdriver. Most languages of all types end up becoming foul messes over the decades.
In the early 90s I watched a team try to automate going from SDL specifications to code. They spent far too much time farting around with the pretty pixels in C++, when they could easily have accomplished that task in Smalltalk. The nice pretty pixels were also annotated with the usual foul C++ required to implement the actions and event processing.