Who else would be better than ChatGPT at answering this question?
Is ChatGPT any good at writing code?
This is how ChatGPT replied:
As of my last update in September 2021, ChatGPT is more suited for generating human-like text based on the prompts given to it. While it can provide some basic code snippets, it is not specifically designed or trained to write complex or functional code. For coding tasks, you would be better off using specialized tools like code editors or IDEs.
If there have been updates or improvements to ChatGPT's capabilities since my last update, I recommend checking the OpenAI website or official documentation for the most up-to-date information.
Phew! I might get to retire before I'm made redundant
OpenAI themselves have always stated this clearly. That doesn't prevent people from pretending they haven't heard anything.
Beyond the "correctness" of the models' output, which over time may be made arbitrarily better, one interesting question is not so much about AI than about languages in general: it looks like we're yet again trying to reinvent the language wheel. Here's why.
Natural languages are well known to be very poor, in general, and by themselves, at expressing technical things accurately. That's why we have invented maths notation, and various programming languages.
The idea that one can accurately define, with natural language, a complete system is a fallacy.
And, as I often say, in programming (and engineering in general), the hard part is often to specify things clearly and unambiguously, that's the hard part. Not the implementation. But implementing has the benefit of making us think of aspect of the specs that are incomplete, possibly incorrect or inconsistent. Without that step, we're just acting blind.
So, yeah, people are getting all excited because they can ask relatively trivial stuff and get a (sometimes) working piece of code, but just try with something more elaborate and you'll quickly figure out that both the LLM and yourself will fail miserably.
One key starting point IMO would be to improve the way we are able to express ourselves with "natural language" in a completely consistent and unambiguous way. Good luck with that.
So, no, you won't get redundant. But you're likely to get frustrated as more and more people expect these LLMs to do your job, and then come back to you to help them get out of the mess they have put themselves in. Get prepared to this to get over the frustration (the "I told you" syndrome), and you'll be fine with probably no shortage of work to do for years to come.