After conducting my own research, it became clear that ChatGPT and similar language models are powerful tools that can help developers address various issues. Their ability to solve problems in code, self-optimise and find alternatives to given solutions is particularly impressive. Additionally, ChatGPT can introduce developers to new libraries and tools they may not have been aware of before.
However, it is noteworthy that these tools have severe limitations. The model sometimes generates subtle errors, such as comparisons without conditions or accessing non-existing parts in data structures. Additionally, it may offer pointless improvements or duplicate steps like applying multiple compressions.
Is the speed at which machine learning advances solving these issues soon? Maybe, maybe not. For example, self-driving cars have been just around the corner for many years. Replacing human capabilities is not a linear problem. While car manufacturers have solved most of the problem swiftly, the last bit has been intractable and requires further research and development of an unknown timeframe.
Similarly, ChatGPT code and other generations have matured to impressive abilities quickly. But to make it reliable and trustworthy, the last bit of maturity could be many, many years away. As developers are already exploring the technology, it has the potential to make its way into critical systems such as financial, infrastructure, defence, or healthcare systems. This is why it is important to be cautious and aware of the limitations of these tools.
Developers need to remember to treat output from ChatGPT with caution. While it can be intriguing and helpful, developers should always double-check and adapt the code. This is why colleagues or an interactive community like Stack Overflow, where developers validate, share experiences and discuss the pros and cons of the code, remain superior.
In conclusion, ChatGPT and similar language models are powerful tools that can be valuable for developers. However, they come with limitations which negate many of their benefits. These tools can be a great way to inspire, explore, and learn, but they require experienced developers and good practices to be used safely. They are not tools that will replace developers and make them unnecessary but instead means to help experienced developers do their work more efficiently in the future.
Christian Prokopp, Bold Data, Founder
Note, I used ChatGPT with some bullet points to generate the first draft of this post. That was helpful but the amount of edits needed to get to something worth posting was surprisingly high. This experience feels similar to trying to use it as a code copilot. It looks good on first glance but the devil is in the detail.