Researchers from Stanford have found that code-generating AI systems like Github Copilot may pose more security vulnerabilities than previously understood.
Elon Musk is a co-founder of the artificial intelligence company OpenAI, which was the focus of this research.
Codex is the engine behind Microsoft’s GitHub Copilot, which translates natural language into code and makes contextually relevant adjustments recommendations to simplify and popularise the development process.
Problems with Artificial Intelligence Coding
Co-author Neil Perry states that “code-generating technologies are presently not a substitute for human developers.”
Perry elaborated, “developers using [coding tools] to complete tasks outside of their own areas of expertise should be concerned, and those using them to speed up tasks that they are already skilled at should carefully double-check the outputs and the context that they are used in in the overall project.”
The use of artificial intelligence in coding tools has been called into question before. In reality, GitHub, which is owned by Microsoft, was sued because its Copilot feature did not properly credit the work of other developers. In response, a lawsuit seeking $9 billion was filed over 3.6 million separate breaches of Section 1202.
Artificial intelligence (AI) driven code-generating tools are now best seen as a helping hand that helps speed up programming rather than a complete replacement; nevertheless, if recent advancements are any indication, they may soon replace conventional coding.
Subtly charming pop culture geek. Amateur analyst. Freelance tv buff. Coffee lover