I think this summarizes in one conversation what is so fucking irritating about this thing: I am supposed to believe that it wrote that code.
No siree, no RAG, no trickery with training a model to transform the code while maintaining identical expression graph, it just goes from word-salading all over the place on a natural language task, to outputting 100 lines of coherent code.
Although that does suggest a new dunk on computer touchers, of the AI enthusiast kind, you can point at that and say that coding clearly does not require any logical reasoning.
(Also, as usual with AI it is not always that good. sometimes it fucks up the code, too).
To give an example, Warner Brothers got sued by Bethesda for stealing code from Fallout Shelter when making their Westworld mobile game, with Bethesda pointing to a bug that appeared in early versions of FO Shelter as evidence of stolen code.
On the other hand they blatantly reskinned an entire existing game, and there’s a whole breach of contract aspect there since apparently they were reusing their own code that they wrote while working for Bethesda, who I doubt would’ve cared as much if this were only about an LLM-snippet length of code.
LLM snippets are so 2024. Coding agents, baby.
Ah yes, the supreme technological miracle of automating the ctrl+c/ctrl+v parts when applying the LLM snippet into your codebase.
Yeah, that’s a great example.
The other thing is that unlike art, source code is already made to be consumed by a machine. It is not any more transformative to convert source code to equivalent source code, than it is to re-encode a video.
The only thing they do that is “transformative” is using source code not for compiling it but for defrauding the investors.