Here's the prompt immediately before that, writing the code to a file. (Which I'm now realizing wasn't quite formatted right, but it worked anyway.)
Interestingly, it didn't like so I had to manually add the declaration for printf.
Here's the prompt immediately before that, writing the code to a file. (Which I'm now realizing wasn't quite formatted right, but it worked anyway.)
Interestingly, it didn't like so I had to manually add the declaration for printf.
Explain the implications of this like I'm an idiot. I'm obviously not, but just in case there are any dumb guys reading.
Depends. There's a possibility that it's internalized how a compiler works and so it's emulating the function of one. Unlikely, but not impossible as gpt3 does seem capable of outputting small amounts of stuff in formats like base64. However, it's much more likely that it's seen enough example of tutorial code that it's simulating what the output should be for those specific functions like a parrot that's been trained to finish songs it has heard over and over.
I also got it to compile and run quick sort. When I get the time I'm going to try and see if it will give me a hexdump of the compiled file then try to run it in an actual terminal.
I've gotten it to give me parts of a json as an incomplete zip file before, but once it reached a point that required repeating, like null spaces or whatever, it got stuck in a loop. It's complicated, but also fascinating to try and visualize out the different levels of like meta that need to occur for all this to happen based on what is essentially a super long and highly contextualized Markov chain.
Yeah it'll absolutely lie at random
I don't really know, but I'm thinking it probably has less of an impact than it being able to write code. It's technically more impressive though.