Calling this a "study" is being a bit too generous. But there is something interesting in it, it seems to use two layers of "reasoning" or interaction (is this how gpt works anyway? Seems like a silly thing to have a chat bot inside a chat bot). The one exposed to the user and the "internal reasoning" behind that. I have a solution, just expose the internal layer to the user. It will tell you its going to do an insider trading in the most simple terms. I'll take that UK government contract now, 50% off.
This is all equivalent to placing two mirrors facing each other and looking into one saying "don't do insider trading wink wink" and being surprised at the outcome.
Calling this a "study" is being a bit too generous. But there is something interesting in it, it seems to use two layers of "reasoning" or interaction (is this how gpt works anyway? Seems like a silly thing to have a chat bot inside a chat bot). The one exposed to the user and the "internal reasoning" behind that. I have a solution, just expose the internal layer to the user. It will tell you its going to do an insider trading in the most simple terms. I'll take that UK government contract now, 50% off.
This is all equivalent to placing two mirrors facing each other and looking into one saying "don't do insider trading wink wink" and being surprised at the outcome.