Except that when a legal firm actually did this, it generated a completely fake precedent where it accused (and "convicted") a real person of a crime they didn't commit, fabricating the details in its summary of the case
LLMs are designed to always give you what you ask for. If it doesn't find any information it will fabricate it, because it can't respond with a negative
Except that when a legal firm actually did this, it generated a completely fake precedent where it accused (and "convicted") a real person of a crime they didn't commit, fabricating the details in its summary of the case
LLMs are designed to always give you what you ask for. If it doesn't find any information it will fabricate it, because it can't respond with a negative
That’s what I said - generating text to be looked at.. LLMs are bad at that.