I had to try this and it doesn't seem that it thinks that, so, I guess it got fixed.
To clarify, you can still overwrite old knowledge, so maybe one of their finetune runs had info that reinforced that he is still alive. It's just that if you go a long time without training a model on something specific, it won't forget it (but it might get to the point where it's "noisy" -- I don't know how this would work on text models, I mainly do image models). Like if you train a model on a dataset like LAION which is just a bunch of random various images from the internet, then train it for a while exclusively on something specific like anime pictures, the resulting model will still be able to make "photorealistic" content, or content of subjects not in the more recent dataset, though the results might be somewhat degraded.
I had to try this and it doesn't seem that it thinks that, so, I guess it got fixed.
To clarify, you can still overwrite old knowledge, so maybe one of their finetune runs had info that reinforced that he is still alive. It's just that if you go a long time without training a model on something specific, it won't forget it (but it might get to the point where it's "noisy" -- I don't know how this would work on text models, I mainly do image models). Like if you train a model on a dataset like LAION which is just a bunch of random various images from the internet, then train it for a while exclusively on something specific like anime pictures, the resulting model will still be able to make "photorealistic" content, or content of subjects not in the more recent dataset, though the results might be somewhat degraded.
Ah ok, thanks for the info.