trained appropriately and generalizing well -> overfitted and only working well on the training data, with shit performance everywhere else -> somehow working again and generalizing, even better than before
That's fascinating, I've never heard of that before.
That's fascinating, I've never heard of that before.