I din't think the facts are in the training set, the training set was to get it to reasonably parse text, the facts it "knows" are whatever it finds online.. which of course is going to be dumb bullshit half the time.
I didn't know that, so it's basically like those Google suggested answers for questions, but combined with a natural language text generator? I assumed it was a purely predictive model, like a souped-up markov chain.
I think there's a lot going on under the hood on the NLP portion, because it does have to group stuff into concepts so that it brings in conceptually similar results. But I don't believe it's pretrained on all the stuff it can answer to.
I din't think the facts are in the training set, the training set was to get it to reasonably parse text, the facts it "knows" are whatever it finds online.. which of course is going to be dumb bullshit half the time.
I didn't know that, so it's basically like those Google suggested answers for questions, but combined with a natural language text generator? I assumed it was a purely predictive model, like a souped-up markov chain.
I think there's a lot going on under the hood on the NLP portion, because it does have to group stuff into concepts so that it brings in conceptually similar results. But I don't believe it's pretrained on all the stuff it can answer to.