I think it's the difference between having a model that explains why a thing is happening vs a model that simply predicts what will happen without truly knowing why it did. Approximating the output of a black box vs looking inside and understanding it's mechanisms
There's a level when you can say you understand a specific thing though. Like if gravity is explained by the interaction of quantum particles through spacetime, we could say we understand gravity. But the particles themselves might still be a black box. Of course you can always go one layer deeper
I think it's the difference between having a model that explains why a thing is happening vs a model that simply predicts what will happen without truly knowing why it did. Approximating the output of a black box vs looking inside and understanding it's mechanisms
deleted by creator
deleted by creator
deleted by creator
There's a level when you can say you understand a specific thing though. Like if gravity is explained by the interaction of quantum particles through spacetime, we could say we understand gravity. But the particles themselves might still be a black box. Of course you can always go one layer deeper