ios actually does this, i can search for generic terms like "cat" or "panda" to find exclusively images/videos of those animals, but i can also search for a specific person (if their face is visible in the picture i want to search for) and they're also categorized like that
I'm curious, iPhone might be powerful enough to run the models locally given enough time but do you know if it's actually that or Apple runs the models on their servers and then sends you back the searchable tags from the models to attach to each respective photos metadata
ios actually does this, i can search for generic terms like "cat" or "panda" to find exclusively images/videos of those animals, but i can also search for a specific person (if their face is visible in the picture i want to search for) and they're also categorized like that
I'm curious, iPhone might be powerful enough to run the models locally given enough time but do you know if it's actually that or Apple runs the models on their servers and then sends you back the searchable tags from the models to attach to each respective photos metadata
there are also third party apps that use AI classification, and afaik it's a local feature exposed by an ios api
oh shit, I've never used iOS but this looks cool af https://developer.apple.com/machine-learning/