One of the most interesting things Apple announced at WWDC was a new framework designed to allow developers to embed machine learning capabilities into their apps. One of the things Core ML can be trained to do is identify and caption real-world objects in real-time, which Apple demos via a sample app trained with 1000 objects …
Developer Paul Haddad posted an amusing tweet testing the capability. While the screen recording showed the iPhone recognising a screwdriver, keyboard and carton, it didn’t seem to recognise one familiar object: a first-generation Mac Pro.
The Core ML framework mostly identified it as a speaker, though from one particular angle it decided it was instead a space-heater. You can see the recording below.
To read full article - https://goo.gl/BVTuk9