First, we imported the .mlmodel file into the XCode project.Then, we generated swift module file for the .mlmodel from Target Membership to use it for further processing.If we take this image recognition app example, it requires to first upload the images on a server and pass it to imagga, an image recognition technology, in order to identify whether the image contains any adult content or not.
After that, we set up the Storyboard by adding an imageview, which will allow the user to select an image from the library, and one Label to show the image detection status.
Next, we imported Core ML framework in View Controller.swift file, and added image picker method to let users import images from the photo library.
If the prediction score is higher than 0.8, then the image will be considered as nude. As you saw in the above code, integrating Core ML data model into i OS app isn’t rocket science.
Moreover, this .mlmodel will accept image in 224×224 size, so first we had to resize our user-selected images to 224×224 with following code. Though it requires experience, it’s achievable if you have the knowledge and skills.
Now, being one of the augmented reality app development companies, our main concern, immediately after WWDC 2017 event, was to figure out how we can apply Core ML framework into an i OS app to do interesting things. Our image recognition app lets the user upload a picture, then the image recognition algorithm will predict whether the picture is explicit or not through image recognition API.
Luckily, our i OS developers have figured out the way. In other words, it warns users if the picture contains any adult content.It’s possible that there’s a more recent version somewhere else, but this is the only version I was able to access. Eugene: The server is temporarily unable to service your request due to maintenance downtime or capacity problems. Scott: Do you think Alan Turing, brilliant though he was, had trouble imagining that the judges of his “imitation game” wouldn’t think to ask commonsense questions like the ones above—or that, if they did, they’d actually accept evasion or irrelevant banter as answers? I don’t think alan turing brilliant although this guy was had trouble imagining that the judges of his imitation game would not consider to Oooh. Scott: In your opinion, does your existence demonstrate any idea or principle that wasn’t demonstrated just as convincingly by ELIZA, Joseph Weizenbaum’s chatbot from the 1960s? All the bots after it were nothing but weak parodies, claiming to have “revolutionary improvements”. Scott: OK, I’ll count that as the second sensible thing you’ve said (grading generously).Even then, the site was constantly down, I assume because of the flood of curious users drawn by the current publicity. Scott: Hey, that’s the first sensible thing you’ve said! Don’t you think that a more convincing chatbot could be created, by using Google to trawl the entire web for plausible-sounding answers (analogous to what IBM’s Watson did)? Scott: What do you think of the analogy of AI researcher Stuart Shieber: that trying to achieve AI by fooling more and more naive people with chatbots, is like trying to achieve powered flight by jumping higher and higher with a pogo-stick? I guess you mean that stupid joke about a blonde who dyed her hair!All you have to do is get a readymade python script, convert it into Core ML data model, and integrate it into your i OS 11 app.Here’s how integrated the Core ML data model into our image recognition app.We converted this file from caffe model with which we can detect inappropriate images.