How We Developed an Image Recognition App in iOS 11 That Detects Inappropriate Images To Warn User About Adult Content

0
Shares
How We Developed an Image Recognition App in iOS 11 That Detects Inappropriate Images To Warn User About Adult Content
3.7 (73.33%) 3 votes

image recognition app

During WWDC 2017, Apple launched a lot of APIs and exciting frameworks for iOS11 App Development. And among all the new frameworks, Core ML is undoubtedly the most popular framework.

What’s Core ML?

As you probably know that Artificial intelligence and Machine Learning are the most trending subjects right now, the Core ML framework is used to integrate machine learning models into iOS mobile apps.

Core ML lets you integrate a wide variety of machine learning model types into your iOS app. Furthermore, in addition to supporting deep learning with over 30 layer types, it also supports standard models such as SVMs, tree ensembles, and generalized linear models.

And because it’s built on top of low level technologies such as Accelerate and Metal, the Core ML seamlessly takes the advantage of the CPU and GPU to provide maximum performance as well as efficiency.

In a nutshell, Core ML is a brand new machine learning framework, announced during this year’s WWDC event, that comes along with iOS 11. With this framework, you can integrate machine learning models into your app.

Now, being one of the augmented reality app development companies, our main concern, immediately after WWDC 2017 event, was to figure out how we can apply Core ML framework into an iOS app to do interesting things.

Luckily, our iOS developers have figured out the way. And we’ve successfully developed an image recognition mobile app to detect explicit images containing adult content.

Cool, huh?

Let’s see what our app is about.

App Overview

The app we developed is fairly simple. Our image recognition application lets the user upload a picture, then the image recognition algorithm will predict whether the picture is explicit or not through image recognition API. In other words, it warns users if the picture contains any adult content.

Now, the best option for this is to add more features and create a sext spying app which can be useful for today’s parents. It’s a modern craze that scares the most parents about their children exchanging inappropriate pictures and selfies with friends or even strangers online.

But, with this app, it’s possible to add certain features which would notify a parent if the children are taking part in such activities.

img_2825

How We Build Our Image Recognition App?

As you probably know that before iOS 11 release, it was not possible to integrate machine learning scripts directly into iOS apps. If we take this image recognition app example, it requires to first upload the images on a server and pass it to imagga, an image recognition technology, in order to identify whether the image contains any adult content or not.

But, in iOS11 app development, it is possible to achieve the same without leveraging any third-party API. All you have to do is get a readymade python script, convert it into Core ML data model, and integrate it into your iOS 11 app.

Here’s how integrated the Core ML data model into our image recognition app.

First, we imported the .mlmodel file into the XCode project.

ios11-app-development-tutorial

Then, we generated swift module file for the .mlmodel from Target Membership to use it for further processing.

ios11-app-development-tutorial-1

After that, we set up the Storyboard by adding an imageview, which will allow the user to select an image from the library, and one Label to show the image detection status.

ios11-app-development-tutorial-2

Next, we imported Core ML framework in ViewController.swift file, and added image picker method to let users import images from the photo library.

Picker Controller:

let pickerController = UIImagePickerController()
pickerController.delegate = self
pickerController.sourceType = .savedPhotosAlbum present(pickerController, animated: true)

To Handle Image in Image Picker Delegate:

func imagePickerController(_ picker: UIImagePickerController, didFinishPickingMediaWithInfo info: [String : Any]) {
dismiss(animated: true)
guard let image = info[UIImagePickerControllerOriginalImage] as? UIImage else {
fatalError("couldn't load image from Photos")
}

self.imgPicker.image = image

guard let ciImage = CIImage(image: image) else {
fatalError("couldn't convert UIImage to CIImage")
}

detectScene(image: ciImage)
}

In the above function of “DetectScene”, we will recognise image using Core ML.

Now, before diving into the coding part, let us first describe our .mlmodel file.

We converted this file from caffe model with which we can detect inappropriate images. This model accepts images as input and provides prediction score between 0 and 1.

If the prediction score is higher than 0.8, then the image will be considered as nude. Moreover, this .mlmodel will accept image in 224×224 size, so first we had to resize our user-selected images to 224×224 with following code.

var nudeImage = resizeImage(image: UIImage.init(ciImage: image), newWidth: 224)

func resizeImage(image: UIImage, newWidth: CGFloat) -> UIImage? {

let newHeight = newWidth
UIGraphicsBeginImageContext(CGSize(width: newWidth, height: newHeight))
image.draw(in: CGRect(x: 0, y: 0, width: newWidth, height: newHeight))

let newImage = UIGraphicsGetImageFromCurrentImageContext()
UIGraphicsEndImageContext()

return newImage
}

Once done, we then initialized our model with following.

let model = NudePredict()

After that, we added the prediction code.

if let prediction = try? model.prediction(data: pixelBuffer) {
print(prediction.prob)

let output = prediction.prob // MLMultiArray

let nudeValue = output[1] as! Double

if nudeValue > 0.1
{
DispatchQueue.main.async { [weak self] in
self?.lblStatus.text = "Image is nude"
}
}else{
DispatchQueue.main.async { [weak self] in
self?.lblStatus.text = "Image is normal"
}
}
}else{
DispatchQueue.main.async { [weak self] in
let alert=UIAlertController(title: "Error", message: "Unable to detect image", preferredStyle: UIAlertControllerStyle.alert);

func yesHandler(actionTarget: UIAlertAction){
print("YES -> !!");
}
alert.addAction(UIAlertAction(title: "OK", style: UIAlertActionStyle.default, handler: yesHandler));

self?.present(alert, animated: true, completion: nil);
}
}

And done!

The app is ready to launch!

Which Problem Our iOS App Developers Faced While Developing This App?

As you saw in the above code, integrating Core ML data model into iOS app isn’t rocket science. Though it requires experience, it’s achievable if you have the knowledge and skills.

However, the most difficult part or the problem we faced during this iOS11 app development was converting the Python Script into Core ML data model.

If you’re passionate and a keen researcher, then you probably know how difficult it is to convert a Python Script into Core ML data model for iOS11 app development.

But as Howard Schultz said it once:

quote-when-you-re-surrounded-by-people-who-share-a-passionate-commitment-around-a-common-purpose-howard-schultz-26-24-79

That’s what we’ve here, at Space-O.

We have the team of experienced iOS developers who are highly passionate to learn new technologies.

In fact, we’re now focusing on how we can leverage the iOS 11 features to develop innovative mobile apps.

Which Similar Apps We Can Develop Using Latest Technologies?

Apart from integrating machine learning data models into iOS 11 apps, we’re also focusing on Augmented Reality Apps development. If you’re interested in developing AR app, or want to get a free quote for your idea, contact us.

 
0
Shares
 

Get Your Free Quote to Make an Image Recognition App