NsfwDetector CoreML Model to scan images for nudity. It was convert fome coffe model of yahoo model.
To run the example project, clone the repo, and run pod install
from the Example directory first.
swift 5.0
@available(iOS 11.0, *)
func filter(image:UIImage) {
let detector = NsfwImgDetector()
detector.check(image: image) { (isNoSafe, confidence) in
print("isNoSafe:\(isNoSafe) confidence:\(confidence)")
if isNoSafe == false {
//safe
}else if (confidence > 0.9) {
// porn
}else if (confidence > 0.5) {
// sexy picture, may little porn
}else {
// sexy picture
}
}
}
NsfwImgDetector is available through CocoaPods. To install it, simply add the following line to your Podfile:
pod 'NsfwImgDetector'
NsfwImgDetector is available under the BCD license. See the LICENSE file for more info.