Skip to content

zhulmin/NsfwImgDetector

Repository files navigation

NsfwImgDetector

CI Status Version License Platform

NsfwDetector CoreML Model to scan images for nudity. It was convert fome coffe model of yahoo model.

如果pod报错无法加载mlmodel, 直接拉取NsfwImgDetector文件夹到项目中即可使用 (仅包含swift和mlmodel两个文件)


Example

To run the example project, clone the repo, and run pod install from the Example directory first.

Requirements

swift 5.0

Usage

@available(iOS 11.0, *)
func filter(image:UIImage) {
    
    let detector = NsfwImgDetector()

    detector.check(image: image) { (isNoSafe, confidence) in
        print("isNoSafe:\(isNoSafe) confidence:\(confidence)")
        if isNoSafe == false {
            //safe
        }else if (confidence > 0.9) {
            // porn
        }else if (confidence > 0.5) {
            // sexy picture, may little porn
        }else {
            // sexy picture
        }
    }
}

Installation

NsfwImgDetector is available through CocoaPods. To install it, simply add the following line to your Podfile:

pod 'NsfwImgDetector'

Author

zhulmin1458@gmail.com

License

NsfwImgDetector is available under the BCD license. See the LICENSE file for more info.

About

No description, website, or topics provided.

Resources

License

Stars

Watchers

Forks

Packages

No packages published