This sample demonstrates how to automatically moderate offensive images uploaded to Firebase Storage. It uses The Google Cloud Vision API to detect if the image contains adult or violent content and if so uses ImageMagick to blur the image.
See file functions/index.js for the moderation code.
The detection of adult and violent content in an image is done using The Google Cloud Vision API.
The image blurring is performed using ImageMagick which is installed by default on all Cloud Functions instances. The image is first downloaded locally from the Firebase Storage bucket to the tmp
folder using the google-cloud SDK.
The dependencies are listed in functions/package.json.
The function triggers on upload of any file to your Firebase project's default Cloud Storage bucket.
- Create a Firebase project on the Firebase Console.
- In the Google Cloud Console enable the Google Cloud Vision API. Note: Billing is required to enable the Cloud Vision API so enable Billing on your Firebase project by switching to the Blaze plan. For more information have a look at the pricing page.
- Clone or download this repo and open the
moderate-image
directory. - You must have the Firebase CLI installed. If you don't have it install it with
npm install -g firebase-tools
and then configure it withfirebase login
. - Configure the CLI locally by using
firebase use --add
and select your project in the list. - Install dependencies locally by running:
cd functions; npm install; cd -
To test the sample:
- Deploy your Cloud Functions using
firebase deploy
- Go to the Firebase Console Storage tab and upload an image that contains adult or violent content. After a short time, refresh the page. You'll see a new folder that contains a blurred version of your uploaded image.