Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

NSFWJS not working in react native #24

Open
mowaisch opened this issue Sep 20, 2021 · 5 comments
Open

NSFWJS not working in react native #24

mowaisch opened this issue Sep 20, 2021 · 5 comments

Comments

@mowaisch
Copy link

I installed nsfwjs in react native.
after installation without importing and using app runs but
after:
import * as nsfwjs from 'nsfwjs';
and
await nsfwjs.load()
This error is shown.

`Error: Unable to resolve module path from /Users/muhammadowais/workspace/datingapp/node_modules/get-pixels-frame-info-update/dom-pixels.js: path could not be found within the project or in these directories:
node_modules/get-pixels-frame-info-update/node_modules
node_modules
../../node_modules

If you are sure the module exists, try these steps:

Clear watchman watches: watchman watch-del-all
Delete node_modules and run yarn install
Reset Metro's cache: yarn start --reset-cache
Remove the cache: rm -rf /tmp/metro-*
1 | 'use strict'
2 |
3 | var path = require('path')
| ^
4 | var ndarray = require('ndarray')
5 | var GifReader = require('omggif').GifReader
6 | var pack = require('ndarray-pack')`

[1 ] Then I just installed :
npm install path

Error: Unable to resolve module assert from /Users/muhammadowais/workspace/datingapp/node_modules/gif-encoder/lib/GIFEncoder.js: assert could not be found within the project or in these directories:
node_modules/gif-encoder/node_modules
node_modules
../../node_modules

If you are sure the module exists, try these steps:

Clear watchman watches: watchman watch-del-all
Delete node_modules and run yarn install
Reset Metro's cache: yarn start --reset-cache
Remove the cache: rm -rf /tmp/metro-*
9 | */
10 |
11 | var assert = require('assert');
| ^
12 | var EventEmitter = require('events').EventEmitter;
13 | var ReadableStream = require('readable-stream');
14 | var util = require('util');
at ModuleResolver.resolveDependency (/Users/muhammadowais/workspace/datingapp/node_modules/@react-native-community/cli/node_modules/metro/src/node-haste/DependencyGraph/ModuleResolution.js:234:15)
at DependencyGraph.resolveDependency (/Users/muhammadowais/workspace/datingapp/node_modules/@react-native-community/cli/node_modules/metro/src/node-haste/DependencyGraph.js:413:43)
at Object.resolve (/Users/muhammadowais/workspace/datingapp/node_modules/@react-native-community/cli/node_modules/metro/src/lib/transformHelpers.js:317:42)
at resolve (/Users/muhammadowais/workspace/datingapp/node_modules/@react-native-community/cli/node_modules/metro/src/DeltaBundler/traverseDependencies.js:629:33)
at /Users/muhammadowais/workspace/datingapp/node_modules/@react-native-community/cli/node_modules/metro/src/DeltaBundler/traverseDependencies.js:645:26
at Array.reduce ()
at resolveDependencies (/Users/muhammadowais/workspace/datingapp/node_modules/@react-native-community/cli/node_modules/metro/src/DeltaBundler/traverseDependencies.js:644:33)
at /Users/muhammadowais/workspace/datingapp/node_modules/@react-native-community/cli/node_modules/metro/src/DeltaBundler/traverseDependencies.js:329:33
at Generator.next ()
at asyncGeneratorStep (/Users/muhammadowais/workspace/datingapp/node_modules/@react-native-community/cli/node_modules/metro/src/DeltaBundler/traverseDependencies.js:137:24)

[2 ] then I installed:
npm install assert
This error poped.
error: Error: Unable to resolve module stream from /Users/muhammadowais/workspace/datingapp/node_modules/through/index.js: stream could not be found within the project or in these directories:
node_modules
../../node_modules

If you are sure the module exists, try these steps:

Clear watchman watches: watchman watch-del-all
Delete node_modules and run yarn install
Reset Metro's cache: yarn start --reset-cache
Remove the cache: rm -rf /tmp/metro-*
1 | var Stream = require('stream')
| ^
2 |
3 | // through
4 | //
at ModuleResolver.resolveDependency (/Users/muhammadowais/workspace/datingapp/node_modules/@react-native-community/cli/node_modules/metro/src/node-haste/DependencyGraph/ModuleResolution.js:234:15)
at DependencyGraph.resolveDependency (/Users/muhammadowais/workspace/datingapp/node_modules/@react-native-community/cli/node_modules/metro/src/node-haste/DependencyGraph.js:413:43)
at Object.resolve (/Users/muhammadowais/workspace/datingapp/node_modules/@react-native-community/cli/node_modules/metro/src/lib/transformHelpers.js:317:42)
at resolve (/Users/muhammadowais/workspace/datingapp/node_modules/@react-native-community/cli/node_modules/metro/src/DeltaBundler/traverseDependencies.js:629:33)
at /Users/muhammadowais/workspace/datingapp/node_modules/@react-native-community/cli/node_modules/metro/src/DeltaBundler/traverseDependencies.js:645:26
at Array.reduce ()
at resolveDependencies (/Users/muhammadowais/workspace/datingapp/node_modules/@react-native-community/cli/node_modules/metro/src/DeltaBundler/traverseDependencies.js:644:33)
at /Users/muhammadowais/workspace/datingapp/node_modules/@react-native-community/cli/node_modules/metro/src/DeltaBundler/traverseDependencies.js:329:33
at Generator.next ()
at asyncGeneratorStep (/Users/muhammadowais/workspace/datingapp/node_modules/@react-native-community/cli/node_modules/metro/src/DeltaBundler/traverseDependencies.js:137:24)

[3 ] Then installed:
npm install stream

error: Error: Unable to resolve module events from /Users/muhammadowais/workspace/datingapp/node_modules/gif-encoder/lib/GIFEncoder.js: events could not be found within the project or in these directories:
node_modules/gif-encoder/node_modules
node_modules
../../node_modules

If you are sure the module exists, try these steps:

Clear watchman watches: watchman watch-del-all
Delete node_modules and run yarn install
Reset Metro's cache: yarn start --reset-cache
Remove the cache: rm -rf /tmp/metro-*
10 |
11 | var assert = require('assert');
12 | var EventEmitter = require('events').EventEmitter;
| ^
13 | var ReadableStream = require('readable-stream');
14 | var util = require('util');

[4 ] Intsalled:
npm install events

Error:
error: Error: Unable to resolve module zlib from /Users/muhammadowais/workspace/datingapp/node_modules/pngjs-nozlib/lib/parser-async.js: zlib could not be found within the project or in these directories:
node_modules
../../node_modules

If you are sure the module exists, try these steps:

Clear watchman watches: watchman watch-del-all
Delete node_modules and run yarn install
Reset Metro's cache: yarn start --reset-cache
Remove the cache: rm -rf /tmp/metro-*
2 |
3 | var util = require('util');
4 | var zlib = require('zlib');
| ^
5 | var ChunkStream = require('./chunkstream');
6 | var FilterAsync = require('./filter-parse-async');
7 | var Parser = require('./parser');

[5 ] installed:
npm install zlib

Error:
error: Error: Unable to resolve module ./zlib_bindings from /Users/muhammadowais/workspace/datingapp/node_modules/zlib/lib/zlib.js:

None of these files exist:

node_modules/zlib/lib/zlib_bindings(.native|.ios.js|.native.js|.js|.ios.json|.native.json|.json|.ios.ts|.native.ts|.ts|.ios.tsx|.native.tsx|.tsx|.ios.svg|.native.svg|.svg|.ios.bin|.native.bin|.bin)
node_modules/zlib/lib/zlib_bindings/index(.native|.ios.js|.native.js|.js|.ios.json|.native.json|.json|.ios.ts|.native.ts|.ts|.ios.tsx|.native.tsx|.tsx|.ios.svg|.native.svg|.svg|.ios.bin|.native.bin|.bin)
1 | module.exports = require('./zlib_bindings');
| ^
2 |

NOW I tried installing zlib and adding zlib_bindings in metro.config.js but no lock. there id file at './zlib_bindings' I checked in node_module

if I try to load Model using

import * as tf from '@tensorflow/tfjs';
const modelJson = require('../../../../../../assets/model/model.json');
const modelWeights = require('../../../../../../assets/model/weight.bin');
model = await tf. loadLayersModel(bundleResourceIO(modelJson, modelWeights));
model and weight is downloaded from here:
https://github.com/infinitered/nsfwjs-mobile/blob/master/nsfw-model.json
https://github.com/infinitered/nsfwjs-mobile/blob/master/nsfw-weights.bin

When app starts it gives syntax error in "weight.bin"

@firem-4
Copy link

firem-4 commented Jul 25, 2022

same problem did you fix it ?

@minhchienwikipedia
Copy link

Same issue

@shamshirsalams
Copy link

Any update?

@FranciscoOssian
Copy link

FranciscoOssian commented Jan 25, 2023

I finally created a project with expo that integrates TensorFlow and this model. However, I encountered issues with the nsfw.js library (which is related to GIFs and has many web references) and had to modify it to work minimally with JPEGs. Despite these efforts, the model's predictions were incorrect or unexpected. I may have made an error when adapting methods from the library.

  "dependencies": {
    "@react-native-async-storage/async-storage": "^1.17.11",
    "@tensorflow/tfjs": "4.1.0",
    "@tensorflow/tfjs-react-native": "^0.8.0",
    "expo": "~47.0.12",
    "expo-camera": "~13.1.0",
    "expo-gl": "~12.0.1",
    "expo-image-manipulator": "~11.0.0",
    "expo-image-picker": "~14.0.2",
    "expo-splash-screen": "~0.17.5",
    "expo-status-bar": "~1.4.2",
    "jpeg-js": "^0.4.4",
    "nsfwjs": "^2.4.2",
    "react": "18.1.0",
    "react-native": "0.70.5",
    "react-native-fs": "^2.20.0"
  },
import React, { useState, useEffect } from "react";
import { View, Text, Image, Button } from "react-native";
import * as tf from "@tensorflow/tfjs";
import { bundleResourceIO } from "@tensorflow/tfjs-react-native";
import * as ImagePicker from "expo-image-picker";
import { decode as atob } from "base-64";
import * as jpeg from "jpeg-js";
import { manipulateAsync } from "expo-image-manipulator";
import { classify } from "./src/nsfw";

const modelJson = require("./assets/nsfw-model.json");
const modelWeights = require("./assets/nsfw-weights.bin");

const picInputShapeSize = {
  width: 224,
  height: 224,
};

function imageToTensor(rawImageData) {
  const TO_UINT8ARRAY = true;
  const { width, height, data } = jpeg.decode(rawImageData, TO_UINT8ARRAY);
  // Drop the alpha channel info for mobilenet
  const buffer = new Uint8Array(width * height * 3);
  let offset = 0; // offset into original data
  for (let i = 0; i < buffer.length; i += 3) {
    buffer[i] = data[offset];
    buffer[i + 1] = data[offset + 1];
    buffer[i + 2] = data[offset + 2];

    offset += 4;
  }

  return tf.tensor4d(buffer, [1, height, width, 3]);
}

const App = () => {
  const [model, setModel] = useState(null);
  const [predictions, setPredictions] = useState(null);
  const [image, setImage] = useState();

  useEffect(() => {
    const loadModel = async () => {
      // Carregando modelo
      await tf.ready();
      let model;
      try {
        model = await tf.loadLayersModel(
          bundleResourceIO(modelJson, modelWeights)
        );
      } catch (e) {
        console.log(e);
      }

      setModel(model);
    };
    loadModel();
  }, []);

  const classifyImage = async (uri) => {
    if (!uri) return;
    try {
      const resizedPhoto = await manipulateAsync(
        uri,
        [
          {
            resize: {
              width: picInputShapeSize.width,
              height: picInputShapeSize.height,
            },
          },
        ],
        { format: "jpeg", base64: true }
      );
      const base64 = resizedPhoto.base64;
      const arrayBuffer = Uint8Array.from(atob(base64), (c) => c.charCodeAt(0));
      const imageData = arrayBuffer;
      const imageTensor = imageToTensor(imageData); //decodeJpeg(imageData);
      const p = await classify(model, imageTensor);
      //setPredictions(p[0]);
      console.log(p);
    } catch (e) {
      console.log(e);
    }
  };

  const onHandlePick = async () => {
    let result = await ImagePicker.launchImageLibraryAsync({
      mediaTypes: ImagePicker.MediaTypeOptions.All,
      allowsEditing: true,
      quality: 1,
    });

    if (!result.canceled) {
      setImage(result.assets[0].uri);
    }
  };

  return (
    <View>
      <Image
        source={{ uri: image }}
        style={{ width: 200, height: 200 }}
        onLoad={() => classifyImage(image)}
      />
      <Button title="pick and predict" onPress={onHandlePick} />
      {predictions &&
        predictions.map((prediction, index) => (
          <Text key={index}>{prediction}</Text>
        ))}
    </View>
  );
};

export default App;
const NSFW_CLASSES = {
  0: "Drawing",
  1: "Hentai",
  2: "Neutral",
  3: "Porn",
  4: "Sexy",
};

export async function getTopKClasses(logits, topK) {
  const values = await logits.data();

  const valuesAndIndices = [];
  for (let i = 0; i < values.length; i++) {
    valuesAndIndices.push({ value: values[i], index: i });
  }
  valuesAndIndices.sort((a, b) => {
    return b.value - a.value;
  });
  const topkValues = new Float32Array(topK);
  const topkIndices = new Int32Array(topK);
  for (let i = 0; i < topK; i++) {
    topkValues[i] = valuesAndIndices[i].value;
    topkIndices[i] = valuesAndIndices[i].index;
  }

  const topClassesAndProbs = [];
  for (let i = 0; i < topkIndices.length; i++) {
    topClassesAndProbs.push({
      className: NSFW_CLASSES[topkIndices[i]],
      probability: topkValues[i],
    });
  }
  return topClassesAndProbs;
}

export const classify = async (model, img, topk = 5) => {
  const logits = model.predict(img);
  const classes = await getTopKClasses(logits, topk);
  logits.dispose();
  return classes;
};

this is the predict response when I teste with 1000000%%% nsfw image

[{"className": "Drawing", "probability": 0.5907529592514038}, {"className": "Hentai", "probability": 0.223214790225029}, {"className": "Neutral", "probability": 0.12099546194076538}, {"className": "Porn", "probability": 0.055417921394109726}, {"className": "Sexy", "probability": 0.009618803858757019}]

I currently had to downgrade tfjs-core, follow the explanation. (the current version is 4.2.0 and I moved to 4.1.0

tensorflow/tfjs#7273

edit: I can't confirm but I was also having a problem with tensorflow and using yarn resolve. But I could be wrong.

@net777infamous
Copy link

I finally created a project with expo that integrates TensorFlow and this model. However, I encountered issues with the nsfw.js library (which is related to GIFs and has many web references) and had to modify it to work minimally with JPEGs. Despite these efforts, the model's predictions were incorrect or unexpected. I may have made an error when adapting methods from the library.

  "dependencies": {
    "@react-native-async-storage/async-storage": "^1.17.11",
    "@tensorflow/tfjs": "4.1.0",
    "@tensorflow/tfjs-react-native": "^0.8.0",
    "expo": "~47.0.12",
    "expo-camera": "~13.1.0",
    "expo-gl": "~12.0.1",
    "expo-image-manipulator": "~11.0.0",
    "expo-image-picker": "~14.0.2",
    "expo-splash-screen": "~0.17.5",
    "expo-status-bar": "~1.4.2",
    "jpeg-js": "^0.4.4",
    "nsfwjs": "^2.4.2",
    "react": "18.1.0",
    "react-native": "0.70.5",
    "react-native-fs": "^2.20.0"
  },
import React, { useState, useEffect } from "react";
import { View, Text, Image, Button } from "react-native";
import * as tf from "@tensorflow/tfjs";
import { bundleResourceIO } from "@tensorflow/tfjs-react-native";
import * as ImagePicker from "expo-image-picker";
import { decode as atob } from "base-64";
import * as jpeg from "jpeg-js";
import { manipulateAsync } from "expo-image-manipulator";
import { classify } from "./src/nsfw";

const modelJson = require("./assets/nsfw-model.json");
const modelWeights = require("./assets/nsfw-weights.bin");

const picInputShapeSize = {
  width: 224,
  height: 224,
};

function imageToTensor(rawImageData) {
  const TO_UINT8ARRAY = true;
  const { width, height, data } = jpeg.decode(rawImageData, TO_UINT8ARRAY);
  // Drop the alpha channel info for mobilenet
  const buffer = new Uint8Array(width * height * 3);
  let offset = 0; // offset into original data
  for (let i = 0; i < buffer.length; i += 3) {
    buffer[i] = data[offset];
    buffer[i + 1] = data[offset + 1];
    buffer[i + 2] = data[offset + 2];

    offset += 4;
  }

  return tf.tensor4d(buffer, [1, height, width, 3]);
}

const App = () => {
  const [model, setModel] = useState(null);
  const [predictions, setPredictions] = useState(null);
  const [image, setImage] = useState();

  useEffect(() => {
    const loadModel = async () => {
      // Carregando modelo
      await tf.ready();
      let model;
      try {
        model = await tf.loadLayersModel(
          bundleResourceIO(modelJson, modelWeights)
        );
      } catch (e) {
        console.log(e);
      }

      setModel(model);
    };
    loadModel();
  }, []);

  const classifyImage = async (uri) => {
    if (!uri) return;
    try {
      const resizedPhoto = await manipulateAsync(
        uri,
        [
          {
            resize: {
              width: picInputShapeSize.width,
              height: picInputShapeSize.height,
            },
          },
        ],
        { format: "jpeg", base64: true }
      );
      const base64 = resizedPhoto.base64;
      const arrayBuffer = Uint8Array.from(atob(base64), (c) => c.charCodeAt(0));
      const imageData = arrayBuffer;
      const imageTensor = imageToTensor(imageData); //decodeJpeg(imageData);
      const p = await classify(model, imageTensor);
      //setPredictions(p[0]);
      console.log(p);
    } catch (e) {
      console.log(e);
    }
  };

  const onHandlePick = async () => {
    let result = await ImagePicker.launchImageLibraryAsync({
      mediaTypes: ImagePicker.MediaTypeOptions.All,
      allowsEditing: true,
      quality: 1,
    });

    if (!result.canceled) {
      setImage(result.assets[0].uri);
    }
  };

  return (
    <View>
      <Image
        source={{ uri: image }}
        style={{ width: 200, height: 200 }}
        onLoad={() => classifyImage(image)}
      />
      <Button title="pick and predict" onPress={onHandlePick} />
      {predictions &&
        predictions.map((prediction, index) => (
          <Text key={index}>{prediction}</Text>
        ))}
    </View>
  );
};

export default App;
const NSFW_CLASSES = {
  0: "Drawing",
  1: "Hentai",
  2: "Neutral",
  3: "Porn",
  4: "Sexy",
};

export async function getTopKClasses(logits, topK) {
  const values = await logits.data();

  const valuesAndIndices = [];
  for (let i = 0; i < values.length; i++) {
    valuesAndIndices.push({ value: values[i], index: i });
  }
  valuesAndIndices.sort((a, b) => {
    return b.value - a.value;
  });
  const topkValues = new Float32Array(topK);
  const topkIndices = new Int32Array(topK);
  for (let i = 0; i < topK; i++) {
    topkValues[i] = valuesAndIndices[i].value;
    topkIndices[i] = valuesAndIndices[i].index;
  }

  const topClassesAndProbs = [];
  for (let i = 0; i < topkIndices.length; i++) {
    topClassesAndProbs.push({
      className: NSFW_CLASSES[topkIndices[i]],
      probability: topkValues[i],
    });
  }
  return topClassesAndProbs;
}

export const classify = async (model, img, topk = 5) => {
  const logits = model.predict(img);
  const classes = await getTopKClasses(logits, topk);
  logits.dispose();
  return classes;
};

this is the predict response when I teste with 1000000%%% nsfw image

[{"className": "Drawing", "probability": 0.5907529592514038}, {"className": "Hentai", "probability": 0.223214790225029}, {"className": "Neutral", "probability": 0.12099546194076538}, {"className": "Porn", "probability": 0.055417921394109726}, {"className": "Sexy", "probability": 0.009618803858757019}]

I currently had to downgrade tfjs-core, follow the explanation. (the current version is 4.2.0 and I moved to 4.1.0

tensorflow/tfjs#7273

edit: I can't confirm but I was also having a problem with tensorflow and using yarn resolve. But I could be wrong.

can you create a repo or drop a zipped download link for that?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

6 participants