NsfwSpy.js is a nudity/pornography image classifier built for Node.js and browsers, based on our parent .NET project, to aid in moderating user-generated content for various different application types, written in TypeScript. The machine learning model has been trained against the MobileNetV2 neural net architecture with 537,000 images (186GB), from 4 different categories:
Label | Description | Files |
---|---|---|
Pornography | Images that depict sexual acts and nudity. | 108,000 |
Sexy | Images of people in their underwear and men who are topless. | 76,000 |
Hentai | Drawings or animations of sexual acts and nudity. | 83,000 |
Neutral | Images that are not sexual in nature. | 268,000 |
NsfwSpy.js isn't perfect, but the accuracy should be good enough to detect approximately 96% of Nsfw images, those being images that are classed as pornography, sexy or hentai.
Pornography | Sexy | Hentai | Neutral | |
---|---|---|---|---|
Is Nsfw (pornography + sexy + hentai >= 0.5) | 95.0% | 97.3% | 93.3% | 3.7% |
Correctly Predicted Label | 85.0% | 81.0% | 89.8% | 96.4% |
Want to see how NsfwSpy.js performs? Try it now on our test site.
This project is available as two seperate packages, one for browsers and one for Node.js.
Before starting to use NsfwSpy, the model should be loaded from your hosted site or as local files on your system.
Hosted files for Browsers
const nsfwSpy = new NsfwSpy("./model/model.json");
Local files for Node.js
const nsfwSpy = new NsfwSpy("file://./model/model.json");
Include this in your HTML page:
<script src="https://unpkg.com/@nsfwspy/[email protected]/dist/nsfwspy-browser.min.js"></script>
npm install @nsfwspy/browser
Import NsfwSpy at the top of your JavaScript or TypeScript file:
import { NsfwSpy } from '@nsfwspy/browser';
const img = document.getElementById("img");
const nsfwSpy = new NsfwSpy("./model/model.json");
await nsfwSpy.load();
const result = await nsfwSpy.classifyImage(img);
npm install @nsfwspy/node
Import NsfwSpy at the top of your JavaScript or TypeScript file:
JavaScript
const { NsfwSpy } = require('@nsfwspy/node');
TypeScript
import { NsfwSpy } from '@nsfwspy/node';
const filePath = "C:\\Users\\username\\Documents\\flower.jpg";
const nsfwSpy = new NsfwSpy("file://./model/model.json");
await nsfwSpy.load();
const result = await nsfwSpy.classifyImageFile(filePath);
const imageBuffer = await fs.readFileSync(filePath);
const nsfwSpy = new NsfwSpy("file://./model/model.json");
await nsfwSpy.load();
const result = await nsfwSpy.classifyImageFromByteArray(imageBuffer);
NSFWJS is the most popular JS pornographic image classifier. It works differently to NsfwSpy in that Sexy images include women with exposed breasts as well as people in their underwear. NsfwSpy classifies any nudity as pornography.
Private Detector is the python image classifier from the team at Bumble. This classifier has been specifically designed for nude images and not Hentai or Sexy images.
Pornography | Sexy | Hentai | Neutral | |
---|---|---|---|---|
NsfwSpy.js | 94.3% | 96.3% | 94.6% | 3.4% |
NSFWJS | 93.7% | 92.2% | 90.5% | 7.3% |
Pornography | Sexy | Hentai | Neutral | |
---|---|---|---|---|
NsfwSpy.js | 83.1% | 82.2% | 90.7% | 97.2% |
NSFWJS | 92.6%* | 69.0% | 88.2% | 92.7%* |
Private Detector | 76.2% | - | - | 99.2% |
*NSFWJS's Pornography numbers are the sum of Pornography + Sexy and Neutral numbers are the sum of Neutral + Drawing.
Interested to get involved in the project? Whether you fancy adding features, providing images to train NsfwSpy with or something else, feel free to contact us via email at [email protected] or find us on Twitter at @nsfw_spy.
Using NsfwSpy? Let us know! We're keen to hear how the technology is being used and improving the safety of applications.
Got a feature request or found something not quite right? Report it here on GitHub and we'll try to help as best as possible.