Collection of scripts to aggregate image data for the purposes of training an NSFW Image Classifier
-
Updated
Jan 21, 2024 - Shell
Collection of scripts to aggregate image data for the purposes of training an NSFW Image Classifier
A free, open source, and privacy-focused browser extension to block “not safe for work” content built using TypeScript and TensorFlow.js.
Keras implementation of the Yahoo Open-NSFW model
A .NET image and video classifier used to identify explicit/pornographic content written in C#.
NudeNet: NSFW Object Detection for TFJS and NodeJS
✅ CODAR is a framework built using PyTorch to analyze post (Text+Media) and predict cyber bullying and offensive content.
An NSFW Image Classification REST API for effortless Content Moderation built with Node.js, Tensorflow, and Parse Server
Anti Spam/NSFW Telegram Bot Written In Python With Pyrogram.
Rest API Written In Python To Classify NSFW Images.
This repo contains Deep learning Implementation for identifying NSFW images.
[Android] NSFW(Nude Content) Detector using Firebase AutoML and TensorFlow Lite
A JavaScript image classifier used to identify explicit/pornographic content written in TypeScript.
This repository is dedicated for building a classifier to detect NSFW Images & Videos.
An NSFW Image Classifier including an Automation Engine for fast deletion & moderation built with Node.js, TensorFlow, and Parse Server
Group Guardian is a Telegram bot for admins to maintain a safe community.
There are many common vision tasks that were resolve by use On-device machine learning.
NSFW画像検出モデル(open_nsfw_android)をColaboratory上で動かすサンプル
A tool for detecting viruses and NSFW material in WARC files
Simple drop in API to determine if image is NSFW using TensorFlow
Add a description, image, and links to the nsfw-classifier topic page so that developers can more easily learn about it.
To associate your repository with the nsfw-classifier topic, visit your repo's landing page and select "manage topics."