Keras implementation of the Yahoo Open-NSFW model
-
Updated
Jan 24, 2025 - Python
8000
Keras implementation of the Yahoo Open-NSFW model
Anti Spam/NSFW Telegram Bot Written In Python With Pyrogram.
Rest API Written In Python To Classify NSFW Images.
Group Guardian is a Telegram bot for admins to maintain a safe community.
A tool for detecting viruses and NSFW material in WARC files
A Neural Net for Nudity Detection. Classifier only.
Containerized self-hosted REST API for vision classification, utilizing Hugging Face transformers.
Anti-NSFW Project in python using pre-trained model.
Python package to apply the Safety Checker from Stable Diffusion.
The ultimate open-source AI tagging tool for image galleries using metadata, or .txt files for AI training. Using newest wd-vit-tagger-v3 model by SmilingWolf
Remove adult content in discord channels better with Artificial Intelligence.
vit-mini-explicit-content is an image classification vision-language model fine-tuned from vit-base-patch16-224-in21k for a single-label classification task. It categorizes images based on their explicitness using the ViTForImageClassification architecture.
nsfw-image-detection is a vision-language encoder model fine-tuned from siglip2-base-patch16-256 for multi-class image classification. Built on the SiglipForImageClassification architecture, the model is trained to identify and categorize content types in images, especially for explicit, suggestive, or safe media filtering.
siglip2-mini-explicit-content is an image classification vision-language encoder model fine-tuned from siglip2-base-patch16-512 for a single-label classification task. It is designed to classify images into categories related to explicit, sensual, or safe-for-work content using the SiglipForImageClassification architecture.
A browser interface for NudeNet classifier.
nsfw-image-detection is a vision-language encoder model fine-tuned from siglip2-base-patch16-256 for multi-class image classification. Built on the SiglipForImageClassification architecture, the model is trained to identify and categorize content types in images, especially for explicit, suggestive, or safe media filtering.
Filter(remove) the NSFW(not safe for work) images in a certain directory recursively, backup every images before it
A machine learning-based NSFW filter specifically designed to detect underage content in Tavern V1/V2 cards.
Catgirl Sorting AI
Add a description, image, and links to the nsfw-classifier topic page so that developers can more easily learn about it.
To associate your repository with the nsfw-classifier topic, visit your repo's landing page and select "manage topics."