AI Tools.

Search

image classification

nsfw-image-detection-384

A fine-tuned image classifier from Marqo that flags adult or explicit content in images at 384px input resolution. It outputs probability scores for NSFW versus safe content and is commonly used as a pre-filter in content moderation pipelines before storing or serving user uploads. Apache 2.0 licensed for commercial deployment.

Last reviewed

Use cases

  • Automated content moderation for user-generated image uploads
  • Pre-screening images before routing to manual review queues
  • Platform compliance checks against adult content policies
  • Dataset curation to remove explicit images before model training

Pros

  • Apache 2.0 license allows commercial content moderation deployment
  • 384px input balances classification accuracy and inference throughput
  • Binary probability output is straightforward to threshold per use case

Cons

  • Binary classification misses nuanced content categories like violence or gore
  • Susceptible to adversarial cropping or low-resolution obfuscation techniques
  • May require threshold calibration for cultural or platform-specific policies

FAQ

What is nsfw-image-detection-384 used for?

Automated content moderation for user-generated image uploads. Pre-screening images before routing to manual review queues. Platform compliance checks against adult content policies. Dataset curation to remove explicit images before model training.

Is nsfw-image-detection-384 free to use?

nsfw-image-detection-384 is an open-source model published on HuggingFace. License terms vary by model — check the model card for the specific license.

How do I run nsfw-image-detection-384 locally?

Most HuggingFace models can be loaded with transformers or the appropriate framework library. See the model card for framework-specific instructions and hardware requirements.

Tags

timmsafetensorsimage-classificationlicense:apache-2.0region:us