Back to model community

NSFW Image Classification - powered by Modzy MLOps platform for Enterprise and Edge AI

NSFW Image Classification

Model by Open Source

This model classifies imagery as either Suitable For Work (SFW) or Not Suitable For Work (NSFW) based upon the presence of pornographic content in an image. It takes an image as its input and returns a JSON output with floating point scores for the model’s determination of the image’s SFW and NSFW probabilities. This model can be used forensically across an IT system to hunt for unauthorized media. The model could also be used to moderate job data flows and to segregate data when an end user’s job requires the viewing of possibly objectionable content.


See the model in action with a Modzy MLOps platform demo or start a trial