NSFW Image Classification

Model by Open Source

This model classifies imagery as either Suitable For Work (SFW) or Not Suitable For Work (NSFW) based upon the presence of pornographic content in an image. It takes an image as its input and returns a JSON output with floating point scores for the model’s determination of the image’s SFW and NSFW probabilities. This model can be used forensically across an IT system to hunt for unauthorized media. The model could also be used to moderate job data flows and to segregate data when an end user’s job requires the viewing of possibly objectionable content.


Many models are available for limited use in the free Modzy Basic account.