How to Add Image Content Moderation to Your App in 5 Minutes

Step-by-step guide to integrating NSFW image detection into your web or mobile application using a free API.

VS
NSFW Checker Team
· 4 min read

Why content moderation matters

If your app accepts user-uploaded images — profile photos, chat messages, forum posts, or any user-generated content — you need content moderation. Without it, a single explicit image can damage your brand, violate app store policies, and drive away users.

Manual moderation does not scale. A moderator can review a few hundred images per day. An API can process thousands per second.

Step 1: Choose your approach

You have two options for when to moderate:

  • Before publishing: — Check the image before it goes live. Users see a brief delay but no explicit content ever appears on your platform.
  • After publishing: — Show the image immediately, check it in the background, and remove it if flagged. Faster UX but explicit content may be briefly visible.
  • For most apps, moderating before publishing is the safer choice.

    Step 2: Integrate the API

    Here is how to add NSFW detection to a Node.js/Express backend:

    const express = require("express");
    const multer = require("multer");
    const upload = multer();
    
    const app = express();
    
    app.post("/upload", upload.single("image"), async (req, res) => {
      // Send image to NSFW Checker API
      const formData = new FormData();
      formData.append("image", new Blob([req.file.buffer]));
    
      const check = await fetch("https://api.nsfwcheckers.workers.dev", {
        method: "POST",
        body: formData,
      });
      const result = await check.json();
    
      if (result.nsfw) {
        return res.status(400).json({ error: "Image contains inappropriate content" });
      }
    
      // Image is safe — save it, process it, etc.
      res.json({ success: true, message: "Image uploaded successfully" });
    });

    Step 3: Handle the result

    The API returns a simple JSON response:

  • `nsfw: true` — Block the image, show an error to the user
  • `nsfw: false` — Allow the image to be published
  • `score` — A confidence value from 0 to 1. You can set your own threshold (default is 0.5)
  • Step 4: Add to your frontend (optional)

    You can also check images client-side before uploading to your server:

    async function checkImage(file) {
      const fd = new FormData();
      fd.append("image", file);
      const res = await fetch("https://api.nsfwcheckers.workers.dev", {
        method: "POST",
        body: fd,
      });
      const data = await res.json();
      return data.nsfw;
    }

    This gives instant feedback to users and saves server bandwidth on rejected images.

    Rate limits and pricing

    The free tier gives you 100 requests per day — enough for development, testing, and small apps. For higher volume, Pro (5,000/day) and Business (50,000/day) plans are coming soon.

    Summary

    Adding content moderation to your app takes 5 minutes with the NSFW Checker API. No signup, no API key, no complex setup. Just send an image and get a result.

    Try NSFW Checker API for free

    100 requests/day. No signup required.

    Try It Now