Thursday, 5 October, 2017 UTC


There are several apps that enable users to generate content en masse, including blogs, social networks, forums and chat bots. And, within these apps, there’s a need to deal with users that post inappropriate content.
Often human moderators had to go through all user submitted content to ensure it’s appropriate for the platform before it could be approved and published. That’s a time-consuming process, but one that we can actually automate.
In this article, I’ll show you how to automatically moderate content - specifically Image Moderation - in four steps with Cloudinary.
A good use case for this is parental control filters for images.
Cloudinary is a cloud platform that provides solutions for image and video management, including server or client-side upload, a huge range of on-the-fly image and video manipulation options, quick content delivery network (CDN) delivery and powerful asset management options.
Cloudinary enables web and mobile developers to address all of their media management needs with simple bits of code in their favorite programming languages or frameworks, leaving them free to focus primarily on their own product's value proposition.
Step 1: Create a Cloudinary Account
Sign up for a free Cloudinary account.
Once you are signed up, you will be redirected to the dashboard where you can get your credentials.
Take note of your Cloud name, API Key and API Secret
Step 2: Set Up A Node Server
Initialize a package.json file:
 npm init
Install the following modules:
 npm install express connect-multiparty cloudinary cors body-parser --save
  • express: We need this module for our API routes
  • connect-multiparty: Needed for parsing http requests with content-type multipart/form-data
  • cloudinary: Node SDK for Cloudinary
  • body-parser: Needed for attaching the request body on express’s req object
  • cors: Needed for enabling CORS
Step 3: Activate Rekognition AI Moderation Add-on
Go to the dashboard add-ons section. Click on Rekognition AI Moderation Add-on and select the Free Plan.
Note: You can change to other plans as your usage increases.
Step 4: Moderate Image Content
Create a server.js file in your root directory. Require the dependencies we installed:
const express = require('express');
const app = express();
const multipart = require('connect-multiparty');
const cloudinary = require('cloudinary');
const cors = require('cors');
const bodyParser = require('body-parser');

app.use(bodyParser.urlencoded({ extended: true }));

const multipartMiddleware = multipart();
Next, configure Cloudinary:
    cloud_name: 'xxxxxxxx',
    api_key: 'xxxxxxxx',
    api_secret: 'xxxxxxx'
Replace xxxxxx with the real values from your dashboard.
Add the route for uploading. Let’s make the route /upload.'/upload', multipartMiddleware, function(req, res) {
  // Upload to Cloudinary
    // Specify Moderation
      moderation: "aws_rek"
    }, function(error, result) {
Once a user makes a POST request to the /upload route, the route grabs the image file from the HTTP request, uploads to Cloudinary, sends to Amazon Rekognition for moderation and returns a response that indicates if the image was approved or rejected.
It’s really that simple! Let’s quickly test this functionality with Postman.
Make sure your server is running:
nodemon server.js
Make the POST request. Ensure you attach an adult/NSFW image file
JSON Response returned
Check out the JSON response returned. The image was rejected because it was flagged as as a nude image. Look at the confidence level. It’s very high.
Users don’t get to post this type of content in our applications anymore. Note: There are two major types of categorizations: Explicit Nudity and Suggestive. Check out the docs for more information.
It’s really that simple to automate the process of moderating image content in your apps. And it’s not applicable to just Node.js apps. There are PHP, Python, Java , Ruby and other SDKs available as well.