Image Moderation

Our workforce moderates images submitted by your users in real time, protecting your site’s reputation while allowing your community the freedom to post and share. We offer an extremely quick turnaround combined with greater than 99% accuracy rates at a lower cost than traditional outsourced image moderation.

  • High Efficiency Moderation UI – Our image moderation user interface lets moderators review thousands of photos per hour and provides them with efficient tools to identify rejection reasons, keeping your costs low.
  • Smart Quality Control – Our proprietary platform tracks all moderators and continually optimizes the workforce for quality. Periodically, we test our moderators by submitting images for moderation that violate your policies. We remove workers who do not correctly reject these “known violators.”
  • Image Value Enhancement – Our workforce can add meta data to your images to help you better leverage the assets your community creates. You can choose to enhance the moderation process with image categorization and tagging.

Protect Your Brand’s Reputation

  • Ensure compliance with company policies and legal standards
  • Monitor for child related issues including COPA compliance and cyberbullying
  • Protect community users and your reputation 24-7
  • Provide a safe environment for your customers and users

Benefits of Managed Moderation

Custom Policies

We can apply image moderation using both standard moderation guidelines and custom policies. If you need images moderated for policies that extend beyond basic guidelines, we'll work with you to implement custom rules and rejection reasons.

API Integration

Our advanced application programming interface provides a seamless solution for submitting images and receiving results. This hands-off management process ensures an extremely fast turnaround time.


We provide you with monthly reporting that outlines the numbers of images moderated and the action taken on each image.


Standard Moderation Guidelines

Our highly qualified moderators provide consistent, objective compliance judgments in the following categories:

  • Violence and threats
  • Illegal drug use
  • Obscenity
  • Nudity and pornography
  • Graphic content
  • Self-Harm
  • Child-related issues (COPA compliance and cyberbullying)
  • Identity and privacy
  • Hate speech
  • Phishing and spam
View demo.