Subscribe to Toloka News
Subscribe to Toloka News
If your business relies on user-generated content, you're probably already using some type of content moderation. But are you sure you have the best moderation tool for your product and your bottom line?
Content moderation can have a hidden impact on your business. Let's look at how you can optimize your moderation process for better product metrics.
Content moderation isn't just for social media. Here are some typical use cases where moderation plays a role in product success.
New features let viewers share their opinions and leave reviews on the shows they've watched, allowing online streaming platforms to benefit from increased user engagement. But sometimes anonymity brings out the worst in people. You need fast, reliable moderation to prevent a couple of toxic users from ruining the experience for everyone.
One of the biggest problems of the gaming industry is toxic content in multiplayer environments. Socializing is an important part of the game for most players, but voice chat is especially hard to monitor for toxic language and insults. The negative behavior of a few players can discourage others from playing. Protect customers on your gaming platform by moderating player communication as it happens.
Even a single toxic encounter can make a user leave a dating platform. Detecting and preventing bad behavior by removing offenders can make the community healthier, ultimately improving metrics with better overall user retention.
Sometimes you need to watch for problems more subtle than toxic content. For instance, if you allow comments on a company blog or specialized forum, you don't want people praising your competitors. Regular moderation services don't cover cases like this, so you need a custom model to detect specific content. You can run the custom model on your website for pre-moderation, meaning comments won't be posted without “approval”, or post-moderation, meaning irrelevant comments will be deleted soon after posting.
Modern e-commerce websites are mostly user-generated, from product pages to reviews and comments. All text, images, and videos need to pass moderation across multiple categories to prevent liability for anything offensive, inappropriate, or illegal on the site.
There are three ways to protect your platform from undesirable content: hire a team of moderators, use automated moderation tools or services, or combine automated tools with manual moderation on edge cases.
Hiring an in-house moderation team isn't efficient for several reasons:
Creating your own ML-based solution is even more costly, especially if you need to monitor both texts and images, or even videos and voice chats. Text moderation is straightforward because models are pre-trained to detect certain words and expressions. Images, voice, and videos are much more challenging, and only state-of-the-art models can recognize patterns in visual content.
This challenge deters many product teams from introducing collaboration features, while their competitors race ahead.
Toloka Moderation automates the process with an AI model tuned to your needs. Instead of employing a team of moderators and managers, our clients usually keep just 2 or 3 moderators on board to check the model's performance and run quality control. The overall cost of automation plus quality control is much lower than the average spending on manual moderation on a large scale.
An additional advantage is the opportunity for multi-language moderation. If your business is a startup operating in 15 languages, hiring a team of moderation experts for every language could be cost-prohibitive, especially for grammatically complex languages like Arabic that tend to cost more. With Toloka you get an automated solution with the skills of a multilingual team for the price of a single moderator.
In terms of ethics, automated solutions do a better job of protecting people from harmful content. Global content sharing platforms usually have limited contracts with moderators for the span of a few months. The purpose of short-term contracts is to protect workers from the effects of repeated exposure to violent or toxic content. Using an AI-based solution is a more ethical way to prevent harm.
|Manual moderation||AI solution|
|Number of human moderators||Minimum 10-15, depending on the number of languages||2-3 for quality control|
|Exposure of moderators to violent/illegal content||Constant||Limited|
|Grammatically complicated languages||Need to hire additional moderators||Included|
|Costs||Depends on number of moderators||$1199/month for startups|
|Scalability||Not scalable without additional hires||Adapts to any volume|
Some content categories require special approaches. For instance, dating apps have prevalent problems with illegal activity like blackmail or secret money transfers. These cases aren't detected by standard moderation services. Toloka Moderation can handle any custom classes with minimal training time for fast deployment.
Each of the business cases above illustrates how companies introduce user-generated content in their products and potentially grow the retention rate. Why? Because collaboration between users always creates additional value.
All the important digital metrics are about customer engagement. Retention is a key metric — successful products always keep customers on the platform long-term. But frequency of use is also important.
Retention and frequency both contribute to lifetime value, which reflects how much money a user brings to the platform during the retention period. For instance, let's say we spend $10 on customer acquisition for a user who generates $7 of monthly revenue for 3 months, giving us $11 of profit (we can ignore operational costs in this equation because they are almost constant). If we can make positive changes to moderation that increase average retention to 4 months, we earn $18 profit per user, which is a 64% growth in lifetime value.
Multiplied by thousands of users, small improvements can make a huge difference!
Grow your business and achieve the metrics it deserves. Get automated moderation with human insight by Toloka:Learn more