Cookie Preferences
close

By clicking “Accept”, you agree to the storing of cookies on your device to enhance site navigation, analyze site usage, and assist in our marketing efforts. View our Privacy Policy for more information.

Close icon
Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.

Why User-Generated Content Moderation is Critical for Small & Medium-Sized Platforms

Automated content moderation used to be an enterprise-only issue. But upcoming changes to the law mean that any content sharing platform needs to start paying closer attention to user-generated content.

Unitary

Table of contents

Why is UGC moderation critical?

Content moderation sounds like an issue that only really concerns the very largest social media platforms.  Often small and medium sized enterprises (SMEs) only come to consider content moderation after something has gone wrong. But the landscape is changing – and businesses of all sizes have a duty to keep their online users safe.

The European Union’s Digital Services Act, and the UK’s forthcoming Online Safety Bill both place a heavy burden on online service providers. All businesses will be expected to protect users against content that could be classified as ‘harmful’ or ‘disinformation’. Which means that SMEs need to get to grips with the issue of moderating user generated content (UGC) sooner rather than later.

What is the best way to begin moderating UGC?

For most SMEs, developing a moderation process can start small, scaling up as demand and requirements change. The typical evolution follows three distinct stages:

1. Build a moderation team

UGC submissions follow a linear trajectory, increasing in volume and frequency over time. Initially, the amount of content tends to be quite low, and can be reviewed manually by an individual or small team. Appointing a community manager to oversee these operations is generally sufficient while submissions remain manageable.

2. Invest in moderation tools

As submissions escalate, businesses begin to invest in tools that help reduce workloads on their moderation teams. Often this will be in the form of content filters that can detect and block the most obvious examples of unsafe content, freeing the team to focus resources on more subtle ‘rule breaking’ that has been flagged for manual review.

3. Invoke artificial intelligence

Eventually, the queue of content continues to grow, overwhelming the moderation team. There are also an increasing number of false-positives, where static filters misidentify harmful content.

At this point, manual moderation and traditional tools cannot keep pace. Instead, SMEs must upgrade their technologies to take advantage of artificial intelligence systems like Unitary's, which can ingest, process, classify and moderate vast amounts of data automatically – with a similar level of accuracy to human moderators.

These next generation AI tools are also far better at assessing UGC in context, allowing the system to grade submissions on multiple factors simultaneously, such as video, text and audio – and the way in which they relate to each other.

Making smarter choices now

Because of the focus on enterprise, content moderation tools and techniques have, historically speaking, been expensive and complex. Businesses have been unable to justify the outlay until moderation has become unmanageable.

In a more positive development, AI-powered UGC moderation tools are now much more affordable, lowering the barrier to entry. Indeed, most SMEs can probably afford to skip the second step of the moderation strategy outlined above, moving directly from manual moderation to artificial intelligence enabled tools.

For users, this would be a win-win because service providers are able to block harmful content accurately from the outset. For businesses, using AI automation is fast, accurate, scalable and far more cost-effective than relying on an overloaded community moderation team equipped with outdated technology. Making the switch to AI tools early on will help SMEs reduce their ongoing moderation costs and ensure they are able to comply with updated regulations for protecting their users against harmful content.

Read more about Unitary's approach to supporting safe online spaces.