Blogs

Social Apps: User-generated Content, Misinformation, and the Need for Moderation

BLOG  by 

CitrusBits
June 18, 2021
#XR #VR #UX #UI

From July 1st through December 31st last year (2020) TikTok took down as many as 89,132,938 videos over policy violations. (source: tubefilter)

The rise of social media apps like TikTok and their easy accessibility has provided the common people with the power to ‘create’ content over the last decade. User-generated content is the feature that makes these apps hard to put down. And with more and more users flocking to such platforms each year – the current number being 3.78 billion in 2021 (source: oberlo) – a plethora of content is being created and shared each day. However, many a time this addiction results in harmful content being created.

And history is filled with instances of misinformation or lack of compliance with policies and consequent results. The recent cases including coronavirus ‘infodemic’ (spread of false information), Trump’s suspension, and Parler’s removal from app stores (over ‘free speech’) perfectly demonstrate just why content moderation is crucial to mainstream social media’s existence.

So, what is content moderation, and does it really take away free speech?

In simple words, content moderation is the process of weeding out objectionable, harmful, offensive, and inappropriate content from social media platforms. Content that is unhealthy or harmful to people. For instance, the hate speech rhetoric and misinformation circulating the social media platforms a while back during the ‘black lives matter’ movement.

A speech ‘too free’ on social media without moderation more than often results in spam, and paves way for profanity, misinformation, extremism, and all kinds of toxic rhetoric.

Content moderators using various moderation techniques prohibit users from posting such content that violates social media terms and conditions, and policies or harms a certain community (as was the case with Trump’s suspension).

The burning question: Does content moderation take away ‘free speech’?

Some people argue that content moderation is a violation of ‘free speech’ and platforms like Facebook and Twitter are often at the center stage of accusations like biased censorship. But many times, it’s really just a lack of compliance with the guidelines users agreed to in the terms and conditions when they signed up to use private social media platforms.

Besides, the First Amendment is meant to keep the government from restricting free speech, not private companies. Therefore, regardless of your take on First Amendment, content moderation is critical for social media apps, all the more to maintain a safe environment for all communities.

How some social platforms are moderating content

Social media operators use a few different approaches to moderate the content posted on their sites; allowing certain posts and not others.

Elite platforms like Facebook, TikTok, Twitter, Youtube, and Twitch use automated content moderation as well as manual content moderation, outsourcing most of the grueling work to thousands of workers at third-party companies.

Exceptions are made sometimes. For instance, during coronavirus last year, Facebook relied mostly on its automated system and mentioned improving its artificial intelligence technology.

On Twitch streamers can use filters so that when a user types something inappropriate, the backend identifies it and censors the indecent remark. (Although, individuals can be quite inventive when it comes to how they type sometimes.)

This is why, with live audio or video, this is not a viable option. As a result, moderation in live communities centers on what to do following an incident of abuse or harassment. The power to mute, kick, block, and blacklist users who disobey the rules are all ad hoc moderator options for chat rooms.

Let’s discuss the different content moderation approaches.

Content moderation techniques for social media

Moderating the content purely depends on the online platform and types of content. In voice-based communities i.e Clubhouse, as soon as something is said, it’s gone. In text-based chats, moderation is based on the prerequisite that there’s something to remove.

Let’s have a look at a few popular social media content moderation techniques listed below.

Automated moderation

AI-based content moderation services are very popular with social media platforms these days. These content moderation services can train AI – based on your custom rules engine — to autonomously help identify content that might violate your app’s guidelines. This approach is often used in a combination with manual moderation to improve reliability and the overall response time.

Social apps like Facebook and some dating apps such as Tinder, where the risk of unwanted or harmful content looms eternally, are using user data to train their AI to improve on automated moderation.

Despite significant improvements, AI-based automation has still a long way to go. At CitrusBits, we have worked with third-party AI-based content moderation tools like the Hive and live streaming solutions like Agora to build robust and scalable social audio and video apps. Contact us if you’d like to discuss your social audio or video app venture.

Human-based approach

Voice is trickier than text or visuals. And real-time audio and video streaming can be even trickier to moderate. Social audio streaming apps like Clubhouse are generally complex in nature and therefore have to rely on human-based moderation. For instance, the video news and debate app, Erupt, CitrusBits created for a team headed by Emmy-winning film producer Edward Walson and former ABC News/GMA producer Bryan Keinz required a human-based moderation approach. The human-based approach sits well with these types of social apps owing to their live-streaming characteristics.

This approach is further segmented into pre and post moderation. With pre-moderation, all user-submitted content is reviewed before it goes live on a site or app. A moderator using the content management system (CMS) evaluates each piece of content and decides whether to publish, reject, or amend it based on the site’s criteria.

This also means that pre-scanning does not permit real-time posting, a major reason why some companies are reluctant to use this approach.

The post-moderation technique on the other hand exposes harmful content on your app or website right away while duplicating it in a queue so that a moderator may examine it after it goes live. In response to consumers’ need for prompt posting, dating apps, some social sites, and other platforms frequently use post-moderation techniques.

This approach comes with significant risks. For example, dating sites receive tens of millions of photographs every day. Companies take a major risk by authorizing these photographs to go live on a site before vetting them. Even if the content is removed, it is sometimes too late because users have already screenshotted and shared the experience.

The hybrid approach

This is the approach that most mainstream social media platforms like Facebook, Twitter, and TikTok are using at the moment. Before any blatantly objectionable content (such as hate symbols or obscene gestures) may go live, AI is employed to detect and reject it, while allowing any content with a low probability of being unsuitable to go live right away. Any content that doesn’t meet the guidelines and policies can be held back for a human team to examine a few minutes later for a more subtle and thorough check.

Fancy Building the Next Big Social App?

Apps like TikTok and Clubhouse have sent the demand for social media apps soaring. However, building a successful social video app like TikTok calls for an expert social app development partner with plenty of experience, excellent reviews, and all the resources and tools required for development.

At CitrusBits we’ve built real-time audio and video streaming social apps like Erupt for renowned businesses and names like the Emmy-winning film producer Edward Walson and former ABC News/GMA producer Bryan Keinz. Our extensive experience combined with our arsenal of popular third-party tools like Hive and Agora places us among the top mobile app developers in the US.

So, if you’re looking to create or upgrade existing social audio or video app for your business, we invite you to reach out to CitrusBits for a free consultation.

About the Author

CitrusBits

Content Writer

Lorem ipsum dolor sit amet consectetur. Odio ullamcorper enim eu sit. Sed sed sociis varius odio vitae viverra. Eu sapien at vitae vulputate tortor massa semper vel. Lectus sed gravida blandit lorem consequat erat integer non ut. Morbi amet dui cras posuere venenatis. Laoreet sapien lacus sit sit elementum risus massa auctor. Enim ornare pharetra quis massa fusce. Nibh vitae in erat ut mollis erat. Amet cursus ut sem condimentum ultrices. Felis morbi malesuada sit amet ultrices at ut consectetur.

Newsletter

Let’s stay in touch