Sunday, May 22, 2022

Content Moderation: Kenyan sues Meta (Facebook)

Content Moderation: A former Facebook moderator has sued the company’s owner, Meta Platforms, claiming that terrible working conditions for contracted content moderators violate Kenyan law.

According to the petition, workers moderating Facebook posts in Kenya have been subjected to unfair working conditions, including irregular pay, inadequate mental health support, union-busting, and abuses of their privacy and dignity.

The lawsuit, which was filed on behalf of a group, seeks monetary compensation, an order that outsourced moderators receive the same health care and pay scale as Meta personnel; protection of unionization rights; and an independent human rights audit of the office.

But what is content moderation about? why is it so important? and what makes content moderators an irreplaceable part of ensuring internet safety? Keep reading to find out.

What is content moderation?

Content moderation is the process of monitoring and reviewing the content on social media channels. It’s a way that social platforms can make sure the content they show people are safe, appropriate, and in line with their user guidelines.

Content moderation aims to ensure that users are not exposed to things that could upset or harm them. For example, Facebook doesn’t allow nudity or hate speech because it doesn’t want people using its services if it makes them feel uncomfortable or unsafe.

Why content moderation is important

Content moderation is important for several reasons. First, it helps prevent people from seeing offensive content. Second, it helps prevent people from seeing content that’s illegal in some countries. Thirdly, it can help mitigate the spread of misinformation and scams on Facebook (or any other platform).

What is like being a content moderator

Being a content moderator is not all sunshine and rainbows. The job can be stressful, depressing, dangerous, and boring. Sometimes you might get to see some pretty gross stuff or even see something that makes you feel uncomfortable. Your life might also be threatened by people who don’t like what you’re doing (they call it “virtual harassment” for a reason). It’s also important to note that if you work in this field for long enough it will start feeling more like a job than something fun and exciting.

However, there are many rewards too! You’ll have the satisfaction of making sure the internet is safe for everyone else around the world – each day your team saves potential victims from being exposed to inappropriate content or cyberbullying means one less person has been traumatized forever; this knowledge alone can help keep morale high among those working on moderation teams at Facebook HQs worldwide!

The workload of content moderators

Because of the sheer volume of content on Facebook, human moderators aren’t able to review everything. Instead, they use a combination of artificial intelligence and machine learning algorithms that help them identify posts that are likely to violate the company’s policies. These algorithms automatically flag posts as they’re uploaded or spread around the platform.

This is how it works:

When a user uploads a post, Facebook’s systems automatically scan it for certain types of words or phrases that are commonly found in abusive content—like “hate speech,” “violence” or “terrorism.” The software then assigns a score based on how likely it is that this post contains abusive content. Posts with higher scores are prioritized by Facebook’s moderator team for manual review before they are made visible to users who have chosen to see them (e.g., because these users follow the person who posted).

Risks of being a content moderator

For example, your team will likely be exposed to graphic violence, such as videos that show people being killed or injured. They’ll also see sexual content like nudity, sexual acts, and pornography. You might have to deal with distressing images or videos that include self-harm and suicide attempts.

Moderators can develop PTSD from the things they see on Facebook. This can lead to depression and other mental health issues in some cases. If you want to reduce the chances of this happening (or if someone in your team already has developed PTSD), it’s important to talk about how you’re feeling rather than internalizing those feelings alone.

Facebook outsources content moderation

Facebook outsources content moderation to third parties.

The social media platform is responsible for the majority of moderating its content, but it also relies on third parties to help monitor what’s being posted. There’s a list of approved companies that Facebook uses for this purpose — and there are more than 2,000 moderators on that list who review millions of posts every day. This helps ensure consistency in the way content is moderated across various platforms.

Where do content moderators work from?

Content moderators may work for Facebook, but they are not typically housed in Facebook offices. Instead, they are located around the world and work for third-party contractors. Some of them work from home; others from coffee shops or other public spaces.

Automations in content moderation

Facebook content moderators manually review flagged posts, but they also rely on automated systems. The automated systems aren’t perfect, and it takes a human to make sure that the system isn’t catching too much or too little content.

The automated systems are designed to flag certain types of content, like nudity, violence, and hate speech. If you run across a post that you think might be negative but not necessarily in violation of those rules—like political satire—you can flag it for review by a moderator.

Moderators don’t have access to all posts

You may have seen a post that’s been removed by Facebook and wondered how it got flagged in the first place. Moderators don’t have access to posts automatically removed by Facebook’s automated systems at upload, so those posts aren’t visible for moderators to review. Instead, Facebook relies on users reporting content as offensive so they can review it and decide whether it violates their standards.

Content moderators salary

According to Glassdoor’s report, content moderators earn between $15 and $20 per hour. However, they are not paid for the number of posts they review.

They earn a standard wage per hour regardless of how many posts they review in that time, so working overtime on a weekend or night shift won’t affect your salary as a content moderator.

Content is evaluated differently depending on local laws

Facebook has global standards for content moderation, but some content is evaluated differently depending on local laws. For example, Facebook prohibits the sale of firearms in most countries except the United States where it is legal under certain conditions; however, it does not allow such sales in Canada or Australia where gun control laws are stricter. The company uses this approach to ensure that they comply with local laws while maintaining a consistent policy across all jurisdictions.

Content moderation remains an integral part of maintaining internet safety

The work of content moderators is crucial to the future of Facebook and all social media platforms.

It’s not just a matter of spotting illegal material like child pornography or violent threats; it’s also about deciding which posts are offensive, which ones are fake news, and so on.

This task is huge—and it’s getting bigger every day as more people join Facebook every minute. It’s no wonder that many people think computers instead of humans should do this type of work. But while artificial intelligence (AI) has made great strides in recent years, AI still lacks some key abilities—such as understanding context and making nuanced judgments about complex situations—that we expect from our human colleagues.

Conclusion

In conclusion, Facebook is an incredibly powerful platform that can be used to build community, let you engage with friends and family, and even drive sales. However, it’s also a place where hate speech and misinformation thrive. To avoid having your content be flagged as inappropriate by users or moderators, make sure you’re familiar with the rules and guidelines of the platform and use them to your advantage when crafting posts and ads.



source https://www.jbklutse.com/content-moderation-kenyan-sues-metafacebook/

No comments:

Post a Comment

Latest TECNO phones and their prices in Ghana [2022]

If not the first, TECNO Phones are among the earliest Chinese phone brands to enter the Ghanaian market. It is a brand of TRANSSION Holding...