Social media sites censor a range of content, new report says - Action News
Home WebMail Tuesday, November 26, 2024, 05:13 AM | Calgary | -16.5°C | Regions Advertise Login | Our platform is in maintenance mode. Some URLs may not be available. |
ScienceQ&A

Social media sites censor a range of content, new report says

In light of criticism of 'fake news,' Facebook CEO Mark Zuckerberg has said the company 'must be extremely cautious about becoming arbiters of truth ourselves.' But a new report says Facebook already censors and removes a wide variety of content, as do other social media platforms.

Policies around what can and can't be posted online are often unclear, according to report's co-author

Tina Spenst, a B.C. mom, complained in 2013 after she said a photo she posted of herself breastfeeding her daughter was blocked from Facebook. Its policies now state such photos are permissible, but nudity remains one of the main reasons sites remove content, according to a new report. (Tina Spenst/Facebook)

Even as social media companies like Facebookstruggle to filter out fake news, a new report says their rules on what can and can't be posted online are often unclearand lead to the controversial removal of some content.

Facebook CEO MarkZuckerberghas announced steps to tackle the issue ofmisinformation, buthas also saidthe company "must be extremely cautious about becoming arbiters of truth ourselves."

However, a new report says Facebook already censors and removes a wide variety of content as do other social media platforms.

CBC Radio technology columnist Dan Misener has looked at what the report says about online censorship.

What does this new report say?

It's from the Electronic Frontier Foundation, a non-profit digital rights group based in California and is part of a project called Onlinecensorship.org, whichaimsto better understand how social media companies decide what is and isn't allowed on their platforms.

Of course, all these sites have policies but they're not always easy to understand, and they're not always consistently enforced. As we've seen with the ongoing conversation around fake news, these companies' approaches change over time.

So Onlinecensorship.org tries to better understand these policies by collecting reports from people whose content has been taken down, or censored. For instance, if a mother posts a photo of herself breastfeedingand that gets taken down, she can file a report with Onlinecensorship.org.

Onlinecensorship.org lets users file reports when their content has been censored by social media sites. (OnlineCensorship.org)
Right now, they have about a year's worth of user-generated data, so they've been able to identify some patterns in what gets taken downand what results in accounts being suspended.

What types of content are being censored?

Jillian York is the director for international freedom of expression with the Electronic Frontier Foundation. She saida lot of the content takedowns they've seen involved nudity or sexual content, citing a couple of famousexamples from earlier this year.

"A photo of the Little Mermaid statue in Copenhagen was censored from Facebook. It's a mermaid, so not really human nudity," she noted.

"But then later in the year, a more serious example of that is when Nick Ut's Pulitzer Prize-winning photo 'The Terror of War' was also censored by Facebook a very famousimage of a young nude girl fleeing a napalm attack."

Nick Ut's famous 1972 photo of Kim Phuc, centre, fleeing after a napalm attack was removed by Facebook earlier this year because it showed nudity. Facebook later reversed its decision. (Nick Ut/Associated Press)
Facebook later reversed its decision with regard to the Nick Ut photo. But nudity was one of the biggest reasons content was censored, according to the EFF's research.

Of course, social media sites also remove content like hate speech. But the precise definitions of hate speech, acceptable nudityor "fake news" aren't always clearly spelled out.

And then there are truly bizarre examples like the Facebook user in India who was banned because he posted a picture of a cat in a business suit. It's unclear exactly what about that image brought it to the attention of Facebook's moderators.

Who actually moderates the content?

We don't know much about the people who do content moderation work. We do know they work in offices all around the world, to deal with the global nature of social media. York saidthere's evidence some of these offices are in places where labour is relatively inexpensive, likethe Philippines, Bangladesh and India.Many of them don't work directly with the social media companies themselves. Instead, the content moderators often work for third parties, so it's often outsourced labour.

And as an end-user of Facebook,Twitteror YouTube, it's almost impossible to know who's enforcing their content rules.

In a recentinterview with TVO's The Agenda, Sarah Roberts, who is an assistant professor in the faculty of information studies at Western University, pointed out content moderation is a "hidden practice" and the workers are "faceless and invisible."

What should companies do to improve?

The Electronic Frontier Foundation's Jillian York says social media companies need to do a better job of clarifying their rules on what can and can't be posted. (EFF.org/Matthew Stender)
York says policies on issues like fake news, nudityand hate speech can be opaque.

It's not always clear what the rules are, or how they're enforced. So she wants to see more transparency from social media companies.

"We want users to understand why their content's being taken down, how they violated the rules not just that they violated the rules," she said.

"And then also maybea little bit more clarity from the companies in terms of what violations look like more broadly."

She's also pushing for better transparency reporting, which would see companiesshare statistics on things like how much content is removed under their own guidelines, how many accounts are suspendedand why they're suspended.

Why do social media bans matter?

For me, this is fundamentally about freedom of speech what you're allowed to sayand what you're not allowed to say.

We have existing laws around this stuff. But through their content moderation policies, social media companies are writing their own rules around what can or can't be said.

It's worth remembering that, for a while, people thought of the internet as a kind of digital public square, with the emphasis on "public."

This report is a great reminder that when we spend time on social networks, we're not really in a public space. We're spending time on private servers, owned by private companies,who have the ability to make their own rules and regulations around speech.

We forget that at our peril.