News

3 ways for Facebook users to handle offensive or abusive content

By Amina Elahi
March 18, 2015

Facebook is a platform for communication but not necessarily for free speech.

So said Monika Bickert, global policy manager for Facebook recently at the 1871 tech hub.

“When you’re dealing with this global community where 85 percent of the people are using Facebook from outside the U.S. and Canada, talking about the First Amendment isn’t particularly useful,” Bickert said.

Bickert took questions about how the company handles hate speech and online abuse in a panel moderated by 1871 CEO Howard Tullman. Steve Freeman, director of legal affairs for event host the Anti-Defamation League, which fights bigotry with a focus on anti-Semitism, also appeared on the panel.

Freeman said Facebook is not bound by free speech laws the way government entities are. The social media platform has the right to control the types of speech it allows, he said.

“In our community standards, we are focused on promoting expression and sharing,” Bickert said, “and that necessarily means that if something is dangerous or attacking a person, we want to take that seriously because it’s going to chill speech and sharing.”

Facebook policy prohibits harmful or hateful speech, including that which glorifies violence or threatens others, Bickert said. She said the company relies on community members to report abuse, which staffers review and deal with accordingly.

“We want to give people a variety of weapons,” Bickert said.

She outlined the different ways Facebook users can handle offensive or abusive content:

About the Author

Amina Elahi

More Stories