Social networking site Facebook has recently updated the "Network Community Code", a user-directed program that explicitly restricts the spread of violence, nudity, hate speech and criminal activity on its Web site, including cleaning up terrorist accounts and preventing terrorism from spreading through social media.
Restricting the spread of terrorism
According to British media reports, Facebook began to limit the content of users who are known as terrorist groups. But now, the site says it will broaden its scope and clear its support and praise for terrorist groups and their leaders.
The statement said it would not allow organizations that "promote terrorism, organized crime or hate ideas" to appear on the web. And some pictures of "torturing or beautifying and praising violence" will be removed.
According to U.S. media, the updated network community norms cover security, user respect, intellectual property and account security in four aspects, Facebook's involvement in violence, nudity, abuse and criminal behavior of the content of the most clearly defined standards.
Facebook said in a statement: "For revenge or unauthorized" share of the naked picture will be removed. But some paintings and sculptures and other works of art are involved in the bare patterns are not prohibited.
In addition, malicious speech involving race, religion, gender, sexual orientation and physical disability will be banned.
Content Choice controversy
With the popularity of social networking sites and the proliferation of inappropriate information, how to characterize such violence, pornography and even terrorism has become a headache for social networking sites with large numbers of users, such as Facebook and Twitter.
Mark Mark Zuckerberg, Facebook's chief executive, said in a statement on his personal homepage that the site was not geared towards any policy or guideline, but gave more guidance.
"People want to know what controversial content we're going to take away or keep, and what the reasons are." "said Zuckerberg.
"These specifications are designed to create a positive environment for users to respect and treat each other with empathy," he said. "Monika Bickert, the Facebook global chief policy Officer Monica Bicote, and deputy legal advisor Chris Sandby (Chris Sonderby) said in a joint post.
"It is not easy to use the same system to meet the needs of different social groups across the globe." On the one hand, people from different backgrounds have different views on what is suitable for sharing, and a network video is interesting to one person, but may be a hurt to another. "said the blog.
Increased interaction between Government and social networking
In addition to removing illegal content, Facebook said it would clean up a number of banned accounts and would work with law enforcement agencies when certain statements might pose a "direct threat to public safety".
In addition, the guidance statement stressed that Facebook users must use their real names. Last October, many homosexuals and transvestite groups launched a protest against the "real name policy".
Facebook said requests from the Government to obtain account information were on the rise, from 34,946 applications in the first half of 2014 to 35,051 in the second half.
The reason for the increasing interaction between government and social media is that social media companies are aware that they are at a critical position in major international political affairs such as counterterrorism, according to U.S. media.
Facebook says its policies do not solve every problematic content perfectly, but it will continue to deal with such cases as much as possible, while also relying on the power of the network community itself to mark inappropriate content. Users can be prompted to the system by clicking the "Report" button on the Web page or software.
Author: Direction Ming