Facebook has taken its position in the ongoing petition presented by Change.org, claiming that the leading social networking company is not doing enough to combat the serious threat of terrorist content on its platform.
In a statement posted by the company in response to the petition that was written by one Julie Guilbault; Monika Bickert, who is the head of Facebook’s Global Product Policy, said that “there is no place for terrorists, terror propaganda or even praising terror attacks on Facebook.”
The petition was clear that Facebook needs to do much better when it comes to fighting ISIS and other terror groups, slamming the company for its lack of timely response to the sickening jihadi accounts that were rallying support messages behind the terrorists that attacked the French capital more than several hours after the attack.
ISIS-related messages kept flowing through Facebook not just after the Paris attacks, even after other attacks that have come after this massacre that saw up to 129 people dead. Even though the social media network went ahead and removed the posts, the Guilbault still thinks this did not happen on time.
According to her, Facebook makes use of high-tech tools that can quickly detect any pornographic material on someone’s account and delete the subsequent account, but it takes them a while longer to deal with terrorism and posting of execution videos.
“When it comes to advocating terror activities and publishing of decapitation videos, no worries,” she says. “They enjoy a comfortable wait before they delete the content or the account.”
This is actually what drives the author of the petition and the close to 140,000 people that have already signed the petition crazy. However, according to Bickert, Facebook always ensures that no terror groups are using the site. Furthermore, she says that the whole process of identifying these possible terrorists and terror groups together with associated content relies on what users report.
Bickert also added that at times Facebook users share content that is upsetting for good intentions. For instance, when promoting awareness of a given issue, the company will not block the content being shared.