CST Blog

Committee criticises social media companies for failure to tackle hate online

3 May 2017

In a new report from the Home Affairs Select Committee, the Government has been called on to consider whether it is a crime for social media companies to allow “illegal and dangerous” content on their platforms. The Committee called for the inquiry following the murder of Jo Cox MP by far right extremist Thomas Mair. CST submitted substantial evidence to the Committee, focusing on antisemitism on social media. The Committee has recommended in their report for the Government to implement a stronger penalty for failure to act on hate online, for social media platforms to pay towards the cost of policing the platforms and for social media companies to issue reports on their safeguarding activity.

Several key violations by the various social media platforms were outlined by the Committee, namely the lack of removal of terror recruitment videos for both Jihadi and neo-Nazi groups, as well as the continued antisemitic harassment of several MPs, even after the social media companies were made aware of the issue. The report condemned the social media companies’ lack of action on hate, stating that they are among “the biggest, richest and cleverest companies in the world, and…they continue to operate as platforms for hatred and extremism without even taking basic steps to make sure they can quickly stop illegal material, properly enforce their own community standards, or keep people safe”. 

CST’s written evidence to the Committee included the increased number of recorded antisemitic incidents on social media, which is often only the tip of the iceberg, as many incidents go unreported. Specifically, CST noted that social media is “being used as a tool harnessed by perpetrators to specifically target victims with hate crime”. Harassment on social media, in particular on Twitter, takes place against both high profile public figures and ordinary users. Previously, CST worked closely with Luciana Berger MP, who has been the victim of several antisemitic campaigns on social media, with some being spearheaded by neo-Nazis in America. CST recommended to the Committee that more training be given to the Police and prosecutors to understand antisemitism online and its impact. CST also called on the Government to consult CST, and other community groups, with regard to hate crime.

Chair of the Committee, Rt Hon Yvette Cooper, responded to the failure of social media companies to respond adequately to illegal content online:

“In this inquiry, it has been far too easy to find examples of illegal content from proscribed organisations – like National Action or jihadist groups – left online. And while we know that companies for the most part take action when Select Committees or newspapers raise issues, it should not take MPs and journalists to get involved for urgent changes to be made. They have been far too slow in dealing with complaints from their users – and it is blindingly obvious that they have a responsibility to proactively search their platforms for illegal content, particularly when it comes to terrorist organisations. Given their continued failure to sort this, we need a new system including fines and penalties if they don't swiftly remove illegal content. Social media companies need to start being transparent about what they do. The idea that they can't tell us what resources they put into public safety for commercial reasons is clearly ludicrous.”

CST’s Director of Communications, Mark Gardner, stated:

“We welcome the Home Affairs Select Committee’s report and robust recommendations on online hate. CST has worked hard to ensure social media companies take responsibility. Our work with Google, YouTube, Facebook and Twitter is increasingly constructive. We welcome their improving efforts, but it is a huge task”.

CST, a trusted flagger for YouTube, Facebook and Twitter, and a recognised expert for Twitter, continues to work closely with several social media platforms to ensure that they understand what constitutes antisemitism, and to ensure illegal content is flagged, either through the platforms’ regular reporting mechanisms or through various escalation processes. CST welcomes reports that social media companies are introducing initiatives to counter hate speech and remove antisemitic and illegal content online, much of which is encouraging. For example, recently Google announced that it will update its search and autocomplete algorithms in an attempt to tackle content which spreads hatred, such as antisemitism and Holocaust denial. Additionally, following work with CST and other community groups, Twitter announced a raft of changes to their community guidelines and security policies to ensure user felt safe online. These changes included additional education on antisemitism for Twitter support teams and the ability to report tweets on the platform as hateful content, including that which discriminate based on religious grounds. CST also supports the Online Civil Courage Initiative, a new initiative developed by Facebook to counter hate speech online.

CST has also been working with the European Union High level Group on combating racism, xenophobia and other forms of intolerance to continue holding social media companies to account and to ensure that they abide by the Code of Conduct on Countering Illegal Hate Speech Online, through various Europe-wide monitoring exercises.  

If you see antisemitic posts on social media or online, please report it to CST through our online incident form, or through Facebook or Twitter

 

 

Read More