Man calling for ‘mosques to be burnt’ and for “British murderers and serial killers” to “concentrate on Muslim community” convicted
Categories: Latest News
Wednesday July 04 2018
Andrew John Emery, 45, was found guilty by Stoke-on-Trent Crown Court of inciting religious hatred against Muslims, having posted several chilling Islamophobic messages on Facebook.
The defendant sought to advocate violence against, and murder of, members of the British Muslim community, claiming that the public had to “fight back”. Mr Emery posted the vile messages on the 4th of June 2017, the date of the ‘One Love’ Manchester attack tribute concert organised by Ariana Grande.
One post read: “It is time we started to fight back. The Government won’t do **** because of the PC brigade. Every time we have a terrorist attack we should burn a mosque, preferably when it is full”.
In another post, Mr Emery wrote: “To all the British murderers and serial killers out there, do us all a favour and concentrate on the Muslim community”.
He also wrote: “Burn a mosque today and feel better”.
Prosecutor Harpreet Sandhu noted that the posts were not sent to just his Facebook friends, but rather were made available to the wider public.
Brian Williams, mitigating, acknowledged that whilst the defendant’s comments were “abhorrent”, Mr Emery was not a racist. Mr Williams stated that: “At the time, he drank too much and his father had just been diagnosed as terminally ill…Without thinking rationally he allowed these appalling comments to pour out. He would not have gone in a pub or stood on a street corner and said such things…His fingers ran away with him. They were faster than his brain”.
However, it was quickly pointed out that Mr Emery had previously posted messages harbouring similar hostility as well. One earlier post read: “Trump had the right idea trying to stop Muslims entering his country. Maybe we should do it so we would only have to worry about the scum already here”.
Andrew John Emery pleaded guilty to three charges of publishing or distributing written material intending to stir up religious hatred.
Recorder Butterfield QC highlighted that the offences were aggravated because of the severity of violence advocated in the messages, the wide public accessibility to the messages and the fact that they were posted at a tenuous time.
He noted: “They were hot on the heels of the London Bridge/Borough Market incident on June 3, the day before the tribute concert”.
The role of social media platforms in acting as an echo chamber for hate rhetoric has been highlighted by various Parliamentary select committees including the Petitions Committee, which took evidence from Karim Palant, Facebook’s UK public policy chief, on the 19th of June 2018.
Ms Helen Jones MP, chair of the Petitions Committee, speaking to Mr Palant about Facebook’s hesitance in engaging with Parliamentary committees, noted: “You [Facebook] have given the impression that your company does not feel it has to be scrutinised and, frankly, that it has something to hide, and in doing so you have done them no service at all”.
Mr Palant, giving evidence, defended Facebook’s approach to tackling hate speech on its platform and said: “We have long had very clear policies around abuse and hate speech on our platform and the abuse of individuals using language that may degrade, may dehumanise and may abuse individuals and attack them for what we describe as protected categories”.
The platform, earlier this year, stated that between January and March 2018 they had taken action against nearly 1.5 billion accounts, permanently removing more than 583 million accounts, 837 million pieces of spam and 28.8 million pieces of malicious content.
Whilst it is commendable that social media platforms are tackling content being flagged as hate speech, stronger action is required to develop the tools necessary to expedite this process and changes are yet to be introduced which prevent the platform from acting as an echo chamber.
Currently Facebook displays stories and posts in a person’s newsfeed that resembles the person’s ideology and their activity on the platform, leading to reinforcement of particular ideas. This is of significant concern when the particular person inadvertently subscribes to fringe narratives resulting in their newsfeed being populated with hate rhetoric. Only by introducing mechanisms that prevents this from happening, and ensures that Facebook members are exposed to pools of ideas rather than accommodating political bubbles.