In the video – pulled quickly by Facebook – the Destiny Church leader made offensive remarks about Sharia law
A video posted by controversial Destiny Church leader Brian Tamaki was investigated by police and pulled by Facebook in the aftermath of the Lynnmall terror attack.
It’s not the first time Tamaki’s social media has come under scrutiny.
Following the Christchurch mosque terror attacks Facebook took down a post in which Tamaki, a fundamentalist Christian, described Islam as a “fast creeping social invasion’’.
The latest incident took place earlier this month when Tamaki posted a video to his Facebook page citing a mosque in West Auckland, including offensive remarks about Sharia law.
Ahamed Aathill Mohamed Samsudeen, a Tamil Muslim, injured seven people when he used a knife from the Countdown supermarket in LynnMall to attack his victims.
The terrorist was shot and killed within minutes by police from the Special Tactics Group, who had 24-hour surveillance on Samsudeen in Auckland.
Police received a complaint about Tamaki’s video, which was posted on Sunday, September 5 – two days after the terror attack.
An investigation found the video didn’t meet a criminal threshold.
A police spokesperson told Newsroom the contents of the video were reviewed and “while many of the comments made were factually wrong and very hurtful to many, there was no criminal offence identified’’.
Police had been in the process of having the video taken down, but Facebook stepped in and proactively removed it first.
It’s understood the video was removed on the same day it was posted.
Facebook transparency policies define hate speech as “a direct attack against people – rather than concepts or institutions on the basis of what we call protected characteristics’’.
Those characteristics are race, ethnicity, national origin, disability, religious affiliation, caste, sexual orientation, sex, gender identity and serious disease.
“We define attacks as violent or dehumanising speech, harmful stereotypes, statements of inferiority, expressions of contempt, disgust or dismissal, cursing and calls for exclusion or segregation,’’ according to Facebook.
Between April and June this year (the most recent statistics available) 97.6 percent of the content breaching hate speech rules was found and flagged by Facebook, while the remaining 2.4 percent was reported by users.
In the days after the LynnMall attack, Police Commissioner Andrew Coster put out a public statement urging the public to “exercise some caution when receiving unverified information” about the attack, through social media platforms.
“Police have been made aware of some false information being shared on social platforms. I would urge anyone who comes across this type of information to be aware that much of what is circulating on social media platforms, is either false, or inaccurate,’’ Coster said.
The minister responsible for the Security Intelligence Service, Andrew Little, told Newsroom on Friday that agencies, including police, were always looking out for threats.
“There are a lot of people who will say things, express conspiracy theories and what have you, to which are difficult to intervene in on a lawful basis,’’ he said.
Deputy Prime Minister Grant Robertson told Newsroom there were a number of new protocols and tools available and used between the police and social media companies as a result of the Christchurch mosque attacks.
Tamaki has not responded to Newsroom’s request for comment.