Content creators

Online content creators make money from hate and misinformation, MPs say

Creators of hateful content and misinformation earn millions from social media, the head of an international non-profit group to MPs studying ideologically motivated violent extremism told reporters on Thursday.

Imran Ahmed is chief executive of the Center for Countering Digital Hate, which has been tracking online hate for six years. He told members of the House of Commons National and Public Safety Committee that a profitable online economy has emerged around hate and misinformation.

“There are commercial hate and misinformation actors who make a lot of money by sowing discord and peddling lies,” Ahmed said.

“There is a network of commercial actors, from platforms to payment processors to people who provide advertising technology embedded in hateful content, giving the authors of this hateful content money for every eyeball they can. lure in, who benefit from hate and misinformation.

“He has incomes of millions, millions, tens of millions, hundreds of millions of dollars. It has made some entrepreneurs extremely wealthy.”

Online platforms and search engines “benefit commercially from this system,” Ahmed said.

“Fragmenters, from anti-vaxxers to misogynistic incels to racists such as white supremacists and jihadists, are able to easily exploit digital platforms that promote their content,” he said. .

Ahmed said that while a small number of highly motivated and talented disinformation spreaders are capable of doing a lot of damage, social media companies do little to stop them or enforce their own platform rules.

“Superspreaders of Evil”

“What we’ve seen is piecemeal enforcement, even when there are identifiable super-spreaders of evil who, of course, aren’t just super-spreaders of evil, they’re super-spreaders of evil. -violators of their own community standards,” he said. “And it just goes to show that they’re more dependent on the profits that come with attention than doing the right thing.”

Ahmed said his group did a study on Instagram and documented how its algorithms pushed people deeper into conspiracy theories.

Imran Ahmed, founder of the Center for Countering Digital Hate, says creators of online hate and disinformation make money from it, as do social media companies. (Jason Burles/CBC)

“It showed that if you follow ‘wellness,’ the algorithm feeds you anti-vaxx content,” he said. “If you follow anti-vaxx content, he was feeding you anti-Semitic content and QAnon content. He knows some people are vulnerable to misinformation and conspiracy theories,” he said.

Ahmed recommended several changes, such as design changes to online platforms, more transparency on the algorithms used by social media companies, and measures to hold companies and their executives accountable.

He also defended social media companies that kick those who promote hate or misinformation from their platforms.

“Anti-Semites, anti-vaxxers and lunatics in general”

“De-platforming these people and putting them in their own little hole, a little hole of anti-Semites, anti-vaxxers and general lunatics, that’s a good thing because you’re actually limiting their ability to infect other people, but also for trends such and the convergence and hybridization of ideologies,” he said.

But other witnesses warned that if extremists are kicked off major social media platforms, they will simply move to other platforms where there is less moderation.

Garth Davies, associate director of the Institute on Violence, Terrorism and Security at Simon Fraser University, said the de-platform is fueling support for far-right groups.

“If we look at this from the perspective of the far right, all of these attempts basically feed into their narrative,” Davies said, adding that the issue calls for more tolerance.

“We basically provide them with the fuel they need,” he said. “Every attempt to try to de-platform or identify content that needs to be shut down actually allows them to say, ‘See, look, they’re scared of us. They do not want these ideas to be disseminated.

Government lacks tools, says expert

Davies said far-right supporters view groups like Black Lives Matter as extremists and called for such groups to be removed.

Davies said the government was not doing enough to monitor extremism in Canada, hadn’t devoted enough resources to it, and lacked tools like a central database to track extremists.

Appearing before the committee, Tony McAleer, a former extremist and co-founder of the group Life after Hate, called for a nuanced approach and more training for people like school counselors who can help prevent young people from gravitating towards groups. extremists.

Marvin Rotrand, national director of Bnai Brith Canada’s League for Human Rights, says reports of online hate incidents have skyrocketed during the pandemic. (Radio Canada)

Marvin Rotrand, national director of Bnai Brith Canada’s League for Human Rights, said there has been less in-person harassment during the pandemic, but a spike in online hate.

“Online hate has exploded,” Rotrand told MPs, saying his organization tracked 2,799 incidents online in 2021.

Rotrand called on the Liberal government to keep its election promise to hold social media platforms accountable for the content they host and urged the government to update its anti-racism strategy to better define hate.