Online content creators are making money from hate, misinformation, MPs told - Action News
Home WebMail Friday, November 22, 2024, 05:01 AM | Calgary | -13.4°C | Regions Advertise Login | Our platform is in maintenance mode. Some URLs may not be available. |
Politics

Online content creators are making money from hate, misinformation, MPs told

Creators of hateful content and misinformation are making millions of dollars through social media, the head of an international non-profit group told MPs studying ideologically motivated violent extremism Thursday.

B'nai Brith says there has been an explosion of hate online

Anti-mask protesters, one holding a QAnon sign, meet in Calgary on Sept. 27, 2020. (Helen Pike/CBC)

Creators of hateful content and misinformation are making millions of dollars through social media, the head of an international non-profit group told MPs studying ideologically motivated violent extremism Thursday.

Imran Ahmed ischief executive officer of the Center for Countering Digital Hate, which has been tracking online hate for the past six years. He told members of the House of Commons public safety and national security committee that aprofitable online economy has emergedaround hate and misinformation.

"There are commercial hate and disinformation actors who are making a lot of money from spreading discord and peddling lies," Ahmed said.

"There is a web of commercial actors, from platforms to payment processors to people who provide advertising technology that is embedded on hateful content, giving the authors of that hateful content money for every eyeball they can attract to it, that benefit from hate and misinformation.

"It's got revenues in the millions, the high millions, tens of millions, hundreds of millions of dollars. That has made some entrepreneurs in this space extremely wealthy."

Online platforms and search engines "benefit commercially from this system," Ahmed said.

"Fringe actors, from anti-vaxxers to misogynist Incels to racists such as white supremacists and jihadists, are able to easily exploit the digital platforms who promote their content," he said.

Ahmed said that while a small number of highly motivated,talented spreaders of misinformation are able to do a lot of damage, social media companies are doing little to stop them or to enforce their own platformrules.

'Super-spreaders of harm'

"What we have seen is piecemeal enforcement, even when there are identifiable super-spreaders of harm who, of course, are not just super-spreaders of harm, they are super-violators of their own community standards," he said."And it just goes to show they're more addicted to the profits that come with attention than they are to doing the right thing."

Ahmed said his group did a study of Instagram and documented how its algorithms were driving people deeper into conspiracy theories.

Imran Ahmed, the founder of the Centre for Countering Digital Hate, says social media companies have become the primary superspreaders of misinformation online.
Imran Ahmed, founder of the Centre for Countering Digital Hate, says creators of online hate and misinformation are making money from it, as are social media companies. (CBC)

"It showed that if you follow 'wellness', the algorithm was feeding you anti-vaxx content," hesaid. "If you follow anti-vaxx content, it was feeding you antisemitic content and QAnon content. It knows that some people are vulnerable to misinformation and conspiracy theories," he said.

Ahmed recommendedseveral changes, such as design changes to online platforms, moretransparency on algorithms used by social media companies and measures to hold companies and their executives accountable.

He also defended social media companies that kick those promoting hate or misinformation off their platforms.

'Antisemites, anti-vaxxers and general lunatics'

"De-platforming these people and putting them into their own little hole, a little hole of antisemites, anti-vaxxers and general lunatics, is a good thing because actually you limit their ability to infect other people, but also for trends such and convergence and hybridization of ideologies," he said.

But some other witnesses warned that if extremists are kicked off large social media platforms, they will just move to other platforms where there is less moderation.

Garth Davies, associate director of the Institute on Violence, Terrorism and Security at Simon Fraser University, said de-platforming fuels support for far-right groups.

"If we look at it from the perspective of the extreme right, all of these attempts essentially feed their narrative," Davies said, addingthe problem calls for more tolerance.

"We are essentially providing them with the fuel that they need," he said. "Every attempt to try to de-platform or to identify content that needs to be shut down actually allows them to say, 'See, look, they're afraid of us. They don't want these ideas out there."

Government lacks tools, expert says

Davies said far-right supporters consider groups like Black Lives Matter to beextremist and have called for those groups to be de-platformed.

Davies said the government isn't doing enough to monitor extremism in Canada, hasn't devoted enough resourcesto it and lackstools like a central database to track extremists.

Appearing before the committee, Tony McAleer, a former extremist and co-founder of the group Life after Hate, called for a nuanced approach and more training for people like school counsellors who can help keep young people from gravitating to extremist groups.

Marvin Rotrand, national director of Bnai Brith Canada's League for Human Rights says reports of online hate incidents have exploded during the pandemic. (CBC)

Marvin Rotrand, national director of Bnai Brith Canada's League for Human Rights, said there has been less in-person harassment during the pandemic but a spike in online hate.

"Online hate has exploded," Rotrand told MPs, saying his organizationtracked 2,799 online incidents in 2021.

Rotrand called on the Liberal government to fulfilits election promise to hold social media platforms accountable for the content they host and urged the government to update its anti-racism strategy to better define hate.