Facebook officially has a problem with QAnon.
An internal investigation by the social networking giant found that far-right conspiracy theory Facebook groups related to QAnon are racking up millions of members on the platform.
According to the documents provided to nbc news(opens in a new tab) Led by a Facebook employee, the company’s investigation found thousands of QAnon pages and groups on the site. Combined, these groups and pages have over 3 million members and followers. Facebook’s ten most popular QAnon groups make up more than 1 million of those members alone.
Facebook’s investigation also revealed 185 ads that “praise, support or represent” the QAnon conspiracy running on the platform, NBC reports. The company earned about $12,000 from the advertisements. In the last 30 days, those ads garnered 4 million impressions on the platform.
The internal findings will be used to guide any QAnon-related policy decisions Facebook may be working on, according to two company employees who spoke anonymously to NBC News.
Facebook may decide to treat QAnon the same way it treats other extremist content. company outright banned white supremacy and white nationalism content from its platform in early 2019.
In the same year it also issued rules specifically for conspiracy pages. anti vaccination Satisfied. Facebook excludes anti-vaxxer accounts from its search results and recommendation engine, making the content harder to find. It also disallows advertisements that promote anti-vaccination messages.
last week facebook banned One of its largest QAnon groups, “Official Q/QAnon”, for repeatedly breaking its rules on misinformation, harassment, and hate speech. At the time of its expulsion, the group had approximately 200,000 members.
Facebook takes its first major action against QAnon content in may when it removed a network of groups, pages and accounts about the conspiracy theory from its platform. However, this removal was due to pages violating Facebook’s policies coordinated inauthentic behavior(opens in a new tab), Fake accounts were being created by the network to promote their content.
While Facebook is figuring out what to do about QAnon, competing social media platforms have already taken action. Twitter announced in late July that it would block QAnon from appearing in its Trends and Recommendations sections and remove links related to the conspiracy theory. Tiktok followed shortly after blocked Content from the QAnon Terms and its Search feature
QAnon has been particularly popular baby boomer generation, Facebook is building with its older demographics(opens in a new tab)The perfect place for Conspiracy to expand and grow to where it is today.