Meta launched its Widely Viewed Content report last August as part of an effort to counter to pervading narrative that Facebook facilitates the amplification of divisive political content, with political misinformation, in particular, gaining significant traction through Facebook News Feeds.
Facebook says that’s simply not the case, and to clarify this, it launched its Widely Viewed Content report, in order to share data on what’s actually gaining the most traction in the app. Which, according to Facebook, is mostly jokes, memes and other harmless junk.
That could be problematic in itself, depending on your interpretation – and even then, the report has also been riddled with controversies around disclosures, misinformation, removed content, spam, and more.
Basically, it’s not a great endorsement of the ‘value’ that Facebook provides, nor is it a validation of Facebook as an impartial political reflection.
So what does Facebook say gained the most traction in the app in Q2?
Here’s its overview of the most widely viewed links in the app:
Oh, that’s gotta’ burn Zuck and Co. to see TikTok heading this list.
Imagine having to be the one who reported this to Zuck. I imagine his response would have been something like:
Jokes aside, as you can see, five of the top 20 most shared links in the app in Q2, reaching a cumulative 73.5 million viewers throughout the period, have since been removed by Meta for various policy violations relating to ‘spam tactics’.
That’s not great. Meta is self-reporting that its platform is responsible for junk content, designed to mislead and dupe users, reaching 70 million people in a three-month period.
But that’s not all – Meta also notes that its platform has been used to amplify this type of content in the past too:
“In our previous WVCR report, we shared that the ninth-most viewed link on Facebook with over 33 million views in the first quarter of this year was alltrendytees[.]com. After the Integrity Institute had flagged it to us, we investigated and blocked this domain for violating our IB policy. Our investigation linked this domain to GearLaunch, a Bangladesh-based e-commerce firm.”
So on one hand Meta’s saying that it doesn’t amplify divisive political content as much as it may seem, and here’s the proof, while on the other, it does amplify scams and rubbish, some of which is in violation of its rules, at a massive scale.
100 million+ cumulative viewers is pretty big. Removal seems almost irrelevant at that point.
Thus remains the conundrum that Meta has with its Widely Viewed Content report, with the data once again highlighting significant concerns with its systems and processes, that may be just as bad as the rumors that it was initially seeking to refute.
Though those issues are also still prevalent – digging into the other links listed, there’s also:
- A YouTube video where a woman spouts right-wing conspiracy theories
- A Fox News clip on YouTube about ‘Disney’s woke queer agenda’
- A Newsmax article which attributes rising petrol prices to US President Joe Biden
Seems a little political, there – it does seem like these types of divisive political posts are gaining a fair bit of traction on Facebook (37.9 million collective Facebook viewers), in spite of the company’s past claims.
That seems like a problem, right?
Ah, but most people don’t actually see any of these posts, with Meta also reporting that:
“90.2% of the views in the US during Q2 2022 did not include a link to a source outside of Facebook.”
So even if 100 million more people are being scammed because of Facebook, most of the things that most people see in the app are definitely not links to politically divisive sources – so Facebook can’t be blamed for amplifying related conflicts, at least based on this measure.
That almost seems like a side note, because again, millions of people are still being exposed to at least some of this material, based on Meta’s own reporting, while that also doesn’t account for the fact that many posts without links are still political in nature.
In other words, at best, this is an inconclusive account of not much in particular, which Meta is trying to frame as vindicating proof that it’s not the source of evil that people say.
It doesn’t prove that. It doesn’t really prove anything, other than the fact that a lot of people are engaging with a lot of rubbish in the app.
You can check out Meta’s ‘Widely Viewed Content Report’ for Q2 here.