Tag Archives: News

News algorithms with a human touch

3 weeks ago, I wrote about how the explosion in news content and the challenge of fact-checking this news content is leading to widespread misinformation.

The timing of the post was particularly appropriate as, following the results of the US presidential elections, reports have emerged that draw attention to how people are producing articles with false information designed to attract clicks and generate revenue, and spreading these articles on social media sites like Facebook. The argument is that these factually incorrect articles helped Trump win the US election.

I don’t agree with this assessment. These articles containing false information are just one of several factors which contributed to the election outcome. And they’re a small one at that. In addition, although perhaps to different extents, they impacted both candidates.

However, the widespread distribution of these factually incorrect articles does lead to an important question. What responsibility, if any, do social media sites like Facebook have to monitor the factual accuracy of the content that they’re helping spread?

Facebook argues that it doesn’t have this responsibility because it is simply a distributor of content. It is not a media company that produces the content.

While true, distributors of online news content have a very different role than offline distributors of newspapers. Online content is effectively infinite while offline content isn’t. This gives online content distributors the ability to influence what readers consume to a far greater extent than offline newsstands. While a newsstand could display all the newspapers available in the country, Facebook has to choose what content to display within your newsfeed. It’s physically impossible to show it all.

So far, Facebook has chosen to prioritize the content it displays based on a black box algorithm which appears designed to maximize user engagement and hence Facebook’s revenue. The problem with this approach is that few users care about the facts. Most are just looking for the next adrenaline rush. So content which meets this demand gets clicks and is pushed to the top of the news feed where it gets more clicks, irrespective of factual accuracy.

But if this isn’t the right approach because factually correct content is intrinsically valuable and this approach often directs our attention to factually incorrect content, then what is the right approach?

One possibility is for Facebook to have a fact checking team, or to work with a third party fact checking team, to only surface content that it deems factually correct. The problem with this approach is that, whenever humans have absolute power like this, it’s up for abuse. One of social media’s greatest advantages over traditional media is that it doesn’t exercise editorial influence (at least in most cases). Allowing Facebook to be the arbiter of factual accuracy would give it much greater editorial powers. This is dangerous and should be avoided.

I believe that the solution lies at the middle of these two extremes. An engagement-optimizing algorithm isn’t the solution, but a human fact-checking team that can override the algorithm whenever it wants to isn’t the solution either.

Instead, Facebook’s algorithm needs to evolve to include factual accuracy as one of the important variables which it uses to determine which articles to surface in its users’ news feeds. This is similar to Google’s search results reflecting not only the number of links to a specific page but also the quality of the sites providing these links. I don’t know the variables taken into account in Facebook’s algorithm but I doubt that factual accuracy is a variable with an important weight, if it is even a variable at all.

I recognize that what I’m proposing isn’t a perfect solution.

The factual accuracy variable will be subject to human bias, at least until we get machines to perform fact-checking for us. But even then, these machines will initially be designed by humans so they’ll also continue to reflect our biases.

And false articles that get clicks may still surface at the top of news feeds if their engagement levels overcome the weight of the penalty they receive due to their factual inaccuracy.

However, a perfect solution doesn’t exist. The best we can hope for is to reward factual accuracy as much as we can without giving the humans responsible for deciding on this factual accuracy limitless power. Enhancing an algorithm with a human touch (not a human override) is the best option available.

This leaves three questions outstanding.

First, what weight will be assigned to the factual accuracy variable?

Second, how and by whom will the variable be measured?

Third, what motivation (or regulation) will ensure that Facebook adopts a factual accuracy variable that lowers its user engagement and revenue?

Fact checking

Yesterday, I read a Webrazzi article on the launch of Teyit.org (“teyit” means “confirmation” in English). Teyit.org‘s goal is to fact check Turkish news articles for their accuracy.

Since fact checking is currently a largely human activity, it isn’t easily scalable. It isn’t possible to fact check every article. To address this issue, Teyit.org first ranks articles based on the value which is at stake dependent on the accuracy of the article’s content. Although this is a subjective assessment, it’s the best approach available to us as of today.

Teyit.org prioritizes those articles with a high value at stake and has its research team check the accuracy of the content of these articles. Based on their findings, researchers mark articles as being accurate, inaccurate, confusing, or suspicious, and share a detailed write-up of their findings to support the mark they’ve given.

The internet has reduced the cost of distributing content down to zero. And social media has made this content available to a widespread audience almost instantaneously. The result is an ever increasing pace of content consumption and a reduction in the cost of spreading misinformation (whether intentional or in the attempt to be the first to publish) because, by the time the misinformation inherent in an article emerges, most people have already jumped onto the next news headline.

We need to slow down. This means that we need to not only identify fact from fiction, but also keep people’s attention focused on what we’ve identified long enough for people to become aware of the facts. This isn’t easy in a world where pleasant fiction is easier to sell than brutal facts.

Teyit.org is a great attempt to distinguish fact from fiction. Its bigger challenge will be to get people to care about the facts. I hope it succeeds.