nav-left cat-right

How Facebook is Bad for Democracy

On Facebook I recently noticed an increase of the following message on the comments of political articles: “Top Comments is selected, so some replies may have been filtered out.” So Facebook is so intent on feeding us only what its algorithm thinks we want to hear that it won’t even let us see opposing viewpoints.

Facebook obviously know I’m liberal, and not just from clicking “Like” on some posts and not others. There are a dozen additional ways to infer this as well (including the views of the people in my social network, and especially those I interact with most frequently, as well as which articles I click on, which ones I linger on, what I comment on – independent of the contents of those comments).

What bothers me is that not only does Facebook predominantly show me liberal comments (even on Fox News articles) but on the articles of certain sites they won’t even let me see the conservative comments any more. One is a nudge towards confirmation bias. The other is an iron curtain preventing me (and presumably millions of other readers) from even knowing the details of opposing points of view.

Probably the simplest thing to do is to create a second, fake identity tied to a separate Facebook account, in my case as a raving conservative. Then I’d see everything that is shown to conservatives. But that is not terribly convenient, nor does it get to the bigger problem that I’m concerned about, namely that structurally Facebook is implementing a policy that is good for them and bad for democracy.

Leave a Reply

Your email address will not be published. Required fields are marked *