Another Way Facebook and Google May Be Undermining Democracy By Tom Jacobs

Google and Facebook have, in recent months, belatedly began to engage in the battle against fake news. But the fact so much misinformation has proliferated on their platforms is only one of the ways these technology giants may be endangering democracy. Newly published research points to another: It finds the tools these companies offer to customize our news feeds result in users getting less and less exposure to viewpoints that challenge their own.

“Originally conceived by computer and information scientists as a way to help users cope with increasing information overload, customizability technology appears to have a dark side,” writes a research team led by Ivan Dylko of the University at Buffalo. “It enables individuals to surround themselves with information supporting their preexisting political attitudes.”


The researchers report this effect was strongest for “ideologically moderate individuals,” potentially pulling them in a polarized direction. Such a dynamic “can undermine important foundations of deliberative democracy,” they write in the journal Computers in Human Behavior. The study - one of the first to address this issue - featured 93 students from a university in the southwestern United States. All began by filling out a questionnaire measuring their political attitudes and ideology. Four weeks later, they were asked to provide feedback on “a new political news website.”

The students were randomly assigned to try one of several versions of the site, including one in which they could select the ideology of their information sources, and another in which the computer software performed similar sorting. The computer’s choices of what to include and exclude were based on the information participants provided in their questionnaire; they were not informed that this automated customization had taken place. The researchers measured how often participants clicked on articles that supported or opposed the students’ political preferences, and how much time they spent reading each.

Not surprisingly, use of either customized technology decreased the number of clicks, and time spent reading, articles espousing viewpoints that differed from those of the user. More insidiously, the automated form of customization produced a stronger such effect than the one where the user consciously chose what sorts of articles he or she wanted to read. Why would that be? The researchers note that “actively and intentionally avoiding counter-attitudinal political information” diminishes our ability to see ourselves as fair-minded. Holding onto a positive self-image — which is a priority for virtually everyone — may inspire us to at least occasionally check out what the other side is saying... read more:
https://psmag.com/another-way-facebook-and-google-may-be-undermining-democracy-b00a58142cc9

see also
Jacques Camatte: The Wan­dering of Humanity


Popular posts from this blog

Third degree torture used on Maruti workers: Rights body

Haruki Murakami: On seeing the 100% perfect girl one beautiful April morning

Albert Camus's lecture 'The Human Crisis', New York, March 1946. 'No cause justifies the murder of innocents'

The Almond Trees by Albert Camus (1940)

Etel Adnan - To Be In A Time Of War

After the Truth Shower

James Gilligan on Shame, Guilt and Violence