The revelations according to which Russian agents inserted ads on Facebook that tried to influence the 2016 US elections raise a disturbing question: is Facebook bad for democracy? As an expert in the social and political repercussions of technology, I think that the problem is not exclusively from Facebook, but that it is much broader: Social networks are weakening some of the conditions that have historically enabled the existence of national states democratic.
I understand that it is a dramatic statement, and I do not expect anyone to believe it immediately, but considering that almost half of all potential voters received false news promoted by the Russians in Facebook is an argument that must be debated.

Associate Professor of Philosophy, Director of Center for Professional and Applied Ethics, University of North Carolina-Charlotte
Let's start with two concepts: the "imagined community" and the "bubble filter".
The late political scientist Benedict Anderson maintained, as is well known, that the modern national state is best understood as an "imagined community" , made possible in part by the rise of mass media such as the newspapers.
Anderson was referring to the feeling of cohesion that citizens of modern nations felt among themselves-the degree to which they could be considered part of a national community-was artificial and at the same time facilitated by the media.
Without a doubt, there are many things that allow national states like the United States to stay together.We all learn (more or less) the same national history at school, for example.
Even so, the typical Maine lobster fisherman, for example, doesn't have much in common with the typical South Dakota master, but the mass media helps them see themselves as part of something else.great, that is, the "nation".
Democratic government systems depend on this shared feeling of community. It enables what we call the "national" policy, the idea that citizens consider their interests to coincide on some issues.jurist Cass Sunstein explains this idea by returning us to the times when only three informative ones were issued and all said more or less the same.As Sunstein states, historically we have relied on these "general interest intermediaries" to frame and articulate our feeling of shared reality.
The term "bubble filter" appeared in a book published in 2010 by activist Eli Pariser and serves to characterize an Internet phenomenon.
Lawyer Lawrence Lessig had detected, like Sunstein, this phenomenon of group isolation on the Internet in the late 1990s.Within a bubble filter, individuals only receive basically the type of information they have selected themselves previously or, and this is more dangerous, than third parties have decided that they are interested in knowing.
Facebook remains, by a significant margin, the most prominent source of fake news.
The specific advertising used in Facebook news helps create these bubble filters.Advertising works by determining the interests of users based on the data they collect from their searches, their likes clicks, etc.It is an operation very complex.
Facebook does not reveal its algorithms.However, studies carried out by Michael Kosinski, a psychologist and data expert working at Stanford University, have shown that the automated analysis of "likes" that people broadcast on this network was able to determine the demographic information and basic political beliefs of those people .
Such segmentation can also be, apparently, extremely accurate.There are indications, for example, that the announcements against Clinton issued from Russia were able to reach individualized voters in Michigan specifically.
The bad thing is that within a bubble filter the person never receives news with which he does not agree.This raises two problems: first, there is never an independent verification of that news.Who wants a Independent confirmation should actively look for it.
Second, psychologists have long known "confirmation bias," the tendency of people to seek information only with which they agree. That bias also limits each person's ability to question information that confirms or supports your own beliefs.
And not only that, the research carried out in the Cultural Cognition Project of Yale University indicates that people are inclined to interpret the new evidence in the light of the beliefs associated with their social groups.This may tend to polarize these groups.
All of this means that if we are inclined to feel ashamed by President Donald Trump, it is likely that any negative information about him reinforces our belief, and vice versa, we will surely not give credit to Trump's favorable information, or we will ignore it. These two characteristics of bubble filters-pre-selection and confirmation bias-are the ones that fake news takes advantage of with precision.

All these features are also integrated into the business model of social networks such as Facebook, which is based precisely on the idea that it allows you to create a group of "friends" with whom to share information.This group is largely insular, separated from other groups.
The program carefully selects the transfer of information through these social networks and strives to the maximum to become the main portal through which its users-about two billion-access the Internet.
Facebook's revenue depends on advertising, and that advertising is easy to take advantage of: A recent ProPublica investigation shows how easy it is to insert individualized ads targeting anti-Semites. More generally, the site He also wants to keep users connected, and knows that he is able to manipulate the emotions of these users, who are more satisfied when they see things that they agree with.
As The Washington Post documents, It is precisely these characteristics that take advantage of Russian advertisements. As one Wired editor observed in a disturbingly clairvoyant comment, he had never seen a Trump-friendly post that had been shared 1.5 million times, and their progressive friends, either. In the news of their social networks they only saw news of progressive tendency.
In this environment it should not surprise a recent poll conducted by the Pew Research Center.This poll shows that the US electorate is deeply divided by partisan reasons, even on fundamental political issues, and is being further divided.
All this is combined to mean that the world of social networks tends to create small and deeply polarized groups that will tend to believe everything they hear, no matter how far away from reality. The bubble filter certainly it will make us vulnerable to false polarized news and isolate us more.

Eli Periser, author of The Bubble Filter.
At this time, two-thirds of Americans receive at least part of their information from unknown, highly refined and customized algorithms.
Facebook remains, by a significant margin, the most prominent source of fake news.Not very different from the forced and false confessions of witchcraft of the Middle Ages, these stories are repeated often enough to appear legitimate.
What we are seeing, in other words, is the possible sinking of a significant part of the imagined community that is the American political entity. Although the United States is also demographically divided and there are strong demographic differences between regions within the country, partisan differences are eclipsing other divisions that occur in society.
This is a recent trend: in the mid-1990s, partisan divisions had a size similar to that of demographics, for example, both then and now, women and men maintained the same moderate distance on some political issues, as, for example, if public authorities should help the poor more.In the 1990s, this was also valid for Democrats and Republicans.In other words, partisan divisions were no better than demographic factors to predict political opinions.of people. Currently, if we want to know someone's opinions, it is best to find out before their political affiliation.
Without a doubt, it would be very simplistic to accuse the networks exclusively of all this. Certainly, the structure of the American political system, which tends to polarize political parties in the primaries, greatly influences.
And it is true that many of us receive information from other sources, outside our Facebook bubble filters.
But I would say that Facebook and social networks offer an additional layer: not only do they tend to create their own bubble filters, but they also offer a paid medium to those who wish to increase polarization.
Communities share and create social realities.In their current function, networks run the risk of inducing a social reality in which different groups not only disagree about what to do, but about what is their own reality.
Author: Gordon Hull, Associate Professor of Philosophy, Director of Center for Professional and Applied Ethics, University of North Carolina-Charlotte
This article was originally published in The Conversation.Read the original.
I understand that it is a dramatic statement, and I do not expect anyone to believe it immediately, but considering that almost half of all potential voters received false news promoted by the Russians in Facebook is an argument that must be debated.

Associate Professor of Philosophy, Director of Center for Professional and Applied Ethics, University of North Carolina-Charlotte
How we create a shared reality
Let's start with two concepts: the "imagined community" and the "bubble filter".
The late political scientist Benedict Anderson maintained, as is well known, that the modern national state is best understood as an "imagined community" , made possible in part by the rise of mass media such as the newspapers.
Anderson was referring to the feeling of cohesion that citizens of modern nations felt among themselves-the degree to which they could be considered part of a national community-was artificial and at the same time facilitated by the media.
Without a doubt, there are many things that allow national states like the United States to stay together.We all learn (more or less) the same national history at school, for example.
Even so, the typical Maine lobster fisherman, for example, doesn't have much in common with the typical South Dakota master, but the mass media helps them see themselves as part of something else.great, that is, the "nation".
Democratic government systems depend on this shared feeling of community. It enables what we call the "national" policy, the idea that citizens consider their interests to coincide on some issues.jurist Cass Sunstein explains this idea by returning us to the times when only three informative ones were issued and all said more or less the same.As Sunstein states, historically we have relied on these "general interest intermediaries" to frame and articulate our feeling of shared reality.
The term "bubble filter" appeared in a book published in 2010 by activist Eli Pariser and serves to characterize an Internet phenomenon.
Lawyer Lawrence Lessig had detected, like Sunstein, this phenomenon of group isolation on the Internet in the late 1990s.Within a bubble filter, individuals only receive basically the type of information they have selected themselves previously or, and this is more dangerous, than third parties have decided that they are interested in knowing.
Facebook, source of fake news
Facebook remains, by a significant margin, the most prominent source of fake news.
The specific advertising used in Facebook news helps create these bubble filters.Advertising works by determining the interests of users based on the data they collect from their searches, their likes clicks, etc.It is an operation very complex.
Facebook does not reveal its algorithms.However, studies carried out by Michael Kosinski, a psychologist and data expert working at Stanford University, have shown that the automated analysis of "likes" that people broadcast on this network was able to determine the demographic information and basic political beliefs of those people .
Such segmentation can also be, apparently, extremely accurate.There are indications, for example, that the announcements against Clinton issued from Russia were able to reach individualized voters in Michigan specifically.

confirmation bias
The bad thing is that within a bubble filter the person never receives news with which he does not agree.This raises two problems: first, there is never an independent verification of that news.Who wants a Independent confirmation should actively look for it.
Second, psychologists have long known "confirmation bias," the tendency of people to seek information only with which they agree. That bias also limits each person's ability to question information that confirms or supports your own beliefs.
And not only that, the research carried out in the Cultural Cognition Project of Yale University indicates that people are inclined to interpret the new evidence in the light of the beliefs associated with their social groups.This may tend to polarize these groups.
All of this means that if we are inclined to feel ashamed by President Donald Trump, it is likely that any negative information about him reinforces our belief, and vice versa, we will surely not give credit to Trump's favorable information, or we will ignore it. These two characteristics of bubble filters-pre-selection and confirmation bias-are the ones that fake news takes advantage of with precision.

Are polarized groups being created?
All these features are also integrated into the business model of social networks such as Facebook, which is based precisely on the idea that it allows you to create a group of "friends" with whom to share information.This group is largely insular, separated from other groups.
The program carefully selects the transfer of information through these social networks and strives to the maximum to become the main portal through which its users-about two billion-access the Internet.
Facebook's revenue depends on advertising, and that advertising is easy to take advantage of: A recent ProPublica investigation shows how easy it is to insert individualized ads targeting anti-Semites. More generally, the site He also wants to keep users connected, and knows that he is able to manipulate the emotions of these users, who are more satisfied when they see things that they agree with.
polarization in social networks
As The Washington Post documents, It is precisely these characteristics that take advantage of Russian advertisements. As one Wired editor observed in a disturbingly clairvoyant comment, he had never seen a Trump-friendly post that had been shared 1.5 million times, and their progressive friends, either. In the news of their social networks they only saw news of progressive tendency.
In this environment it should not surprise a recent poll conducted by the Pew Research Center.This poll shows that the US electorate is deeply divided by partisan reasons, even on fundamental political issues, and is being further divided.
All this is combined to mean that the world of social networks tends to create small and deeply polarized groups that will tend to believe everything they hear, no matter how far away from reality. The bubble filter certainly it will make us vulnerable to false polarized news and isolate us more.

Eli Periser, author of The Bubble Filter.
The end of imagined community?
At this time, two-thirds of Americans receive at least part of their information from unknown, highly refined and customized algorithms.
Facebook remains, by a significant margin, the most prominent source of fake news.Not very different from the forced and false confessions of witchcraft of the Middle Ages, these stories are repeated often enough to appear legitimate.
What we are seeing, in other words, is the possible sinking of a significant part of the imagined community that is the American political entity. Although the United States is also demographically divided and there are strong demographic differences between regions within the country, partisan differences are eclipsing other divisions that occur in society.
This is a recent trend: in the mid-1990s, partisan divisions had a size similar to that of demographics, for example, both then and now, women and men maintained the same moderate distance on some political issues, as, for example, if public authorities should help the poor more.In the 1990s, this was also valid for Democrats and Republicans.In other words, partisan divisions were no better than demographic factors to predict political opinions.of people. Currently, if we want to know someone's opinions, it is best to find out before their political affiliation.
The reality of social networks
Without a doubt, it would be very simplistic to accuse the networks exclusively of all this. Certainly, the structure of the American political system, which tends to polarize political parties in the primaries, greatly influences.
And it is true that many of us receive information from other sources, outside our Facebook bubble filters.
But I would say that Facebook and social networks offer an additional layer: not only do they tend to create their own bubble filters, but they also offer a paid medium to those who wish to increase polarization.
Communities share and create social realities.In their current function, networks run the risk of inducing a social reality in which different groups not only disagree about what to do, but about what is their own reality.
Author: Gordon Hull, Associate Professor of Philosophy, Director of Center for Professional and Applied Ethics, University of North Carolina-Charlotte
This article was originally published in The Conversation.Read the original.
Comments
Post a Comment