Facebook, Twitter, and Google testify during the Senate Intelligence Hearing on fake news, free speech, and Russia. Our Political Analyst Antone Christianson-Galina takes a close examination of the algorithms that influence politics through social networks.
In the context of Russian interference in the 2016 presidential election, the Senate Intelligence Hearing brought to Capitol Hill the tech company executives of Google, Facebook, and Twitter.
Senator Richard Burr: “I’m hopeful that we can provide an informed and credible assessment on how you used your platforms during last year’s elections. I am also hopeful you will share what your companies are doing to make it harder for foreign actor’s future platforms automated accounts to falsify stories here in the United States.”
Colin Stretch, Legal Counsel of Facebook, Richard Salgado, Law Enforcement and Information Security Director of Google on October 31st and General Counsel Kent Walker on the 1st, and Sean Edgett, Twitter's acting general counsel faced a row of senators on October 31st and November 1st. These three tech companies have found themselves without friends after the 2016 election. Democratic senators are furious that the companies allowed the Russian government to buy ads to influence the American election. Currently, senators Amy Klobuchar (D), Mark Warner(D), and John McCain(R) are building support for an “Honest Ads Act” forcing social media companies to make political ads publically available. Republican senators, on the other hand, argue that these three companies repress right-wing views. Senator Cruz made the following statement: “Last month, Twitter banned Marsha Blackburn from advertising her campaign launch video because it deemed a line about her efforts to investigate planned parenthood to be inflammatory.” During the hearing, the tech companies were asked how they were able to screen out hate speech while not infringing on free speech.
Senator Dick Durbin, during the Senate Intelligence Hearing, raised the following question: “When we start with the word Russian, fake, trolls and so forth, we know the starting point is a trigger. Something needs to be done. The second thing is if it includes a reference to a political candidate or a party. Then it’s a category, too of electioneering. And then the third question gets into what you characterized in this case of vile content. How are you going to sort this out, consistent with the basic values of this country when it comes to freedom of expression?
During the Senate Intelligence Hearing, Colin Stretch of Facebook repeatedly defended himself by arguing that algorithms would solve or help solve complex issues such as filtering out hate speech or stopping bots. However, Senator Mark Warner brought up an interesting question by asking if these algorithms were actually part of the problem?
Colin Stretch of Facebook said that with respect to the Algorithm, their goal is "to provide the most relevant information. It’s primarily driven by friends and families.”
To understand how these mysterious algorithms work- we need to understand basic network theory. When allowed to choose groups, we will choose group members like us. This has robust psychology theory behind it, Tajfel and Turner’s 1974 “Social Identity Theory” and it has been replicated since.
A 2012 study “Not Like Me = Bad Infants Prefer Those Who Harm Dissimilar Others” in Psychological Science found that even at the tender age of 9 months, humans have a preference towards people like them. This principle has an important implication: We use social media to surround ourselves with people more like us than otherwise. We can now group on interests rather than geography. How does information spread in these homogenous groups?
One of the most important phenomena to understand how social networks change politics is information cascade. An information cascade occurs when people make decisions one after another rather than all at the same time. Later people watch the actions of earlier people and base their actions on the earlier ones. An example would be someone who searches for a new song by looking at the songs other people have listened to- the most popular song.
The cascade can be based on little information and people oftentimes ignore new information. This cascade can easily be based on faulty information, and the credibility of crowds could quickly surpass that provided by traditional experts and authorities. The structure of the network determines the effectiveness of an information cascade. The more people close to you see you doing something, the more likely you are to do it. Different people and groups have different tendencies to conform, but on a macro-level, people conform to their network.
The structure of social networks not only creates information cascades but also leads to echo chambers.
Social media allows us to connect more effectively than ever before, but our love for the familiar leads us to connect to what we already know. Information spreads quickly through our insular online groups, but not across them. How are social media sites taking advantage of this property?
In the hearing, Stretch repeatedly argued that this new tailored newsfeed technology was there to give people news that interests them. Colin Stretch of Facebook said to Senator Durbin that:“we want to provide a platform for political discourse but not that inflames and divides.”
Facebook's mission statement states that they aim to “…give people the power to share and make the world more open and connected…”
While the mission may seem noble, at the end of the day, Facebook makes money from ads and makes more money the longer it can get you to stay on the site. Facebook’s engineers manage people’s newsfeed so that voters see things they like and stay longer. For example, if a voter is a likely Democrat, it will show posts by their friends that support their political views but repress those that would clash.
For example, if a voter is a likely Democrat, it will show posts by their friends that support their political views but repress those that would clash. You can see this in practice through a project by the Wall Street Journal, “blue feed red feed”, which compares Facebook feeds of accounts with different political affiliations.
By only showing views that voters are likely to agree with, they are sheltered from new ideas. They end up trapped in an echo chamber listening only to the similar. These echo chambers inflame crises and divisions by only showing one set of opinions and blocking out the rest- killing any chance for compromise.
If the world seems to be going mad, part of the reason is algorithms meant to keep us looking at advertisements. Tech companies should not be given a free pass if they mumble something about algorithms. Sometimes algorithms aren’t the solution- they’re the problem.
Sign up for The Pavlovic Today Newsletter featuring news, scoops, exclusive interviews and expert analysis