Search algorithms can change people’s minds without their informed consent and the press must do more to combat this, Jonathan Compo argues.
That our social media diets create microbiomes in which our opinions are never challenged is widely accepted. It is assumed by many, however, that we have some agency in creating these echo chambers. This underestimates the threat. Search algorithms do not just confirm our pre-existing beliefs. They instill new beliefs. The press must combat this.
People do not use the internet only for politics, but politics permeates the internet. Watching a TikTok compilation can lead to watching a video about immigration policy. Trawling twitter for memes can lead to getting caught in a charged discussion about gun control. These linkages are apparently random, and they do not necessarily correspond with one’s preexisting views. Those without stable political identities are susceptible to be snared by these chance encounters. Then, repeated exposure can instill a viewpoint.
For many of my peers and I, our political consciousnesses were formed by our interactions with the web. We constructed our echo chambers as our echo chambers constructed us.
Value formation has always been contingent. One’s first values are almost wholly dependent on one’s upbringing and culture. One can only believe in what one knows about. Frameworks for viewing the world are acquired willy-nilly from guardians and teachers and peers and, now, the internet. The internet, though, compared to these other influences, is especially potent and vicious.
It is more potent because, unlike human instructors of the past, it claims to have all the answers. The complete reservoir of human thought is supposedly accessible. The quantity of knowledge available, however, poses an issue. No person could sort through that much information.
So, algorithms do the sorting. These algorithms, though, are built by corporations with profit motives. And, as of December 2009, they are personalized to each individual user. Our apparently all-knowing guru is only feeding us the information it believes will maximize our interaction with advertising, and thereby maximize profits for its creators and their shareholders.
Therefore, the links between entertainment and political content online are not random. The information to which one is exposed while browsing is that with the best search engine optimization, not that which is most true. This is a problem.
Freedom has many definitions, but central to all of them is a concept of meaningful choice. To be free is to choose meaningfully. A privately-owned internet governed by profit-maximizing algorithms makes choices mean less. Forces of a complexity and reach beyond our comprehension are conspiring towards our misinformation.
The prognosis is bleak. The subjection of all information to sorting by the profit motive weakens the binary between truth and falsity. Consumers are no longer equipped to be rational.
If consumers cannot pursue the truth themselves, the burden is on the producers, then, to reliably provide it. And who are the producers of truth? Journalists. It is up to journalists, with direct access to information unmediated by corporations, to assist in moral decisions.
Finding the truth through the algorithm is hard enough before questions of fake news are even brought to the table. Increasing the competency of journalists and the complexity and depth of coverage seems to be an absolute requirement to assist in this search. Journalism must never be silenced.
Today is World Press Freedom Day. This day was proclaimed in 1993 by the United Nations to commemorate the Declaration of Windhoek on Promoting An Independent and Pluralistic African Press. Though it was written before our current truth crisis, this document suggests some further solutions.
The documents’ definition of a free press goes farther than the American conception of the term. It suggests that the truly independent press should consist of “as many outlets as possible.” Crucially, it also calls for these outlets to be accessible, that the means of distribution be free from government or economic influence.
In 1993, that still meant printing presses. But the spirit of the UN declaration would extend today to online platforms through which news is disseminated, that is Google and YouTube and any of the other monopolistic, digital purveyors of knowledge.
Humans are fallible in a way algorithms are not. There is an extraordinary moral potential in computation. Imagine a world in which computers are capable of sorting through petabytes of information to provide an end-user with the most reliable and nuanced result to a search query, not the most profitable result. That world is possible.
Algorithms could be modified to reduce the negative effects of personalization. There was an internet before echo chambers and we could return to it.
These are not impossible suggestions. They are not a technological challenge; they are a shift in priorities. They are also not radical suggestions. They are based in a UN Declaration more than twenty years old. This declaration should be universalized and news freed from the economic control of Google. Until then, the algorithmic attack on freedom will persist, and it is the task of human journalists to resist it.
Sign up for The Pavlovic Today Newsletter featuring news, scoops, exclusive interviews and expert analysis