Intelligent Machines, Broken Dialogues
Consistently effective fully automatic indexing and retrieval is not possible... It is hardly imaginable that a mechanism other than a human could acquire such self-knowledge, be given it, or do the job without it.
— Don Swanson, 19881
We have developed a global ranking of Web pages called PageRank based on the link structure of the Web that has properties that are useful for search and navigation... We have used PageRank to develop a novel search engine called Google, which also makes heavy use of anchor text.
— Sergey Brin, 19982
Only a decade after Don Swanson, an information scientist at the University of Chicago, proclaimed that fully automated information retrieval would be impossible, future Google co-founder Sergey Brin publishes his paper about a “novel search engine” that will prove Swanson wrong.
The history of the Internet is rife with parables of doubt and subsequent achievement. This has led many futurists to take positive technological progress as self-evident. Some techno-optimists claim computational innovations march society inexorably toward utopia. The Internet, they suggest, exemplifies tomorrow’s mode of communication: universal, uncensored, decentralized, free.
Their confidence is not unwarranted. The Web has proved to be the most significant advance in communication since Gutenberg’s printing revolution. Vast quantities of information now pass instantaneously across continents. Internet platforms have empowered dissidents and incited democratic revolutions. Educational resources shared online have brought valuable skills and knowledge to those without access to traditional schooling. Web technologies underpin virtually all major industries and support the modern economic system.
Few could persuasively deny the transformative impact of the Internet. However, the panglossian belief in an inevitable technological utopia ignores serious barriers to this vision.
The same features that have made the Internet a powerful social force in the present, also threaten the desire for society-wide dialogue. Faster, more convenient communication channels may discourage meaningful conversation. Powerful, more intelligent machines could make dialectic processes more difficult. A connected citizenry may be a fragmented one.
Although unintuitive, certain fundamental features of the Internet and strong forces within Silicon Valley cast grave doubts on the techno-optimist’s dream. The challenges are numerous. Nascent biases may be amplified through personalized filtering. The decentralized structure of the Web that has empowered so many can also incite division within a population. Small Internet communities tend to adopt language differences and incompatible frames, furthering polarity.
More intelligent machines may lead to broken dialogues.
Online experiences are increasingly tailored to user preferences.
Enter a phrase into most search engines and the results will be personalized. Query “pizza,” as a businessman in New York City, and the records returned will contain the locations of local New York style pizzerias. A resident in Chicago, using the same search term, may see local restaurants serving deep-dish style pizza. These personalization systems construct user profiles and develop models along several dimensions: age, location, country, education, interests.3
While search engines pioneered personalization, the practice is now common. In his book, The Filter Bubble, writer Eli Pariser describes several companies that now rely on the technique.
Facebook uses “like” history to display only posts one will probably “like” again. Amazon aggressively mines user viewing history to populate the homepage with only the most relevant products. News organizations have developed front-pages that filter out stories that do not match a user’s preference profile. Netflix has developed algorithms that can predict an individual’s movie rating within half a star.4
Personalization appears benign, but serious consequences may follow from algorithms that work too well.
The term “filter bubble,” coined by Pariser, describes the result of excess personalization. When results match a user’s pre-existing biases too well, an individual may not be exposed to differing or dissenting viewpoints.5
A left-leaning environmentalist may search “GMO” and discover the Greenpeace campaign against GMO food. Conservatives may instead find scientific articles claiming GM crops are perfectly safe. A similar situation may apply to vital topics such as climate change. The liberal searching for climate change may find the scientific consensus, the conservative may find climate skeptic blogs.
Filtering matters because latent biases in one’s browsing history may be amplified. Small indicators of political affiliation may cause only confirmatory information to be tagged relevant and dissenting voices to be silenced. Articles corroborating pre-existing biases will only strengthen those biases.
Humans already have irrational tendencies to ignore knowledge conflicting with their existing perspective. Confirmation bias, the tendency to recall only evidence that supports pre-existing beliefs, is already a major obstacle to rational inquiry.6 The filter bubble may exacerbate these tendencies and render truth-seeking even more elusive.
Some experts, like Harvard law professor Jonathan Zittrain, claim that personalization is minimal and not terribly pernicious.7 True to Google’s motto “don’t be evil,” the company claims to have “algorithms in place designed specifically to limit personalization and promote variety in the results page.”8 No corporation, including Google, can be trusted to keep such a promise.
In the early days, when Google was a non-profit academic project, Brin and Page stated: “we expect that advertising funded search engines will be inherently biased towards the advertisers and away from the needs of the consumers.”9 The allure of massive profits proved too strong. Google’s advertising revenue constitutes 89% of the company’s total earnings.10
As a publicly traded entity, Google is beholden to shareholders. If future markets dictate that the company must implement aggressive personalization without restraint, no doubt the engineers will comply. Personalized results increase click-through rates, directly boosting advertising revenue.11 Corporate responsibility policies are not robust enough to resist the market incentive for ever-tighter filter bubbles.
From the beginning, digital communication systems have been imagined as decentralized entities.12 The Internet made this vision a reality.
With minimal resources, anyone in the world could share their content without permission from a central authority. The combination of low-cost publishing and uninhibited communication provided ample opportunity for subcultures to proliferate. Within a few years, everyone with enough funds to purchase a computer was granted the equivalent of their own personal pirate radio station, except with a potential audience of millions.
Bulletin board systems enabled small niche communities to exchange messages. Internet relay chat allowed real-time communication between computers and the protocol remains a frequently used technology. Newsgroups were well known for highly specific subcultures with passionate, dedicated, and often monomaniacal denizens. These were all precursors to the current World Wide Web, a technology that preserves many of the same distributed characteristics.13
Although many of the platforms themselves have become centralized and monetized, the subculture zeitgeist is alive and well. The modern poster-child of this would be the discussion site Reddit. The self-proclaimed “front page of the internet” allows users to carve out disjoint communities known as subreddits. Each community enacts its own rules for governance and content is as varied as the Internet itself. The site has hosted discussions with Barack Obama, but also contains an official government-sanctioned North Korean subreddit.14 Vibrant subcultures are empowering and generally positive, but they have serious implications for public discourse.
As legal scholar Cass Sunstein describes in Republic.com, the community becomes an echo chamber. Information is shared primarily through insular groups. This confirming information, which readily flows between members, further reinforces collective bias.15
Without exposure to groups with radically differing views, subcultures become dogmatic. Members share a repository of unquestioned common knowledge that form the basis of their bond. The information they share may not be factually accurate. Once a small community has formed, the strength of the relationships make changing underlying beliefs difficult.
Members become intransigent.
Close communities tend to adopt the similar frames. A frame is a perspective through which one understands and interprets reality.16 Parochial groups organize themselves around common frames. As society becomes more atomized, differences in frame will make collective conversations more fraught.
Language itself evolves and diverges between communities. Fundamentally different frames can obstruct communication even without any technological or physical limitations on information transmission.
George Lakoff, a cognitive linguist at the University of California, Berkeley, has analyzed this idea with respect to U.S. politics. He has described the differing frames conservatives and liberals use to interpret reality.
Conservatives use the “strict father” model of the world, requiring: obedience, punishment, protection, discipline. Progressives understand the world through the “nuturant parent” frame, emphasizing: empathy, responsibility, cooperation, trust. These conceptual differences contribute to the difficulties that occur when speaking across the aisle.
Importantly, these frames significantly influence language itself. Conservatives deeply understand reduction in taxes as “tax relief.” Their conceptual framework associates taxes with a burden or affliction. Liberals understand taxes as a civic duty, and are enraged with those who dodge taxes or exploit loop-holes.17 Productive conversation between two groups with opposing frames and distinct parlances is tough.
The Internet is a perfect sphere for developing conflicting frames. Insular subcultures harmonize and further a shared perspective. Conversations between members reinforce shared language conventions. Without significant and frequent interaction between groups whose frames vary dramatically, productive dialogue becomes improbable. Even when the technological barriers to conversation are eliminated, conceptual obstacles remain. Facilitating this public conversation should be a major goal.
Dialogue and democracy
No one is born knowing the truth. This is why the dialectic method holds a prominent place in philosophy. Dialogue between thinkers whose perspectives differ offer a chance to evaluate arguments rationally. In practice, this requires dedication. Despite its difficulty, reasoned discussion has always, and should always play a role in democracies.
In John Dewey’s conception of democracy, conversation between informed citizens, politicians, journalists, and specialists is essential.18 Shared information lies at the heart of the first amendment to the U.S. Constitution. However, legally permissible dialogue is not sufficient for productive public conversation.
The aforementioned forces: filter bubbles, subcultures, and framed language all contribute to fragmented dialogue. Filter bubbles amplify latent biases by displaying confirmatory information. This may funnel users into particularly insular pockets of the Web. These Internet subcultures cloister themselves around common knowledge and beliefs. Members adopt common frames and language idiosyncrasies. Independently, these phenomena are not fatal to inter-group dialogue. Concurrently, they may prove disastrous.
Better technology, yielding easier and more convenient conversation, will not automatically improve the situation. If anything, smarter machines will disrupt conversation further. Advanced machine learning techniques will be able to construct tighter filter bubbles. Platforms will be able to direct users into subcultures whose ideology more perfectly reflects their own. Within these groups, highly specific frames may shut out opposing perspectives.
More technology is not the solution. More technology contributes to the problem.
Limitations and solutions
This analysis does not suggest that the Internet will put an end to rational dialogue, nor should we abandon the technology altogether. Three forces are at the heart of the issue: personalization, subculture proliferation, and frames. They are not inherently evil. The assumption that technological innovation will automatically lead to rational public dialogue, however, is gravely mistaken.
The core challenges to public conversation must be confronted by citizens and companies alike. Citizens have a responsibility to seek truthful and accurate information. They must be willing to interact with others whose frames and perspectives differ radically from their own. Crucially, they have to be willing to understand others and change themselves. In the Internet age, this may be difficult.
Companies too have an obligation. They have corporate responsibilities to ensure such dialogue is possible, and must put these obligations at the heart of their business model. Personalization should not allow for self-propaganda. Even if the market demands a tighter bubble, sites must resist.
One search engine, DuckDuckGo, has committed itself to private non-personalized search.19 Through a strong moral philosophy, DuckDuckGo has attracted users away from the search engine giants, all while maintaining ethical practices. While slightly radical, other companies should commit themselves to similar principles.
As Internet users, we must recognize that technological progress does not lead inevitably to productive society-wide conversations. The struggle for truth-seeking and empathetic dialogue will not simply disappear with convenient and intelligent communication systems.
A broken, fragmented, insular public cannot progress. The Internet provides a great opportunity for rational discourse, but we still must fight for it.