Saturday, November 16, 2024
Google search engine
HomeData Modelling & AIInfoTribes, Reality Brokers

InfoTribes, Reality Brokers

It seems harder than ever to agree with others on basic facts, let alone to develop shared values and goals: we even claim to live in a post-truth era1. With anti-vaxxers, QAnon, Bernie Bros, flat earthers, the intellectual dark web, and disagreement worldwide as to the seriousness of COVID-19 and the effectiveness of masks, have we lost our shared reality? For every piece of information X somewhere, you can likely find “not X” elsewhere. There is a growing disbelief and distrust in basic science and government. All too often, conversations on social media descend rapidly to questions such as “What planet are you from?

Reality Decentralized

What has happened? Reality has once again become decentralized. Before the advent of broadcast media and mass culture, individuals’ mental models of the world were generated locally, along with their sense of reality and what they considered ground truth. With broadcast media and the culture industries came the ability to forge top-down, national identities that could be pushed into the living rooms of families at prime time, completing the project of the press and newspapers in nation-forming2. The creation of the TV dinner was perhaps one of the most effective tools in carving out a sense of shared reality at a national level (did the TV dinner mean fewer people said Grace?).

Learn faster. Dig deeper. See farther.

Join the O’Reilly online learning platform. Get a free trial today and find answers on the fly, or master something new and useful.

Learn more

The rise of the Internet, Search, social media, apps, and platforms has resulted in an information landscape that bypasses the centralized knowledge/reality-generation machine of broadcast media. It is, however, driven by the incentives (both visible and hidden) of significant power structures, such as Big Tech companies. With the degradation of top-down knowledge, we’ve seen the return of locally-generated shared realities, where local now refers to proximity in cyberspace. Content creators and content consumers are connected, share information, and develop mental models of the world, along with shared or distinct realities, based on the information they consume. They form communities and shared realities accordingly and all these interactions are mediated by the incentive systems of the platforms they connect on.

As a result, the number of possible realities has proliferated and the ability to find people to share any given reality with has increased. This InfoLandscape we all increasingly occupy is both novel and shifting rapidly. In it, we are currently finding people we can share some semblance of ground truth with: we’re forming our own InfoTribes, and shared reality is splintering around the globe.

To understand this paradigm shift, we need to comprehend:

  • the initial vision behind the internet and the InfoLandscapes that have emerged,
  • how we are forming InfoTribes and how reality is splintering,
  • that large-scale shared reality has merely occupied a blip in human history, ushered in by the advent of broadcast media, and
  • who we look to for information and knowledge in an InfoLandscape that we haven’t evolved to comprehend.

The InfoLandscapes

“Cyberspace. A consensual hallucination experienced daily by billions of legitimate operators, in every nation, by children being taught mathematical concepts… A graphic representation of data abstracted from the banks of every computer in the human system. Unthinkable complexity. Lines of light ranged in the nonspace of the mind, clusters, and constellations of data. Like city lights, receding.”

Neuromancer, William Gibson (1984)

There are several ways to frame the origin story of the internet. One is how it gave rise to new forms of information flow: the vision of a novel space in which anybody could publish anything and everyone could find it. Much of the philosophy of early internet pioneers was couched in terms of the potential to “flatten organizations, globalize society, decentralize control, and help harmonize people” (Nicholas Negraponte, MIT)3.

As John Perry Barlow (of Grateful Dead fame) wrote in A Declaration of the Independence of Cyberspace (1996):

We are creating a world that all may enter without privilege or prejudice accorded by race, economic power, military force, or station of birth. We are creating a world where anyone, anywhere may express his or her beliefs, no matter how singular, without fear of being coerced into silence or conformity. Your legal concepts of property, expression, identity, movement, and context do not apply to us. They are all based on matter, and there is no matter here.

This may have been the world we wanted but not the one we got. We are veering closer to an online and app-mediated environment similar to Deleuze’s Societies of Control, in which we are increasingly treated as our data and what Deleuze calls “dividuals”: collections of behavior and characteristics, associated with online interactions, passwords, spending, clicks, cursor movements, and personal algorithms, that can be passed into statistical and predictive models and guided and incentivized to behave and spend in particular ways. Put simply, we are reduced to the inputs of an algorithm. On top of this, pre-existing societal biases are being reinforced and promulgated at previously unheard of scales as we increasingly integrate machine learning models into our daily lives.

Prescient visions of society along these lines were provided by William Gibson and Neal Stephenson’s 1992 Snow Crash: societies increasingly interacting in virtual reality environments and computational spaces, in which the landscapes were defined by information flows4. Not only this, but both authors envisioned such spaces being turned into marketplaces and segmented and demarcated by large corporations, only a stone’s throw from where we find ourselves today. How did we get here?

Information Creation

In the early days of the internet, you needed to be a coder to create a website. The ability to publish material was relegated to the technical. It was only in walled gardens such as CompuServe and AOL or after the introduction of tools like Blogger that regular punters were able to create their own websites with relative ease. The participatory culture and user-generated content of Web 2.0 opened up the creative space, allowing anyone and everyone to create content, as well as respond to, rate, and review it. Over the last decade, two new dynamics have drastically increased the amount of information creation, and, therefore, the “raw material” with which the landscape can be molded:

  1. Smartphones with high-resolution video cameras and
  2. The transformation of the attention economy by “social media” platforms, which incentivize individuals to digitize more of their experiences and broadcast as much as possible.

And it isn’t only the generation of novel content or the speed at which information travels. It is also the vast archives of human information and knowledge that are being unearthed, digitized, and made available online. This is the space of content creation.

Information Retrieval

The other necessary side of information flow is discoverability, how it is organized, and where it’s surfaced. When so much of the world’s information is available, what is the method for retrieval? Previously the realm of chat rooms and bulletin boards, this question eventually gave rise to the creation of search engines, social media platforms, streaming sites, apps, and platforms.

Platforms that automate the organizing and surfacing of online content are necessary, given the amount of content currently out there and how much is being generated daily. And they also require interrogating, as we humans base our mental models of how the world works on the information we receive, as we do our senses of reality, the way we make decisions, and the communities we form. Platforms such as Facebook have erected walled gardens in our new InfoLandscape and locked many of us into them, as predicted by both Gibson and Stephenson. Do we want such corporatized and closed structures in our networked commons?

InfoTribes, Shared Reality

A by-product of algorithmic polarization and fragmentation has been the formation of more groups that agree within their own groups and disagree far more between groups, not only on what they value but on ground truth, about reality.

Online spaces are novel forms of community: people who haven’t met and may never meet in real life interacting in cyberspace. As scholars such as danah boyd have made clear, “social network sites like MySpace and Facebook are networked publics, just like parks and other outdoor spaces can be understood as publics.”

One key characteristic of any community is a sense of shared reality, something agreed upon. Communities are based around a sense of shared reality, shared values, and/or shared goals. Historically, communities have required geographical proximity to coalesce, whereas online communities have been able to form outside the constraints of meatspace. Let’s not make the mistake of assuming online community formation doesn’t have constraints. The constraints are perhaps more hidden, but they exist: they’re both technological and the result of how the InfoLandscapes have been carved out by the platforms, along with their technological and economic incentives5. Landscapes and communities have co-evolved, although, for most of history, on different timescales: mountain ranges can separate parts of a community and, conversely, we build tunnels through mountains; rivers connect communities, cities, and commerce, and humans alter the nature of rivers (an extreme example being the reversal of the Chicago River!).

The past two decades have seen the formation of several new, rapidly and constantly shifting landscapes that we all increasingly interact with, along with the formation of new information communities, driven and consolidated by the emergent phenomena of filter bubbles and echo chambers, among many others, themselves driven by the platforms’ drive for engagement. What the constituents of each of these communities share are mental models of how the world works, senses of reality, that are, for the most part, reinforced by the algorithms that surface content, either by 1) showing content you agree with to promote engagement or 2) showing content you totally disagree with to the same end. Just as the newspaper page has historically been a mish-mash collection of movie ads, obituaries, and opinions stitched together in a way that made the most business and economic sense for any given publisher, your Facebook feed is driven by a collection of algorithms that, in the end, are optimizing for growth and revenue6. These incentives define the InfoLandscape and determine the constraints under which communities form. It just so happens that dividing people increases engagement and makes economic sense. As Karen Hao wrote recently in the MIT Technology Review, framing it as a result of “Zuckerberg’s relentless desire for growth,” which is directly correlated with economic incentives:

The algorithms that underpin Facebook’s business weren’t created to filter out what was false or inflammatory; they were designed to make people share and engage with as much content as possible by showing them things they were most likely to be outraged or titillated by.

The consequence? As groups of people turn inward, agreeing more amongst their in-group, and disagreeing more fervently with those outside of it, the common ground in between, the shared reality, which is where perhaps the truth lies, is slowly lost. Put another way, a by-product of algorithmic polarization and fragmentation has been the formation of more groups that agree within their own groups and disagree far more with other groups, not only on what they value but on ground truth, about reality.

We’ve witnessed the genesis of information tribes or InfoTribes and, as these new ideological territories are carved up, those who occupy InfoLandscapes hold that ground as a part of an InfoTribe7. Viewed in this way, the online flame wars we’ve become all too accustomed to form part of the initial staking out of territory in these new InfoLandscapes. Anthropologists have long talked about tribes as being formed around symbols of group membership, symbols that unite a people, like totem animals, flags, or… online content.

Reality Brokers, Reality Splintering

The platforms that “decide” what we see and when we see it are reality brokers in a serious sense: they guide how individuals construct their sense of the world, their own identities, what they consider ground truth, and the communities they become a part of.

Arguably, many people aren’t particularly interested in the ground truth per se, they’re interested in narratives that support their pre-existing mental models of the world, narratives that help them sleep at night. This is something that 45 brilliantly, and perhaps unwittingly, played into and made starkly apparent, by continually sowing seeds of confusion, gaslighting the global community, and questioning the reality of anything that didn’t serve his own purposes.

This trend isn’t confined to the US. The rise of populism more generally in the West can be seen as the result of diverging senses of reality, the first slice splitting people across ideological and party lines. Why are these divergences in a sense of shared reality becoming so exacerbated and apparent now? The unparalleled velocity at which we receive information is one reason, particularly as we likely haven’t evolved to even begin to process the vast amounts we consume. But it isn’t only the speed and amount, it’s the structure. The current media landscape is highly non-linear, as opposed to print and television. Our sense-making and reality-forming faculties are overwhelmed daily by the fractal-like nature of (social) media platforms and environments that are full of overlapping phenomena and patterns that occur at many different frequencies8. Moreover, the information we’re served is generally driven by opaque and obscure economic incentives of platforms, which are protected by even more obscure legislation in the form of Section 230 in the US (there are other incentives at play, themselves rarely surfaced, in the name of “trade secrets”).

But let’s be careful here: it isn’t tech all the way down. We’re also deep in a several decades-long erosion of institutional knowledge, a mistrust in both science and government being the two most obvious. Neoliberalism has carved out the middle class while the fruits of top-down knowledge have left so many people unserved and behind. On top of this, ignorance has been actively cultivated and produced. Look no further than the recent manufacturing of ignorance from the top down with the goals of chaos creation, sowing the seeds of doubt, and delegitimizing the scientific method and data reporting (the study of culturally induced ignorance is known as agnotology and Proctor and Scheibinger’s book Agnotology: The Making and Unmaking of Ignorance is canonical). On top of this, we’ve seen the impact of bad actors and foreign influence (not mutually exclusive) on the dismantling of shared reality, such as Russian interference around the 2016 US election.

This has left reality up for grabs and, in an InfoLandscape exacerbated by a global pandemic, those who control and guide the flow of information also control the building of InfoTribes, along with their shared realities. Viewed from another perspective, the internet is a space in which information is created and consumed, a many-sided marketplace of supply-and-demand in which the dominant currency is information, albeit driven by a shadow market of data, marketing collateral, clicks, cash, and crypto. The platforms that “decide” what we see and when we see it are reality brokers in a serious sense: they guide how individuals construct their sense of the world, their own identities, what they consider ground truth, and the communities they become a part of. In some cases, these reality brokers may be doing it completely by accident. They don’t necessarily care about the ground truth, just about engagement, attention, and profit: the breakdown of shared reality as collateral damage of a globalized, industrial-scale incentive system. In this framework, the rise of conspiracy theories is an artefact of this process: the reality brokered and formed, whether it be a flat earth or a cabal of Satan-worshipping pedophiles plotting against 45, is a direct result of the bottom-up sense-making of top-down reality splintering, the dissolution of ground truth and the implosion of a more general shared reality. Web 2.0 has had a serious part to play in this reality splintering but the current retreat away into higher signal and private platforms such as newsletters, Slack, Discord, WhatsApp, and Signal groups could be more harmful, in many ways.

Shared reality is breaking down. But was it even real in the first place?

Shared Reality as Historical Quirk

Being born after World War Two could lead one to believe that shared reality is foundational for the functioning of the world and that it’s something that always existed. But there’s an argument that shared reality, on national levels, was really ushered in by the advent of broadcast media, first the radio, which was in over 50% of US households by the mid-1930s, and then the television, nuclear suburban families, and TV dinners. The hegemonic consolidation of the American dream was directly related to the projection of ABC, CBS, and NBC into each and every household. When cable opened up TV to more than three major networks, we began to witness the fragmentation and polarization of broadcast media into more camps, including those split along party lines, modern exemplars being Fox News and CNN. It is key to recognize that there were distinct and differing realities in this period, split along national lines (USA and Soviet Russia), ideological lines (pro- and anti-Vietnam), and scientific lines (the impact of smoking and asbestos). Even then, it was a large number of people with a small number of shared realities.

The spread of national identity via broadcast media didn’t come out of the blue. It was a natural continuation of similar impacts of “The Printed Word,” which Marshall McLuhan refers to as an “Architect of Nationalism” in Understanding Media:

Socially, the typographic extension of man brought in nationalism, industrialism, mass markets, and universal literacy and education. For print presented an image of repeatable precision that inspired totally new forms of extending social energies.

Note that the shared realities generated in the US in the 20th century weren’t only done so by national and governmental interests, but also by commercial and corporate interests: mass culture, the culture industries, culture at scale as a function of the rise of the corporation. There were strong incentives for commercial interests to create shared realities at scale across the nation because it’s easier to market and sell consumer goods, for example, to a homogeneous mass: one size fits all, one shape fits all. This was achieved through the convergence of mass media, modern marketing, and PR tactics.

Look no further than Edward Bernays, a double nephew of Freud who was referred to in his obituary as “the Father of Public Relations.” Bernays famously “used his Uncle Sigmund Freud’s ideas to help convince the public, among other things, that bacon and eggs was the true all-American breakfast.” In the abstract of his 1928 paper “Manipulating Public Opinion: The Why and the How,” Bernays wrote:

If the general principles of swaying public opinion are understood, a technique can be developed which, with the correct appraisal of the specific problem and the specific audience, can and has been used effectively in such widely different situations as changing the attitudes of whites toward Negroes in America, changing the buying habits of American women from felt hats to velvet, silk, and straw hats, changing the impression which the American electorate has of its President, introducing new musical instruments, and a variety of others.

The Century of Marketing began, in some ways, with psychoanalytical tools, marketing as a mode of reality generation, societal homogenization, and behavioral modification. A paradigm of this is how DeBeers convinced the West to adopt diamonds as the necessary gem for engagement rings. A horrifying and still relevant example is Purdue Pharma and the Sackler dynasty’s marketing of OxyContin.

The channels used by marketers were all of the culture industries, including broadcast media, a theme most evident in the work of the Frankfurt School, notably in that of Theodor Adorno and Max Horkheimer. Look no further than Adorno’s 1954 essay “How to Look at Television“:

The old cultured elite does not exist any more; the modern intelligentsia only partially corresponds to it. At the same time, huge strata of the population formerly unacquainted with art have become cultural “consumers.”

Although it was all the culture industries of the 20th century that worked to homogenize society at the behest of corporate interests, television was the one that we brought into our living rooms and that we eventually watched with family over dinner. Top-down reality-generation was centralized and projected into nuclear suburban homes.

Fast forward to today, the post-broadcast era, in which information travels close to the speed of light, in the form of lasers along fiber-optic cables and it’s both multi-platformed and personalized and everyone is a potential creator: reality, once again, is decentralized. In this frame, the age of shared reality was the anomaly, the exception rather than the rule. It’s perhaps ironic that one of the final throes of the age of shared reality was the advent of reality TV, a hyper-simulation of reality filtered through broadcast media. So now, in a fractured and fractal InfoLandscape, who do we look to in our efforts to establish some semblance of ground truth?

Verified Checkmarks and Village Elders

If our online communities are our InfoTribes, then the people we look to for ground truth are our village elders, those who tell stories around the campfire.

When COVID-19 hit, we were all scrambling around for information about reality in order to make decisions, and not only were the stakes a matter of life and death but, for every piece of information somewhere, you could find the opposite somewhere else. The majority of information, for many, came through social media feeds. Even when the source was broadcast media, a lot of the time it would be surfaced in a social media feed. Who did I pay attention to? Who did I believe? How about you? For better or for worse, I looked to my local (in an online sense) community, those whom I considered closest to me in terms of shared values and shared reality. On top of this, I looked to those respected in my communities. On Twitter, for example, I paid attention to Dr Eleanor Murray and Professor Nicholas Christakis, among many others. And why? They’re both leaders in their fields with track records of deep expertise, for one. But they also have a lot of Twitter followers and have the coveted blue verified checkmarks: in an InfoLandscape of such increasing velocity, we use rules of thumbs and heuristics around what to believe and what to not, including the validity and verifiability of the content creator, signaled by the number of followers, who the followers are (do I follow any of them? And what do I think of them?), and whether or not the platform has verified them.

If our online communities are our InfoTribes, then the people we look to for ground truth are our village elders, those who tell stories around the campfire. In the way they have insight into the nature of reality, we look to them as our illiterate ancestors looked to those who could read or as Pre-Reformation Christians looked to the Priests who could read Biblical Latin. With the emergence of these decentralized and fractured realities, we are seeing hand-in-hand those who rise up to define the realities of each InfoTribe. It’s no wonder the term Thought Leader rose to prominence as this landscape clarified itself. We are also arguably in the midst of a paradigm shift from content being the main object of verification online to content creators themselves being those verified. As Robyn Caplan points out astutely in Pornhub Is Just the Latest Example of the Move Toward a Verified Internet:

It is often said that pornography drives innovation in technology, so perhaps that’s why many outlets have framed Pornhub’s verification move as “unprecedented.” However, what is happening on Pornhub is part of a broader shift online: Many, even most, platforms are using “verification” as a way to distinguish between sources, often framing these efforts within concerns about safety or trustworthiness.

But mainstream journalists are more likely to be verified than independent journalists, men more likely than women, and, as Caplan points out “there is a dearth of publicly available information about the demographics of verification in general—for instance, whether BIPOC users are verified at the same rates as white users.” And it is key to note that many platforms are increasingly verifying and surfacing content created by “platform partners,“ an approach also driven by business incentives. Who decides who we listen to? And, as Shoshana Zuboff continually asks, Who decides who decides?

This isn’t likely to get better anytime soon, with the retreat to private and higher signal communication channels, the next generation of personalized products, the advent of deep fakes, the increasing amount of information we’ll be getting from voice assistants over the coming 5-10 years, the proportion of information consumed via ephemeral voice-only apps such as Clubhouse, and the possibility of augmented reality playing an increasing role in our daily lives.

So what to do? Perhaps instead of trying to convince people of what we believe to be true, we need to stop asking “What planet are you from?” and start looking for shared foundations in our conversations, a sense of shared reality. We also have a public awareness crisis on our hands as the old methods of media literacy and education have stopped working. We need to construct new methods for people to build awareness, educate, and create the ability to dissent. Public education will need to bring to light the true contours of the emergent InfoLandscapes, some key aspects of which I have attempted to highlight in this essay. It will also likely include developing awareness of all our information platforms as multi-sided marketplaces, a growing compendium of all the informational dark patterns at play, the development of informational diets and new ways to count InfoCalories, and bringing antitrust suits against the largest reality brokers. Watch these spaces.


Many thanks to Angela Bowne, Anthony Gee, Katharine Jarmul, Jamie Joyce, Mike Loukides, Emanuel Moss, and Peter Wang for their valuable and critical feedback on drafts of this essay along the way.


Footnotes

1. A term first coined in 1990 by the playwright Steve Teisch and that was the Oxford Dictionaries 2016 Word of the Year (source: Post-Truth and Its Consequences: What a 25-Year-Old Essay Tells Us About the Current Moment)
2. See Benedict Anderson’s Imagined Communities for more about the making of nations through shared reading of print media and newspapers.
3. I discovered this reference in Fred Turner’s startling book From Counterculture to Cyberculture, which traces the countercultural roots of the internet to movements such as the New Communalists, leading many tech pioneers to have a vision of the web as “a collaborative and digital utopia modeled on the communal ideals” and “reimagined computers as tools for personal [and societal] liberation.”
4. There is a growing movement recognizing the importance of information flows in society. See, for example, OpenMined’s free online courses which are framed around the theme that “Society runs on information flows.”
5. Think Twitter, for example, which builds communities by surfacing specific tweets for specific groups of people, a surfacing that’s driven by economic incentives, among others; although do note that TweetDeck, owned by Twitter, does not show ads, surface tweets, or recommend follows: perhaps the demographic that mostly uses TweetDeck doesn’t click on ads?
6. Having said this, there are some ethical constraints in the physical publishing business, for example, you can’t run an ad for a product across from an article or review of the product; there are also forms of transparency and accountability in physical publishing: we can all see what any given broadsheet publishes, discuss it, and interrogate it collectively.
7. Related concepts are the digital tribe, a group of people who share common interests online, and the memetic tribe, “a group of agents with a meme complex, or memeplex, that directly or indirectly seeks to impose its distinct map of reality—along with its moral imperatives—on others.”
8. Is it a coincidence that we’re also currently seeing the rise of non-linear note-taking, knowledge base, and networked thought tools, such as Roam Research and Obsidian?

Post topics: AI & ML
Post tags: Research
Share:

Dominic Rubhabha-Wardslaus
Dominic Rubhabha-Wardslaushttp://wardslaus.com
infosec,malicious & dos attacks generator, boot rom exploit philanthropist , wild hacker , game developer,
RELATED ARTICLES

Most Popular

Recent Comments