Two-Facebook: Why a new mission won’t escape the business plan.

In the early media tumult of the Trump administration, New York Times columnist Frank Bruni gave a lecture entitled “Media in the Age of Misinformation” for the March 21 Westminster Town Hall Forum in Minneapolis (a regular series sponsored by Westminster Presbyterian Church and aired by Minnesota Public Radio).

Frank Bruni

Bruni spoke emphatically as a journalist about the sad state of political discourse and the hyper polarized and increasingly dis-informed news audience. Echoing what many others have said on the topic, Bruni expressed dismay over the concurrent crises in journalism and democracy in America, pointing out the technological forces that gave them rise:

[Fake news and alternative facts] only matter and only have currency because our changed media landscape is the soil in which they grow. Fake news wouldn’t be able to lay down roots and alternative facts wouldn’t flower if there weren’t all these tiny, ideologically peculiar patches of land that Americans have created for themselves and fenced off from countervailing influences.

Bruni decidedly pointed a finger at human consumer behavior and the kind of information technology marketplace it has created: “Instead of taking advantage of the limitless variety these advances can make available, we use them to collapse our worlds into a single manner of feeling, a single mode of being, and often a single method of thinking. What is happening with culture is happening with the news. You pick what suits your taste, and in this case that means what validates and echoes and amplifies your existing beliefs and established biases.”

This is the bias of the medium. Your digital profile in social media is designed to deliver the most easily targetable market of one. There’s no desire for nuance, ambiguity, or dialectical tension. There’s very little use for the qualitative and subjective nature of a human being. You are a record in a database, and you keep updating your file with quantitative data with every like, emoji, share, click, comment, post, etc. “On Facebook, what they like and share today, shapes what they see tomorrow, which means more of the same.”

Bruni rightly identified the dangers of a media landscape (social media foremost) that is designed to give you more and more of what it thinks you want. But if that was the problem by itself, I’m confident we could in consumer-like fashion take steps to improve the situation by patronizing competitors that address this problem (ultimately this was Bruni’s call to action). But now the underlying commercial economy of social media, the very essence of what is tapped into to produce value, works against this.

“…where you’ll most quickly lose the ability to relate to or see the possible validity of someone else’s perspective, because that perspective is thriving in its own, separate cocoon. There’s no overlap between yours and theirs.” He concluded, “…democracy depends on this overlap.”

The Socio-Commercial Media Platform as a Controlling Matrix

The scene from 1999’s The Matrix that is the most memorable to me is when Neo takes the red pill, is expelled from his power producing pod covered in gelatinous goop, and rescued by the crew of the Nebuchadnezzar cruising the bleak “desert of the real.”

Have you ever had a dream, Neo, that you were so sure was real? What if you were unable to wake from that dream? How would you know the difference between the dream world and the real world? – Morpheus in The Matrix

The matrix is a computer-generated dream world built to keep us under control in order to change a human being into [a power source]. – Morpheus in The Matrix

One of the plot holes that was never fully satisfied was the description of the matrix as a shared collective reality constructed by the machines. Nothing is real in the matrix except the consciousness of other humans experiencing it. More to the point, the matrix is a singular virtual reality (referred to as “a harmony of mathematical precision”) that is designed to control and keep the human race blissfully unaware that their real function is to generate electricity for the machines after a war that made solar energy impossible.

Choice. The problem is choice. – Neo in The Matrix Reloaded

…nearly 99% of all test subjects accepted the [matrix] program, as long as they were given a choice…even if they were only aware of the choice at a near unconscious level. While this answer functioned, it was obviously fundamentally flawed…. – The Architect in The Matrix Reloaded

Of course, the problems with the integrity of this matrix become the heart of the story, begging obvious questions: Why bother allowing humans to have any conscious contact with one another? Why not firewall their minds in their own personalized mental matrix?

That would solve the central problem that fuels the conflict narrative in the film. It’s also essentially what social media platforms are doing to cultivate and harness the value of their users—cocooning them into realities of their own making, and why resolving the political maladies we’re now confronting will require challenging the commercial economy that now underpins social media—namely Facebook.

Man on a Mission

Bruni noted in his talk that even the New York Times is tinkering with how it delivers its news in deference to this habit of filtering the world and picking and choosing information sources. He quoted a March 18, 2017 column by New York Times public editor, Liz Spayd, stressing the importance of maintaining a shared experience of the news in the midst of tailoring content for the reader: “Scholars of mass media long ago established the theory that part of a society’s bond comes from the shared experience of consuming the same news. We shape our worldview, our opinions — however different they are from one another — after reading about and watching many of the same things. We gain a sense of community, however false or fleeting.”

The sense of community has become of utmost importance to Facebook CEO Mark Zuckerberg, who announced with great fanfare in a June interview that the company’s responsibility has “expanded,” Yet, what Mark Zuckerberg is now making the mission of the Facebook— to “Bring the world closer together” by building community—is simply not present in the source code of Facebook. It’s quite the opposite, in fact, now that the profit engine of the platform is the collection and sale of user profile data. Its original mission to “make the world more open and connected” was the diplomatic and high-minded way to describe the platform as it was, and still is, designed.

Overall Bruni piercingly outlined the effects of how we are now mediating the news (and our democracy for that matter) but didn’t expose (or see) the underlying driver beneath the human behaviors: namely the commercialization of social software platforms and the mediation of news media across those platforms. This is made clear when he ends his talk by emphasizing the thought he’s put into what the news media needs to do to confront this environment, and placing the real hope for change in the hands of his audience, which he calls “consumers of news.” Essentially washing his hands of responsibility, he says that he can’t tell people what to consume or where to get it but he can warn them about the outcomes of fueling this increasing tribalism. He firmly believes the news media, driven by profit-motivations will respond to give consumers a better diet of news when the audience starts to seek healthy alternatives. Yet, that’s two dimensions of a three-dimensional problem. The consumer and the news media economy are now interacting within a platform with its own economic dynamic—one that has essentially swallowed the old media economy whole. Journalist and consumer alike find themselves within the belly of the new beast.

Facebook most recently celebrated their two-billion user milestone, corresponding with their new mission, by creating customized videos incorporating photos and user like/reaction data that users could “share” with their friends. It’s fascinating to break down the messages in this video:

“It all adds up.”

“Whether sharing a moment, being part of something, or giving some love.”

“Loves you’ve given: ###”

Right up front you get the core of Facebook’s user-side value creation: Proactive data contribution (sharing content), sociability (online group/event involvement) and the reactive data of processing and adjudicating other users’ content in the newsfeed (using reaction features, such as likes, loves, etc.). It even quantifies the number to make the user feel good.

“The little things become not so little.”

“Today you’re a part of 2 billion people on Facebook.”

“But it’s not really about the number.”

“It’s about what all of us can do together.”

While second part of the video seems to discount the importance of the first, the push toward online community doesn’t fundamentally change anything. None of those objectifying features have been eliminated or changed. Instead, new any community-building features will simply be additive (driving new categories of profitable data). Clearly Zuckerberg is convinced Facebook can create an online experience that can both drive meaningful community building and create even more dynamic data profiles for greater profit.

But online community does not equate with real community (even in VR), just like Facebook friends do not equate with real friends (nor online church with real church for that matter).

“Thanks for being here.”

“From all of us at Facebook.”

At several points over the hours and days that followed the custom video being created, Facebook prompted me to share it with my friends. Cleary that was the preferred behavior—exercise my agency thereby adding another entry in the database.

What’s in It for Facebook?

As it is defined as “social media,” it’s prudent to ask how our sociability is mediated, and to what end. Answering these questions will bring a better understand of why Facebook can never truly deliver on the promise to build a meaningful “global community that works for everyone.”

Let’s begin by noticing the obvious fact that nobody pays a fee to use social media. People are essentially granted free software and network storage to store and share words, images, videos and facilitate their network of human connections while increasingly consuming news and information within a single platform (as opposed to visiting a number of separate web sites). Nobody pays and no one needs to because, as Apple CEO Tim Cook pointed out in 2014 of free Internet service models like Google and Facebook, the user is the product.

“facemashscreen”At the heart of Facebook’s value generation is a database with an insatiable appetite for more data. The software, first designed simply to connect and open people up for online sociability (see Facebook’s first mission statement) has largely evolved to serve that appetite in the enormously successful effort to monetize the platform. Features are always developed to promote quantifiable user behaviors—the juicy stuff of consumer judgement that can drive more granular data profiles. The filtering and isolating bias of the platform emerges not from the client side (the users) but from the server side (the code and the database of users). Lest one forgets, the first iteration of Zuckerberg’s social software—Facemash—was simply a Harvard version of “Am I Hot or Not” that invited people to compare photos of students and competitively rate them.

Derek Schuurman, computer science professor at Redeemer University College in Ontario, notes that all technological artifacts have embedded values that can push human beings in less obvious directions. In his 2013 book, Shaping a Digital World, he quotes George Grant’s Technology and Justice from 1986 on the specific way computer technology does this:

It is clear that the ways computers can be used for storing and transmitting information can only be the ways that increase the tempo of the homogenizing process. Abstracting facts so they can be stored as information is achieved by classification, and it is the very nature of classifying to homogenize. Where classification rules, identities and differences can appear only in its terms.

Facebook comprises computer software, a database and networking technology as the means of social mediation. At the same time, its database primarily functions as a means to commodify human data, generating economic value by selling this data to interested clients (business, political entities, etc.). As a public company, this is its singular profit-orientation.

Today there two billion users of a computer platform that converts human subjectivity into a quantifiably objective and saleable product. The side effects, as noted by Frank Bruni, and which I contend are the strange fruits of this process, are the increasingly insular, polarized and tribal users, and an increasingly ineffectual democracy in America.

Schuurman notes that computers, and by extension databases, must convert user input into a form that can be catalogued, classified and stored: “That very process limits the range of possibilities for information that is stored…. Storing data in a computer requires quantification, and one issue with quantification is that it reduces things to ‘what can be counted, measured and weighed,’” (Charles Adams, as quoted in Schuurman).

The primary feature set of Facebook compels the classification and homogenizing of human relational communication. First, we establish a network of friends by “friending” others, or not doing so. We “follow” the activities of others, or can “unfollow.” While our newsfeeds are now curated by algorithms that simultaneously quantify and serve us paid messages, we can “like” or react in one of five preset ways (producing stunningly usable data for Facebook’s database clientele). Every click is a choice, and the choice making feeds the quantification process. Whether we proactively post our own content, or reactively make comments or “share” the content of others, every action is a human judgment producing more data.

The Profits of Hacking Pride & Envy

The human propensity to compare and judge others objectively over and above encountering others subjectively has deep existential and spiritual roots. In Repenting of Religion, Greg Boyd writes, “The one doing the judging is separating himself or herself from and placing himself or herself above the one being judged.” Perhaps for some, Facebook’s addictive nature lies in how it provides a platform, “to experience worth for oneself by detracting it from others.”

Boyd’s text draws heavily on Dietrich Bonhoeffer, who wrote in Ethics, “Judgment passed on another man always presupposes disunion with him.” In Cost of Discipleship Bonhoeffer takes this idea a step further: “When we judge other people, we confront them in a spirit of detachment, observing and reflecting as it were from the outside.” Think: Facemash.

This mode of human behavior does indeed drive the maladies that Bruni and so many others in the news business now decry. For example, Harvard law professor, researcher and author Cass Sunstein’s latest research shows that when people discuss contentious issues with like-minded people, their views become more homogeneous and amplified.

David Simas, assistant to President Obama in charge of 2016 campaign outreach, speaking to Divid Remnick for a November 28 article in the New Yorker (and cited by Bruni in his talk) pinned the sad state of political journalism and discourse in general on the rise of the Internet and the decline of institutions invested in binding people together rather than splintering them into interest groups, confirming Sunstein’s findings:

Until recently, religious institutions, academia, and media set out the parameters of acceptable discourse, and it ranged from the unthinkable to the radical to the acceptable to policy.

The continuum has changed [and through] social media, you can find people who agree with you, who validate these thoughts and opinions. This creates a whole new permission structure, a sense of social affirmation for what was once thought unthinkable.

This points to how socio-commercial media, without any specific political or cultural bias, shapes and directs user attitudes and behaviors.

“At Facebook’s scale, behavioral targeting doesn’t just reflect our behavior, it actually influences it,” writes columnist and software engineer Jon Evans for Techcrunch. “Over time, a service which was supposed to connect humanity is actually partitioning us into fractal disconnected bubbles.”

Everything about Facebook as a medium (and this is where we get to the heart of its embedded value system) reinforces, cultivates and corresponds to the pejorative tendency in human behavior in order to generate more data and profit. Through the lens of social database software, what we judge is either someone other than ourselves, or the content they mediate using the platform. It’s this very activity that generates value to Facebook and its shareholders.

For Zuckerberg, this is where mission must defer to a business plan that works against the kind of positive community he envisions. Evans continues: “This eventually constructs a small ‘in-group’ cluster of Facebook friends and topics that dominate your feed; and as you grow accustomed to interacting with them, this causes your behavior to change, and you interact with them even more, reinforcing their in-group status … and (relatively) isolating you from the rest of your friends, the out-group.”

Blissful Ignorance Proves Too Costly

At a pivotal moment in The Matrix, the character Cypher cuts a deal with agent Smith, playing Judas by offering up his captain Morpheus in exchange for re-entry into the matrix, his real existence too bleak and desperate to endure any longer.

Cypher: “You know, I know this steak doesn’t exist. I know that when I put it in my mouth, the Matrix is telling my brain that it is juicy and delicious. After nine years, you know what I realize? Ignorance is bliss.”

Agent Smith: “Then we have a deal?”

Cypher: “I don’t want to remember nothing. NOTHING. You understand? And I wanna be rich. You know, someone important. Like an actor.”

Unlike Cypher, Facebook users at present are in no position to get a better deal for the colonization of their relational lives, “engagement” on the platform and the commensurate societal side effects.

Clearly the “free” price of socio-commercial media is proving to be too costly for a vibrant democracy and most conducive to the plutocracy that has slowly replaced it.

Author: toddwold

Todd Wold is an Assistant Professor of Communication at Asbury University School of Communication Arts in Wilmore, Kentucky, and a Ph.D. candidate (ABD) at Regent University at Virginia Beach, Virginia. His research interests include the political economy of social media and crowd patronage platforms, the digital displacement of faith practices and authority in church communities, and transcendence in filmmaking.

One thought on “Two-Facebook: Why a new mission won’t escape the business plan.”

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s