Preface: This post is full of links to material far superior to my own. The reader is encouraged to follow these rabbit trails to appraise more of the context underlying my assertions.
In 2008 Clay Shirky gave a much-vaunted talk (transcript available here) on his concept of cognitive surplus. In setting up the premise of his eventual book, Shirky described decades of watching television as a way to manage the excess of free time that came with the post-WWII American economy and culture.
And what did we do with that free time? Well, mostly we spent it watching TV. We did that for decades. We watched I Love Lucy. We watched Gilligan’s Island. … We watched Desperate Housewives. Desperate Housewives essentially functioned as a kind of cognitive heat sink, dissipating thinking that might otherwise have built up and caused society to overheat. – Clay Shirky
It’s a novel description, and is not without critics. However, I think we are at the point where we can observe a shift in the “heat sink” or time cooling role (literally the “chill” time) of media platforms from television to social media (Facebook specifically). While there is a profound degree of difference in user activity between the two, the overall level of cognitive dissipation rather than surplus is becoming increasingly similar.
If you define this cognitive surplus in terms of an economic value alone, you can walk back to the one-dimensional value of TV’s audience commodification—that of audience attention cultivated by media companies and sold to advertisers. Yet this surplus was never simply dissipated. Rather, it was cultivated by parties willing to pay to send mass mediated messages to this growing pool of consumers. The sitcom and other content served to assemble the “factory in the living room.”
The single most valuable commodity in the media environment is attention—that set of intellectual processes that converts raw data into something useful. – Ed Shane
Today Facebook has increasingly funneled excess audience attention and activity toward value generating “features” to cultivate a much higher value commodity for advertisers than television is capable of. It’s one reason why television is slowly moving away from a commercial platform to a paid content platform (see Netflix, Amazon, Hulu, Apple, HBO and others entering the content business).
In many ways Facebook consumes and dissipates more than simply leisure time. It also colonizes formerly offline social and cultural activities (arts and culture, journalism, entertainment, civic and political engagement, personal relationships), shifting them to its controlled digital context where it can be cultivated and more thoroughly commodified.
Nearly ten years on from Shirky’s cognitive surplus hypothesis, the rosy promises of web 2.0 and user-generated content have given way to a social media reality where the dissipation (or depletion) of the cognitive surplus is far greater and more total.
While this may not be true for everyone, a subjective appraisal of most people’s Internet behavior seems to point in this direction. For many, being on the Internet now equals being on Facebook. Generating content equals sharing content on Facebook.
No doubt, Facebook provides features to encourage productive kinds of usage: notes, Instagram, Facebook live, groups, etc. but then controls the newsfeed algorithmically to better cultivate its mass user commodity. These features tend to shift the value generated by users toward its primary value of profitability. All other goals—the public good notwithstanding—are secondary. Any instance where profitability and the public good are at odds presents a dilemma.
We’ve recently seen lauded moves by Facebook to eliminate fake news and spam on the 1 billion-strong platform but only because of a backlash and the potential for the phenomenon to cause users to mistrust the platform as a whole. There’s been no talk of barring the mercenary and paid use of the platform as a political profiling and propaganda tool. Recent stories here and here discuss the data profiling techniques marketing and political analytics firms are employing that have been largely unknown to users. For example, would you willingly take a fun quiz posted to Facebook if you knew it was being used to develop a political and psychographic profile and subsequently target you with propaganda custom-made to influence you?
Not only can psychological profiles be created from your data, but your data can also be used the other way ‘round to search for specific profiles: all anxious fathers, all angry introverts, for example—or maybe even all undecided Democrats? Essentially, what Kosinski had invented was sort of a people search engine. – Hannes Grassegger and Mikael Krogerus.
In one sense, the cognitive surplus of Facebook’s user base is being offered up to a kind of dark market of big data analytics firms. Some may use it to sell you stuff. Some may use it to sway elections. The key here is that we’re not talking about the profiteering purveyors of fake news. We’re talking about those that pay to access the data and advertising platform–Facebook’s real customers.
CEO and Facebook creator Mark Zuckerberg recently released what he called a “manifesto.” Claiming that “Facebook stands for bringing us closer together and building a global community,” he rightly expressed dismay at the current global trends toward isolation and xenophobia. His answer is to make Facebook a positive force in unifying communities and weaving a stronger social fabric. It is an inspiring and hopeful piece of writing.
For the past decade, Facebook has focused on connecting friends and families. With that foundation, our next focus will be developing the social infrastructure for community—for supporting us, for keeping us safe, for informing us, for civic engagement, and for inclusion of all. – Mark Zuckerberg
I have experienced the positive and empowering value that Facebook’s features provide. They allowed my family to share and gain community support during my son’s life-threatening illness and bone marrow transplant in a dramatic way. But I also encountered the data value-extraction process at work. The visibility of our posts was subject to the algorithmic gatekeepers, and topics mentioned in our posts inevitably shaped sponsored content for ourselves and others following our son’s support page. While Facebook was an amazing social networking tool for us, it was also frustrating to learn that our content and access to others on the network was controlled and exploited outside of our awareness.
Zuckerberg repeatedly mentions “Social fabric” in his treatise. This implies a social contract. The thing is, extracting financial value from social fabric is not without ramifications. For Facebook to meaningfully embrace its manifesto, Zuckerberg must be willing to shape the platform in a way that empowers citizens rather than further commodifies them without their awareness or consent. Every business has a right to make money—Facebook included. But when your “product” is the cumulative psychographic profiles and social graphs of more than 1 billion people, how you make your money matters. There has to be an ethical framework beyond Pollyannaish rhetoric.
While regarded as a misnomer in many ways, the “Dark Ages” that followed the fall of the Roman Empire were characterized by a lack of writing, general illiteracy and the loss, for a time, of the ancient wisdom that had formerly been centralized and accessible in the empire. The medieval church filled this vacuum instituting power and control over people’s spiritual matters, and became increasingly corrupt as it found new ways to exploit people’s largely illiterate faith for profit. Similarly, social media has ushered in a kind of Dark Age following the wide open promise of the Internet’s early days and the post-2000 epiphany of social software. To be clear, the whole of the Internet is still there, but people’s experience of it has dramatically shifted to something that is biased toward commercial and partisan interests fostering a similar user illiteracy. Those interests are most concerned with commodifying a social media discourse that sorts, separates and quantifies people, not one that “brings humanity together” as Zuckerberg opines.
Zuckerberg writes at length about two critical topics that are directly impacted by the commodification of the Facebook’s users: An informed citizenry (read: strong journalism) and civic engagement (read: participating in democracy/voting).
…even if the received opinion be not only true, but the whole truth; unless it is suffered to be…vigorously and earnestly contested, it will, by most of those who receive it, be held in the manner of a prejudice, with little comprehension or feeling of its rational grounds. – John Stuart Mill
This may be a gross oversimplification in judgement, but I don’t believe Facebook will ever create a situation where they can serve two masters equally—an advertising and market analytics marketplace generating profit for shareholders (making money hand over fist) and a marketplace of ideas serving the public good.
The very nature of democracy is bound in human deliberation and choice making. People sort out the available options for the betterment of themselves and society. This is also the very nature of the data extracted from users—profiles representing the sum of human choices. As the data is extracted and manipulated, the very nature of democratic discourse is influenced and altered. In one sense, the “social graph” is being gerrymandered and exploited.
Is this being done equally by all sides—political parties and other interests? Who knows? And that’s the point. This happens in the dark. There is no activity page a user can visit to see who is mining their cumulative data (posts, shares, likes, reactions, comments) on an ongoing basis. There are no required identifiers other than “sponsored posts” labelled on the newsfeed. Opt-ins or opt-outs and privacy settings are selectable to a point—but blindly so. The “free lunch” of the platform requires the user to remain illiterate of how their data is being commodified or for what purpose or interests. The social contract since Facebook’s IPO has rapidly evolved from tolerating simple ad inserts in the newsfeed to this vast and hidden data extraction marketplace.
If anything, Facebook’s transparency is only available to paid interests, made evident in this recent article in the NY Times.
“Facebook’s actions on media transparency are a positive step forward, particularly coming from one of the largest media players in the industry,” Mr. Pritchard said in an emailed statement. Procter & Gamble was “encouraged by the responsiveness and leadership Facebook is demonstrating, and we hope it builds more momentum to create a clean and productive digital media supply chain.”
According to the article, Facebook raked in $27.6 billion in revenue in 2016, increasing 50% over the prior year.
For Facebook to truly commit to its manifesto, they must re-think where to separate the profitability and public interest in the social media discourse. For a marketplace of ideas to truly flourish, there must be a free and open market. We don’t have that when a dark market serving paid political and profit interests is operating under the surface of our social fabric.
Perhaps Facebook needs a users’ bill of rights with an establishment clause of sorts: “[Facebook] shall make [or allow] no [user data feature] respecting an establishment of [private business or political party], or prohibiting the free exercise thereof….” In other words, transparency with regard to all sponsored content and data mining.
If they truly wish to build an informed and civically engaged community, Facebook needs to literally encode this ethic into the platform itself, which requires some serious soul searching about its conflicting motivations.