A New Social Contract

On Christmas Eve of 2014, Eric Meyer wrote a blog post on his personal webpage detailing his Facebook Year in Review video. Like most Years in Review, it displayed pictures of loved ones, algorithmically found, over upbeat music; unlike most, it featured pictures of his deceased daughter, Rebecca, who had died of brain cancer on June 7 of that year. Her sixth birthday. In his essay, Meyer wrote: “I know, of course, that this is not a deliberate assault.” Still, he warned of the importance of investigating ‘edge-cases’ when companies make the decision to create highly personalized content pieces such as these.

Meyer’s piece made the rounds on the Internet space of 2014, two years after Facebook’s IPO, and the same year of their acquisition of What’s App and Oculus. Facebook in 2014 was just starting its massive growth towards the number 8 spot in market cap ranking it enjoys today; on April 30 of that year, Facebook hosted its fifth f8, a roughly-annual developer conference held around the Bay Area. Meyer’s essay became so popular that he wrote a follow-up post clarifying his original intent; the people at Facebook are not heartless ‘brogrammers’, Facebook is far from the only company that makes such data-driven mistakes, etc. Rather ominously, at that year’s f8 held in San Francisco, Facebook acknowledged the steady unease among consumers regarding proliferation of data: “We’ve heard from people that they are worried about sharing information with apps, and they want more control over their data. We are giving people more control over these experiences so they can be confident pressing the blue button.” Misuse of Facebook information by third-party apps is by now a familiar refrain; the problem of the Year in Review video, however, persists. Meyer dubs it ‘inadvertent algorithmic cruelty’, further suggesting that the main issue with the video is the lack of an option to opt out. But what about the option to opt in?

To retroactively focus on Facebook is partially an exercise in recency bias as the company just now seems to be recovering from the major drop in stock price following the Cambridge Analytica disaster. Facebook is by no means the only large tech firm to have had major PR issues regarding data abuse; Google CEO Sundar Pichai was summoned to a legislative hearing of his own in December of last year, where he was grilled on a broad yet focused, at times seemingly common-sense, array of questions by a mostly boomer House Judiciary Committee. The hearing was expectedly partisan, as much of the concern on the right was regarding the possibility of bias in Google search engine results and much of the concern on the left was in battling this assertion, insisting that search engines are democratic — a majority information vote decides your popular standing. The hearing was also expectedly frustrating for a relatively tech-savvy audience to watch; NowThis News posted an edit of the hearing titled “Congress Was Confused by the Internet During Hearing With Google CEO”. Ted Poe (R-TX) is a main feature, asking Pichai if Google can track his short movements across the congressional hall. Pichai struggles to explain in between Poe’s interjections that it depends on if Poe had opted in to use certain Google software. Other features include Rep. Lamar Smith (R-TX), who insists that there is human manipulation of search results, and Rep. Steve King (R-IA), who confuses Apple and Android mobile devices.

Towards the end of the hearing, however, Representative Karen Handel (R-GA) asked the following:

“For years, the FTC on a bipartisan basis has affirmed that precise geolocation information is considered highly, highly sensitive, and that consumers must opt in to that… Do you think there is other information, privacy information, of consumers that should also be required to have opt in vs. opt out?”

Pichai responds to Handel: “In general, I think a framework for privacy in which users have a sense of transparency, control, and choice, and a clear understanding of the tradeoffs they need to make I think is very good for consumers, and we would support that.” Assumedly, Pichai is concerned for the consumer, and Poe does not understand how privacy settings work. Fatally, the same issue of consumer choice arises. Did you opt out? Equally as important: did you opt in? Increasingly, the types of choices that consumers are presented with regarding data are contingent on their knowledge surrounding the significance of their data and the methods through which they are collected. Rep. Steve Cohen (D-TN) admits that he uses Google services often but is so confused on how to turn off location services that he wants Pichai to consider an online help center for users like him. Even if Representative Cohen’s best efforts were fruitful, however, this congressional hearing was called in part due to an Associated Press report released in August of the same year that discovered that, among other misconceptions, Google still collected location data even with the Location History privacy setting turned off. Is confusion, then, so unexpected?

Pichai’s hearing and Zuckerberg’s own Senate hearing are centered around the services provided by their respective companies, and the questions asked of both men were not existentially threatening but instead mirror a familiar situation in 2008. Though these conditions are not quite as dire as those under the Great Recession, there was a similar significant reckoning that our fabric, then financial and now also social, is too dependent on too few private, unregulated companies. Sen. Bill Nelson (D-FL) asks, incredulously: “So your chief operating officer, Ms. Sandberg, suggested…that Facebook users who do not want their personal information used for advertising might have to pay for that protection. Pay for it. Are you actually considering having Facebook users pay for you not to use the information?” Though Facebook has no such system in place, the idea of a paid Facebook seemed impossible, antithetical even. It appears as though Congress (and the Senate) was not ‘confused’ about the Internet at all, and in fact, this group of boomers have the same realization as their younger counterparts: Increasingly, many technological services no longer exist as luxuries, but instead as necessities. Computer literacy classes in schools, investment by big tech in lower income communities, and the extensive PR material released by companies such as Facebook and Google detailing how their products help small businesses all point to the importance of technological fluency for successful financial and social lives. Lost among all the talk of services provided, data protection needed, etc., is the recognition that companies such as Facebook and Google, like JPMorgan and BoA, may have become too big to fail. The minimum cost of living in a connected, ever progressing world is your data, and consumers are faced with increasing options of which to opt out — one should hope that opting in is part of the equation as well.

So, we are faced with a dilemma. To survive/thrive, one must surrender some amount of personal data. The problem with data is that it is equalizing — though certain demographic groups are highly valued, disaggregation of demand curves has allowed for almost infinite customization. As technology has advanced, media has advanced in tandem, at first allowing advertising to reach mass audiences, then allowing advertising to perform limited targeting using Nielsen statistics. As media moved closer to the individual rather than the collective due to the innovation of the Internet and personal computers, the value of demographic targeting subsections of mass audiences lessened as highly specific ads could now be instantly and invariably served to those to whom they are most likely relevant. It does not matter if one does not understand what can be collected for which purposes; the data is uniquely valuable. Mark Andrejevic, in his seminal essay from 2002, “The Work of Being Watched: Interactive Media and the Exploitation of Self-Disclosure”, describes such data-production as the ‘work of being watched’. Initially used by Andrejevic to describe users of services such as TiVo, the ‘work of being watched’ refers to the self-disclosure of personal information in exchange for convenience or customization. This personal information is indeed labor produced — shows are stacked on the TiVo hard drive by the consumer — and also labor saved — demographic surveys, capital expenditures, and more are decreased due to the work produced by the consumer. This work of being watched has only increased exponentially in efficiency; we carry media streaming, geolocating devices in our pockets at any given moment of the day. There have been many that suggest paying consumers for their data. Regardless of pay, our legislators have understood that we have undergone a contractual exchange of service for service, the received service is becoming more and more crucial to life, and the given service is becoming more and more expensive. As Andrejevic writes: “A discussion of surveillance might… be couched in terms of conditions of power that compel entry into the digital enclosure and submission to comprehensive monitoring as a means of stimulating and rationalizing consumption.” One can make a strong case that there exist conditions of power that compel entry into digital enclosures. Perhaps, as Andrejevic and Ibarra, Goff, Hernandez, Lanier, and Weyl suggest, we are working multiple jobs in multiple capacities.

Still, there remains an insistence that this sort of deeply personal data can be used for good. Zuckerberg, in his testimony, states that though consumers hate advertisements, they hate irrelevant advertisements even more. The argument has been made that targeted advertisements are mutually beneficial for advertisers and consumers; indeed, as Zuckerberg indicates during his response to Senator Nelson, the majority of Facebook users choose not to turn off advertisement personalization. Perhaps this is, as representatives Poe, Handel, and Cohen suggest, due to lack of knowledge. Again, one must choose to opt out, but not opt in. Or — perhaps the data exchange really is worth the services. Facebook has certainly played its part in trying to convince their users that it is — like a corpse reanimated, our own data is used to conjure up a cobbled, digitized self in the form of Memories or Years in Review. For a moment, it works. We are dragged by that ever-pulling power of the world’s largest archive of persona, our past selves weaponized towards our current. We need only to recall Rebecca Meyer to break the siren call and remember that products of our data, no matter how highly personalized, are almost always highly decontextualized. “The design is for the ideal user, the happy, upbeat, good-life user. It doesn’t take other use cases into account,” Meyer writes. Perhaps the exchange is not worth the services — it seems it will persist nonetheless.

Leave a comment