Skip to main content

What Would You Pay to Keep Your Digital Footprint 100% Private?

dec17-12-559155313-Tariq-Dajani
Tariq Dajani/Getty Images

Last year, the rather ironic rumor that Mark Zuckerberg covers his laptop camera and mic made the rounds. As Antonio Garcia Martinez points out in Chaos Monkeys, a provocative new book on the Facebook founder, “it’s not the rats who first abandon a sinking ship. It’s the crew members who know how to swim.” Or in the more famous words of Andrew Grove, a co-founder of Intel and Silicon Valley pioneer, “only the paranoid survive”.

In an age when we spend a bigger proportion of our waking life online than offline, we are producing a sea of personal data, making hacking and cybersecurity bigger obsessions than ever. Where the concept of “personal” once meant “private”, this distinction no longer holds true, as so many of our personal and intimate interactions (and the data they generate) now exist online, an increasingly public domain. Even when nobody is watching, we are rarely alone.

What if you were given the chance to buy back all the data you left behind, from the first minute you ever spent online to this very moment? What would you pay for it? How much would you give to keep it 100% secure? Before you think of the answer, you may want to consider the following costs:

(1) No more “free” services. We all know the trope “If you are not paying for it, you’re not the customer; you’re the product being sold” (a quote by Andrew Lewis on MetaFilter), but what would actually happen if the email providers, social media platforms, and even search engines we use every day were no longer free? We may complain about our desire for privacy against the rapid encroachment such service providers make into our personal space, yet few mainstream alternatives seem to exist.

It’s not for lack of ideas. Alternative business models have been suggested by tech pioneers such as Jaron Lanier, who proposed person-to-person micro-payments as a way to remunerate users for the data they contribute. Imagine if you had to deposit a small payment every time you send an e-mail or Whatsapp to someone — could this be the solution to information overload? Another option could of course be to offer subscription-based versions of the services we use, such as Facebook, in which the fee paid by users would secure the privacy of their data, replacing the for-profit sale of their data for advertising purposes.

The challenge therefore isn’t so much a question of logistics, but rather the psychological issue of the intangibility of our data and its perceived value. For most of us, the “free” services we use online enable us to fulfill a specific need: to connect with others. Whether through email, group chats or social media, the data we generate through our interactions are not only intangible, but their value is also hard for the average consumer to grasp and quantify. While most people may shudder at the idea of a stranger coming into their house to rummage through all their photo albums, bank statements, and personal diaries, many of us don’t bat an eyelid when comparable infringements take place online, perhaps because the violation and its consequences are somewhat less visible. In fact, it may only be when we are confronted with a concrete scenario such as a data breach (the leaking of credit card details or private photos) that we recognize the real-world implications of our online activities.

When it comes to social media platforms, even when people are made aware of the risks, their privacy concerns rarely appear to translate into protective behaviors. People increasingly bemoan the creeping infringement of their privacy, but they still engage in uncensored public self-disclosure and allow companies unprecedented access to their data, resulting in a “privacy paradox”. In short, people’s behaviors suggest they don’t care as much about privacy as they say. Although they like the idea of privacy, they don’t seem to value their data enough to take concrete steps to protect it. To be fair, though, this disconnect between attitudes and behaviors has been found in every area of life. For instance, most people value their health and relationships a great deal yet they still engage in behaviors that put those at risk on a regular basis, even when they are conscious about the potential consequences.

(2) No more personalization. When it comes to online interactions, we’ve grown to expect a degree of personal and contextual relevance, especially with regards to search results and online services or products. Eric Schmidt, Executive Chairman of Alphabet, has been noting for years that unlike traditional broadcast media, the internet is a “narrowcasting” medium. This expectation is largely implicit, and we may only realize how tailored our services have become (for instance in our searches on Google maps or the ads we see on YouTube or Facebook) when we change our privacy settings or wipe our cookies and search history.

It is the ease and convenience to which we have become so accustomed that keeps us from choosing more private (but also more clunky) user experiences. Sure, you could turn off your geo-location settings for a taxi app so that it stops tracking your every move, but the hassle of having to manually enter your current location every time you want to use it may feel like too much of an effort to be sustainable.

There can also be a flip side of course — for instance, when you start receiving ads across different platforms (from social media feeds, to on-demand TV and digital radio) for something you searched for but didn’t realize was being tracked. This “creepy factor” —  the feeling that your every move, both public and private, is being watched, tracked, followed, analyzed and capitalized on — can create a significant cost known as “psychological reactance”. If we feel that our freedom has been lost or threatened, the motivation to regain that freedom can lead people to resist the social influence of others, resulting in a sense of violation in the user, and a tarnished reputation for those companies that might hope to use personalization to increase their revenues.

(3) No more instant gratification. The data we generate through everyday interactions can also be used to help design services that better leverage our desire for immediacy and convenience. We appear to be living in an age when only instant gratification is fast enough. Given our limited attentional capacity in a market which is already time- and attention-poor, services that provide immediate solutions not only offer an experience which feels more immediately rewarding; they can also more effectively bypass considered thought.  For example, when was the last time you read the terms and conditions of an app you downloaded? This means that when it comes to signing up for a new service, our desire for convenience will often trump rational consideration, resulting in an agreement to have our data used in ways we may never conceive of. In our desire to expend as little time and mental effort as possible, we can be setting ourselves up for a great deal of risk further down the road, whether through the leaking of personal data, or the denial of access to specific medical or financial services due to covert inferences about our character that have been collated from our digital footprints.

In our view, we should expect significantly more in exchange for our privacy. There should be more symmetry between the amount of data we give away, and the services we receive. For example, recommendation engines should actually help us discover what we need, even if we didn’t know that we wanted it. To some degree, this may happen in the realm of music and movies, but it is by no means the case when it comes to shopping. By the same token, the same data that are being mined for retail-related advertisement purposes could be used to match people with more relevant jobs, as they are indicative of broad psychological traits that have been linked to career outcomes for decades. For example, if our Netflix or Spotify preferences reveal that we are intellectual curious or creative, algorithms may present us with job openings for curious and creative employees. But the key is that none of these services should be offered without consumer consent — and true awareness — and they should also give us back some “understanding” of how the data is being used, and why. We deserve the right to know what inferences are being made from our data, whether those inferences are accurate or not.