Privacy is not a commercial commodity. It’s a fundamental human right. A fundamental human right by definition can’t be bought or sold.
These comments are inspired by a post written a few months ago by Chris Messina, ex-Google employee, about privacy and Google+ –a post I’ve obviously been thinking about for a while. In that post Messina refers to privacy as a “4-letter word,” and offers a few novel ideas about how privacy might be implemented as a commercial commodity in the 21st century.
Here’s a relevant sample from Messina’s post:
***
This word, privacy?— it’s a problem.
It’s one of those words that puts a stop to useful conversations and prevents us from actually engaging with what’s going on in our digital lives. It obscures and glosses over.
WAKE UP!
Maintaining your privacy doesn’t strictly mean keeping people from having data or information about you. Certainly not preventing yourself from having access to data about yourself. Privacy is about the ability to be left alone, or about not being watched, if you don’t want to be. Which is fine. Turn on Do Not Disturb. There — you’ve got a bit of your privacy back. But that has nothing to do with the huge amounts of data you’re still producing and is being tracked.
So, given that expectations of privacy are changing (or being changed), I challenge you: what if you want to be watched? What if you were offered an outsize amount of value in exchange for allowing someone else to watch you? What would you do? Who would you want to watch over you? Who would you want to look after you and your best interests? Who would you trust? Do you feel like you have reasonable choices in today’s marketplace?
***
Privacy is about the ability to be left alone, or about not being watched, if you don’t want to be: these words echo Brandeis and Warren’s influential paper on privacy from the late 19th century, in which privacy was defined as the right to be left alone. The word ability instead of right is an interesting substitution, implying the real modern issue in privacy, which is control: the individual’s ability to control others’s access to him or herself (or to information about the individual, which in many ways amounts to the same thing). In other words, privacy is a kind of power: the power of the individual to resist intrusion, and hence external control.
Rights are intrinsic, not purchased. Individuals can lose abilities for all sorts of reasons, but rights don’t go away. To define privacy as a right means that it is not and cannot be in any sense a commercial transaction. Imagine some other fundamental right—say, freedom of speech or worship, or protection against self-incrimination–as the commercial commodity described in the quotes above, and ask yourself whether those paragraphs make sense in those terms.
Is it possible that the idea of a human right as a commercial entity is the natural consequence of a philosophy that seems widespread lately in Silicon Valley: the idea that human beings are essentially a resource to be exploited? In the 19th and 20th centuries industrialists exploited natural resources; one result was rampant destruction of the natural environment (which is ongoing, of course, in many places to this day). The natural resource that’s being exploited by techno-industrialists in the 21st century is humanity and all its works. The exploitation may be immediate (as in the case of certain “sharing economy” companies that treat their opportunistically-acquired workers as disposable parts) or secondary, as in the case of Google and Facebook (who use personal data to sell advertising, as opposed to extracting value from the individual customer directly), or Youtube (which offers a range of intellectual property, sourced from legitimate owners and not, to the same end).
To say that privacy is one of those words that puts a stop to useful conversations and prevents us from actually engaging with what’s going on in our digital lives. It obscures and glosses over– simply disparages privacy as a legitimate human interest—a right–just as more-traditional industrialists have argued for decades that it’s not their concern if acid rain from coal-fired plants in Ohio damages New York’s population and environment. Traditional industrialists created toxic chemical waste; information-age industrial waste includes identify theft, omnipresent surveillance by governmental agencies, and the increasing inability of content creators in many fields to make a living from their content. Like traditional industrialists, this waste is left to individuals and societies to deal with. We took your privacy? We polluted your lake? Who cares? Grow up. Stuff happens. Not my problem. I’m only responsible to my bottom line.
Yes, conflicting interests impede progress (though “progress towards what?” is the unanswered question). Some of the competing interests in this case are complex and difficult to reconcile. That doesn’t make those interests illegitimate or irrelevant. The rights of individuals are always relevant, not least because they can’t be bought, sold, or bartered away, ever.
The proposed remedy to that conflict– What if you were offered an outsize amount of value in exchange for allowing someone else to watch you?… Who would you trust? Do you feel like you have reasonable choices in today’s marketplace?—is simply astounding, and not in a good way. The ideas that a) the only choice available is who will exploit you, and b) that this is a marketplace issue as a opposed to a fundamental human rights issue are both spurious. In particular, the idea that privacy can only be exercised in terms of marketplace choices—and will, by implication, be protected by marketplace providers from violation by other actors, such as governments and criminals–is astoundingly naïve, the kind of thing you’d expect from someone who never heard of the 20th century or read “1984.” Further, the idea that any—any–marketplace provider will manage personal information first and foremost on behalf of the person(s) the data represents is utterly contrary to actual business practice, not to mention history.
Finally, the idea of privacy as a marketplace choice ignores something utterly fundamental to any ethical commercial transaction: the awareness of both parties to the transaction of the real value exchanged. In the 1620s the Dutch “bought” Manhattan Island from the local Native Americans for about $24 worth of baubles and beads (or so the legend goes). Assuming for the moment that the legend is true, was this transaction ethical? I don’t think so. The Native Americans involved didn’t have a concept of land as something that any individual could own; they were literally unaware of what they were selling, and of course the Dutch did little to enlighten them. (“After we give you these beads, you all have to stay off the island forever unless we say you can come here, okay? That’s what this deal means” might have been an appropriate caveat.)
The typical citizen now has about as much awareness of the ultimate value of their privacy (and the information that said privacy is supposed to protect) as those Native Americans had of the concept of “owning” land. If you don’t know what you’re selling, how can you make an informed choice to sell? The mismatch in this case between the respective powers of the buyers and sellers is too extreme to make the transaction ethical; it is inherently exploitative.
It remains to be seen how privacy for individuals will be made viable in the 21st century, if indeed it can. Like I said above, it’s a complex problem, and one that technologists have barely addressed. Starting from the point of view that privacy is a transaction, as opposed to a fundamental right, is not a viable approach.
I expect plenty of disagreement from plenty of places on this point of view, and I’m delighted to hear it. I’d be even more delighted to hear a proposal for how privacy-as-control can effectively be implemented in the 21st century, but I’m not expecting that, at least not right away, because no simple solution (short of a return to the 19th century, which is of course no simple thing either) is likely to work.
The Gartner Blog Network provides an opportunity for Gartner analysts to test ideas and move research forward. Because the content posted by Gartner analysts on this site does not undergo our standard editorial review, all comments or opinions expressed hereunder are those of the individual contributors and do not represent the views of Gartner, Inc. or its management.
2 Comments
Richard
Thanks for a thought provoking blog. I’m an IBMer. We are creating and integrating complex systems of engagement and record where personal data can be mined to deliver “sense and respond” goods and services tailored to an individuals digital footprint.
As a consumer, I enjoy the benefits, so I’m prepared to “trade” my privacy in exchange. But I have no idea who’s looking at the data behind my transactions and no way of finding out.
So I don’t even know who’s watching me – let alone whether I trust them to. We need a wake up call like yours to Google / IBM / Microsoft / the whole industry. This is MY identity. I may be prepared to lend you bits of it to make my life easier. But you better guard those bits and keep my secrets like a good friend would. Are you up to the job?
Thanks — yes, Richard, you frame privacy as a fundamental human right, and it is addressed both in the Bill of Rights in the US, and the Charter of Fundamental Human Rights in the EU. But human rights are not absolute — you trade your right to free speech when you’re in the workplace, you put your right to life aside when you join the military or law enforcement — in which you can be ordered knowingly to your death. So, the question is not whether you have a right to privacy — you do — but in the digital age the question becomes what is “due process” for trading part of all of that right away, or for infringing on it?