Ian Glazer

A member of the Gartner Blog Network

Ian Glazer
Research Vice President and Agenda Manager
4 years at Gartner
16 years IT industry

Ian Glazer is a research vice president and agenda manager on the Identity and Privacy Strategies team. He leads IdPS' coverage for authorization and privacy. Topics within these two main areas include externalized authorization management, XACML, federated authorization, privacy by design, and privacy programs. Read Full Bio

Coverage Areas:

I “like” you, but I hate your apps – Part 3: Controls and a look at the market

by Ian Glazer  |  January 28, 2011  |  6 Comments

Happy Data Privacy Day! Or if you are in the EU – Happy Data Protection Day!

In my last post I talked about the desires of all the parties involved in this new style of relationship, one in which, not only you and I are involved, but also your apps. One thing is clear – better controls are needed.

Controls for me

To attain the kind of control that I want, there are two things I need. First, I need a language to express my wishes with respect to information about me. I must be able to express myself easily even though the expressions may be complex. The manner in which I build these expressions must be miraculously usable. This is not a problem for developers to solve, nor is it a problem for privacy professionals to solve. This is a problem of big “D” design. Here designers such as IDEO, Jonathan Ive, and Charlie Lazor must be enlisted to tackle the problem.

Second, the language and methods used to express how I want information about me to be handled must not only be simple, they must be applicable in multiple contexts. A person will not dream up every way that information about them can be used in just one sitting. Thus how someone wants their information used has to be malleable to be applicable in new unforeseen situations.

Controls for you

You want to obey social norms if for no other reason than thousands of years of societal development has constructed a reward system for compliance. Simply put – the more trustworthy you are the more trusted you become. And you want your apps to be as trustworthy as you are.

To accomplish this you need usable controls that govern how your apps use information about your relationships. You need to understand how apps use this information. You also need to know how apps build information through relationship-based interactions. Lastly, you need a means to control both relationship information use and construction. These controls must be usable, complete, granular, and non-invasive. Again, this is a problem of big “D” design.

One approach – trustworthy agents

Copies of information are bad. Copies of information about me lead to ownership issues. They lead to authenticity issues. They lead to freshness issues and they lead to disclosure issues. Instead of copying information about me, it would be far better for all parties to reference information about me.

But the question is from where should this information be referenced?

I believe the answer is – from a trustworthy agent. I provide the trustworthy agent with information about me and the terms under which this information can be shared. The agent then acts on my behalf. This agent is persistent and available. It can be consulted regardless of whether I am at my computer, on a bus in Bhutan, or on the operating table. This agent is a polyglot and knows how to express my information usage wishes to all parties. This agent is portable and can be consulted anywhere in both on-line and real world situations.

In a world of trustworthy agents, you still express your preferences about information sharing to your apps. When an app wants to use information about me it first checks your preferences to see if that use is okay. Then the app is directed to my trustworthy agent, which in turn determines if that use is okay. Then, and only then, does my trustworthy agent share the relevant piece of information. In this way your wishes are respected, my wishes are respected, and the app developer’s need for information are respected while minimizing the amount of information copied from place to place.

What is out there

So does such a fantastical trustworthy agent exist? … not exactly, but technology is beginning to catch up with our collective need for controls. First, User Managed Access (UMA) aims to provide a framework in which an individual can control access and use of information about them. Leveraging OAuth 2.0, UMA provides the kinds of controls that are needed. Second, the idea of a trustworthy agent, known more commonly as a personal data service, is starting to gain real traction. Projects stemming from ProjectVRM such as Personal Data Ecosystem and Mydex have begun to further explore how such data services could work. XDI has shown potential as the underlying protocol for such services. Again, a personal data service would be a way for your apps to respect my preferences: I’m happy and you and your apps appear trustworthy.

What isn’t out there

Unfortunately, your needs and the needs of the app developers aren’t addressed by both UMA and personal data stores. In order to meet these needs, device and platform makers must build “concern for the other” into their products. This is a big “D” design problem that requires not just user-experience intelligence but also classically trained design expertise. Baking “concern for the other” into products can be used to gain a competitive advantage in a market. By acknowledging that referencing information and pulling it from the source when needed, is superior to copying it, app developers have an opportunity to both mitigate their risks as well as provide better controls.

Because societal norms are formed at social speeds, it will take some time before society can provide guidance when your app does me wrong. To assist us all, technology and design must be enlisted and the needs of all parties must be considered.

Furthermore, both you and I are at the mercy of platform providers. Your apps run on a social network platform whose provider may have different goals, values, and principles than either of us. Similarly, your devices are likely connected to communication platform over which neither of us have a great deal of leverage.

In summary…

While social and legal norms play catch-up, we need to make our apps are trustworthy as ourselves. Until concepts such as UMA and personal data services are more wide spread we are at the mercy of the paltry controls that app developers provide. Apps are not uniformly trustworthy and their use of information far exceeds what is visible to and expect by their users. The reality is for the foreseeable future – I like you but I hate your apps.

6 Comments »

Category: Privacy     Tags: , , , , , , , , , , , , , ,

6 responses so far ↓

  • 1 Tweets that mention I “like” you, but I hate your apps – Part 3: Controls and a look at the market -- Topsy.com   January 28, 2011 at 10:49 am

    [...] This post was mentioned on Twitter by Ian Glazer, Robin Wilton and UK Technology News, Bromley Stone. Bromley Stone said: I “like” you, but I hate your apps – Part 3: Controls and a look at the market: Happy Data Privacy Day! Or if yo… http://bit.ly/hbw69c [...]

  • 2 Robin Wilton   January 28, 2011 at 11:09 am

    Great post Ian – you’re absolutely right to highlight the “big D” nature of the problem: there’s ample evidence that ‘single-stakeholder’ designs can and will never fix this problem.

    I only have one nit to pick, and it’s on the thorny issue of “ownership”. I’ve seen enough discussions of this to convince me that describing the problem in terms of data ownership is just a dead end.

    It’s far more promising, I think, to try and define the rights which I should be entitled to exercise over data about me. Some of those may be similar to the rights associated with ownership of something, but others will be quite different – and are never likely to be revealed by a discussion of ‘ownership’ alone.

  • 3 Ian Glazer   February 10, 2011 at 6:10 pm

    Just in case you thought I was over-hyping this issue, check out this great talk from Black Hat on building a rogue iPhone app.

  • 4 Susan Morrow   February 16, 2011 at 12:16 pm

    I was pretty excited to read this post, not just because it makes reference to the Kantara working group UMA, which I’m privileged to be part of, but also to the wider picture of software development you have described – which happens to be one of my bug bears. I want to particularly reference your mention of the thousands of years of social evolution that we have gone through to develop our own real world ‘trust framework’ and how the big ‘D’ must be used in combination with the efforts of developers to create systems that work, for us.

    I’d like to expand on this thinking by arguing that it is not just about design, but it is more fundamentally about our deep understanding of the societal developments that underpin our real world, trust framework. Without a thorough understanding of this and how to apply the concepts to a virtual world, I don’t think we can build systems that ‘bake concern for others’ in. I would like to see a greater input into system design and development by, for example, privacy professionals. They understand, deeply, the need for ‘Privacy by Design’ and should be involved with development teams in planning out technologies from the outset. Personally, I also believe developers should be working with evolutionary behaviourists and psychologists too, but I’m sure that’s a debatable point.

    Going back to UMA, I can let you know that one of our focus areas is the trust framework – including the privacy layer, that exists within the UMA architecture. We are currently endeavouring to map out this framework as a means of expressing the areas of least resistance. UMA’s raison d’être is to provide the basis for the type of agent you are proposing – a kind of personal expression of your own trust framework.

    I expect that ultimately, intelligence (the ability to learn) based on our knowledge of how human behaviour governs our own ‘trust’ relationships will be an integral part of the design of the type of systems you describe in your post, I don’t think we are quite there yet though.

  • 5 Stephen Wilson   February 16, 2011 at 2:46 pm

    I agree with Robin about “ownership”. As fascinating as they are, I find that philosophical and sociological subtleties are best avoided when building practical privacy protections. Just look at the very words “private” and “public”. For the most part, they don’t actually appear in typical Information Privacy Law, becayse they are so fraught with ambiguity or subjectivism. Even “privacy” eludes an agreed definition. So instread, Information Privacy Law seeks to put objective, clinical controls on what businesses and governments are allowed to do with PI. Information Privacy is not perfect, and it’s an incomplete project, but it is elegant, objective and it does deliver a good measure of “privacy” in the fuzzy human sense of the word .

    I even think “trust” is problematic. There is sometimes a confusion of levels when it comes to trust. We need secure, reliable, privacy enhancing transaction systems, composed of layers of standards, technologies, processes and regulations, and the community needs to trust that those layers are serving individuals’ interests. But whether or not individuals within these systems trust one another is an altogether different issue. Interpersonal trust (as opposed to trust in systems) I think is a distraction. Just look at how PKI chased its tail for a decade or more because it started out promising to deliver “trust”.

    So I reckon the defining slogan of Internet sociology should be the old Italian proverb “It is good to trust, but it is better not to”.

  • 6 links for 2011-02-24 | Bare Identity   February 24, 2011 at 8:10 pm

    [...] I “like” you, but I hate your apps – Part 3: Controls and a look at the market (tags: ianglazer uma design ux identity authorization) [...]