Putting People in Control of Personal Data
I was thinking about how much I like using OpenID. I’m registered with myopenid.com, who could do with ironing out some kinks in their user experience, but it’s good enough.
One thing struck me after reading Tomas Baekdal’s excellent blog post on the subject of privacy policies. I summarised this in my comment on his post, but to cut to the chase:
“… statement of intent is all very well, [but] the practical reality of the situation is that data leaks. No matter how much you “respect” the people that gave you their data, respect alone won’t stop you leaving 10,000 names and addresses on a laptop in the local KFC.
This is why the real battleground needs to shift to putting users in control of how much data they release – regardless of privacy policies.
I would like to see, for example, the introduction of revocable keys for personal data. Have my name and address, but only in a form encrypted to you, with a key I can revoke at any time.”
Applying this to explicit data (eg names, addresses and data you explicitly hand over) the following things would need to happen for this to be possible:
- I would upload my public key to a key server (operated by myopenid.com, for example). This could of course be trusted in whatever mechanism is appropriate.
- When asking for data, sites would allow me to encrypt this to them (and perhaps together with any third parties) using this key plus a “modifier.” This modifier would make my key unique to them (ie I would be “spawning” many public keys).
- The encryption would be performed as part of the OpenID protocol with the host site.
- If I wanted to revoke my data, I’d simply revoke the key I’d used for them.
- The list of keys I’d used, and their associated businesses, would be managed using my OpenID provider’s account (myopenid.com).
This is only an initial concept – and it can almost certainly be improved upon – but the key to it (no pun in tended) would be the user experience. It would need to be relatively easy.
Of course, in being able to decrypt the data, the receiving party could easily make this available to anyone as clear text. It also wouldn’t work with implicit data very easily. But implicit data is, as Tomas points out, less of a problem.
The point is there would be a protocol established: a means by which honest parties could be honest, and in being honest prevent leaks. Dishonest parties will always be around, but my faith is that they will be in the tiny minority. The above may indeed be impractical (or impossible), but we need to look at ways to protect privacy in other ways than just legislation.