Laws of Identity Iterations - or: The Nexus Between Morality, Subjectivity, and Empirical Knowledge
Kim Cameron has recently tried to shorten his "Laws of Identity". This started an interesting semantic process, which I will address at the end. But first, let's have a look at the iterations.
Here are Kim's original laws:
Here are the new and shortened ones:
- User Control and Consent: Digital identity systems must only reveal information identifying a user with the user’s consent.
- Limited Disclosure for Limited Use: The solution which discloses the least identifying information and best limits its use is the most stable, long-term solution.
- The Law of Fewest Parties: Digital identity systems must limit disclosure of identifying information to parties having a necessary and justifiable place in a given identity relationship.
- Directed Identity: A universal identity metasystem must support both “omnidirectional” identifiers for use by public entities and “unidirectional” identifiers for private entities, thus facilitating discovery while preventing unnecessary release of correlation handles.
- Pluralism of Operators and Technologies: A universal identity metasystem must channel and enable the interworking of multiple identity technologies run by multiple identity providers.
- Human Integration: A unifying identity metasystem must define the human user as a component integrated through protected and unambiguous human-machine communications.
- Consistent Experience Across Contexts: A unifying identity metasystem must provide a simple consistent experience while enabling separation of contexts through multiple operators and technologies.
Pamela Dingle still thinks this would not "resonate with people like my Mom". So she came up with the laws in even more colloquial terms:
- People using computers should be in control of giving out information about themselves, just as they are in the physical world.
- The minimum information needed for the purpose at hand should be released, and only to those who need it. Details should be retained no longer than necesary.
- It should NOT be possible to automatically link up everything we do in all aspects of how we use the Internet. A single identifier that stitches everything up would have many unintended consequences.
- We need choice in terms of who provides our identity information in different contexts.
- The system must be built so we can understand how it works, make rational decisions and protect ourselves.
- Devices through which we employ identity should offer people the same kinds of identity controls - just as car makers offer similar controls so we can all drive safely.
But Pamela has more.
- Don't do anything with my data unless I say so.
- Don't ask for or keep my data unless you have to.
- Don't let anyone see my data unless there is a good reason.
- I get to choose whether my data in one place is connected to my data everywhere else.
- I get to choose who speaks for me and I reserve the right to change my mind.
- If the easiest way to use the tool isn't the safest way to use the tool, the tool isn't built right.
- Agree on one way to do things so that I can be successful everywhere regardless of the tool I use.
"If I could use any terms I wanted and assume that everyone understood them, I could get even shorter":The interesting thing I noticed is how the meaning of the laws changes along the way.
- Don’t share my information behind my back.
- Don’t take more information than you need.
- Don’t expose my information unnecessarily.
- Don’t link me or allow others to link me unless I want to be linked.
- Don’t lock me into silos.
- Don’t tell me to RTFM in order to be secure.
- Don’t let the product interfere with the ceremony.
Kim's original laws have the remainders of empirical laws in them. This important aspect is much clearer in the very long version, but you can still see that the laws are meant as something that is based on observation, like the laws of physics: If you don't keep them in mind, stuff just won't work.
Kim's short version has exchanged a lot of the "must" wording with "should", which makes it sound much more like a moral statement.
Pamela's "for my mum" version goes further down this road. It takes a radically subjective perspective and tells the world what she wants to happen to her data, and how the systems she deals with should be built.
Her "favourite" version again changes the attitude and only works with "don't", which is clearly directed to the technology community from a user perspective, implicating the annoyance with many current systems.
So in the end, we have arrived full circle at the start, but know a bit more about the whole thing:
If the users don't want it, it just doesn't work. And there is even some morality behind it.