So here's the first bridge and challenge Mike. Pronounced privacy wrong. How many people are privacy and how many people are privacy? Right? Right. Just name spaces are hard. I've been going around for the last couple of days talking to people and saying, there's a domain of privacy and there's a domain of identity and what's the ven overlap? And the most interesting thing is that there was not a single consistent response. And I go and there was a lot of, I don't think that's a question that makes sense in my head, right? And other people did various other things. And what's really interesting to me, I'm a privacy, my day job is enterprise privacy consultant. And a lot of what I do is data mapping and looking at the data. And I have a background before that in IT and operations. So I understand I am from the perspective of workforce and SSO and all of that stuff.
And so because I'm a data guy, I'm thinking to myself, well that's an easy question. The data that both those domains are interested in, if we set aside entity endpoints, you know, and identity, it's a hundred percent. Cuz they're both dealing with the same data by and large. But why are the domains so different? Why is it that people, I mean, when I go to identity conferences, people look at me at the privacy guy and when I go to privacy conferences, I keep failing to get on the speakers list with identity stuff about privacy. Cuz they go, that's none of our business. So there's these two silos or these two domains that aren't talking to each other and really need to, and, and how do you build bridges between those two domains? So, so let's talk about maybe what they have in common. I already talked about the, the data underlying that, but just sort of at a, at a metal level, each domain has at least one privacy association.
I'm a member of the International Association of Privacy Professionals, the I A P P and I go to their conferences and I'm a member of ID pro and i, I participate in those. Both of those organizations have domains of knowledge, right? If you are in this room, you've probably heard of the ID Pro Bach, the Book of Knowledge, shout out to Heather Flanagan. And, and the team that that is, is building that. And in the I A P P, you, you tend, the regime tends not to be so much the book of knowledge as the various data protection regulations. And I say data protection, not privacy for a reason that I may, that I may get to later. And both sets of professions can get certifications, acquire knowledge and practice in their profession without ever talking to somebody in the other domain. Again, disturbing. What else do they have in common?
They both have norms for controlling data, right? In the I am the sort of traditional I am space, it's inside the enterprise. Either my workforce data or my customer data, possibly partner data or possibly data if you're an outsourcer that I'm processing on behalf of, of a client. And so both have norms related to the, to the processing of, of that data. So the, which is interesting because you have identity and access management, you have data protection, but they both have this in common. We, the institution or we the experts, will control your data. That's our job. It's, you know, we're the, we're the men in the white coats or the hipsters with the, with the pencils stuck in our beards that the, the, the suits won't let show their corporate clients. That was me 20 years ago.
But we control your data. So it's paternalistic and it removes all the agency from the individual. So they both have that in common, but yet they still don't talk to each other. So that's kind of how I want to frame this, this discussion as these two domains that they, that they need to talk to each other cuz underline, they're, they're both dealing with us or the data that represents us. And I want to talk about two linguistic judo tricks that I use in my practice. A lot of people will talk about we have to protect privacy.
What's wrong with that statement? He asks rhetorically, protecting privacy removes your agency, removes your control. The whole notice and consent thing, which we all know is broken on the internet, is built on this notion of, let me tell you what I'm gonna do with your data. Let me pat you on the head, take your data away from you and control it because I know best, right? So I try to talk with my clients about respecting privacy. Cuz to respect somebody's privacy, you have to take a bit of a journey to understand what it is that they want, what it is that they need. And sometimes those aren't the same, same things of course. And what it is that you can do to respect their privacy. And on the Im or identity side, stop using the word user. I mean, right? Two industries use user, right?
Drugs, drug drug, drug sellers, and it, and let's talk about individuals, right? And that really clarifies it in terms of the identity space. If we're talking about individual identity, whether it's a persona, an avatar, whatever it is, then we can exclude for the purpose of this, discuss this discussion. Entity based endpoints, which are validly part of identity management, but not part of what I want to talk about. So we have these two domains that have these common elements but are not talking to each other. And they probably should. So just to ref, just to sort of frame things in these two domains, I use this often when I'm introducing clients that are new to this, the stages in a personal data life cycle, collect, use, disclose, retain, and dispose, right? So anytime you collect data, right? And here's where I want to talk about consent, cuz in my practice and as I observe it, the much more important privacy principle or fair information practice engages here.
And it's not consent, it's purpose. Have you identified a purpose for the collection of the data, right? And that purpose has to be legitimate, necessary, maybe a legal basis. So, but whatever the purpose is that you're collecting, that some branches, some downstream uses will lead to consent. Some won't. Like if I'm, if it's in the health space, there's non-discretionary collection and retention of data related to, to health in, and there's some areas national security, other areas where collection is mooch, you have a legal authority, you're going to gimme the data. So defining the purpose is critical. Once you define the purpose, you determine the use. To whom can I disclose the data? If I'm the data controller or another paternalistic binary data controller and data subject, I have a whole keynote about going from data subject to data citizen. But that's a different area. How long should I retain the data and how do I dispose of it?
So at a high level, we have this life cycle for privacy. But if, if we're talking about respecting the user and we're looking at it from that side, are we actually spending the effort to understand what it is that the user wants? So on the top side of this diagram, you see the enterprise view, the data protection, view, collect, use, disclose, retain, and dispose. But if it's personal information and there's an individual involved, not a subject to your control, then you should, if you're the one that's gonna be processing their data and you wanna respect them, understand where they're coming from, you talk about collection. But from a user's point of view, is that, am I willing to share my information on my terms? Right? I'll, you want to use the data, I'll permit you temporary access to my information for that purpose. I want to disclose the information to someone else.
What gives you the right to take control of my information and give it to someone else? All right? I want to retain the information. How can I manage or co-manage my information? So I understand that that retention is appropriate, right? And how, how can I monitor who does what with my information and why? That's a much fuzzer view of privacy. And I've had conversations with dev teams, with clients globally, whether it's in Silicon Valley or whether it's in London or whether it's in Shenzhen. The conversation is the same. I want a checklist. I don't want any of this fuzzy crap. Just give me a set of functional requirements. I can write into my code and go away, right? And that's not how it works. But, but it turns out there's another lifecycle.
The digital identity life cycle. And look at this, I got this from the, for the ID Pro PM buck. And look at this, we create the identity, we acquire or collect the information necessary to create the identity. It's then we provision the account, we provision access. So we're using the data, right? We authenticate. And that may involve a call to another entity. So there may, there may be some disclosure. We manage access, right? We control what you do. And then we do provision access. So it's not a one-to-one correspondence with the privacy life cycle. You wouldn't expect it to be, but it's the same kind of thing. The data comes in, we control it on the enterprise side, we share it with the people as is necessary for enterprise needs. And then we get rid of it. So how does that answer the privacy concerns? Right? Now, the privacy concerns in a workforce are significantly different than the privacy concerns in a, in, in some kind of customer facing facing app. But the process is the same. And because if you wanna respect privacy, it takes work to understand what those, what, what those expectations are.
Okay? Just syncing my, my two issues here. There we go. So now I want to talk about where we've started to try and build those bridges. So I want to talk about two projects that I was involved in in the Cantera initiative. And this reflects my, my experience in enterprises doing enterprise consulting, trying to get the, the dev teams to talk to the, to the privacy people because God bless him. There's competing elements of Dunning Krueger here. I've had more than a few security or ciso people say, don't worry about privacy, we got security covered. That covers off privacy, right? And I've had on, on the, on the data protection and privacy, I've had more than a few privacy lawyers saying, what, what are you doing here? This is a completely illegal, a legal issue. There's nothing, why is there even such a thing as a privacy consultant that isn't a lawyer, right?
So the first bit of work I did for Cantera and this report, cantera initiative.org, go to the reports and recommendations. You can download this report, privacy and identity protection and mobile driving license ecosystems. A couple hours ago they were talking about ISO 18 0 13 dash five mdoc and all of that. So if you look at the left side of the left hand side of the diagram, my, my right, your left, those, these are the data flows between the issuing authority, the MDL holder and the MDL verifier. And they look very much similar to any one of these triangular data flows. But when you think about it in the mobile, if there's a, if there's an ecosystem, you've also got other credentials. And, and how do you protect identity if the mobile driver's license is on a, on a wallet and shares information, right? Or there's other credentialing cre credential holding artifacts in the hand of the individual.
So how does the MDL exchange data with a mobile device? Cuz the MDL holder in the individual, it's the same person and it shouldn't be that difficult for those two credential holders to exchange information. There may be rules a appropriately so, but the individual still rules the, the RP reader software or the MDL reader. So we had, it was an interesting report bringing some people in the room who were sort of thinking just about the MDLs, just about 18 0 13 5 or just about the wallet or just about the issuance or just about the experience of the holder and what, what other elements needed to be addressed to talk about privacy. So in the cantera world, a report is just something we publish for, for information. The current bit of work that I wanna spend more time talking about is the work group I currently chair, which is the privacy enhancing mobile credentials.
And here's the thing, if you want to build a bridge between privacy and identity, this is where it is. You look at the center of this possibly overly complicated diagram we're working on that the center, the center triangle is the one that everybody in this room knows, right? It's the, it's the, the is the issue issuer, the app and the reader. And I borrowed for this diagram, the UML component to say that's a technical endpoint. And what's going on there is controllable by technical specifications, a lot of which have been talked about here today. And you talk, you have a zero trust, you can have a zero trust environment, you have data interfaces and data flows based on, on technical, technical zero trust environments. However, right? If you remember the diagram between data protection and the person, the privacy view, what you're really talking about are the relationships between a holder who's an individual between an issuer that's an organization or an entity and a verifier that's an organization.
And each one of those uses those components to do something, right? When I was cute and was challenged for entry into a bar, you know, in the last millennium, the right there, that was, that was the, the bouncer trusting that I really was 18 Canadian, not American. So I got to drink earlier and not, not, not 16. And the fake ID that I was using to get into the bar was good enough to pass, right? So there was that sort of, does the, does the bouncer look at me and look at that piece of plastic? And and does he trust that he, I can go in? Does he trust that the, the ministry of transport who issued my driver's license was trustworthy, right? So there's this, there's this kind of real world trust, not technical trust that happens outside of that trust triangle. And that's where privacy's gonna live, right?
Because in one of the keynotes yesterday, I think it was Mike Jones talking about how the, the original Microsoft passport, part of the failed user experience was because it didn't, they didn't ask for a password, right? They didn't trust it because it wasn't doing the ceremony to borrow Ian's phrase from even the the other one. It wasn't a ceremony they were familiar with. So there was, it was technically trustworthy, but they didn't trust it. So how do you build privacy requirements that people will see and trust outside of that? And that's what, so we built this and we say, okay, there's a holder who uses the app, but the app, the issuer, when they provision the credential on the app or on the wallet or they issue it, right? The issuer may have terms and policies on the provider of the app.
The user has privacy terms and policy that they have to agree to when they use the app or the, or the wallet on that. So there's, there's whole set of things that are going on outside the technical trust triangle that we go, that we're talking about to make, so that the holder can present this in a way that they, they believe that, so holder privacy is respected when her reasonable expectations for information usage after disclosure are respected, right? So we talked, there was a good talk earlier about selective disclosure using SD John or Mdoc and, and that's great, but after that selective disclosure, what assurances do I have as a holder that the verifier isn't going to use that data for a different purpose outside of that? And that's a, that's a deeply privacy preserving type of question. And it has nothing to do with what we can do on that technical trust triangle.
That's the domain of the privacy impact assessment. Or if you do iso kind of audits it's policies, it's procedures, it's training, it's some technical elements. Have you done encryption properly, right? The same thing at the, at this side of the issuer. The issuer may have reasonable requirements to impose on the provider of the wallet that it a certain level of os certain capabilities and, and, and away you go. So we're still working this out. The early implementer's draft report should be out this month, next month, depending on the voting process and how much heads down editing we get to over the, over the hump. And then we're gonna be working on the requirements. So, and once we have a set of requirements, like then we'll be building profiles. And if you think about it, the profile for John's Bar and Grill to get a trust mark as a verifier so that he can, he can use a verifying device and say privacy enhancing, right?
The experience you want is I put my mobile device to the verifier, it pops up with my picture and a green check mark. Minimal disclosure, no retention, no other use, right? That's gonna get you a, a a a profile. So the privacy expectations for that profile are gonna be relatively high. The privacy expectations, if I'm using my, my mobile credential to gain access to a tier three data facility or get access to a secure, what's the acronym? Secure compartmentalized information facility. A high, high high security. Or if I'm on the, I want to get access to airside on in an airport where I have access to airplanes and all that other stuff, then my, the set of requirements in the profile will be a lot less because my privacy expectations are reduced. So that's the fuzzy part that drives developers crazy. You know, how do, how do we do that?
And, and, and, and continue sidelight. I I will tell you that there is a standard published in 2022 from the I E I that I, that I helped work on called P 7 0 2 dash 2 20 22. I hate that. I know that off the top of my head. Data privacy process, which is about integrating privacy into your software development life cycle. So there's work Percy going on in a, in a couple of fronts and the nasty clock on the wall tells got five minutes left. And I do want to leave, I got most of what I wanted to talk about out there other than cantera initiative.org. If this is something you're interested in, please come along and sign up for the work group. Most of this is happening out in public, so most of the work product is currently available. You can read it and send an email to me if, if you want to participate. But five minutes left. Are there any questions, anything online, Mike, or anything in the room?
Do we have any questions in the audience? Well, so in the absence of a question in the old, I
Know I could trust Mike,
I will come on the stage and so it, it's very interesting, John, because much of what you've described here is work in progress. So what is it that an organization should be doing now? Is this something that they should be planning for? What, what is your opinion?
In a former life, I was a Six Sigma black belt doing in a service organization and there's something in Six Sigma called Voice of the Customer. And I've thought for 20 years that the best voice of the customer tool you have is privacy. If you're listening to the customer and you're actually listening and not saying, well, here's what I think you said, the the privacy expectations, the reasonable privacy expectations of an individual tell you what the voice of the customer is. I mean, it's true. Every customer by themselves is an edge case and you still have to manage to the, to the, the center of your, the center of your population. But here's the thing, because privacy or data protection has implications in compliance and headlines and the risk comes from the edge cases. So from a point of view of customer satisfaction and lock on listening to customer privacy expectations and creating an I V R journey, that's actually a meaningful journey and not a gauntlet would be, would be nice. So your onboarding process, is it a journey or is it a gauntlet? Are you actually respecting the person and on the other side, are you looking at the edge cases? Where are the complaints gonna come from that are gonna get the regulator's attention or get the, OR or, or get the civil suits or the headlines, right? One, one prominent politician's breach in a hospital is enough to bring down the entire hospital's IT department for a month dealing with that, right? So voice the customer in the edge cases. A short version to that.
Thank you. So a lot of people have come to this conference to hear more about the European initiative, the IDAs. So could you kind of relate what you are doing to E IDAs?
There's a, there's an initiative I live in Toronto, Toronto in on, in the province of Ontario and Canada. And we're a federation. So a lot of this stuff happens at the provincial level. People in this room probably know about the BC org book, which is a verified credentials initiative for commercial entities in British Columbia. We have a digital ID initiative, like a digital ID provincial digi. That's, that's stalled. And here's, here's what my take no inside knowledge is the Europe, in Europe, the idea of a mandatory government issued ID card is normal. In North America, not so much. That's part of the reason the driver's license is taking over as a defacto one, but without any of the controls that E EI A has. So I think that IES points the way to a privacy protective identity card identity system. But it's not clear to me that the, the holding device, the whole, that's a credential. What is issued onto shouldn't be controlled by the state. And that's gonna be an interesting, interesting battle. I'm, I don't, don't come at me and say that it's not because I've been in too many government bureaucratic rooms where this is, well that's our credential. We have to control it all the way up and down the line. That's gonna be a continuing tension.
Okay, well there are still no questions from the audience, so I think I'd like to say to everybody, please can you thank John for his very good presentation. Thank you.