Event Recording

Allan Foster - Power to the People: Privacy, Trust and Data

Log in and watch the full video!

Keynote at the Consumer Identity World 2017 APAC in Singapore

Log in and watch the full video!

Upgrade to the Professional or Specialist Subscription Packages to access the entire KuppingerCole video library.

I have an account
Log in  
Register your account to start 30 days of free trial access
Subscribe to become a client
Choose a package  
So, first of all, please continue this discussion in the, in the, in the coffee break. And please invite me to it because I would like to be part of that second. I would like to hand over to a foster vice president of, I love global partner success, which is awesome. And being director or an identity, my guy for 30 years now, I think I've, I've seen you change positions over, over the time. As I have been looking at this of a thing that went from Netscape to AOL, to there's something missing in between I yes, that over then to, to some Microsystems and then I think your way parted and went over to for, so for truck also partner for us just as speakers and they're sponsoring this event. So thank you for that's name. Well, it's a foster, sorry. And we continue that red line with, with, with, with a keynote that's called power to people, privacy again, trust and data. And trust is the red line that goes through all the keynotes that we'd like to.
So while I was listening to the previous two presentations, I was thinking of consent privacy and trust in three parts. So welcome to part three. And it was actually very interesting in that both presentations we've just been knowing through, I found myself sitting in the back saying, Hmm, how do I change what I'm gonna say? So somewhat different than what I've just heard. And the stories are actually very, very similar. So really where this talk came from, we're starting to work at GDPR and I had some fairly strong, obviously it's a lunchtime discussion. It's strong opinions of view our GDPR, but one of the things that actually happening on it's that we are going through a process right now, we're going through a change in our internet civilization. We're actually getting through a civilizing of an online space. And our, our online experience is trying to be civilized.
And what typically happens when we start civilizing, the space is we start setting out new rules of how we get to interact with each other, right? And the GDPR is, is one of the ways of trying to do that. But we are setting up these new ways of dealing or these new guidelines of what is acceptable. If we map this back to 150 years ago in the American west, if somebody said something bad about your horse, you ended up up in main street at noon, with a gun stabbed your him and the person who didn't fall down was the winner. And they were right at some, we sort of had a civilization of that. And we said, that's not how we settle disagreements anymore. Now we have a structure in place and you make over court. We have a legal system in place. It's true that there are jobs that are no longer jobs, cattle wrestler.
That's not a job anymore. Pirate is not, I wish it was a job, but except in certain places around where we change the, the protective as to what is valid. So I wanted to have a look at privacy and what that means as we start looking at delivering services in this digital space. Right. And what we're talking about that. So I thought I trying find, right. So when I look at privacy, I look at this as an expectation that my data is going to be treated with treated confidentially, right? I'm giving you data and I'm, it's, it's between you and I and you to use it appropriately, whatever appropriately means, right? We can determine what that is, but I expect confidentiality and I expect appropriate use. And that is appropriate for my private data that I'm going to share with someone or with an entity.
The same data may have different terms of use with different entities. As I choose to share with them, we have a different contract with each one of those. Ultimately, we've been talking, it started, I think yesterday morning or yesterday afternoon when Don started out there with talking about trust, right? And we've got this issue where in variably as a consumer, we have no trust. We have very low level of trust. I'm not sure that that's actually true though, because there's a lot of things that consumers actually have quite a lot of trust in Google, right? So immediately those of us who are in the identity space cow down under a chair and say, oh no, Google takes all of our information, but I trust Google to go through all of my emails and separate out, which is spa. I haven't checked my spam folder in quite some time.
Cause it actually does quite a good job of it. I trusted to do that job. I also trust Google as in fact for dinner last night to find the restaurant and I trust it. And I get actually quite upset if it's not going to get me to the restaurant and text me to the other side of the river or something like that. So in general, where you have a service delivered and it's being delivered consistently and it's being delivered appropriately, a lot of consumers actually do have quite a lot of trust, but in some specific things around giving private information or personal identifying information to services, we don't know what's gonna happen to it. And we have this sort of fear of identity theft though. It's gonna be sent out into the gazillion email boxes and messages that we come into that, and that is driven by a, whether it's a feeling of a lack of control or an actual lack of control.
If you have no control about something, you don't really know what's gonna happen, you don't trust that it's going to be there, right. Or it's going to do the right thing. And so the real problem that we have to address is not privacy or consent. The real problem that comes down to it is how do we increase trust? Right? It's a trust ultimately sits down in the relationship between the entity and the individual. So we can try laws. The UVU has GDPR and everybody is either laughing at it or cracking in their shoes. But one way or another next may, there are a bunch of fines that potentially come into place and we're going through a trace of GDPR. So we have all of these different laws about how we need to treat data. If we look at this as being, trying to civilize behavior, right?
One of the things that I think GDPR is really trying to do is to trying to redefine what is acceptable behavior in our digital interactions. It's not only the GDPR in Canada, we've got prepared strangely, almost exactly the same set of requirements. We've got similar things down in Australia. We've got, and as we pointed out and yeah, each one of the countries are coming up with new definitions around what is acceptable behavior and what is not acceptable behavior. This goes back to the breakfast discussion that we were having this morning. Regulation is one way that we get to try and have an outcome. However, I believe the regulation is the lowest form of incentive to get something it's trying to get somewhere by beating people with a stick and a carrot sometimes works a lot better. And so when we started looking at things like the GDPR, I like to think that, and I think Katrina said it very nicely and I can't remember the exact words, but the terms that I use is the it's the forum, which we stand rather than the, the reach for that we grasp, right?
This is the very basics that we want to be able to do so that our CFO doesn't go to jail, which brings us ultimately to consent. Now, consent, it's been talked about a lot when it comes to GDPR. And specifically as we were just going, we get looking at, we need to get consent. We need to let the user be part of the discussion so that they know what we're going to do with their data consent is. And so we can get different their definition around it. It's giving someone or an entity access to some specific information for a specific purpose or for a specific use. And all three of those things are important. It's specific information to be used for a specific use by a specific entity. That's really what consent comes down to the problem. I think that we are heading down the path with consent.
And when we come up with this, I always go back to my mom because my mom thinks she's on the internet because of AOL. And that is her internet it's AOL. And so when we start coming up with asking for consent, every single one of us is, is familiar with those 200 page legal documents we get before you open a piece of software that tells you exactly what your rights are. I, I, I don't understand them. I honestly, I don't read through most of them. I may get through the first page, but that's about it. And a lot of the time that we end up going down with consent is we get lost in the weeds. You know, we get lost with particular pieces of information most of the time. And I'm reasonably familiar with what happens in identity systems. Most of the time I look at it and I say, Hmm, you get my address.
I'm not really sure what's going to happen with that or how it's going to be used. And I understand how the systems in the back end works. My mom. Doesn't right. And so we really should be saying to her, this is what I'm gonna do with the information and building trust that she will trust us that that's, what's going to happen. So how we address this consent is, is sort of going through issue. And I'm gonna talk about that in a moment. Consent is also not an event. It's not a point in time or a checkbox, right? We've got some, I know we opt in is one of things frustrating, a little bit about consent, because that's very little bit that we need to do around that. It's not an event. It ties back in that consent under life or start trust relationship. The two words in here, I think that are really important is trust and relationship consent is between two individual parties.
One of whom is saying what they're gonna do with the information the other one is saying, yes, that's okay with me. And it's an ongoing agreement that relies on an underlying trust because you can give consent. And if you don't trust that they're going to fulfill that consent, what is it better? Right. It really, so it's really bound by the underlying trust. If it's an ongoing event, this changes how we have to deal with our software because consent. And this was a term, I think I came up with it at identity. Now the cloud identity side, a couple of years back where we were talking about consent, lifecycle management consent is now a real business object that we've gotta worry about. It has a life cycle. It has a beginning where we collect that consent. It has a middle, right? And so at the beginning, we have to sort of set the terms of what am I going to do?
Here's what I'm gonna do with the data. And then do, you know, say what you're gonna do and then do what you've said, how long are we gonna share it? What are we gonna use it for? But after collecting it, there's a maintenance period where it changes. As we said, our relationship with the customer changes over time. Are we gonna share it with other people? Are we changing the purpose around it? What's actually happening? Are we adding new things to it? Are we moving things from it? I'm adding something new to the frequent flyer program or whatever that case may be. And then finally, there's a point at which the consent terminates that doesn't terminate trust relationship, right? Because if, if we terminate a trust relationship who cares about the consent at that point, you just carry on good example around this one was two years ago.
I left here in Singapore and I had a travel agency, an online travel agency that used to buy tickets every week to send me in all sorts of exotic places. And they had my credit card. They charged my tickets. Everything was great. At the end of the year, I moved back to the states. I wasn't upset was the travel agency. I didn't move back to the, because of them. They were actually doing quite a good job, but I no longer needed to buy ticket here in Singapore. And it was appropriate for me to say, okay, you don't need my credit card information anymore. Right? I want to terminate the consent to store that and use that. And I want to trust that you will do that in ongoing, ongoing fashion. So it doesn't terminate the trust. It's just we going to go and change. It's really an aspect of changing the underlying relationship that we are dealing with.
So sort of thinking about that, I started coming up with what are the things that we can start thinking about as we start building software and we start giving services. And I think we're gonna have a wonderful discussion over lecture, around email addresses, but it's the same thing. It's about selling services. One of the things we can start thinking about, and one of the first principles I've got there is that as we start building the software, we can identify where there are trust risks that our users are gonna come in. Now, users are gonna look and say, Ooh, why do they need that? What's going to happen with this, where you identify those, let's get out in front of them, right? As we are transforming the digital process, say, Hey, this is something that may make people uncomfortable. Let's get in front of that and give them some control over it.
Instead why we're doing it, let's allow them to opt in or opt out or allow them to actually control their process. And the second one, I really like this is conceive of personal data is a joint ask to come directly to Don's question who owns the email address? The reality is we both, right? It's there's many situations where we have multiple owners of things. And if you are a co-signer with your spouse on the checking account, there is a clear understanding between the two of you that you don't write a check for the last, whatever it is, a hundred dollars in the account. You leave money in the account. So the other person isn't drag, right? This is exactly same process that we have. It is still the user's data, all of their private information, their PII, their identifying information is still theirs. And if it's still theirs, they need to be part of the process in managing that and what we're going to do with it.
They need to have some of that control, lean into consent. It never hurts to ask consent gives us transparency. And I really like that slide earlier on that we had around transparency say what you're gonna do, and then do what you said that builds trust, right? And trust is one of those things where generally we have to give it and we lose it over time. If we don't behave correctly. And if you say what you're gonna do, one of the questions that is asked in Europe several times, I've had a question where people will ask, well, when do I actually need to get consent? If I'm going to be doing such and such, do I need to get consent for that? Or can I just do it? And I think that comes down to a very simple question. If the end user, if the, the data subject person, that data is about, if they would be surprised at what you are about to do, you should probably tell them you should probably get consent if it's not, if they wouldn't be surprised in that and you are probably okay, it's all about the context and what you're going to deal with.
It. One of the sort of stories that I had, or one of the thoughts around this is that it's not about the data, it's how it gets used. And the, the example or the story on this one, it's an imagine you've got I a rash on your arm and you go to see the doctor and you go into the doctor's office and you show them the rash doctor. Hasn't seen that rash before and says to you, can I consult with my colleague, okay. As a patient now responses. Sure. And he knows next door with the next office. And they talk about the rash as a patient. I'm perfectly happy with that. However, exactly the same situation doctor says to me, can I control it with my colleague? And that evening down at the bar, the doctor says to his colleague, Hey, you'll never guess this guy that came into my office this morning.
The context of that is not what I signed up for. Right? And so it's the context of how it's actually being used rather than two margin or what it's about, or how they were going to deal with it. And so in general, you need to consent. Let's tell the use of what we are doing and be open and clear about it. When that happens, the user becomes in control. And then finally, let's bring the user into the process, give them control over what we're going to do. We all three presentations have been talking about access management and identity management. And the reality about this is the user has control over what's happening. And I don't care if financial transactions, health transactions, telecoms. It doesn't matter any one of those it's the end user has control over what they can, what the data's being used for and how it's being used.
They start building trust, right? Privacy is not about secrecy. I've heard several times. People will say, oh, well, users will just give their information away. Therefore it's no longer private. No, it's not about secrecy. Privacy is about that expectation of confidentiality. It's not about, well, you should encrypt that information. Cause if somebody gets it, it's no longer private, right. Privacy is about context control, user choice and respect for both the user and the ultimately taking us to making the consumer an active part, an active participant in how their data, their personal information and, and their other data is actually being used. Because if they are an active participant, they have control. And if they have control, they have trust. And so these are the kinds of things in my mind, that policies like the GDPR policies, like the pay, all of these different things are trying to push us towards. They're trying to change our way of thinking of how we deliver these software and how we deliver these services. And I think that now takes us to outcome of coffee. So thank you very much. Thank.

Stay Connected

KuppingerCole on social media

How can we help you

Send an inquiry

Call Us +49 211 2370770

Mo – Fr 8:00 – 17:00