Everyone operates on the risk-reward continuum. It's true for CEOs, CMOs, CPOs, CIOs...and consumers. What does this mean for each of them in a digitally connected world, when the lines have blurred not only between organization A and organization Z, but also between cars and clouds, washing machines and webs, cradles and cybernets? With new consent regulations, standards, and tools on the scene, now is the time to think strategically about solutions that don't force awkward compromises when it comes to privacy, business growth, and consumer trust.
Another thing we can look at that. I've I found a study from mobile ecosystem forum, looking at consumers attitudes to the IOT, and it found some interesting results. What would concern you about a world of connected devices? So even over machines taking over the earth, that was 21% found it interesting. They asked that privacy 62% concern security, 54% concern. So trust issues were a big concern. So, okay. O T I think wasn't a big thing before 2014. So it wouldn't have made sense to ask in 2012 or so, but I don't think we would've seen such big numbers necessarily prior, so we can ask what happens when businesses can't form trusted digital relationships with consumers. What's the risk to businesses of, of not being able to do that. Well, as we can see in the Spotify case, they could see revenue loss. They can certainly see brand damage because I mean, people can take to social media and do a lot of damage very quickly. And overall there's loss of trust. Now, you know, in a totally cash economy where you might have a single interaction with the customer, you know, you might not see 'em again, you could really hurt 'em badly and off the go and off you go, you know, game theoretically, that's one turn, but in a digital economy where you're going to have an ongoing relationship with them, you have to think about the consequences of those multiple turns and that can hurt a business.
So now I'll talk about missing out on opportunities, the upside potential that you miss out on when you can't form trusted digital relationships with customers in a moment now, should we consider compliance costs and penalties to be part of the downside risk? Well, of course, I mean, in the general data protection regulation era, when we can measure those in terms of, well, you might have to pay 4% of worldwide turnover and you have to put in place a data protection officer and all those things. We can talk about costs very specifically, but I don't know if we can actually attach those directly to the proposition of what we do for our customers in the same way that we can't say that compliance to security regulations is actually about security.
So, okay. Why enable personal data sharing? So this is the reward part for businesses and for consumers. And I wanna use health relationship trust as an example, what is health relationship trust? It is that's the, the logo of heart. That's the, the, the nickname for the health relationship trust work group on the right there. It is a work group of the open ID foundation. And I happen to be the, the co-chair one of the two co-chairs of this heart group, where we're doing work on patient-centric privacy, sensitive, restful health data sharing, believe it or not, that's a thing now it's very exciting.
And so I wanted to share with you that there is one of the use cases that we're working on right now in the heart work group is called Alice selectively shares health related data with physicians and others. So it makes a really good case for looking at why would you enable personal data sharing? We've done some analysis on this. And so some of the flows that we're looking at one flow enables Alice to choose, to share basic data about herself with a doctor before her visit another looks at letting Alice monitor and control access to her data. There's a flow involving Alice sharing the list of her medications with her spouse, and one where Alice agrees to donate data to clinical research in a de-identified fashion. So, okay. Why enable personal data sharing one good reason, data, quality and accuracy. Maybe you can actually get a degree in rocket surgery, data quality and accuracy makes makes sense there. So I, I found some data that is fascinating. So one us study showed that there's currently only 5% agreement in a particular patient set between the medications listed in their electronic health records and what patients actually take.
Ow. Yeah. I saw some good facial expressions out there. That's not really good for, if you have some bad drug interactions, if you're going in for surgery and it didn't catch the fact that you're taking warfare in, or one of those things that might thin your blood, we need better accuracy. And actually going directly to the source of data would help that versus a pale copy. That goes stale. Another thing, improved clinical data for actual research or for public health reasons for that matter. So more data. This was actually a UK study over half the respondents supported use of their data by commercial organizations for research. If it could be shown that it was being used responsibly and interestingly only 17% of the respondents, but a hard 17% didn't support use of their data at all.
And then finally better care. And I there's, there's a bunch of data around this, but one, one interesting study that Phillips did with banner health, where they found that patients with the particular chronic disease using a smart device and an app would tend to leverage continuously monitored, vital signs resulting in shorter, less expensive, less emergency room intensive stays. And the savings averaged 10 days per year and $27,000 per year significant. Okay. So that's reasons why businesses and individuals and patients in that last case would benefit. Why should businesses actually ensure that individuals have personal control of sharing that personal data, something people might like to have, but why should businesses actually go and do that? So I've illustrated this here with a couple of examples. So pretty recently apple released care kit, which had a really interesting capability added, which was the ability to share data between people between parties.
So for example, with your spouse, so that's really great for the apple ecosystem. If you're in it, sometimes you're not always in the apple ecosystem. Although I see a lot of people here are fanboy and fan girls as I am, but if you're outside the apple ecosystem, as you sometimes might be, you might want to wear smart socks. Oh, has somebody heard me speak on this before? Yes, I do love my smart socks. Not that I wear them myself at the moment, so, okay. What, what's the deal with the smart socks and why am I talking about smart socks here? Okay. You can wear dumb socks if you want to. Most people do still wear dumb socks. What's the difference between dumb socks and smart socks, smart socks, generate data.
And in particular smart things, internet of things, things connected, things, generate data that consumers want to share, not just with an application that they themselves use. They want to share it with other people in their lives. That is a new use case that we haven't really seen a rise in urgent fashion when it comes to the web world, the pure just mobile world, that's actually a new use case that needs to be solved. So in other words, that's not just the fear end of the continuum of IOT. Like we just saw with the me F data, that's a greed use case or reward use case if you will.
Okay. So, all right, now we have to solve this with technology somehow, how bad is this need and how dire is the situation in terms of solving all these wonderful needs, risk and reward needs. So for rock actually did a survey and we published in, mid-March saying, okay, you've got these methods, like opt-in check boxes and cookie acknowledgements. I love traveling over here from Seattle and visiting any website and clicking. Okay, got it about a million times. Okay. So how ready are these technologies to solve all these needs? 9% of companies believe current methods can adapt. And I think, I think that's not a surprise perhaps, but fear not help is on the way.
The next generation of consent standards is actually riding to the rescue. And in fact, the next generation has started to be here for quite some time now. So let's walk through them. OAuth two, it's been around for some time. It innovated by letting us do consent withdrawal. You revoke the token open ID connect. It leverages OAuth by adding portable identity user managed access adds multi-party delegation, finer grained, withdrawal of consent, and the option for a central console heart. As I mentioned, it profiles those first three, along with a health specific API called fire fast healthcare interoperability resources for patient centricity, a newer one called consent receipts coming from Canara is emerging. Now it does what it says. I think you can imagine what it means. And one called common accord. Think of it as being able to store legal boiler plate. And should I say, make it ready for the blockchain?
Oh, I said that word and I'll just leave that one to make sure there's time to, to go through the rest, but I want you to sort of get ready for the power of all of these together. And I wanna illustrate Uma is a bit of a linchpin for sort of putting all these kind of together. So I wanna illustrate for you what a flow might look like for actually having personal control of sharing some, some consumer health data. So imagine that Bruce Wayne has bought that's. My example, Bruce Wayne has actually bought a fitness watch and he's looking at the data available from the fitness watch on a mobile friendly app. And so you're seeing four kinds of health data available to him as what OAuth and Uma would call a resource owner. So able to manage the sharing of this data through what here is an integrated feature, because in this example, the same organization made the watch and is also managing the sharing for him.
And so this is his what's called a, my resources panel. And you can see he's managing both a sort of fitness watch and also a scale. And he's got a share button that he can use to start sharing, differentiate different information about this watch. You would call them scopes if you know, OAuth as well as Uma. So he is gonna share with his doctor Dr. McCoy, he's got a choice of four different scopes. Those scopes are controlled by the manufacturer of that device. And in this case is gonna ha actually choose all four of them. This is kind of like if you were in Google docs, only standardized and available to all the different services who might be Uma enabled. And so here you can see that he is actually shared with three different people in his life. And you can see that he's chosen different scopes to share all four scopes in the Dr. Case, but his trustee servant Alfred only three, cuz Alfred doesn't need to know that he doesn't get enough sleep and Catwoman only two because she doesn't need to know he's run to fat.
And Dr. McCoy logs into his physician, Porwal a completely different client application sees he's gotten a notification and is able to go to the next step. Yes. And because he's Dr. McCoy, the Uma protocol is able to perform what is called trust, elevation on him and recognize that he's Dr. McCoy. He's not Bruce. Bruce has not shared his password with him. Bruce has not emailed him a file. Dr. McCoy is able to see in a different application. Bruce has information gotten directly from Bruce's API. Now let's go back to being Bruce for a moment. Oh no, I guess I don't have time for that. I will go directly to some conclusions here. So we know that the CMO and the CPO in a company can and must meet in the middle so we can have a risk perspective and we can have a business perspective and the tween have to meet.
I mean, they've struggled to meet in the past on the left. I actually have some, you know, quotes from general data protection regulation, which has an outsized component in its worldview for consent. And that's not actually codifying current data protection practice. It's actually stepping up a role for consent and on the right, we have the worldview of what businesses believe and what they're starting to believe. I, I think, you know, they, we know that personal data is a corporate asset. It's starting to be seen as a joint asset and they're starting to be top line of the business opportunities to mash up data.
So I wanna leave you with some recommendations, food for thought data protection has got to be only the beginning of this conversation. Data protection is a, what I think of as a fetal crouch version of privacy trustworthiness has to be a strategic business goal. I like to say that surprising and delighting customers with data sharing and control options has got to be a worthwhile endeavor. And so at the end of the day, your best and most strategic option as a business has got to be not purely risk thinking. It's got to be about reward thinking mutually in forming trusted digital relationships with your customers and your end users. So I wanna thank you for your attention.
Thank you so much. I pretty much appreciate your second major thought that ultimately trustworthiness must be the goal of the business also through digital means. And this is the problem actually we are facing, right? I mean, we are used to play the trustworthiness card in the classical world, but no it's new ways. There are some questions I've seen.
This first one is funny. Ah, is apple supporting you ma
Not to my knowledge, although I imagine they have some similar flows. Yeah. I mean, care kit is an example of doing the right thing and I, I applaud them for it.
Yeah. Next? What about a smart socks? What did I collect? And what's the idea? Well,
Those particular ones they're made by a company called sensoria in Seattle. And the purpose is for runners. Marathoners. I've heard that Netflix was looking into smart socks that would detect when the movie you're watching detects when you're falling asleep. So they could stop the movie that you're watching or something there's multiple smart socks.
And where can you buy them?
Sensoria is the name. I encourage you to look into them. I, I met a guy once in the UK who, who was a customer of theirs.
I think related to that SOS question is the next as well, is every user of smart things ready or even willing to share the data with others and or the public.
You know, I will say, you know, these are not social use cases. In fact, I don't think this is about social at all. You know, there are smart C a P machines. If, if you're one of those people who uses C P a P machines, you don't wanna put that on Facebook, right? The purpose is maybe for your doctor, maybe for a spouse or a partner. I can see use cases where you might have a leaderboard at a gym where you might wanna share that kind of data. So I think if you're not ready to share, you shouldn't share smart meter data. Isn't there to be shared with friends. You know, it's maybe there to be shared if you wanna save money on energy usage.
Okay. I think we're still here because of time. Thank you so much.
Thank you all.
How can we help you