They were able to identify that, but that was the incentive versus the consequence. Right. And I think there is a strong balance that needs to be struck. And one that, again, I think many of us in the room are privacy professionals.
You know, hopefully I can bring a unique perspective as the marketing guy who says, you know what, at the end of the day, my sales team is killing me for leads right now. And that's really what I'm incented on, but how do we make sure that the privacy and the marketing work together to the benefit of the consumer
And I'm Pamela Dingle. I work in the CTO office at ping identity, and we really focus on the protocols and the tooling required to transmit identity data across boundaries.
So, you know, as a, you know, we help people either provide that information or rely on that information in a way that can be, that is provably, hopefully provably consistent, secure, and predictable. If you will.
Thank you.
Thank, thank you everyone. So I guess one question to kind of kick off a discussion, triggered it my mind when you said, how do we balance Ms. Jason?
You said, how do we balance privacy with Martin? And earlier I sort of framed from cookies to informed consent.
You know, how, how do we make that transition? Because you know, well, well, one thing after Tim had his talk this morning, I had a little Q and a with him and we were following up on a question that had been asked by someone who experienced sort of a creepy incident on Facebook, where there, what was it? It was jewelry. I purchased for my wife. Yeah. That then shows up in the Facebook ad later. Right? Yeah.
So the, the, the question I'd asked was, you know, how could Facebook comply with GDPR? Could they just have a opt in, you know, Hey, I'm going to do all kinds of analysis on you and I'm gonna share it with everybody and you're gonna get ads and, you know, click it, click.
Yes. It okay. Right. And Tim highlighted that, well, not quite, you know, you needed to be a bit more transparent Facebook about what kind of analytics you were gonna do. And also you needed to get the consent each time you shared the data with another company. So that seemed to be a fairly high bar for them to cross.
And it does imply that, you know, maybe you can't get that kind of blanket opt in. So how privacy and marketing you would want take.
Well, I, I think it's actually interesting. So as a marketer, I actually use Facebook's marketing tools. What you may or may not know is it's actually privacy compliant in that you don't know the audience you're actually targeting or marketing those ads to it's algorithmically based. You upload, if you do retargeting, you upload your email list and they tell you we're serving ads, but we can't tell you to who, but you've gotten this many clickthroughs.
And when they come back through to your site, obviously at that point, you see, so they, Facebook would argue that, you know what, we don't need consent, cuz we're actually not sharing the data with Nordstrom in the example of, of the jewelry. And you know, this is where the lines become a little bit gray, right? And this is where you start off with the dumb cookie. Like Nordstrom has a dumb cookie. They've uploaded that into Facebook.
They've said, you know, Tim, in that example, we're gonna target you with those ads, but we don't, but Nordstrom actually doesn't know that Tim's being targeted. But I think that kind of goes back to the targeting or audience marketing, right? Like I'm marketing to an audience. I'm not marketing to an individual. And I think that's where the mentality needs to change because certainly I, you know, one of our mantras I gig you is to move from unknown to known.
And most people start in an anonymous state.
How many people in the room have seen that cookie consent bar pop up on the website and, oh my gosh, I'm getting outta here. I'm I'm not gonna click except or you just move past it. I don't think anybody has stopped browsing a website because of that cookie warning. It's actually. So from my point of view, it's actually a, a really bad set of legislation cuz most people, and there are some extent, exceptions just go, this is a CYA thing. Everybody's using cookies and I'm gonna continue.
Whereas I think, you know, I really personally like GDPR because it says consumers don't trust customers because it is in my interest as a marketer to use every means possible to reach you as a consumer, right? It's in my best interest. That's what I'm, KPId on. That's what I want to do.
And it really, in the case of the EU are saying, you know what, it's the data subjects. It's the citizens who should have control of that data and how they're communicated with, and we're gonna legislate that.
So I think, you know, as a marketer and somebody that my whole job every day is to wake up and figure out ways to reach all of you and get you interested in my product. That really, I think this is a case where by legislating and mandating rights for each of you in the room in terms of how companies communicate and use your information is actually the right way to go. Cause every incentive I have as a marketer, no matter how moral I think I am or how good a person I believe I am. Every incentive in my job is to try to reach you by any means possible.
And I will say, Hey, I'm using Facebook retargeting and it's anonymous and you know, it's fine and it's good. But at the end of the day, it can still creep people out. Right?
So I think it's a, is
The reason you can't be removed from that audience when you actually complete the purchase, people get angry or that the mattress they bought keeps showing up in the ads, even though they already bought and they don't need another one, but that they don't realize that that's a privacy consequence because getting put in a large audience of people who look at mattresses is a privacy respecting, but getting removed individually from that audience because you completed the purchase is not
That's exactly right.
I'm gonna be talking about this actually later in the afternoon with ad tech. So for some of the consequences, is that, is it Nordstrom? Of course the brand gets blamed, right. For those annoying ads and they have no control over it.
So, you know, be careful what you say, right. But the point is that, you know, with GDPR provided, you know, we could actually go back to a situation where, you know, we actually have a page which is actually all about jewelry and it's actually okay to put jewelry on there cause you're not using personal data. Just imagine wouldn't that be great? And we would actually get back to a, kind of a predictable kind of world that you know, that we could all live with rather than the
Creepy one.
That's right. And Nordstrom is actually not to drag Nordstrom's name through the mud.
That mean many companies, lots of companies do this. Yeah.
But, but really the marketer and the organization of Nordstrom, I mean all the way up to the CEO, they know this market is marketing is happening and they know that some customers are gonna be creeped out at, but on the other side, they're looking at the conversion rate and going for every dollar we spend in that channel, we know we're gonna return three. I'm gonna pump money into that channel all day long. And you know, a few people will get offended seems to be, but as long as I have a conversion and an ROI, it's my incentive to continue that type of market.
It seems to me it misses the broader or it's a very short-term view. Like, so I think about it really differently.
I think about it as look, you're trying to form a long term relationship with this customer, right? And the basis of that long-term relationship is trust and creep. Creepiness is one of the fastest ways to demonstrate that trust, to, to destroy that trust. So if I'm the marketer at Nordstroms, I'm gonna think really hard about, I might get you on this purchase, but I may never get you again if we're creeping you out.
So I think as a company, like certainly this is the way we think about it at Microsoft is the, you gotta be super careful around don't damage trust. That is the fundamental thing you're trying to build is a trusted relationship with the customer. And you can do magical things for the customer that don't harm trust, but you have to avoid creepy because then you're into the avoiding trust. And I think the way you do magical things for the customer is that you provide them with the transparency as to what's going on.
So let me give you an example.
I think one of the coolest things we've done recently is look, I signed into outlook.com. All of my Expedia receipts.
I know, go to outlook.com. It comes up and it says, Hey, we found a Expedia receipt for this flight to this. Would you like us to put this on your calendar and said to a reminder, and should we always do this? Right. Like that I think is awesome.
I'm like, okay. So there's some magical thing happening for me. I can understand exactly what the value benefit trade off is. It kind of makes sense, cuz I know that that's where the data came from, was a gut from the email.
Oh, sweet. Do it. Right. The creepy one. Like I feel like all of the ones that the, so you go search for something on Amazon, then you search something on Facebook or whatever, not Facebook on the Nordstrom or whatever.
Right. And then you go to another site and that exact thing you were looking for or something super close to it comes up. Then I think you're in the creepy, right? Because there it's, I was, I gave no, there was nothing to inform me that this might happen. And this seems disconnected.
Like I don't even have any reason to think that the data from Nordstroms is somehow gonna make it to Facebook. And I think that is then in the space where you're harming customer trust and the long term business value you're forming from that relationship has now been put at risk. So I think what part of what we need to do as organizations is put the right set of principles.
Like I love your, your suggestion around how to think about these things up front as part of the design, put the right set of principles in place and make sure that our people are incented and thinking about the value of that long-term relationship more than just the effectiveness of the current thing I'm doing.
I'm, I'm not sure whether I wouldn't already be creeped out by outlook reading the attachments of my emails and then asking whether they should continue to do so, but that might be the term speaking, but also from, from my experience advising companies, many of them are just not interested in building a long term relationship. It's more the, the aspect of, I don't care for creep, that guy he already bought.
If I can spend my time on acquiring five new customers, instead of spending the same amount of money on getting this person to maybe buy another item in two weeks, they will spend it on acquiring new customers. I'd like to, I'd like to compete with those companies, please. That'd be awesome. And also from my perception, the effect that you feel kind of creeped out by being displayed and at on Facebook, it really disappears over time. You might be creeped out by Nordstrom for a day or two. I guess if you see the same ad in two weeks, again, people would just start buying again.
The, so that's just my personal, we have a question back here in the middle
World. So I I'm curious cause the part of the, the premise of this is about strategic perspective, right? And strategy is about how companies create and give value to their customers. And I think some of the conversation is focused on, well, what is the short term value that we can create? And I think, and I enjoy the point that you made from Microsoft Microsoft perspective. I remember the days of Microsoft had different views in other departments. Yeah. It was much more short term.
And I'm curious, what, what are some of the great examples you've used? Cause I'm thinking to myself, this is more, this can potentially be more of a detractors than necessarily strategic value that you create. How do what's some of what are some of the examples you've seen in the industry where companies have really integrated this informed consent idea with the value that they're created? Like they would be able to create that value without this informed consent.
One of the kind of biggest digital retailers in Europe is a company called Assos somewhat familiar with Assos as OS their target market is millennials. So their fashion retail for millennials. So given some of the comments earlier today and how that kind of generation really is digitally savvy, they understand what they're doing on the internet. And as a brand, you know, Assos really understands that every interaction with that consumer is an opportunity, right?
They don't look at it as how do I minimize the checkbox information that I capture from this person and the most innocuous way to drive conversions. They look at this is an opportunity to communicate with our customers. So as O has actually taken a very proactive and forward leaning stand.
So GDPR, and every time you go to sign up, you look at something they're very clear on the communications channels.
The opt-ins the frequencies, which are all GDPR requirements, right? Like this isn't rocket slides, but they're already there. Right? And they are very focused in terms of communicating how that information then translates into recommendations, targeted advertising and other things. So they've rather than shying away from. Cause I think, you know, Lisa's points on trust are, are so critical. You want to trust the brand you're dealing with.
So it's not so much that Nordstroms is retargeting me. It's that I didn't know they were gonna do that. And I didn't appreciate it. But AAW on the other hand would say, you know what, we are gonna do that. And we're gonna be very upfront about it. And here's the checkbox that says, we're gonna do that and how we're gonna use the information and we're gonna present you, you know, the dress you just purchased the shoes and everything else that matches that along the way. Right?
So they've been very upfront with the transparency that helps build trust so they can avoid the ad blocking and all the other behaviors that these millennials display. And, you know, they, we are managing over 67 million identities on their behalf as a single organization today to build that transparency. That's one good example of a very forward leaning company that knows their target market is ultra sensitive to that and how they put all of those requirements up front in their process.
What a cool example.
So I, I just want to remark, you know, just how fascinating this topic is and that it's touches on ethics and behavior and marketing and blah, and all these things I want to bring in the standards piece. Since we're lucky to have Pam with us and ask you Pam, what, what you see helping us move from dumb cookies to,
It's funny when, when I think of dumb cookies, I don't actually think of the advertising cookies. I think of what has been sort of the stateless nature of permissioning in our, in our systems, right?
As we build systems that serve customers and there's been a ton of stuff going on in the standards world right now that changes this idea of what people agree to or don't agree to into really not just this point of time where I do a database look up and did they check a box? It's a sum total of policies, right? That involve actions taken over time, right?
The things you've consented to, the things you've ceased to consent to the, the things you've done in, in one site, in another affiliated site, all of those things get evaluated in a moment to come up with that really a much more context, rich vision of what this person wants and doesn't want.
Right.
Which, you know, if you talk about, we talk a lot about GDPR, I don't know. I'm assuming that everybody here knows actually that it is an EU regulation, right? That is that it is a data protection regulation, right? That there are some major tenants that occur as part of GDPR and those tenants push the technology in very specific ways.
So if, for example, so this whole concept of what did I consent to and when I revoke it, how do I prove as a company that the user has requested to me that, that, that, that consent be revoked. So that later when they don't get something, they thought they were gonna get, you can tell them why. Right.
So from a standards perspective, there's things like consent receipts in, in Cantara, which are really this idea of creating a structured document, to be able to, to explain what's happened in a standardized way so that if a person can sense in one place, a receipt can be sent that is as readable as your passport, even if you're using different technology, different vendors, different everything else. So that would be one example.
I don't, I don't wanna talk too long on it. Does anyone else wanna comment on that?
Well, I will comment on the receipt. Interesting document to be sure.
And I, I was just looking at it the other day, but is it, is it something that is more than just a consent receipt to the users themselves? Is it, is it something that can be exchanged between different companies?
It is not a consent receipt to the user. It is a consent receipt between sharing parties. Yes.
Oh, Andrew, did you wanna come in?
We've got the editor of the specific specs sounded a little, like it was something we gave to the user. So the consent
Receipt specification of just data format. Yeah.
You can, as a company, you can choose to give the copy if you want. Right. But the work group that we're starting out next in September is requirements for systems, the manage consent. And that's the practices of organizations that want to manage the consent active. Clearly one of the recommendations we're gonna have is use consent receipt. Yeah.
Strangely,
We're going, yeah. Sorry, sorry, go ahead. We're gonna do a couple experiments this year with the consent receipts and our consumer business.
I like, I, I kind of believe that that might actually be a way to increase trust, like super transparent and essentially giving you back a record that says exactly what we did for you. I think that might drive up trust with customers. So we're gonna try it out and see what happens. I don't know how far it'll go, but that I there's a lot of problems.
And we're looking for implementations and experiments because we need practices. We don't need to make up the practices.
And, you know, from a strategic perspective and back to that concept of the strategy that lays over top, if you look at what happens today, so, you know, it, there are specifications like OAuth that do involve consent, right? Where, you know, you, there is a structure where, where a developer who's writing an app can request a specific scope of data, right? So this is privacy by design at its absolute best. I want to be able to, you know, read your tweets, but I don't get any ability to write a tweet. Right? That's a pretty fundamental minimization of data law there.
What we're seeing now is the ability to connect the dots. So just because, you know, it's very easy to take a techy, tactical approach to those kind of scope consents to say, I agree to this scope, it lasts for as long as my token lasts.
And if I, if my token dies, I'll just ask you again and, and that's fine. But when you start thinking of it, not as a stateless thing, but as a relationship over time, then the technology has to change to match that, that philosophical view.
So I was just talking about this earlier with someone else. If I grant an app permission to like a navigation app permission to access my contact list so that I can more easily pull up the name of my friends so that I can enter that for, to navigate there or to request an Uber to get to their house.
What have you, how do I know that Boz and Uber, aren't just going and downloading my contact list. Hold second.
You don't, you
Don't right.
What are we doing to solve that?
And, and by the way, I think the work that's being done is fabulous. And it's very important. We need to solve this before we get to the direct neural interface is at that point, if we haven't got this, all privacy will be
Dead.
Well, speaking of the direct neural interface, there's a synonym to that, or I guess, neural learning. And that reminds me of the discussion that Tim and I had this morning after your talk and this harks back to Facebook already implementing the solution that we thought of, which is that they could, rather than sharing your jewelry preferences with their sharing your data with their jewelry vendors, they could just basically share enough information that the jewelry numbers could push the ads back to them.
And we talked about some privacy consequences of that as well, but then you had a problem with that as well, too, around the machine learning aspect of all this, right? Yeah. Because depending on the, on the type of data and amount of data that are being used, there, there can always be an implication of, of data protection.
The, the thing we discussed was this, this collection or, or these non-personal information that are actually being used in this context, but they, they would be in a position to gather a sufficient amount of non-personal information to some kind of profile, which is then expressed by the European data protection authorities considered personal information again.
So that, that will definitely not be, be a way around.
And, and that will become relevant whenever, whenever we try to basically use technology to find a way around it and, and use non-personal information because targeting has to be efficient and it will only be if I'm able to, at least to a certain degree, identify the person, regardless of whether I use personal or non-personal information.
So challenges that given a couple points of information about a person, you can at least to some low confidence, be able to identify who they are, you know, if you know the neighborhood and that they're purchasing diapers and that they're 50 years old, all of which taken by themselves are non personally identifiable are enough to send a Terminator, to kill seven people.
Haven't forbid that think it's worth noting though, just to go back to GDPR for a minute, that it's a little bit fuzzy on some of this machine learning and algorithmic learning.
And I think there's gonna be litigation in the EU over the coming years to try to settle what exactly the responsibilities are in some of these areas.
Yeah.
So I just want to point out that none of this is new and this all existed. Even before big data systems. I was at
Really enjoyed working there.
Axio a big data company, right? I in Arkansas, right near a place called toad stock, I'm not kidding was an annual festival called toad suck case.
Anyway, I was out there visiting them maybe in 99 and evangelizing XML and so on. They gave me this demo where they, they put in my address. And then they told me the demographics of my neighborhood. It was alarming like the degree to which they had narrowed down, you know, income levels are they likely to own a boat? What's the ethnicity of the neighborhood and so on. So they've been doing this thing for a long time. So it's not a new thing. The difference is they can do it faster and they can do it larger sets of data.
But you know, it, isn't a matter of, of this being a new problem to a certain extent.
And, and the thing about machine learning is the, the algorithms are emergent from the learning itself. So how do you do informed consent for what's gonna happen out of that is a challenge.
And I think that there is that we say that if you explain how you get that to target specifications, you use to just she, which, but if the application locally access to your competitors and proposed to use to which front house you want to go and just sent the final address to the cloud services to calculate which pass is the best. There is no, which, because it's just like, he's doing this address. So we must think about it from the and UI perspective on our solve this.
So that, that sounds like the idea of putting more control and decision making in the user's hands it's happening on the system with the app under the user's control makes your deal clear, right? Like what are you consenting to? And what's the value I'm getting back, make the deal clear. And there might be something we're sending my context to the cloud as a deal and will sign up for, but let me know what the deal is. And let me choose if I want the dealer.
Yeah. But there's no, there's no reason for them to do that right now.
I mean, has anyone in this room, how many times have you been offered to have people scan your contact lists, have any single one of them said, we will not use this data for anything except finding this one thing that you, you want. I I've never seen it. Right.
And that, I mean, you know, to go back to the, the Uber techy thing, not, you know, that's an obligation, right? You could say, I will grant you access to my contact lists.
And, but you are under the obligation to discard all data, except the data that I select. Right.
I mean, it's technically possible. Is it societally?
You know, if it was societally unacceptable, then I assume there would be pressure for that to change.
So, but you're telling the consumer, so have you gone into your phone? I did. And I totally screwed it up, but loved all the privacy. Like I don't want this application to do that.
And I, it was interesting because then I saw what they were doing when they were doing it. But my mom doing that, or maybe a few millennials, but the, the non-techy millennial, I mean, there are a huge part of our population that are not technical and sure. Opt in here, opt out there. But for them to have a full understanding, even after say best case scenario, all this is implemented and it works beautifully. And maybe I'll be a, a consumer data consultant and I'll analyze your information and go through and block.
I mean, maybe there'll be a whole industry. Won't be financial consulting. Right.
It'll be, but I, I just, I can't picture how it's used user friendly or usable. I, I can't picture that
The overreach isn't, you you're there's I don't feel like I'm getting something back and I don't feel like I, that millennials, I won't use your app. If you wanna look at my photos and I'll,
It's a nice idea to be, there's a lot of choices here, because that's what we're saying. There's a lot of convenience involved.
If you try to use Google apps, without it, knowing what your location is and you type in, you know, coast hotel or something, and it's giving you places in Hawaii instead of Seattle or Bellevue, that's not good. It's more work for you to get your map done and crash your cars.
I have to copy a person's address and put it into the Google search. I can't click on that anymore.
Yeah. Yeah. But you have
To do, you know what I mean? It was an interesting experiment. Yeah. Like I said, it kind of messed up the convenience factor, but I mean, that's just one phone. Yeah.
I think it's a great point. And it kind of all comes back to where's the onus, right? Is it on you as an individual to mandate and control and do yourself audit or do you expect the government to set up step up and say, these are the standards by which we expect companies to abide, and we're gonna enforce that on their behalf, in the case of GDPR, that's what happened. Whereas the onus back on the company themselves, and it becomes very piecemeal, right.
And that's where I look at, you know, GDPR will have flaws, but at the end of the day, it's trying to put the consumer back in control with clear steps and mandates that organizations need to follow. Right.
So I, I agree with you. I think right now in, in north America, certainly it's the wild west. Like they can ask for anything, keep anything for as long as they want, but really there doesn't seem to be any appetite in certainly the us political system to address this issue, which means actually, yes, your grandmother, your mother, your sister, your sister's daughter, your eight year old son has to defend for themselves.
And that is literally the environment we live in because there is no will of a larger governmental agency to set in.
And I think all the standards, all the voluntary things you want to do at the end of the day as a marketer, I'm like, that's all great. That's all well, but I know retargeting works.
You know,
It's actually, it's actually the industry who's at fault here. It's the industry's inability to regulate themselves. That's caused position. Surely regulation has to be the, the last, the last resort and my God we've got there simply cause the ad industry attic industry has not been able to, to, to
Control itself
Would say restrain themselves. That's
So should
You be regulating the needs? Or should we regulating the end?
Should we, should we put a whole bunch of things in place to make sure that my boss never underneath circumstances can learn that I'm gay? Or should we have a lot that says we can't fire me because
I'm gay?
Well, GDP is trying to be technology neutral, but they can't imagine the future. And that's part of the problem. I was gonna say, when I'm talking about, you know, optin is a phrase they're using a lot. So if you look at the ICO consent guidance, they talk about optin and privacy dashboards. What they can't imagine is an actual empowered method of sharing data, which is what healthcare would know as consent directives.
And
I'm gonna use the Uma phrase here because I was trying that to the discussion and we see it actually things like Google docs or whatever, which was never built for regulation, it was built for some business models. They have it's the rest of us. We have existence proofs of it. Regulation is sort of maybe enabling us to try getting there, but it's gonna, you know, the virality of GDPR may, for some other people who aren't under that regime to get there because the other guys are doing it. So maybe will see,
I think somewhere out we're, we're almost done outside here.
And we, we, I guess there's probably reasons why we then walked in to see if I was ever gonna finish this. Otherwise we could just continue, you know, through lunch until people opted out. But before I lose control of the situation completely, can we have like a 32nd closing thought from, from each of you guys, maybe starting with Pamela on the left this top
And then we'll wrap. Yeah. I like where we're going. I like the fact that we're starting to connect dots.
I don't know that GDPR is gonna like fix the world because I think the chances of actually being GDPR compliant anytime in the next 10 years is maybe problematic from an internal perspective. But I love that there's a conversation about it.
Yeah. I I'll second that I think the conversation and, and also bridging different silos, right. I think a lot of people would self-identify themselves is identity professionals, right. Which is great, but there's a whole marketing team.
There's a whole digital team that's out there whose entire job and reason for being, like I said, is to connect with the consumers and that's all of their incentives and all of their thought processes. And I love the gentleman who brought Axio into the conversation. Like I think a lot of us would shudder to know our organizations are using DMPs and looking at audience segmentation and audience marketing. But guess what? Every incentive is there.
If I go to my CEO and I go, look, we're doing this retargeting campaign, and we're seeing this click through rate and this conversion rate, can I have some more money? He'll say, hell yeah, here's a bunch more money.
Keep doing it until it sees diminishing returns. Right? And I think we need to start building those bridges between the digital and the digital marketing teams and the privacy and the privacy teams to get back to, Hey, let's build that trusted relationship with our customers and do it in a way that builds transparency.
And I think the conversation that's happening is, is the right one. And about time it's, for me, it's nice to hear from a German perspective that GDPR is not perceived as the end of the word in the us. That it's basically perceived as to a certain extent positive. As I said this morning in my talk business will go on in life will go on.
I'm not sure whether it will only do so because companies will be afraid of fines and of consequences or whether the fact that such a highly interesting regulation is in place will lead to social change as well, both in terms of consumer awareness for the topic and also awareness on the, on the company side.
Maybe the fact that this topic is regulated the way it is, will also lead to, to effect that companies will potentially develop a strategy that is more built on trust and, and less on, on quick short term success and, and, and lead generation would probably be the outcome that I would prefer most rather than having a clean regulation and advising clients on this.
I, I just wanna echo that.
I hope every single company in this room heard the fear in some people's voices and the confusion as they're talking about how to navigate through privacy settings and how to have user controls that my 70 something mom and that my 98 year old grandmother who bless her FaceTimes with my seven year old understand and how they're able to use them. There is a business opportunity here. I swear to God. I said my mother and my grandmother, the list of the companies whose privacy policies are great. If you've gotta go with someone, trust this company, there's a business case here.
And I think we should exploit it.
I agree with all that, the I'm gonna confine just the GDPR though. So the I'm I'm of the opinion that GDPR will be good for the world. It's costing like as a, you know, as a executive, Microsoft is costing us unbelievable millions of dollars to get into compliance with it. And there are parts of it that are written in such specific ways that in fact, the end user is gonna get these terrible end user experiences and all that, right? So there's all kinds of problems with it, but I think it is generally a net good. Like it will drive us in the right direction.
I'm actually kind of happy that we're doing it. And like, for us, for instance, you know, American consumers are gonna get a lot of benefits from GDPR, cuz we're not gonna have a different experience for Americans than from European consumers.
They're all gonna get the GDPR benefits.
So in some ways I'm pretty convinced that, look, this is a good thing for the world and we just have to work through it and then keep getting more and more sophisticated and coach the European union on how to maybe improve the regulation so that it doesn't force us to op offer up some really strange user experience. Right. And we'll have another panel later this afternoon on user experience and one tomorrow on survival of the fittest. So I hope to see you all there and want to give everyone on this panel. Big hand. Thank you. So.