Facing here we go facing the post treat DPR reality. It's your turn. I like that. T-shirt
So folks, before we start, I gotta get a selfie for my mom. Smile and wave. Hello. Perfect. Thank you. So I have many advantages over most of the speakers here this week in that I'm British and my English is my native language. I also have a singular disadvantage in that I'm British and you know, may not be here next year. Who knows? That may be a good thing, but you can decide that at the end of end of the session. So we are gonna be talking today about GDPR. I mean, as if I always thought that last year was the GDPR year and that we could stop talking about it, but now we're talking about what's happened after GDPR. So it seems to me that there is still some kind of, you know, appetite to still be talking about it. And, but as usual in this sort of thing, we wanna spice it up a little bit. There won't be too much audience participation, may ask a few questions, et cetera. But we're what I'm trying to do is put a little bit of flesh and reality on what is really going on with GDPR right now. What is the impact it's having?
And I St I want to start with this quote. And the reason I wanna start with this quote is that I think it's fascinating that you know, our parents and our grandparents, when they learn to skill, that was the skill that they went through their lives doing so farming for instance, or being, being a blacksmith or, or, or even, you know, more recently being a banker and, you know, so that half the half life of their skills, about 26 years. So pretty much an an entire career. Whereas now it's more like four and a half years. I mean, is there anybody here who can say that this time, 10 years ago, they were doing exactly the same kind of work as they're doing today? Probably not many. You can put your hand up if you want. No, so, oh, one person, there we go. It's always is the exception that proves the rule.
Thank you. And I think it's important that we recognize this, that actually we live in a time that is changing so rapidly and how things like, you know, regulations like GDPR could, will need to keep up everybody. Remember this date last year, come on a little bit of help here. Folks who remembers this date last year. Yeah, I do you remember what happened? We woke up that morning. There'd been so much hype hype about GDPR. We woke up and it was like this post apocalyptic regulators world. There were regulators walking around like this brains logs audits, or, well, actually there wasn't right. It was a bit like the year 2000. Everybody was so focused on this date that the, you know, the world was gonna come to an end and, you know, you know, the cats would lie with dogs and other madness and things like that.
And, but actually it was just a normal day, right? And we're not seeing a huge amount of, of activity, you know, and I say that carefully, because obviously here in Germany, France, et cetera, has probably been, been more action from the regulators than anywhere else in Europe. And certainly around the world, especially given that the impact of GDPR is so global. But the reason for that, or the reason why we actually had GDPR in the first place is that the, the sheer variety of data is increasing exponentially. It's going through the roof. You know, I, I, I don't have any quotes here. There's a, there was a, a standout side that was shown how much data was being processed. And it's like everything. That's, you know, the amount of data we produced in the last year is the same as what we produced in the previous 200 years or something like that, which is just gonna get exponentially worse.
So we've got so much, so many different types of data, especially when you think about what's produced from an OT device, you know what, you know, the data of when you switch your lights on, when you, you know, when your TV switches on, it's so much stuff, and we've also got a shift volume of data as well, you know, so it's not just the different kinds, but that's so much, it is everywhere. Every single thing across the world has data of some description associated it so massive volume. And then we end up with a massive velocity. The speed at which this data is being generated is just phenomenal. It's, you know, we, we, I, I remember when the first, the term came out about a data lake, it was so evocative, et cetera. I mean, surely we're now moving into the occasion of, you know, data oceans or data, water, planets, or whatever, because there is so much data out there that, and, and the speed of which it's coming, that we find it very difficult to cope with, you know, and the net results here is that actually everybody's struggling to keep up even the Americans.
And we know that California is very much, a bit more, very much more progressive than most of the other states. They come up with much of their privacy and much of their regulation around sort of protection of consumers and individuals before anybody else. But they came up with the California consumer privacy act, 23rd of September last year. I think it comes into, into effect. I think it's either January 1st, next year or the year after, but they came up with this. And in some ways, it's, it, it's not quite as strong as GDPR and other ways it exceeds et cetera. But surprisingly, what happened was that 13 other states followed suit within about six months. And that those 13 states, they actually account for 40% of all consumers across the us. Now, in a matter of months, they've gone from zero protection to actually quite significant protection dependent upon where they live.
And this I think is important because a lot of countries are trying to play catch up. They recognize the importance of data, privacy, consumer protection, et cetera. And like I said, even, even in a, in a country, that's not necessarily known for it, it's going to, you know, rise dramatically. And it wouldn't surprise me if this map is out of date, certainly by, well, if not the end of this talk, then maybe the end of next month or something like that. And it's important. It's very important. And the reason it's important is because as Satya says here, you know, if we don't trust the technology that we have, that we're going to use, we're not going to use it. And we've seen this in many, many different industries as well, not just, you know, within the, the data protection industry, et cetera. We've also seen it across the food industry.
So something termed as conviction based societal change, which is basically a way of saying, putting your money where your mouth is. And so, you know, just to run an analogy there, the in 2018, the increase increase in us. Food sales only rose by 2%, that's hardly anything, right? But the, the level of sales of plant-based alternatives rose by 17%, people were willing to stop eating or drinking cows milk, or to stop eating meat products, et cetera. And they were spending their money appropriately. And so to a certain extent, it's not surprising that we see a huge shift in the way companies are responding to these, to regulations like GDPR and the way that countries are responding. They recognize that actually fundamentally, it's all about the money we need to follow the money.
And this is, you know, no, no better put than this was from last year's CES conference in Las Vegas, where apple did a massive advert across the side of a hotel, which didn't say anything about their product. There was no technology specs. They didn't say how good the camera was. They didn't say how fast the processor was. They didn't say how cool it looked. They didn't even do one of those fancy little, you know, images and gifts that, you know, showed the light glance off their glass objects, et cetera. They purely said, our phone keeps your data private. That's a big move for any company to purely just focus on one issue during the largest technology showcase pretty much in, in the world, or certainly one of the largest. And if we look at an advert from a competitor around about the same time, not NEC, not the same location, purely technology based, the camera's awesome is what they're saying.
So you've got a complete distinction between the two approaches by a company around their products. One is saying, trust us, come to us. And we will look after your data. The other one is saying, you can take really cool pictures with us. And I think we're gonna see be seeing many more of these examples coming through. And I think, you know, even within sort of identity management, even within any, any industry that's or any company, that's going out to business, going out to the market with its product, there is going to have to be a statement around. We look after your data, Mr. Klein, you can trust us with what we do. And I think that is a shift that has happened over the last few years, and which has somewhat been accelerated by GDPR. And so a lot of companies are like this. It's not supposed to be a mutant snail.
It's supposed to be an evolved snail, but you, you try looking for evolved snail on the internet. You don't come up with much, but it means that companies have to evolve. They have to change the way they fundamentally do business in order to continue to do business. And frankly, not many people are managing it. Most companies are still on the ground. They're still doing the same thing because let's face it. This is an incredibly complex subject. If we go back to the variety, the volume and the velocity of data, we're B, we are trying to, you know, drink from the fire hose. As, as they say over in north America, we're trying to deal with stuff at a pace of change. That is just so immense. And to the point where a another research company that I won't mention, actually reckon that by 20, 20, 40% of companies are gonna be in violation of, of GDPR direct violation, they will be doing things with private data that they really should not be doing. And they'll be doing that. And they may be justifying it to themselves because it's how they stay afloat. It's how they get reach their business goals. It's how they meet their strategy, but that's quite scary, really a fundamental piece of regulation that is supposed to protect a fundamental human right of yours may well be totally disregarded in the face of money.
Never seen that happen before,
But why, why is this so difficult? And I think one of the key things that we, we really forget is that privacy has huge cultural and generational differences. If I look at the way, my kids want to sign up to YouTube and Snapchat and Instagram and, you know, insert a and other social media here thing and give them their date of birth and their eye color and all that sort of thing. It's frightening. You know, whereas I, from a different generation, I had to register a company with the, with the company's house in the UK, just recently, my accountant said he needs to know the color of my eyes and my wife's eyes. And I just said, come on, don't be ridiculous. You're just taking the Mickey cuz you know what I'm doing? He said, actually, no, it's an actual thing. You have to know.
We have to know the color of your eyes. So I'm very, you know, cynical in that sense. But my kids, no, they can have the lot, they don't mind whatsoever. And then when we look at cultural differences, there was a Carnegie Mellon re piece of research done that compared the us and India India's attitudes to privacy, guess what they found. They were really different who would've thought it incredibly different. And you know that as well. If you go to, you know, if you do go to India and anybody who, anybody here been to India, loads of people, I'm sure. Yeah, exactly. You will see shops on effectively the high street selling Bootle DVDs shops, not somebody walking around the local pub or bar with him under his coat, trying to sell 'em to, but shops because the attitudes to privacy, the attitudes to intellectual property, et cetera, is very different.
It's not to say it's it's wrong. It's wrong by our standards. It's wrong by us standards by personal standards, not wrong there completely different. And so it doesn't, it shouldn't surprise us when a piece of all encompassing regulation is actually struggling to make a real impact. I, I say this next fact with a little bit of caution, I did a lot of of research on it. I 20 minutes on Google, but I did a lot of research on it, but in Japan, there is no word for privacy. There is a lot of words that describe what it is, et cetera, but there is no singular word for privacy because privacy in Japan is a very, very rare thing. So why would he, why would you refer to it very commonly? So this cultural difference is having a huge effect on, you know, the, on what we can do with a piece of regulation as it stands. And if, as Mr Drucker says culture eats strategy for breakfast, what is it gonna do to regulation? Culture is going to eat regulations for breakfast. Culture is going to pave over and start again. When it comes to regulations, if a country loses the, you know, loses the appetite to actually heavily enforce a piece of legislation or regulation that, you know, because culturally, it just doesn't fit. Who's what's gonna happen. What's going to, who's going to force them to do it, you know, to a certain extent there's, it's, it's out of our hands.
And then we get to big blue. You know, who here has a Facebook account going loads of you who here, and you should get the same number of hands knows that Facebook has all their data and all their metadata and all their friends' data, et cetera. Right? We know this, but who here has the paid for version of Facebook? Exactly. Right. There is no paid for version it's free. And we all know the, the old adage of, you know, if it's free, you are the product, et cetera, convenience, Trump's privacy, Trump's human rights. Trump's security. Every single time, we are happy to hand over troves of data about ourselves. If it means we can laugh at pictures of cats, not a problem at all, as far as we're concerned, how on earth does regulation force a hand against that? How, how does regulation address that? Because if we are willingly handing data over, despite Facebook's TV adverts telling us how much they value our privacy, that we know it's going to be used when it shouldn't be used.
And the best part of it is business actually flies. When you get access to personal data, right? You can, you can customize, you can focus. You can do so much. You can. There was that case in the us of, of the, the, the, the woman who was living with her parents. And they found out that she was pregnant by what was by some nappies being posted to her from the supermarket that actually worked out that, you know, based on her buying choices, she was pregnant. You know, they now actually have to drip in bad data into those kinds of situations. So it doesn't look like they know everything about you, but that personalization makes a huge difference to how we can do business and the effectiveness and the richness of what we do.
And we have this unbreakable connection, according to an unnamed research company, which basically if it's free, you're gonna get mass adoption. And if you've got mass adoption, then you are going to be targeted with adverts and targeted ads. And there will be privacy issues. End of story. You know, but if we are very happy to be consuming the free stuff, then we should expect and anticipate those privacy issues. And it doesn't matter how much we put little, you know, sticking plasters over this to try and stop things from happening. If we are handing our data over freely and willingly and let's face it, we all knew we were doing it. And honestly, there will always be privacy issues that we have to address. And why is regulation not always going to be the solution to this? I'll tell you why. Cuz the Cobra effect. So another story about India, back in the days of garage, the British government in a particular part of India, there was a massive Cobra problem.
Cobra's wherever infestation. So the British government in the infinite wisdom, cuz we make really good choices. As we know, when it comes to certain decisions and their infinite wisdom said, I know we will pay you for every single Cobra that you kill. So what happens? The Indian population starts to farm cobras, start to make a killing and then the Brits go, ah, you got us there, right? We're gonna not pay you for cobras anymore. So what do they do? Well, this isn't gonna make us any profit. Let them go and ended up with an even worse Cobra problem. So regulation is not always gonna be the best way to move because when it is a question of money, we are all of the same religion, right? We all want to do the same thing. And the only other option that is open to us when it comes to GDPR is what many, many news agencies and media outlets decided to do on may.
The 25th was just go dark. Oh, I'm sorry. You look to be, you look like you're coming from Europe. We don't deliver news to you. And that's a very valuable decision, but a very difficult one to, to face. So in summary, I would say these three things to you, one fake it until you make it right. You need to, you need to, you know, be, I nearly said compliant. There nobody's compliant. You need to be in line with the regulations. You're not gonna be in line with the regulations. Even now. Most of you're probably not in line with the regulations, but keep going at it. Two, if you can't look your customer in the eye and tell 'em what you're doing with your data without feeling ashamed, you're doing it wrong. Your business model needs to change. And three, and this is a two-prong one don't confuse privacy with security in the same way that you don't confuse compliance with security because ultimately like all regulations. This two shall pass. Possibly. Thank you folks. I appreciate your attention.
Thank you Tom. Great presentation. You're a little bit more negative regarding the potential effect of GDPR. I might be. I still I'm still hopeful. Wait and see what happens.
I, I think let's, let's be honest about it. Let's throw some opinions out there and generate the conversation I think.
Exactly. And wait what's happens. So thank you. And hopefully you are able to enter mainland Europe again next year.
Well, if not, how difficult is it to get German citizenship? Just ask.
I think it's easier than the other way around.
Oh yeah. It costs you money as well.
Okay. So thank you very much.