Keynote at the Consumer Identity World 2017 EU in Paris, France
KuppingerCole's Advisory stands out due to our regular communication with vendors and key clients, providing us with in-depth insight into the issues and knowledge required to address real-world challenges.
Optimize your decision-making process with the most comprehensive and up-to-date market data available.
Compare solution offerings and follow predefined best practices or adapt them to the individual requirements of your company.
Configure your individual requirements to discover the ideal solution for your business.
Meet our team of analysts and advisors who are highly skilled and experienced professionals dedicated to helping you make informed decisions and achieve your goals.
Meet our business team committed to helping you achieve success. We understand that running a business can be challenging, but with the right team in your corner, anything is possible.
Keynote at the Consumer Identity World 2017 EU in Paris, France
Keynote at the Consumer Identity World 2017 EU in Paris, France
Who begin by Ling, who is head of ForSight at AXA. So I think there should be about here's the microphone. The title is how is global insurance leader, reconsidering its client relationships in life of new technologies. And I think it's a question probably not only insurance companies have, but the age of GDPR, every organization has, but it's about what are my relationships to my clients and how do they have to change? How will they change? Just want follow on the slide, like appears here.
Okay, here we go. And maybe we can also make the clock on the other screen work again. That would be perfect. Okay. It's your turn. Hi everyone. So I'm gonna start by talking about our relation to our customers. And in the past, the relation that an insurance has with this customer was, you know, you have a problem and you get some money. So the relation was kind of a transaction, transactional relation, and you get money, you get the payment. So we're a payer.
And now we think that more and more with the data society and AI and predictive agricul, we can make more and more and we can be partner. So we can have rather a prevention relation to the customer. Not only be there, whether it's a problem, but maybe help before something occurs. And also it's kind of an emotional relation because it's relation to your body to your property. So it's not only about money being shipped to a bank account. So having this in mind, we said that we need to have a ecosystem around data and AI that meets this position of being a partner.
So first of all, we decided that we need to protect the data of our customer, but much more than just protecting it, which is by law, which is a compliance issue. We will take stronger commitment. And this is how we came to the commitment, not to sell the personal data of our customer. We were the first in the financial sector to take this commitment. And it's a big decision because it's the selling of personal data is a business for many, many, many FinTech and banks. So this was the first commitment, quite a strong one.
And we decided to take it because we thought people and trust us with a lot of personal data. And this is a trust relationship we want to build. So trust was really the key world. Second thing we decided to do is to create like experiments such as my 11 online. So my life online, which is here in the middle and the objective of this is to explain to a customer with a very clear and easy website, what it means to be exposed online, what it means to protect your identity online, what it means to be attacked maybe.
And so we started to work on new cyber and data risk, and we not only for ourselves, but partnering with the customer means explaining to them the risk so that we both are aware of what can happen also when it goes wrong. And the third thing we did is also to say, okay, we don't sell your personal data. We really make sure we protect it together with you. And then we can have an ecosystem where we can create value for you based on your data.
And this is an example of Excel drive, which is an application when you drive, it sees how you accelerate, how you stop, how you turn and make sure that you are safe. Driver. The idea here is that the safe driver is good for society is good for you and your family in the car. And it's good for us as well. And here too. The idea is to be very clear on where the data is, where the data sits and be very committed to having a fair relation with the customer. The second thing we said is, well, you know, that algorithm, sometimes it seems very OPAC and very difficult to understand.
So we launch Coco of product, which is called as C 20. So citizen insurance, where we invited, of course we cannot invite every customer, but we invented association of customers to design the new product with us. And also we created a crowdsource platform when consumer can exchange and share with us, but also between customer about products. So this was quite new for us. And it was really good because we could really explain what it means today to not only go in the store in a, like in an accessory Paris, but you know, interact with us on the channel.
So at the same time we developed our presence on messenger. We develop a presence with apps and website, and we also launch our first chats, but we said, trust is also a relationship where you need transparency. So we created an experimentation where we share all the feedback of a customer online. So if customer are not happy with us, it's not only us. I can see it, but everyone, this is also to say that, you know, customer opinions are very important and we share that with the world.
And there are also this debate a lot in Europe, but you know, it starts to be also now international that many big companies sit on a lot of data and that those data can have values outside the companies and that you need to share this value. So we launch something called, give data back. This is actually life. You can try it. We realized, for instance, let's say Paris, because it's comparison in Paris, you live in Paris, but we have a lot of information on the types of water leakage or the type of robbery that could happen.
So we said that you can go online and you can write not your address because we want to respect your privacy, but maybe the neighborhood where you live. And then based on this, you can see how many hours between two robbery, what, what type of robbery were down. So how can you protect yourself?
It was, I don't know, walls through the window most through the roof. You can also see for what or leakage. And it's very important because, you know, in Paris, some buildings in some area have the same types of constructions.
They, they date from the same moment. So they will all have the same leakage at the moment. So sharing this information, of course like on an a and transparent basis could help create value for the customer. So this is exactly the stress relationship that we are building, but we also very aware of the buyers and the risks that we need to take into account when we move to an artificial intelligence world.
So I would like to give some example, as an insurance, for instance, you are not allowed to use gender in the opinion, when you price a car insurance, the reason actually women were better drivers. So women were paying less, it was not discrimination. It was just based on their risk level, but the European commission tell that it was not okay, and that women could not have such a better price than men.
So we had by law to change, okay, we had by law to make this, but then it's a question when you use AI and when you use big data, you can collect, you can delete the viable gender for instance, but you can still have the color of the car or many others variables that can be inference or proxies to something that you have deleted from the database. So we are really working on how you are not creating with those inference things that actually will not be compliance at the end.
The more we move to dynamic models, the more so we need to control for the bias and the deviation of the model, not only where there launch, but on the, on the long run. So we're talking about machine learning. And so the, the evolution of the algorithm means that we really need to control plus what is happening. So we are also working on that and we are well aware and we know that a lot of people in the literature are talking about black box algorithm and the inability to understand, or to explain what is really happening.
So we are thinking about how to use like expos metrics or all the tools so that we know actually, and really control what is inside the model. So we don't do all these works on our own AXA is financing research on cyber risk and data risk. We work with a lot of the famous researcher like Alexei on the model and so on with who we talk about those models, because we need to make sure that we import the best out of the new technology and not the words. So let's go a little bit more now in the AI and customer relation, what it means.
So, first of all, we don't think AI will replace humans. And we don't think that when people have their feet in water or they just have a very bad robbery that they want to have, you know, please press one. If you have this, please press two. I think you are in an emotional state and you really want a human on the line. So we are more thinking about how do you build a corporate military system with AI and the humans altogether? How do you empower the humans to make a better job?
But it means that there are lots of shifts that will happen shift in the type of skills that people will need shift in the type of working practice shift in the type of customer relations. And the question is, how do we manage this shift? And for us upskilling or risking is gonna be important. Both who have people upskill to this data and cyber world, but also to have people upskilled to empathy and many other things that will not be easily replaced by AI.
So we, we just launch a partnership with Corra. We are the first company to do this B2B partnership, but it creates this platform dedicated to our employee with the best Corra course for them. And also they can get certified and then change their learning curve in terms of what it means to interact with the customer in an AI driven world, going back to the finance that we have.
So we, we, we did an access awards on responsible AI. So we finance the best researchers who work on how to make sure that we end code fairness. We are code accountability. We have called just all those kind of values. And we're also working with a chair in to lose on neuroeconomics. The neuros is really how do you decide in a process, for instance, a notification process, what should be done by your boat? What should be done by your human, in which order also, how you make the decision making process. So this is very important to manage this transition.
And as I mentioned yesterday, we have created this data privacy aary panel. So this panel is made of 10 experts. They are external from our company and they sit with us and we review all those topics on a regular basis. And we also review concrete projects to make sure that we, we do it right?
So this, this was more first part to introduce, you know, what is opposition and what we've done. I would like to talk a little bit about the, the new challenges and what is coming up in the future. So the first things that I see really changing is the, my data movement and the piece personal information management system. I'm sure you are aware of them. It's very interesting to us to see that new players are emerging putting the customer in the middle and making sure the customer is a center of the world.
So not at data driven, but maybe a human driven world, but it's very also complex to move to the system because sometimes you can have illusion of information, people just by getting the data. It doesn't mean we can act actually act on the data you have. We are very interested in data supportability, you know, the new right of the GDPR, because it means that data is gonna be flowing much more, but it's not only about getting the data. As I said, it's also making sure that it's the right data with the right level of quality, that there is not a cyber risk in it.
And that there is not misunderstanding when you transfer data from one company to another. So we are very, we are only doing quite closely the, all the speeds and the, my movement. We are very eager because we think it creates value for the customer. And this is really what we want, but we need also to make sure to understand what the customer can really get out of it. There is a second big topic.
We are, monitoring's a liability. And welcome your question just before. So we have a lot of data intensive object around us. And as an insurance, we have more and more cases. Let me give you concrete example.
You know, I have turned left because my car said in five red meters, so left, but actually it was not a street. It was stairs because it was a very old city in Italy and it was stairs. And the person really turned, you know, and find the car stucking, the stairs, big accident, actually. And then it's a question I'm not responsible. Okay. I was the driver, but you know, the car told me left left. So then's a question who is actually responsible when the device gives an advice, which is actually dangerous for the person. Another example, you have an app and this app is ringing every day.
And because you're taking the bills, you don't want to have baby, but when the app is, you know, act, or there is a problem, it doesn't ring and then you get pregnant. Okay? There are many, many examples. So the European commission is working on what it means to work on liability and the data intensive world. We are very interested in this conversation as an insurance, because we need to know who is liable when there is an accident. What we notice is most of the time to know who is liable. You need to have access to data because you need to know what happened in this connected car.
What really triggered this message. And this is very difficult because we live in a society where data is stored or used by certain percent or by others. The question of data access would then be key to make sure we can ensure liability. Obviously also the question of the digital identities.
So here, there are lots of issues. First of all, there are lots of questions around fraud and people stealing identities of one another. So there is a lot of work to do on detecting fraud and making sure we avoid them. When people fraud that offer the company, they fraud also all the other customers, because the price of fraud is paid by everyone in the society. So there is this question of fake identities or ceiling identities. There are also the question relating to know your customer. We spend a lot of money on that.
It's a regulatory obligation, but we think that today we started to mention it yesterday, with the example of the blockchain, you can have a blockchain system with different keys, public keys, private keys, and so on. So there might be ways to share some keys with regulators, with public institution, controlling the health system, to make it more convenient, to use, to make it easier for the customer, and also to make it cost efficient. Both from the complete and private interview.
There are also a lot of issues around people that have, that are the same people, but have different names of different ways that they were filled in to the system. And this is also very complicated when you have a lot of duplication. Sometimes when something happened, like there is a service attack you really need to know right now, what are the customer that are affected? And it might be someone having a life insurance, maybe someone having a health insurance, maybe someone having a car insurance and is very, very complicated.
If you make sure you could not make sure that there is an entire probability in the identification of the person. But I would like to say that all those questions around data and identities also can change the rule of the game in the presentation. Just before I heard, do you have the right salary to get, for instance, this amount of money from a bank or those kind of things. I'm really monitoring many experiments about how big data and a data driven world is changing. The way you can land money.
I would like to give this example of somebody like fixing shoes in Barcelona and very, very small, tiny shop and very, not entitled to getting money because in the right model and white profile, not safe enough for the bank and for him, but now with the data, they made an experiment and they looked how far people are coming to the shop to fix the shoes. Okay, how far? And second, they look at the frequency.
How often are people coming back to the shop and entering those new data that they can have, thanks to, for instance, people paying with credit card, all those kind of things that decided that if the frequency was high enough, and if the people were traveling from far enough, it really mean the guy is good and he's an expert and he's not gonna go out of business. So that decided to change the patterns and actually try and give the money to those people. And it worked, it were, they all paid back and there were no problem.
So what I really mean to say is we really need to pay attention also how we building the criteria and maybe criteria will also change entirely due to the data access. But again, it means that access to be reliable proof, cyber secure, it needs to be a trusted relationship that why you use this data so that people do not feel that they were, you know, that people were claiming to use the data for one thing and actually using the data for something else.
So in other world, there are lots of discussion now about, and I would like to conclude on that ethics of algorithm, meaning can we have principle of friend liability, transparency, accountability, and in France, it's a consultation that is driven today about, but there are also at the European data protection supervisor level, the discussion on that topic. And if you look on the side of people really coding the world and the I, there are also discussion and reports on this topic. So I will think it's not a leader, a topic for business people, or for academics. It's a topic for us all.
And I think that it's really, really linked to what it means to be identified online and what it means to share or not share data. So with that, I would like to, to thank you for your attention. And I welcome maybe went question if we have the time. Thanks a lot. Thank you. And yeah. Any questions on the presentation on too early questions?
So what I, what I'm curious about is, so we had talk of training before, which talked about using a lot of ID providers. Once your approach on that, is it more the anxiety, or is it something where you say you're open to them? We are considering the fact that it could be useful to have kind of a lack Estonia, an I don't, that could be useful for public and private actors. And we are very, I mean, we are working on it at the level of the Federation and we really think it could be the future, but again, to make that happen, it needs to be really, really well known.
So there is a long way, and I'm the head of foresight. So I'm talking this on the need to long term, but yes, we really think that there need to be some connection and maybe having one idea will help us a lot in particular, when I mentioned when there are fruits, but also when there is a big event, when you really make need to make sure that the people affected get their services in the right time. So we are monitoring what's happening and experimentation inside the world.
And we hope that we can make the best out of it in the opinion, which I think is a very interesting area to live in today because we are really setting the stage to a new trusted relationship when it comes to data. Okay, thank you. And I believe that this, what, what is the right identity approach is clearly one of the most interesting things today because we have Tony works perfectly well. We have Germany where we don't talk about, we have approach of Canada, which is very different to it.
And so I think we have a broad variety here that will be really interesting to see all the discuss around set. So with blockchain, cetera, what happens over the next year. So thank you very much again,