I'm going to share with you here. Some recent research work on measuring the performance of IM from the perspective of the final organization. So typically the private company that is implementing and the, the IM processes. So here is our agenda. First I will cover a number of fundamental definitions around performance measurement. Then share those preliminary research results. I would like just to quickly test the fact that there may be these discrepancies between what I'm showing here and what will be published eventually in the final report paper from a, you know, from few months from now, because these are still preliminary results. Part of it was a survey that I conducted with IM professionals, mainly IM managers around the world. So we will focus on the results and outcomes of this survey. And then finally we will close the, the session by looking at what we could do and where this research could lead us to.
Good. So if we want to measure the performance of, I am obviously one of the key concept that we need to be clear about is measurement itself. And that may look as a very straightforward and tri concept, but in fact, it is much more complex than it looks literally. If you look at the story of measurement in mankind and science, it's took centuries to develop the modern definitions of measurement that we are using today. Measurement is really an empirical process, which means that we are interested into reality. And this means that we are not interested in how we would like reality to be, but really on how reality is the process of measurement is really a process of assigning symbols to the phenomena and objects that we can observe. In reality, the most often, we, most of most often we use numbers as symbols to represent the, the, the measurement. One of the characteristic that is often sought in measurement is objectivity. And what objectivity means is that if we have two observers, two different observers trying to measure the same object of they should arrive at the same result within the measurement error. And the measurement error is really conditioned by your measurement methodology and your measurement instruments.
If we want to measure the performance of, I am, we need also to be clear about what performance means and performance is pretty much concerned about the, the effectiveness and efficiency of our past actions. But the definition of purpose that I prefer is the degree to which we accomplish our goals. And this is the, the definition of per performance that I would like you to keep in mind throughout this organiza, this presentation, the degree to which we are accomplishing our goals. And finally, if we want to measure the performance of IM, we obviously need to clarify what IM is. And there are many definitions of IM we can see it from multitude of perspectives. Very often we look at IM from the perspective of the objects that it manipulates, which are the digital identities, digital assets and the access of those digital identities to the digital assets.
We often look at it from the complementary aspect of the life cycle of these objects, which is often exemplified by the join removal liver process. For example, another perspective on IM may byed by the fact that many, many IM professionals have an it background. And they look at IM often with a focus on it, systems or technology, and these limited perspective should be complimented with also the people and process perspectives that are, of course, as important of as it systems, if not much more finally, a different view of what IM is, is from the perspective of the populations of identities that are being managed as part of the IM activity. And here we may categorize these as workforce. I am third party or business partners. I am customer, I am technical. I am object. I am and so on and so forth, but finally here, because we are concerned in the measurement of IM performance.
Another the definition of IM that we should look at is from its goals. What are the goals of IM for an a final organization? And to answer that question, I have conducted literature review from both academic and industry literature, and populated an inventory of typical goals for IM depending on how we regroup these goals. We end up with roughly a dozen popular goals for IM some of which are very well known, such as information security, resilience, or compliance assurance, but also perhaps lesser well known goals, such as improving the customer experience of our customers or accelerating the digital transformation of our organization or improving the scalability of the management of our content from all these goals. The question really becomes what are your particular IM goals in your organization? And this is one of the questions I asked. I am professional as part of my research survey.
What we see on this chart is the, in the vertical dimension, a number of typical goals for IM and I asked IM professional to categorize these goals between primary goals in dark blue, which are defined as those strategic goals, strategic IM goals for IM that are fully supported by top management, where you have most of your budget and resources invested in. Then we have secondary goals in light blue, and these are the, the goals that you are actively pursuing as well, but as long as it does not conflict with the primary goals, what we see here in gray are those goals that are nice to have. This means that these are goals that you are not actively actively pursuing, but that you would do if for by chance, the opportunity would arise, arrive, arise to, to, to do something about it. And the last category is a, not a goal category. These are the things that you are not going to work on that are perhaps even not desirable for you.
What we see from this chart is without pies that cybersecurity and compliance actions are among the most popular goals for IM and the bottom of the list. We find reducing cost strengthening trust with third parties, enabling the workforce to become global or enabling new business opportunities. One question that we may ask us is whether this priority is really the priority we would like to have for I am. But another story that this chart is are telling us is that IM managers and organizations have too many primary goals for IM. If you look at management to, and, and on strategy, it is clear that defining strategy is not about picking up goals. This is very easy to pick goals and to select goals. The real difficulty to define a strategy is about making sacrifices, which means to define those goals, that you are not going to pursue and to choose just one, perhaps two maximum, three key primary goals where you will make your investments. And what this chart shows us is that most organizations are far too many goals on I assigned to IM to make it a success.
So where are all these goals coming from? Maybe part of the answer is on the reporting line of the IM managers. What we see on this chart is the direct reporting lines of IM managers. Most of them are reporting directly to the CSO. A good portion of them are reporting to the CIO, and some of them are reporting to both but only minority of IM managers are reporting directly to the COO. Now, considering that IM is such a transversal discipline with value proposition in the field of customer experience, enabling business partnerships and so on and so forth. We may ask ourselves if this governance setup is right for IM and another complementary research question that we should perhaps investigate is whether those organizations where IM is reporting to the COO are perhaps better positioned to yield maximum mile value from IM rather than those where IM has that strong it and security focus that we often see in our organizations.
I also asked IM managers and professionals to assess the maturity of their IM processes. The assumption here is that you will not measure the performance of IM the same way, depending on the maturity of your process. If you have a chaotic process, you will not implement sophisticated performance measurements. What we see here is what we, I use the, a typical CMI like maturity model hing from the most mature level optimized in dark blue here then managed in light blue, defined in gray, repeatable in orange and initial in red. What this chart tells us is that our most mature post without surprise is workforce IM then comes technical IM where you will find also your plan program. Then only third, the customer IM process, and finally, third party Federation. And so on here, IOT or object IM may not be statistically significant. So we shouldn't considerate as part of this conversation now is that T all right, for IM workforce technical and then only customer, and finally business partners.
This raises the question as an IM manager. If you were to ask your top management, what should be in your organization, the most mature IM process in your organization, what do you think your CEO would answer? Well, my bet is that most CEO would answer first, put the priority on the customer, then enable our business partners and finally work on the workforce, technical access management Palm. Well, perhaps the CEO doesn't even know what that means. I'm of course not seeing here that we should not invest in Palm, which is a fundamental piece of our security, but what I'm raising as a question here is whether that priority is the right priority here, we see the, the result of the survey on how IM performance indicators are designed in organizations. And basically what it shows is that IM managers and professionals are very satisfied about the way they design their performance indicators. They are deemed as actionable objective, mostly focused on the long term, clearly defined, mostly aligned with strategic goals and so on and so forth. But is that picture too good to be true? Well, what's missing and which would require some complimentary research is the missing perspective of top management on IM indicators.
We don't see here, the, the viewpoint of top management. We don't see here, the viewpoint of your chief compliance officer, chief risk officers, or for example, the business, which are of course critical perspectives on IM performance. So this will require complimentary research. This chart now shows us the view on the performance performance measurement process. And here, the picture is more nuanced. I am professionals do recognize some difficulty in assigning owners, for example, for performance indicators, they also, some of them recognize some difficulty in disclosing the limitations or impressions of the performance indicators that they use. Looking at the, the performance, the performance measurement process, a complimentary perspective is the degree to which it has been automated. And what this chart shows us is that only a minority of organizations have streamlined and automated the measurement of IM performance. Most of us to put it bluntly are still fighting with Excel spreadsheets and scripts and macros here and there to report the performance of their IM.
If we look at how IM performance indicators are communicated throughout the organization, we see that and how they are understood. We see that IM performance indicators are generally very well communicated and understood by those who manage IM, which is a bit of a joke because they design their own performance indicators. Of course, we see that in general, these indicators are well understood by those people, the process workers of IM those doing the IM within the organization, but at the bottom of the list, we see that we may have more difficulties communicating these indicators with our top management and making these indicators being well understood.
Now, considering if IM performance indicators within organizations are covering the whole activity of IM, we see that we are pretty good at assessing process legacy per process performance, such as velocity and throughput of our typical IM processes. We are also pretty good at assessing the deployment of technology within the organization, such as the scope or singles of single sign on or multifactor authentication or other technologies and, and solutions that we deploy. But as a profession, we are less good at reporting and assessing the satisfaction of our internal clients or external clients. We are also less good at evaluating the residual risk of IM after the application of typical IM controls. And we are even less good at evaluating IM costs. These are a number of areas where we should improve the measurement of our performance.
So as an initial conclusion, the key message I wanted to transmit here was that the first thing to define an IM strategy, the prerequisite is to make a number of sacrifices by not choosing all those goals that you could pursue, but instead choosing those goals that you are not going to pursue, this leads us to the definition of an IM strategy. And from this strategy, you can infer a number of primary goals for IM within your organization. Once these primary goals are clearly defined, you can start your journey to measure the performance of IM which will be the measurement of the degree to which those primary goals are APL.
So what's next. And where is this research leading us to? So first I need to complete this research project and you can help me by filling in the online questionnaire about I am performance measurement, for which you find the link here. Then as part of this project, I collected from many colleagues around the world. I am dashboards. I am typical indicators and past performance data. My next objective will be to consolidate this information and make it publicly available in the form of a catalog of standardized performance indicators for IM this should be released in the coming few weeks or perhaps sponsors from now. And finally, my final goal for this whole project would be to facilitate and enable organization to perform benchmarking of their IM performance with their peers, peers in such a way as to give visibility as to, to, to their individual performance. So I thank you very much for having me here as part of this session. I hope that this was informative and perhaps helpful to help you reach and accomplish your primary IM goals.