As a long-term observer of the IAM market, KuppingerCole finds it interesting to see the change in both the size of investments and the type of investors in this market. Just recently, ForgeRock announced an $88 million round in series D funding. This follows other major investments in IAM vendors such as Okta, Ping Identity, and SailPoint, to name a few.
What is interesting with the recent funding for ForgeRock is that KKR appears on the list, one of the very big names amongst the investors. I found that particularly telling because it means that IAM is now on the radar of a different type of investor, beyond the more focused IT- and Information Security-focused investors we have primarily seen until now.
Obviously, such major investment helps ForgeRock to continue their growth, to further expand their product offerings, and strengthens their market position. We will closely follow the plans and releases of ForgeRock for their Identity Platform and keep you up-to-date.
I’ve been working in IT my whole life and since I’ve joined KuppingerCole over ten years ago, cybersecurity has been my job. Needless to say, I like my job: even though we industry analysts are not directly involved in forensic investigations or cyberthreat mitigation, being up-to-date with the latest technological developments and sharing our expertise with both end users and security vendors is our daily life, which is always challenging and exciting at the same time.
However, occasionally I am having doubts about my career choice. Does everything I do even matter? Cybersecurity market is booming, predicted to reach nearly 250 billion USD within the next 5 years. However, do we notice any downward trend in the number of security breaches or financial losses due to cyberattacks? Not really…
Last time I was having these thoughts back in May after the notorious Wannacry incident: just as hundreds of top experts were discussing the most highbrowed cybersecurity problems at our European Identity and Cloud Conference, a primitive piece of malware exploiting a long-fixed problem in Windows operating system has disrupted hundreds of thousands computers around the world, affecting organizations from public hospitals to international telecom providers. How could this even have happened? All right, those poor underfunded and understaffed British hospitals at least have an (still questionable) excuse for not being able to maintain the most basic cybersecurity hygiene principles within their IT departments. But what excuse do large enterprises have for letting their users open phishing emails and not having proper backups of their servers?
“But users do not care about their security or privacy,” people say. This can’t be further from truth though! People care about not being killed very much, so they arm themselves with guns. People care about their finances, so they do not keep their money under mattresses. And people surely care about their privacy, so they buy curtains and lock their doors. However, many people still do not realize that having an antivirus on their mobile phone is just as important for their financial stability and sometimes even physical safety as having a gun on their night table. And even those who are already aware of that, are often sold security products like some kind of magical amulets that are supposed to solve their problems without any effort. But should users really be blamed for that?
With enterprises, the situation is often even worse. Apparently, a substantial percentage of security products purchased by companies never even gets deployed at all. And more often than not, even those that get deployed, will be actively sabotaged by users who see them as a nuisance hindering their business productivity. Add the “shadow IT” problem into the mix, and you’ll realize that many companies that spend millions on cybersecurity are not really getting any substantial return of their investments. This is a classical example of a cargo cult. Sometimes, after reading about another large-scale security breach I cannot completely suppress a mental image of a firewall made out of a cardboard box or a wooden backup appliance not connected to anything.
However, the exact reason for my today’s rant is somewhat different and, in my opinion, even more troubling. While reading the documentation for a security-related product of one reputable vendor, I’ve realized that it uses an external MySQL database to store its configuration. That got me thinking: a security product is sold with a promise to add a layer of protection around an existing business application with known vulnerabilities. However, this security product itself relies on another application with known vulnerabilities (MySQL isn’t exactly known for its security) to fulfill its basic functions. Is the resulting architecture even a tiny bit more secure? Not at all – due to added complexity it’s in fact even more open to malicious attacks.
Unfortunately, this approach towards secure software design is very common. The notorious Heartbleed vulnerability of the OpenSSL cryptographic library has affected millions of systems around the world back in 2014, and three years later at least 200.000 still have not been patched. Of course, software vendors have their reasons for not investing into security of their products: after all, just like any other business, they are struggling to bring their products to the market as quickly as possible, and often they have neither budgets nor enough qualified specialists to design a properly secured one.
Nowadays, this problem is especially evident in consumer IoT products, and this definitely needs a whole separate blog post to cover. However, security vendors not making their products sufficiently secure pose an even greater danger: as I mentioned earlier, for many individuals and organizations, a cybersecurity product is a modern equivalent of a safe. Or an armored car. Or an insulin pump. How can we trust a security product that in fact is about as reliable as a safe with plywood walls?
Well, if you’ve read my past blog posts, you probably know that I’m a strong proponent of government regulation of cybersecurity. I know that this idea isn’t exactly popular among software vendors, but is there really a viable alternative? After all, gunsmiths or medical equipment manufacturers have been under strict government control for ages, and even security guards and private investigators must obtain licenses first. Why not security vendors? For modern digital businesses, the reliability of cybersecurity products is at least as important as pick resistance of their door locks.
Unfortunately, this kind of government regulation isn’t probably going to happen anytime soon, so companies looking for security solutions are still stuck with the “Caveat Emptor” principle. Without enough own experience to judge whether a particular product is really capable of fulfilling its declared functionality, one, of course, should turn to an independent third party for a qualified advice. For example, to an analyst house like us :)
However, the next most useful thing to look for is probably certification according to government or industry standards. For example, when choosing an encryption solution, it’s wise to look for a FIPS 140-2 certification with level 2 or higher. There are appropriate security certifications for cloud service providers, financial institutions, industrial networks, etc.
In any case, do not take any vendor’s claims for granted. Ask for details regarding the architecture of their products, which security standards they implement or whether they rely on open source libraries or third-party products. The more pressure about secure design you put on vendors, the higher are the chances that in the future, they will see security by design as their unique selling proposition and not a waste of resources. And as always, when you don’t know where to start, just ask an expert!
A couple of days ago, DIACC (Digital ID & Authentication Council of Canada) together with IBM Canada and the Province of British Columbia released information about a PoC (Proof of Concept) for moving corporate registrations to a blockchain-based register. The PoC, which used the Hyperledger Fabric, was for both corporate registries of a single province and across multiple jurisdictions.
Such registries, be it corporate registries, land register, or other types of decentralized ledgers, are the sweet spot for blockchains. Registration is decentralized. The registries and ledgers must be tamper-resistant. The data must be sequenced and, in many cases, time-stamped. All these are the fundamental characteristics of blockchains. Simply said: It is far easier to construct solutions for such registries and ledgers based on blockchain technology than it is based on traditional technologies such as relational databases.
Here, the use case and the solution fit perfectly well. Given that improving such registries and other ledgers can have a massive economic and societal impact, it also proves the value of blockchains, well beyond cryptocurrencies. Blockchains are here to stay, and I expect to see a strong uptake of use cases around registers and ledgers in the next couple of years – beyond PoCs towards practical, widely deployed and used implementations. Honestly, every investment in IT solutions for such registers etc. should be revisited, evaluating whether stopping it and restarting based on blockchains isn’t the better choice nowadays. In most cases it will turn out that blockchains are the better choice.
More than 12 years ago, the first EIC attracted an already surprisingly large number of practitioners dealing with directory services, user provisioning and single sign-on, as well as vendors, domain experts, thought leaders and analysts. I remember Dick Hardt giving an incredibly visionary keynote on "User-Centrism - The Solution to the Identity Crisis?" at EIC 2007 - a topic which still is highly relevant. Or the legendary keynote panel back in 2008 on the question whether there is a difference between the European way of doing IAM and the rest of the world, moderated by KuppingerCole's Senior Analyst Dave Kearns. In the same year of 2009, Kim Cameron of Microsoft gave a keynote on his Claims Based Model, which eventually came true, even if in an unexpected way... Look at Eve Maler's Keynote on "Care and Feeding of Online Relationships", an early and mature vision on Customer Identity. She held that keynote not in 2017, it was already back in 2009! "Extending the Principles of Service-Oriented Security to Cloud Computing" - a remarkable keynote at EIC 2010 held by John Aisien, as well as André Durand's "Identity in the Cloud - Finding Calm in the Storm". Now, for the latest EIC conferences 2011 - 2017, let me give you a selection of my personal favorite keynotes, even if I would change this selection every time I watch videos from past EIC sessions: Doc Searls - Free Customers - The New Platform Doc actually gave a preview on GDPR before politicians started working on it (EIC 2012); Martin Kuppinger's Opening Keynote back in 2014 on the key trends we talk about today; Patrick Parker's (EmpowerID) famous keynote on "IAM Meat and Potatoes Best Practices", describing what can go wrong in an IAM project and feeding a great EIC tradition of keynotes with high practical relevance; Amar Singh on "Heartbleed, NSA & Trust"; Mia Harbitz on IAM, Governance and Forced Migration at EIC 2016; Dr. Emilio Mordini's great talk "In a Wilderness of Mirrors: Do we still need Trust in the Online World?" and a great Analyst/ former Analyst Keynote Panel on "Shaping the Future of Identity & Access Management" with Dan Blum, Gerry Gebel, Ian Glazer, Martin Kuppinger, Eve Maler and Prof. Dr. Sachar Paulus as the moderator. EIC 2017 just blew my mind. So many great contributions make it impossible to choose. Maybe, to give you an idea, look at these 3: William Mougayar - State of Business in Blockchains, Richard Struse - Let Them Chase Our Robots and Joni Brennan - Accelerating Canada's Digital ID Ecosystem Toward a More Trusted Global Digital Economy. Well, I can't but add another one: Balázs Némethi of Taqanu Bank on "Financial Inclusion & Disenfranchised Identification". Ok, one more: Daniel Buchner (Microsoft) on "Blockchain-Anchored Identity: A Gateway to Decentralized Apps and Services". Because it is so relevant. Come and join EIC 2018 enjoy another 120 hours of great and relevant content with speakers from all around the world. Or propose your own talk through the call-for-speakers feature on this website.
See you in Munich Joerg Resch, Head of EIC Agenda
Guest post by Tim Maiorino, Counsel of Osborne Clarke
The newest EU legislation on data protection is the General Data Protection Regulation (GDPR) which will be enforceable from May 26th 2018. It will bring several important changes, altering the requirements of data protection law in the European Union.
The GDPR will replace the EU-Directive on Data Protection and, by extension, all transposing national regulation. The GDPR´s objective is to harmonise data protection legislation across the EU and to “protect the fundamental rights of natural persons to the protection of their personal data”, while promoting free movement of data within the EU.
An examination of the GDPR and its rules is inevitable for any business involving personal information, as it provides a uniform standard for data protection throughout the EU and is directly applicable in all member states. Content-wise, German law has served as a role model to the GDPR. Although it replaces most of the current data protection laws across the EU, the changes – though far-reaching – do not override the fundamental principles of the current regime. Rather, it preserves the basic principles while implementing stricter and more extensive rules.
The most significant changes regard the scope and applicability, data governance and allocation of responsibilities, data subjects´ rights (facilitation and expansion), and sanctions, which include heavy fines. The fines in case of non-compliance may be up to EUR 20 Mio. or 4 % of your worldwide turnover within the previous financial year.
Therefore, further action is required by businesses regarding the handling of personal data.
1. Interacting with Data Subjects
The GDPR establishes detailed requirements for both internal and external facing processes and policies, which can be divided into several steps.
Firstly, the internal processes should be identified and any possible information on the future use of personal data gathered.
Secondly, current policies and any external measures taken or information given with regard to the collection and use of personal data should be identified. You should be aware in which circumstances and at what point data is collected and what information the data subject is given on the use of their data. It is also necessary to identify the method used to obtain the data subjects consent, where applicable, and what channels are used by data subjects to file access requests.
Hereafter, discrepancies between internal and external processes can be recognised. It is essential to validate that the external facing policies match the internal processes and actual use of data. Controllers thus need be aware what the data subject (which includes staff, if their data is concerned) is told about how their data is used. When all internal and external processes and policies are known, they can be updated to comply with the detailed requirements in the GDPR. Only if you know your policies and processes, you will be able to ascertain the necessary steps to meet the extended requirements and to provide guidance and training to your representatives and staff.
2. Managing Compliance
You might now consider this more trouble than it´s worth, but there are several ways to facilitate compliance with the GDPR.
For example, there is the option (and sometimes obligation) to appoint a Data Protection Officer who will, amongst others, monitor and work towards compliance with the GDPR. The contact details of the Data Protection Officer should be published and provided to the Data Protection Authorities.
Any data controller should undertake impact assessments and privacy by design as required. All existing processing operations should be identified and current record-keeping arrangements reviewed.
With regard to external policies, controllers have the possibility to use industry codes, which may provide an orientation for handling certain situations to their employees. Also, for reasons of facilitation, using templates for external notifications in case of data breaches may be helpful.
3. Processors and Transfers
Generally, where data is used, it will also be transferred to third parties or processors. As this is regulated by the GDPR, controllers should be aware of and map their (international) data flow.
As requirements for data transfers change, standard form contracts and addenda need to be updated and / or prepared. Also, updates of procurement processes are required and procurement and IT-teams need to be trained accordingly to identify potential issues. Moreover, it is always advisable for customers and suppliers to work together to address changes and potential issues as well as conduct customer and supplier audits to safeguard both parties´ interests.
Although, essentially, it evolves the current data protection law, the GDPR brings several important changes to the obligations of data controllers and processors and to the corresponding rights of the data subjects.
Hence, not only is data protection and the requirements it sets our for business dealing with any type of personal information lifted to a significantly higher (meaning: more detailed, stricter and even more complex) level. The prominence of the GDRP brings significantly more attention to the topic and potential breaches are sanctioned more strictly than ever before.
Guest post by Christian Goy, Co-founder and Managing Director of Behavioral Science Lab
The world of customer journeys is a terrible mess. The linear path to purchase does not exist. “Predictable shopping patterns, once so fundamental to marketing and advertising strategy, have gone by the wayside. Persona- and demography driven strategies now fall short – the winners in this new era are the brand and retailers who’ve put a plan in place to meet actual shoppers anywhere along their path to purchase,” says BazaarVoice.
Even though marketers claim to understand, use and predict consumers’ shopping patterns with (1) web and mobile analytics, (2) social analytics, (3) media analytics, (4) customer journey analytics, or (5) voice of the customer analytics, most marketers still do not know why buyers bought their brand or their competitors’.
Take Jessica for example. She is looking for a new tote – “a gift for herself,” she would argue. She needs something bigger for her personal items, as well as what her kids need. She wants something that can help her stay organized, is durable, practical and looks good.
She starts her journey on Google search; moves quickly to Pinterest, Instagram, and then reviews a few products sites that sell the bag for which she is looking. Two items look promising. She reads a couple of customer reviews and a week later looks for them in her favorite department store. Unfortunately, the store does not carry either of the two products which were at the top of her list. So she goes back online and finds a retailer’s web site that shows a real person holding the tote she wants and photos of the bag being used to carry diapers. She can tell from the pictures that it will work well for her, and she buys the tote. According to BazzarVoice that journey lasted thirteen days.
The problem is not a lack of understanding touch-points, channels used, time between channels and so forth, but rather a lack of understanding as to why Jessica chose the product she did. Current tools can do an incredible job aggregating the bits and pieces of what a person did — but not why. If marketers don’t understand why people choose a certain product, the path to purchase will always be of secondary value because it is only the means to an end, and not the factors that motivated its use in the first place. Without this basic knowledge, marketers can never put a strategy in place that delivers on what customers truly want, but only how they “get there.”
How Do We Solve That?
Human thinking is complex and trying to fit their behavior, which often appears irrational, is difficult. The reason Jennifer is using so many channels to gather information is to find a product, which can fulfill the utility she expects from her new tote.
This idea of expected utility or value is at the heart of behavioral economics. It assumes that the value of each product or service is determined by a very specific set of psychological and economic elements, which play a role in the buyer’s expectation of its value. The relative value of purchase options determines how much we pay for something and how we decide what to buy.
In Jennifer’s case, we might assume she is looking for a tote bag that has certain attributes such as size which allow it to perform certain functions (economic), a price she can afford (economic), compliments her look (psychological), and perhaps, creates recognition of her shopping savvy among her friends (psychological). Not only does she put these elements in a certain order, each element is surrounded by a set of decision heuristics (rules of thumb she has developed for herself) that Jennifer uses to evaluate which bag maximizes her expectation of utility.
How Do Marketers Create Utility?
To convince the Jessica’s of the world that you have the perfect tote for her; marketers need to first learn what drives utility and then deliver on those elements. Here are some simple, but important next steps to accomplish that:
Deeply understand what drives the expectation of your products’ utility — These are the psychological and economic decision elements used by the buyer to define utility.
Define buyers by how they make purchase decisions — Group buyers into segments with similar decision elements. This will allow the marketers to be more effective and specific in their messaging and product offering because your customers’ specific needs will be addressed. Jessica’s decision type would have started with functionality, then price, style, size, and so on.
Create specific communications and channels to address buyers’ psychological and economic needs — Show the potential functionality buyer that one path is easier and more efficient than any other. This adds value to the search process, rather than just adding to the “work” required to find the “right” product. Fulfilling specific decision requirements through specific and individualized communication channels is the key. We have found in our studies that if you are able to only address and fulfill the primary driver in a buyer’s decision system, the likelihood of purchase is very high. Just imagine the likelihood of purchase if you could address the second and third drivers, as well.
Do not forget, the path to purchase starts in the buyer’s head; marketers should start there to understand how to sell more effectively.
Thanks to Dr. Tim Gohmann, Behavioral Science Lab Chief Science Officer and Ron Mundy, Chief Operations Officer for contributions to this publication.
It seems almost ironical, but the currently and constantly growing number of legal and regulatory requirements might be the important (and first actually working) catalyst for changing the attitude of organizations towards privacy. While the true rationale behind it are most probably the substantial fines that come with several of these regulations, first and foremost the GDPR.
The value of customer data, from basic account data to detailed behavioural profiles is undisputed. And whether information is really the new oil in the digitalized economy or if comparisons are misleading anyway: Customer identity and access management (CIAM) is already a core discipline for almost any organization and will be even more so.
Changing the attitude towards consumer and employee privacy does not necessarily mean that all those promising new business models building upon data analytics are prevented by design. But it surely means that all the data can be used for these extended purposes if and only if the data subject (consumer, employee, social user, prospective customer, etc.) gives permission. This user consent is something that will need to be more and more deserved by companies relying on user data.
The problem with trust is that it needs to be strategically grown over long periods of time, but as it is highly fragile it can be destroyed within a very short period of time. This might be through a data breach, which surely is one worst case scenario. But the mere assumption, maybe just a gut feeling or even hearsay that data might be reused or transferred inappropriately or inspected by (foreign) state authorities can immediately destroy trust and make users turn away and turn towards a competitor.
The real question is why many organizationa have not yet started actively building this trusted relationship with their users/customers/consumers/employees. The awareness is rising, so that security and privacy are moving increasingly into the focus of not only tech-savvy users, but also that of everyday customers.
Building up trust truly must be the foundation of any business strategy. Designing businesses to be privacy-aware from ground up is the first and only starting point. This involves both well-thought business processes and appropriate technologies. Trustworthy storage and processing of personal data needs to be well-designed and well-executed and adequate evidence needs to be presented to many stakeholders including the individual Data Processing Authorities and the users themselves.
Being more trusted and more trustworthy than your competitors will be a key differentiator for many customer decisions today and in the future. And trusting users will be more readily willing to share relevant business information with an organization as a data steward. But this must be based on well-executed consent management lifecycles, especially when it turns out to be to the benefit of all involved parties.
KuppingerCole will embark on the Customer Identity World Tour 2017 with 3 major events in Seattle, Paris and Singapore. Trusted and privacy-aware management of customer data will be a main topic for all events. If you want to see your organization as a leader in customer trust, you might want to benefit from the thought leadership and best practices presented there, so join the discussion.
In a recent press release, IBM announced that they are moving security to a new level, with “pervasively encrypted data, all the time at any scale”. That sounded cool and, after talks with IBM, I must admit that it is cool. However, it is “only” on their IBM Z mainframe system, specifically the IBM Z14.
By massively increasing the encryption capabilities on the processor and through a system architecture that is designed from scratch to meet the highest security requirements, these systems can hold data encrypted at any time, with IBM claiming support of up to 12 billion encrypted transactions per day. Business data and applications running on IBM Z can be better protected than ever before – and better than on any other platform.
One could argue that this is happening in a system environment that is slowly dying. However, IBM in fact has won a significant number of new customers in the past year. Furthermore, while this is targeted as of now at mainframe customers, there is already one service that is accessible via the cloud: a blockchain service where secure containers for the blockchain are operated in the IBM Cloud in various datacenters across the globe.
It will be interesting to see whether and when IBM will make more of these pervasive encryption capabilities available as a cloud service or in other forms for organizations not running their own mainframes. The big challenge here obviously will be end-to-end security. If there is a highly secure mainframe-based backend environment, but applications accessing these services through secure APIs from less secure frontend environments, there will remain a security risk. Unfortunately, other platforms don’t come with the same level of built-in security and encryption power as the new IBM Z mainframe.
Such a gap between what is available (or will be available soon) on the mainframe and what we find on other platforms is not new. Virtualization was available on the mainframe way before the age of VMware and other PC virtualization software started. Systems for dynamically authorizing requests at runtime such as RACF are the norm in mainframe environments, while the same approach and standards such as XACML are still struggling in the non-mainframe world.
With its new announcement, IBM on one hand again shows that many interesting capabilities are introduced on mainframes first, while also demonstrating a potential path into the future of mainframes: as the system that manages the highest security environments and maybe in future acts as the secure backend environment, accessible via the cloud. I’d love to see the latter.
Today, Adobe announced that Flash will go end-of-life. Without any doubt, this is great news from an Information Security perspective. Adobe Flash counted for a significant portion of the most severe exploits as, among others, F-Secure has analyzed. I also wrote about this topic back in 2012 in this blog.
From my perspective, and as stated in my post from 2012, the biggest challenge hasn’t been the number of vulnerabilities as such, but the combination of vulnerabilities with the inability to fix them quickly and the lack a well-working patch management approach.
With the shift to standards such as HTML5, today’s announcement finally moves Adobe Flash into the state of a “living zombie” – and with vendors such as Apple and Microsoft either not supporting it or limiting its use, we are ready to switch to better alternatives. Notably, the effective end-of-life date is the end of 2020, and it will still be in use after that. But there will be an end.
Clearly, there are and will be other vulnerabilities in other operating systems, browsers, applications, and so on. They will not go away. But one of the worst tools ever from a security perspective is finally reaching its demise. That is good, and it makes today a great day for Information Security.
Authorization is one of the key concepts and processes involved in security, both in the real world as well as the digital world. Many formulations of the definition for authorization exist, and some are context dependent. For IT security purposes, we’ll say authorization is the act of evaluating whether a person, process, or device is allowed to operate on or possess a specific resource, such as data, a program, a computing device, or a cyberphysical object (e.g., a door, a gate, etc.).
The concept of authorization has evolved considerably over the last two decades. No longer must users be directly assigned entitlements to particular resources. Security administrators can provision groups of users or select attributes of users (e.g. employee, contractor of XYZ Corp, etc.) as determinants for access.
For some of the most advanced authorization and access control needs, the OASIS eXtensible Access Control Markup Language (XACML) standard can be utilized. Created in the mid-2000s, XACML is an example of an Attribute-Based Access Control (ABAC) methodology. XACML is an XML policy language, reference architecture, and request/response protocol. ABAC systems allow administrators to combine specific subject, resource, environmental, and action attributes for access control evaluation. XACML solutions facilitate run-time processing of dynamic and complex authorization scenarios. XACML can be somewhat difficult to deploy, given the complexity of some architectural components and the policy language. Within the last few years, JSON and REST profiles of XACML have been created to make it easier to integrate into modern line-of-business applications.
Just prior to the development of XACML, OASIS debuted Security Assertion Markup Language (SAML). Numerous profiles of SAML exist, but the most common usage is for identity federation. SAML assertions serve as proof of authentication at the domain of origin, which can be trusted by other domains. SAML can also facilitate authorization, in that, other attributes about the subject can be added to the signed assertion. SAML is widely used for federated authentication and limited authorization purposes.
OAuth 2.0 is a lighter weight IETF standard. It takes the access token approach, passing tokens on behalf of authenticated and authorized users, processes, and now even devices. OAuth 2.0 now serves as a framework upon which additional standard are defined, such as Open ID Connect (OIDC) and User Managed Access (UMA). OAuth has become a widely used standard across the web. For example, “social logins”, i.e. using a social network provider for authentication, generally pass OAuth tokens between authorization servers and relying party sites to authorize the subject user. OAuth is a simpler alternative to XACML and SAML, but also is usually considered less secure.
From an identity management perspective, authentication has received the lion’s share of attention over the last several years. The reasons for this are two-fold:
- the weakness of username/password authentication, which has led to many costly data breaches
- proliferation of new authenticators, including 2-factor (2FA), multi-factor (MFA), risk-adaptive techniques, and mobile biometrics
However, in 2017 we have noticed an uptick in industry interest in dynamic authorization technologies that can help meet complicated business and regulatory requirements. As authentication technologies improve and become more commonplace, we predict that more organizations with fine-grained access control needs will begin to look at dedicated authorization solutions. For an in-depth look at dynamic authorization, including guidelines and best practices for the different approaches, see the Advisory Note: Unifying RBAC and ABAC in a Dynamic Authorization Framework.
Organizations that operate in strictly regulated environments find that both MFA / risk adaptive authentication and dynamic authorization are necessary to achieve compliance. Regulations often mandate 2FA / MFA, e.g. US HSPD-12, NIST 800-63-3, EU PSD2, etc. Regulations occasionally stipulate certain that access subject or business conditions, expressed as attributes, be met as a precursor to granting permission. For example, in export regulations these attributes are commonly access subject nationality or licensed company.
Authorization becomes extremely important at the API level. Consider PSD2: it will require banks and other financial institutions to expose APIs for 3rd party financial processors to utilize. These APIs will have tiered and firewalled access into core banking functions. Banks will of course require authentication from trusted 3rd party financial processors. Moreover, banks will no doubt enforce granular authorization on the use of each API call, per API consumer, and per account. The stakes are high with PSD2, as banks will need to compete more efficiently and protect themselves from a much greater risk of fraud.
For more information on authentication and authorization technologies, as well as guidance on preparing for PSD2, please visit the Focus Areas section of our website.
Register now for KuppingerCole Select and get your free 30-day access to a great selection of KuppingerCole research materials and to live trainings.
AI for the Future of your Business: Effective, Safe, Secure & Ethical Everything we admire, love, need to survive, and that brings us further in creating a better future with a human face is and will be a result of intelligence. Synthesizing and amplifying our human intelligence have therefore the potential of leading us into a new era of prosperity like we have not seen before, if we succeed keeping AI Safe, Secure and Ethical. Since the very beginning of industrialization, and even before, we have been striving at structuring our work in a way that it becomes accessible for [...]