Blog posts by Martin Kuppinger

Successful IAM Projects Are Not a Rocket Science – if You Do It Right

While we still regularly see and hear about IAM (Identity & Access Management) projects that don’t deliver to the expectations or are in trouble, we all see and hear about many projects that ran well. There are some reasons for IAM projects being more complex than many other IT projects, first and foremost the fact that they are cross-system and cross-organization. IAM integrates a variety of source systems such as HR and target systems, from the mainframe to ERP applications, cloud services, directory services, and many others. They also must connect business and IT, with the business people requesting access, defining business roles, and running recertifications.

In a new whitepaper by One Identity, we compiled both the experience of a number of experts from companies out of different regions and industries, and our own knowledge and experience, to provide concrete, focused recommendations on how to make your IAM project a success. Amongst the top recommendations, we find the need for setting the expectations of stakeholders and sponsors right. Don’t promise what you can’t deliver. Another major recommendation is splitting the IAM initiative/program into smaller chunks, which can be run successfully as targeted projects. Also, it is essential not to run IAM as a technology project only. IAM needs technology, but it needs more – the interaction with the business, well-defined processes, and well-thought-out models for roles and entitlements.

Don’t miss that new whitepaper when you are already working on your IAM program or when you will have to do in future.

Consolidation in Privilege Management Market Continues: Bomgar Acquires Lieberman Software

Just two weeks after One Identity has acquired Balabit, the news spread about the next acquisition in this market segment: Bomgar acquires Lieberman Software. Both vendors have been active in this market. While Bomgar entered the market a couple of years ago, having a long history in Remote Control solutions, Lieberman Software is one of the Privilege Management veterans.

Looking at their portfolios, there is some functional overlap. However, while the strength of Bomgar comes from Session Management related to their Remote Control features, Lieberman Software is stronger in the Shared Account Password Management and related capabilities. The two companies will be able to deliver strong capabilities in most areas of Privilege Management by joining their forces.

With that second merger in a row, the Privilege Management market dynamics are under change. Aside from the established leaders in the market, there are now two vendors about to bring strong combined offerings to the market. This will foster competition among the leaders, but also increase pressure on smaller vendors that need to rethink their positioning and strategy to find their sweet spots in the market. However, from a customer perspective, more competition and more choice is always a good thing.

One Identity Acquires Balabit

Yesterday, One Identity announced that they have acquired Balabit, a company specialized on Privileged Management, headquartered in Luxembourg but with their main team located in Hungary. One Identity, a Quest Software business, counts amongst the leading vendors in the Identity Management market. Aside of their flagship product One Identity Manager, they deliver a number of other products, including Safeguard as their Privilege Management offering. Balabit, on the other hand, is a pure-play Privilege Management vendor, offering several products with particular strengths around Session Management and Privileged Behavior Analytics.

One Identity already has a technical integration with Balabit’s Session Management product as a part of their Safeguard offering. With the acquisition, One Identity gets direct access to one of the leading Session Management technologies, but also the Privileged Behavior Analytics capabilities of Balabit. Combined with the One Identity Safeguard capabilities, this results in a comprehensive Privilege Management offering, from Shared Account Password Management to Session Management and Privileged Behavior Analytics. Given that there is already some integration, we expect One Identity to progress fast on creating a fully integrated solution. Another advantage might occur from the fact that still a significant portion of the One Identity Manager team is based in Germany, geographically relatively close to Hungary.

The acquisition strengthens the position of One Identity in both the Privilege Management market and the overall Identity Management market. For Privilege Management, the combined portfolio and the expected close integration moves One Identity into the group of the market leaders, with respect to both the number of customers and technical capabilities. One Identity becomes a clear pick for every shortlist, when evaluating vendors in this market segment.

When looking at the overall Identity Management market, One Identity improves its position as one of the vendors that cover all major areas of that market, with particular strengths in IGA (Identity Governance and Administration, i.e. Identity Provisioning and Access Governance) and Privilege Management, but also in Identity Federation and Cloud SSO, plus other capabilities such as cloud-based MFA (Multi-Factor Authentication). For companies that focus on single sourcing for Identity Management or at least one core supplier, One Identity becomes an even more interesting choice now.

The acquisition underpins the strategy that One Identity had announced after the split of Quest Software from Dell and the creation of One Identity as a separate business of Quest Software: playing a leading role in the overall Identity Management market as a vendor that covers all major areas of this market segment.

Microsoft Azure Confidential Computing – a Step Forward in Cloud Security

A few days ago, Microsoft announced Azure Confidential Computing. As the name implies, the technology is about adding a new layer of protection to cloud services, specifically Microsoft Azure, but also Windows 10 and Windows Server 2016 running in other public cloud infrastructures on specific hardware.

The foundation for Azure Confidential Computing are so-called TEEs (Trusted Execution Environments). Such environments protect the code running in that environment and data used by the code from other parties’ access. Neither administrators, neither people having direct access to hardware, nor attackers that gain access to administrator accounts can bypass that protection layer – at least this is what the TEE concept promises.

Based on TEEs, data can be held encrypted in the cloud services and their data stores and are only decrypted and processed within the TEE. That means that data is not always encrypted, but it remains – if the application is implemented correctly – encrypted in the accessible areas of the public cloud.

For now, there are two supported TEEs. One is Virtual Secure Mode, a software-based TEE that is based on Microsoft Hyper-V in Windows 10 and Windows Server 2016. The other is Intel SGX (Software Guard Extensions), which is a hardware-based TEE. Based on Intel SGX, secure TEEs can be used outside of the Microsoft Azure Cloud.

Microsoft has been using such technologies as part of their Coco Framework for enterprise blockchain networks for some weeks already, and now is moving support to Microsoft SQL Server and Azure SQL Database. This is achieved by delegating computations on sensitive data to an “enclave”, which is based on a TEE. However, Azure Confidential Computing supports broader use of this capability for various types of data.

Microsoft Azure Confidential Computing, which is available in an early adopter’s program, is a great improvement for security and confidentiality in public cloud environments and will enable customers to port workloads to the cloud which, so far, have been considered as being too sensitive. The announcement stands in line with the recent IBM announcement for their IBM Z14 systems, where factually the entire system acts as a TEE. While the use of TEEs in Azure Confidential Computing is restricted to parts of the application that are moved to the TEE specifically, both announcements are about significantly increasing the level of security in computing. That is good news.

Changes in the Scope of Investors for IAM

As a long-term observer of the IAM market, KuppingerCole finds it interesting to see the change in both the size of investments and the type of investors in this market. Just recently, ForgeRock announced an $88 million round in series D funding. This follows other major investments in IAM vendors such as Okta, Ping Identity, and SailPoint, to name a few.

What is interesting with the recent funding for ForgeRock is that KKR appears on the list, one of the very big names amongst the investors. I found that particularly telling because it means that IAM is now on the radar of a different type of investor, beyond the more focused IT- and Information Security-focused investors we have primarily seen until now.

Obviously, such major investment helps ForgeRock to continue their growth, to further expand their product offerings, and strengthens their market position. We will closely follow the plans and releases of ForgeRock for their Identity Platform and keep you up-to-date.

The Sweet Spot for Blockchains: Registries

A couple of days ago, DIACC (Digital ID & Authentication Council of Canada) together with IBM Canada and the Province of British Columbia released information about a PoC (Proof of Concept) for moving corporate registrations to a blockchain-based register. The PoC, which used the Hyperledger Fabric, was for both corporate registries of a single province and across multiple jurisdictions.

Such registries, be it corporate registries, land register, or other types of decentralized ledgers, are the sweet spot for blockchains. Registration is decentralized. The registries and ledgers must be tamper-resistant. The data must be sequenced and, in many cases, time-stamped. All these are the fundamental characteristics of blockchains. Simply said: It is far easier to construct solutions for such registries and ledgers based on blockchain technology than it is based on traditional technologies such as relational databases.

Here, the use case and the solution fit perfectly well. Given that improving such registries and other ledgers can have a massive economic and societal impact, it also proves the value of blockchains, well beyond cryptocurrencies. Blockchains are here to stay, and I expect to see a strong uptake of use cases around registers and ledgers in the next couple of years – beyond PoCs towards practical, widely deployed and used implementations. Honestly, every investment in IT solutions for such registers etc. should be revisited, evaluating whether stopping it and restarting based on blockchains isn’t the better choice nowadays. In most cases it will turn out that blockchains are the better choice.

IBM Moves Security to the Next Level – on the Mainframe

In a recent press release, IBM announced that they are moving security to a new level, with “pervasively encrypted data, all the time at any scale”. That sounded cool and, after talks with IBM, I must admit that it is cool. However, it is “only” on their IBM Z mainframe system, specifically the IBM Z14.

By massively increasing the encryption capabilities on the processor and through a system architecture that is designed from scratch to meet the highest security requirements, these systems can hold data encrypted at any time, with IBM claiming support of up to 12 billion encrypted transactions per day. Business data and applications running on IBM Z can be better protected than ever before – and better than on any other platform.

One could argue that this is happening in a system environment that is slowly dying. However, IBM in fact has won a significant number of new customers in the past year. Furthermore, while this is targeted as of now at mainframe customers, there is already one service that is accessible via the cloud: a blockchain service where secure containers for the blockchain are operated in the IBM Cloud in various datacenters across the globe.

It will be interesting to see whether and when IBM will make more of these pervasive encryption capabilities available as a cloud service or in other forms for organizations not running their own mainframes. The big challenge here obviously will be end-to-end security. If there is a highly secure mainframe-based backend environment, but applications accessing these services through secure APIs from less secure frontend environments, there will remain a security risk. Unfortunately, other platforms don’t come with the same level of built-in security and encryption power as the new IBM Z mainframe.

Such a gap between what is available (or will be available soon) on the mainframe and what we find on other platforms is not new. Virtualization was available on the mainframe way before the age of VMware and other PC virtualization software started. Systems for dynamically authorizing requests at runtime such as RACF are the norm in mainframe environments, while the same approach and standards such as XACML are still struggling in the non-mainframe world.

With its new announcement, IBM on one hand again shows that many interesting capabilities are introduced on mainframes first, while also demonstrating a potential path into the future of mainframes: as the system that manages the highest security environments and maybe in future acts as the secure backend environment, accessible via the cloud. I’d love to see the latter.

A Great Day for Information Security: Adobe Announces End-of-Life for Flash

Today, Adobe announced that Flash will go end-of-life. Without any doubt, this is great news from an Information Security perspective. Adobe Flash counted for a significant portion of the most severe exploits as, among others, F-Secure has analyzed. I also wrote about this topic back in 2012 in this blog.

From my perspective, and as stated in my post from 2012, the biggest challenge hasn’t been the number of vulnerabilities as such, but the combination of vulnerabilities with the inability to fix them quickly and the lack a well-working patch management approach.

With the shift to standards such as HTML5, today’s announcement finally moves Adobe Flash into the state of a “living zombie” – and with vendors such as Apple and Microsoft either not supporting it or limiting its use, we are ready to switch to better alternatives. Notably, the effective end-of-life date is the end of 2020, and it will still be in use after that. But there will be an end.

Clearly, there are and will be other vulnerabilities in other operating systems, browsers, applications, and so on. They will not go away. But one of the worst tools ever from a security perspective is finally reaching its demise. That is good, and it makes today a great day for Information Security.

PSD2 – the EBA’s Wise Decision to Reject Commission Amendments on Screen Scraping

In a response to the EC Commission, the EBA (European Banking Authority) rejected amendments on screen scraping in the PSD2 regulation (Revised Payment Services Directive) that had been pushed by several FInTechs. While it is still the Commission’s place to make the final decision, the statement of the EBA is clear. I fully support the position of the EBA: Screen scraping should be banned in future.

In a “manifesto”, 72 FinTechs had responded to the PSD2 RTS (Regulatory Technical Standards), focusing on the ban of screen scraping or as they named it, “direct access”. In other comments from that FinTech lobby, we can find statements such as “… sharing login details … is perfectly secure”. Nothing could be more wrong. Sharing login details with whomever never ever is perfectly secure.

Screen scraping involves sharing credentials and providing full access to financing services such as online banking to the FinTechs using these technologies. This concept is not new. It is widely used in such FinTech services because there has been a gap in APIs until now. PSD2 will change that, even while we might not end with a standardized API as quickly as we should.

But what is the reasoning of the FinTechs in insisting on screen scraping? The main arguments are that screen scraping is well-established and works well – and that it is secure. The latter obviously is wrong – neither sharing credentials nor injecting credentials into websites can earnestly be considered a secure approach. The other argument, screen scraping being something that works well, also is fundamentally wrong. Screen scraping relies on the target website or application to always have the same structure. Once it changes, the applications (in that case FinTech) accessing these services and websites must be changed as well. Such changes on the target systems might happen without prior notice.

I see two other arguments that FinTech lobby does not raise. One is about liability issues. If a customer gives his credentials to someone else, this is a fundamentally different situation regarding liability then in the structured access via APIs. Just read the terms and conditions of your bank regarding online banking.

The other argument is about limitations. PSD2 request providing APIs for AISP (Account Information Service Providers) and PISP (Payment Initiation Service Providers) – but only for these services. Thus, APIs might be more restrictive than screen scraping.

However, the EBA has very good arguments in favor of getting rid of screen scraping. One of the main targets of PSD2 is a better protection of customers in using online services. That is best achieved by a well-thought-out combination of SCA (Strong Customer Authentication) and defined, limited interfaces for TPPs (Third Party Providers) such as the FinTechs.

Clearly, this means a change for both the technical implementations of FinTech services that rely on screen scraping as, potentially, for the business models and the capabilities provided by these services. When looking at technical implementations, even while there is not yet an established standard API supported by all players, working with APIs is straightforward and far simpler than screen scraping ever can be. If there is not a standard API, work with a layered approach which maps your own, FinTech-internal, API layer to the various variants of the banks out there. There will not be that many variants, because the AISP and PISP services are defined.

Authentication and authorization can be done far better and more convenient to the customer if they are implemented from a customer perspective – I just recently wrote about this.

Yes, that means changes and even restrictions for the FinTechs. But there are good reasons for doing so. EBA is right on their position on screen scraping and hopefully, the EC Commission will finally share the EBA view.

PSD2: Strong Customer Authentication Done Right

The Revised Payment Services Directive (PSD2), an upcoming EC regulation, will have a massive impact on the Finance Industry. While the changes to the business are primarily based on the newly introduced TPPs (Third Party Providers), which can initiate payments and request access to account information, the rules for strong customer authentication (SCA) are tightened. The target is better protection for customers of financial online services.

Aside from a couple of exemptions such as small transactions below 30 € and the use of non-supervised payment machines, e.g. in parking lots, the basic rule is that 2FA (Two Factor Authentication) becomes mandatory. Under certain circumstances, 1FA combined with RBA (Risk Based Authentication) will continue to be allowed. I have explained various terms in an earlier post.

For the scenarios where 2FA is required, the obvious question is how best to do that. When looking at how banks and other services implemented 2FA (and 1FA) up until now, there is plenty of room for improvement. While many services, such as PayPal, still only mandate 1FA,  there generally is little choice in which 2FA approach to use. Most banks mandate the use of one specific form of 2FA, e.g. relying on out-of-band SMS or a certain type of token.

However, PSD2 will change the play for financial institutions. It will open the fight for the customer: Who will provide the interface to the customer, who will directly interact with the customer? To win in that fight between traditional and new players, customer convenience is a key success factor. And customer convenience starts with the registration as a one-time action and continues with authentication as what the customers must do every time they access the service.

Until now, strong (and not so strong) authentication to financial services seems to have been driven by an inside-out way of thinking. The institutions think about what works best for them: what fits into their infrastructure; what is the cheapest yet compliant approach? For customers, this means that they must use what their service provider offers to them.

But the world is changing. Many users have their devices of choice, many of these with some form of built-in strong authentication. They have their preferred ways of interacting with services. They also want to use a convenient method for authentication. And in the upcoming world with TPPs that can form the new interface, so there will be competition.

Thus, it is about time to think SCA outside-in, from the customer perspective. The obvious solution is to move to Adaptive Authentication, which allows the use of all (PSD2 compliant) forms of 2FA and leaves it to the choice of the customer which he prefers. There must be flexibility for the customer. The technology is available, with platforms that support many, many different types of authenticators and their combinations for 2FA, but also with standards such as the FIDO Alliance standards that provide interoperability with the ever-growing and ever-changing consumer devices in use.

There is room for being both compliant to the SCA requirements of PSD2 and convenient for the customer. But that requires a move to an outside-in thinking, starting with what the customers want – and these many customers never only want one single choice, they want a real choice. Adaptive Authentication thus is a key success factor for doing SCA right in the days of PSD2.

Discover KuppingerCole

KuppingerCole Select

Register now for KuppingerCole Select and get your free 30-day access to a great selection of KuppingerCole research materials and to live trainings.

Stay Connected

Blog

Spotlight

Privacy & the European Data Protection Regulation Learn more

Privacy & the European Data Protection Regulation

The EU GDPR (General Data Protection Regulation), becoming effective May 25, 2018, will have a global impact not only on data privacy, but on the interaction between businesses and their customers and consumers. Organizations must not restrict their GDPR initiatives to technical changes in consent management or PII protection, but need to review how they onboard customers and consumers and how to convince these of giving consent, but also review the amount and purposes of PII they collect. The impact of GDPR on businesses will be far bigger than most currently expect. [...]

Latest Insights

How can we help you

Send an inquiry

Call Us +49 211 2370770

Mo – Fr 8:00 – 17:00