BOMGAR, owned by PE firm Francisco Partners has recently announced that it has acquired Avecto, a UK based Endpoint Privilege Management (EPM) company. The move coming within 6 months of Lieberman Software’s acquisition by BOMGAR clearly depicts the quest to strengthen its position in the PAM market by offering a full-featured PAM suite.
Originally a provider of ‘remote support’ solutions, BOMGAR offered remote session management capabilities in the market for a while until it acquired Argentina based Pitbull Software in late 2015 to enter the PAM market with its password management technology. Since then BOMGAR has been on an acquisition spree to expand its portfolio of PAM technologies to compete more effectively against the market leaders.
Avecto has been a market leader in the niche market of Endpoint Privilege Management (EPM). Its flagship product Avecto Defendpoint offers capabilities to manage threats associated with local administrative rights on windows and mac endpoints by offering controlled and monitored escalation of admin privileges. Avecto Defendpoint also offers effective application whitelisting and sandboxing capabilities for enhanced endpoint protection which has positioned it uniquely in the market with almost twice the number of managed endpoints than its closest competitor. For a couple of years before acquiring Viewfinity in late 2015, CyberArk embedded Defendpoint as a technology licensed through an OEM agreement with Avecto to sell a more complete PAM solution in the market for its customers and compete against then leading EPM product BeyondTrust PB for Windows.
Endpoint Privilege Management (EPM) has become one of the fastest growing sub-segments of PAM market, closing in on approximately 28% YoY growth. With EPM capabilities, PAM solutions are poised to offer effective second-in-line defense mechanism for endpoint threat protection in coming years. The increased demand of better EPM capabilities embedded in PAM solutions has led many market leading vendors to acquire or develop their own EPM capabilities in the recent past. CyberArk, for example, acquired Viewfinity and Thycotic acquired Arellia in recent years to bring EPM capabilities in their PAM portfolios.
At KuppingerCole, we define EPM solutions to primarily offer three distinct technologies:
- Application Control: This allows organizations to control what applications can be allowed to run on an endpoint. This is usually achieved through application whitelisting in which only known good applications are placed on the pre-approved list and allowed to run. Application control provides effective protection against shadow IT challenges for most organizations.
- Sandboxing: This technology uses the approach to isolate the execution of unknown applications or programs by restricting the resources they can access (for eg., files, registries etc.). This technology, also known as application isolation, provides an effective protection against cyberattacks by confining the execution of malicious programs and limiting their means to cause the harm.
- Privilege Management: This technology encompasses user and application privilege management. For user privileged management, it deals with controlled and monitored elevation to local admin privileges. Application privilege management deals with exception or policy-based elevation of administrative rights for known and approved applications to execute successfully.
Avecto DefendPoint offers a good mix of these EPM technologies in the market to provide effective endpoint protection against a range of cyber threats. The acquisition of these EPM capabilities make a natural fit for BOMGAR offering great cross-sell opportunities in the short to mid-term. While their integration under a common PAM platform should begin soon, no immediate changes are expected to either product lines. In the short term, Avecto will continue to operate under the rebranded entity as Avecto, a BOMGAR company until its fully integrated into BOMGAR organization through the remainder of 2018.
The BOMGAR’s approach to obtain additional PAM capabilities through acquisitions is expected to bring rapid growth and deliver quick synergies but is also accompanied by the risks of integration failures and long-term effects of dampened organic growth. No doubt, the Lieberman Software’s and now Avecto’s acquisition places BOMGAR on the list of top 5 PAM vendors by revenue but not necessarily on the list of market leaders for technology innovation. As the PAM market continues to evolve, consolidation is inevitable, however, a stronger focus of vendors on completeness of features as compared to innovation in order to compete can stiffen the healthy market growth by failing to deliver on opportunities created by innovation.
While a clear integration roadmap for Lieberman Software was still awaited, the acquisition of Avecto adds to the growing pipeline of product and engineering teams to develop an integrated PAM platform to realize the essentials of these acquisitions. With a good track record of delivering growth and profitability as well as driving operational excellence, we expect BOMGAR to steer clear of any such challenges in the short to mid-term by delivering on the actual synergies created by these acquisitions.
Cybersecurity needs to be at the heart of the digital transformation, but organisational models will have to evolve
Cybersecurity is in the process of becoming an essential component of any organisation’s digital transformation journey. There is no way around this, especially as policymakers start dipping their toes into privacy and security issues, and societal norms are shifting on the topic.
Most new technology layers enabling the digital transformation need to be protected from interference, intrusion, or corruption. This is especially the case across industry sectors seeking to take advantage of the enormous opportunities offered by driverless vehicles and the logistics sector – amongst others - could be unrecognizable in ten years’ time.
New technologies will also generate and feed on massive amounts of data - most of it sensitive or private - that will need to be collected, processed, and safeguarded in a way that is both sensible and ethical. The concepts of security by design and of privacy by design will inevitably become any organisation’s best allies in its innovative endeavours and must be taken seriously by all digital transformation players, especially as the regulatory and social contexts become harder to navigate.
There is no doubt – in our opinion – that organisations which put information security and privacy at the heart of their digital transformation from the start could obtain a real competitive advantage in the mid-to-long run.
As a matter of fact, the recent launch of the General Data Protection Regulation (GDPR) in the EU is changing dramatically the incentives landscape for all businesses active in Europe. In addition to the fines of 4% of the global turnover, firms are now required to report any relevant data breach to the regulator within 72 hours. This will require capabilities of detection, analysis and reaction, which go far beyond the scope of the security teams and will force many corporate stakeholders to work together on those matters (security, IT, legal, DPO teams, senior management etc…). As such, the GDPR could be a painful lesson as to why cybersecurity is necessarily a transversal matter for organisations of all sizes.
Finally, and perhaps most importantly, respect for privacy and the protection of personal data is likely to become a true competitive advantage as our societies become increasingly warry of these issues.
This shift is well illustrated by the first complaints filed under the GDPR framework. Privacy activists such as Max Schrems or the French Quadrature du Net, for example, have already started to drag high-profile tech companies (Facebook, Google, Instagram, etc…) into what could become lengthy legal proceedings. Depending on how the regulators react, this could have deep implications on how data-driven businesses are to operate in Europe.
Increasingly, security and privacy become intertwined, but it makes little sense from a corporate governance perspective to allow a new privacy organisation under a DPO to grow in parallel to – or in conflict with – existing security structures. Synergies are obvious and need to be leveraged, and where security practices are deemed dysfunctional or in need of improvement, this could provide an ideal opportunity.
In fact, it could be the start of a major evolution around corporate perceptions of security and privacy, from burden, annoyance and costs, towards becoming central management functions. But organisational models will have to evolve as a result to accommodate the truly transversal nature of security and privacy matters and carve out a niche for those new corporate functions.
At this junction, the traditional role of the CISO – heavily influenced by a technical bias, tactically-oriented and project-driven in many firms – could become exposed.
Not in its functional existence – IT security is more essential than ever – but in its corporate prominence. Having failed to project their roles beyond the tactical and technical fields for the best part of the last decade, many CISOs could find themselves pushed down the organisation while CSO and DPO roles take centre stage at the top.
With those new roles should come new people and a new focus, and probably a different way to approach security matters and talk about them.
We could be at the start of an exciting decade for all security professionals.
Learn more about this topic in my session at the Cybersecurity Leadership Summit 2018 Europe, November 12-14, 2018 in Berlin.
*** Please note this is a guest blog post and does not necessarily represent the opinion of KuppingerCole ***
On June 15th, 2018 I attended an OIX Workshop in London on this subject. The workshop was led by Don Thibeau of the Open Identity Exchange and Distributed Ledger foundation and was held in the Chartered Accountants’ Hall, Moorgate Place, London.
Blockchain and Distributed Ledger Technology (DLT) is often associated with crypto-currencies like Bitcoin. However, it has a much wider applicability and holds the potential to solve a wide range of challenges. Whenever a technology is evolving, the governance is often neglected until there are incidents requiring greater participation of involved parties and regulators to define operating guidelines. Governance is a wider subject and covers markets, laws and regulations, corporate activities as well as individual projects, the workshop covered many of these areas.
One question that often arises while evaluating or adopting a new technology is whether the existing legal framework is sufficient to protect your interests. According to the technology lawyer Hans Graux (time.lex) existing EU legislation on electronic signatures works well for blockchain. However, where blockchain is sold as technology there is no guarantee of governance to it back up. EU law allows the prohibition of electronic contracts for certain forms of transaction (e.g. real estate) so there are regional variations to the applicability of blockchain within EU. Some countries have created laws but, in his opinion, these are intended to show that these countries are open for business rather than because they are needed. He recommended that organizations should take a risk-based approach, similar to that for GDPR to gauge their readiness for blockchain and document the risks arising from an early adoption of blockchain as well as the controls required to manage these risks.
There was a panel on Smart Contracts and the legal framework surrounding the Smart Contracts. A key takeaway from the panel was the fact that Smart Contracts are not deemed legal contracts and so how can Smart Contracts be made legally enforceable? Tony Lai (CodeX & Legal.io) outlined the Accord project from the CodeX Stanford Blockchain Group. The initial focus of this group is in the areas of:
1. Regulatory frameworks and ethical standards around token generation events (also known as ICOs or Initial Coin Offerings);
2. Legal issues and opportunities presented by blockchain technologies and their intersection with existing legal frameworks;
3. Smart contracts and governance design for token ecosystems; and
4. Legal empowerment and legal services use cases for blockchain technologies.
The panel then discussed the ‘Pillars of Trust’ – Governance, Identity, Security and Privacy in DLT. During this panel Geoff Goodell (UCL) provided an interesting set of perspectives including the need for people to have multiple identities. He described how electronic funds transfer systems provide best surveillance network in the world. He stated that it is only now coming to the point where the risks associated with linking peoples’ activities is becoming clear. To ensure privacy only the minimum information needed should be required to be disclosed. Systems need to be accountable to their users. DLTs are not immutable – the people in control can decide to make changes (for example a code fork) in a way that is unaccountable. Peter Howes then discussed the evidentiary value of IoT data – he expressed the view that Blockchain will not obviate disputes but will reduce the number of areas for dispute.
During the afternoon some Real-World Use-Cases for blockchain and DLT were discussed:
Laura Bailey (Qadre & British Blockchain Association) – described how Qadre has developed their own blockchain system “PL^G” and how this is being prototyped for pharmaceutical anticounterfeiting in support of the EU Falsified Medicines Directive.
Jason Blick (EQI Trade) described how they are aiming to launch the world’s first offshore bank that bridges fiat and cryptocurrencies using blockchain technologies. He announced that they will shortly launch KYC blockchain based system EQI Check.
Brian Spector (Qredo) described a Distributed Ledger Payments Platform for the telecoms industry. This could not use proof of work because of the compute overhead instead they will use the network with a “proof of speed” consensus algorithm.
KuppingerCole is actively researching blockchain and DLT including its applications to identity, privacy and security. Recently at EIC (European Identity & Cloud Conference), in Munich there were several workshops and sessions devoted to practical implementations of blockchain. In the opening keynote at EIC, Martin Kuppinger described the areas where Blockchain technology has the potential to help to solve real-world identity challenges. There are already so many KYC (Know Your Customer) use cases based on Blockchain with valid business models that this is now a reality or at least close to becoming one. Blockchain also has the potential to simplify authentication by having various authenticators and IDs associated with a wallet. Its application to authorization, privacy and smart contracts also has obvious potential.
However, a practical realization of these potentials requires trustworthiness which takes us back to the question of governance. Good governance remains vital to avoid traditional challenges of DLT and to ensure that these inherent problems are not exacerbated in blockchain implementations due to a lack of governing principles.
Guest Author: Jordan L. Fischer, Esq., Co-Founder & Managing Partner of XPAN Law Group, LLC
Technology is changing rapidly, correlating in an increasing amount of data collected every second. These technologies cross-borders and allow businesses to operate on a global scale, at a rate never before seen. However, the corresponding legal infrastructures operate with borders -- hard borders -- that make the exchange of data, both internally and externally, complicated and challenging.
In the last two years, new data protection regulations have gone into effect in a number of different regions: Japan, China, Australia, and most recently (and with the largest “bang”), the European Union. Each of these regulations imposed nuanced requirements on companies, often asserting data localization requirements, implementing the principle of transparency and including consent initiatives when these organizations collect and process data. Most importantly, companies need to proactively be aware of the implications of the technology they use and the data they collect which depending on the regions in which they operate.
This changing legal landscape is no more apparent than in the European Union (EU), with the General Data Protection Regulation (GDPR). The GDPR imposes a number of proactive privacy measures on entities, both within the EU and outside of the EU, that are poised to drastically change the way businesses maintain and exchange data from within the EU. At its core, the GDPR asserts data privacy and security principles on companies. The GDPR does not discriminate depending on the industry or the size of the organization. It universally and equally requires data minimization, data localization, transparency, and accountability by all organizations. The GDPR empowers data subjects to take control of the data collected by companies about them, and to require that those companies to account for all processing of that data, and all third-parties who have access to that data.
The “GDPR model” is becoming the de facto standard. Canadian data protection laws are changing this fall, bringing them more in line with the the GDPR. Even individual states are moving more towards providing similar data protections as the GDPR: California is in the midst of a debate of how much control to give data subjects regarding their data. What started as a potential ballot to be included in the fall elections has now become a bill in the California state legislature and appears to provide similar data protections as many of these international regulations.
These varying principles of data privacy and cybersecurity converge when organizations exchange, transfer and process sensitive information across borders and, as such, implicate a number of different regulations. Take for example the growing prevalence of cloud storage, with companies opting to store data and systems off premise, in a data center located in a specific location, or in multiple data centers. Either option directly correlates with a legal obligation and potential ramifications for regulatory compliance and contractual agreements.
When addressing cross-border data management, companies should take key steps in order to better understand any legal obligations or liabilities, before an issue arises. The first step is knowledge: What data is collected? What is done with that data? Where is that data stored? These regulations increase the power of the data subject, which dovetails into a burden on companies to provide the necessary transparency, both prior to and after the collection of data. In order to provide accurate information to meet these obligations companies need to know, before collecting the data, what it intends to do with that data.
Second, a company needs to know who has access to that data. This is both internal access -- a company’s own employees-- and external access -- third-parties or partners. Understanding the “who” is involved in a “data transaction” is key to ensuring security along that entire chain and providing the necessary transparency to the data subject. The use of processors and sub-processors is common -- but, companies need to ensure that each party involved understands its obligations and adequately protects and secures the data.
Third, a company needs to understand the data lifecycle: how long is the data needed? What happens when we no longer need the data? Data storage is expensive, especially if additional security measures are needed such as encryption or redundancy. Often, companies are not even aware of all of the “old” data that it maintains -- old data that is no longer useful but remains a liability in the event of a breach. Creating “house cleaning” policies (i.e. data destruction and retention policies) is key to decreasing costs and potential legal ramifications.
Ultimately, companies need to understand this convergence of domestic and international data obligations and its effect on creating efficient and secure data management practices in order to meet the needs of the business. Technology and data is like a spiderweb within an organization -- it impacts a number of different business units, and requires a holistic approach. Taking key steps early-on in the data collection process can drastically minimize long term costs and liabilities.
Learn more about this topic in my session at the Consumer Identity World September 19-21, 2018 in Seattle.
* * * * *
Nothing contained in this blog post should be construed as creating an attorney-client relationship or providing legal advice of any kind. If you have a legal issue regarding cybersecurity, domestic or international data privacy, or electronic discovery, you should consult a licensed attorney in your jurisdiction.
It’s May 25 today, and the world hasn’t ended. Looking back at the last several weeks before the GDPR deadline, I have an oddly familiar feeling. It seems that many companies have treated it as another “Year 2000 disaster” - a largely imaginary but highly publicized issue that has to be addressed by everyone before a set date, and then it’s quickly forgotten because nothing has really happened.
Unfortunately, applying the same logic to GDPR is the biggest mistake a company can make. First of all, obviously, you can only be sure that all your previous preparations actually worked after they are tested in courts, and we all hope this happens to us as late as possible. Furthermore, GDPR compliance is not a one-time event, it’s a continuous process that will have to become an integral part of your business for years (along with other regulations that will inevitably follow). Most importantly, however, all the bad guys out there are definitely not planning to comply and will double their efforts in developing new ways to attack your infrastructure and steal your sensitive data.
In other words, it’s business as usual for cybersecurity specialists. You still need to keep up with the ever-changing cyberthreat landscape, react to new types of attacks, learn about the latest technologies and stay as agile and flexible as possible. The only difference is that the cost of your mistake will now be much higher. On the other hand, the chance that your management will give you a bigger budget for security products is also somewhat bigger, and you have to use this opportunity wisely.
As we all know, the cybersecurity market is booming, since companies are spending billions on it, but the net effect of this increased spending seems to be quite negligible – the number of data breaches or ransomware attacks is still going up. Is it a sign that many companies still view cybersecurity as a kind of a magic ritual, a cargo cult of sorts? Or is it caused by a major skills gap, as the world simply doesn’t have enough experts to battle cybercriminals efficiently?
It’s probably both and the key underlying factor here is the simple fact that in the age of Digital Transformation, cybersecurity can no longer be a problem of your IT department only. Every employee is now constantly exposed to security threats and humans, not computers, are now the weakest link in any security architecture. Unless everyone is actively involved, there will be no security anymore. Luckily, we already see the awareness of this fact growing steadily among developers, for example. The whole notion of DevSecOps is revolving around integrating security practices into all stages of software development and operations cycle.
However, that is by far not enough. As business people like your CFO, not administrators, are becoming the most privileged users in your company, you have to completely rethink substantial parts of your security architecture to address the fact that a single forged email can do more harm to your business than the most sophisticated zero-day exploit. Remember, the victim is doing all the work here, so no firewall or antivirus will stop this kind of attack!
To sum it all, a future-proof cybersecurity strategy in the “post-GDPR era” must, of course, be built upon a solid foundation of data protection and privacy by design. But that alone is not enough – only by constantly raising awareness of the newest cyberthreats among all employees and by gradually increasing the degree of intelligent automation of your daily security operations do you have a chance of staying compliant with the strictest regulations at all times.
Humans and robots fighting cybercrime together – what a time to be alive! :)
Why does it seem to be getting harder to delete information online? GDPR will take effect in just a few days. GDPR empowers EU people to take control of their personal information. When in force, GDPR will mandate that companies and other organizations which control or process personal information must comply with delete requests. Users around the world are more cognizant of the data they create and leave online. Even outside the EU, people want to be able to delete data which they deem is no longer useful.
Enter the “archive” button. On some social media sites and other popular applications, the archive button appears to have replaced the old familiar “delete” button. Why? It is ostensibly to make it easier for users to retrieve information that they want out of sight. App makers reason that you don’t always want to delete something once you hit delete. Sometimes, they’re right. But most of the time, “delete” should mean delete. If one searches hard enough, one can usually find ways to actually delete data, even though the top-level UIs only show options to archive.
Another reason “archive” has replaced “delete” is that all information has some value, or at least that is the guiding principle in Big Data circles. Just because a user wants data removed doesn’t mean that it doesn’t have value for others. Social network operators make money off user data, so they believe it must be retained for historical analysis.
Turbulence in the markets and bad press for social media companies may be a leading indicator as to the importance of personal data control for an increasing number of users worldwide. In advance of GDPR, and for the benefit of all users, we urge app makers to bring back the delete button.
With mere days left till the dreaded General Data Protection Regulation comes into force, many companies, especially those not based in the EU, still haven’t quite figured out how to deal with it. As we mentioned countless times earlier, the upcoming GDPR will profoundly change the way companies collect, store and process personal data of any EU resident. What is understood as personal data and what is considered processing is very broad and is only considered legal if it meets a number of very strict criteria. Fines for non-compliance are massive – up to 20 million Euro or 4% of a company’s annual turnover, whichever is higher.
Needless to say, not many companies feel happy about massive investments they’d need to make into their IT infrastructures, as well as other costs (consulting, legal and even PR-related) of compliance. And while European businesses don’t really have any other options, quite a few companies based outside of the EU are considering pulling out of the European market completely. A number of them even made their decision public, although we could safely assume that most would rather keep the matters quiet.
However, before you even start looking for other similar solutions, consider one point: the GDPR protects the EU subjects’ privacy regardless of their geographic location. A German citizen staying in the US and using a US-based service is, at least in theory, supposed to have the same control over their PII as back home. And even without traveling, an IP blacklist can be easily circumvented using readily available tools like VPN. Trust me, Germans know how to use them – as until recently, the majority of YouTube videos were not available in Germany because of a copyright dispute, so a VPN was needed to enjoy “Gangnam style” or any other musical hit of the time.
On the other hand, thinking that the EU intends to track every tiniest privacy violation worldwide and then drag every offender to the court is ridiculous; just consider the huge resources the European bureaucrats would need to put into a campaign of that scale. In reality, their first targets will undoubtedly be the likes of Facebook and Google – large companies whose business is built upon collecting and reselling their users’ personal data to third parties. So, unless your business is in the same market as Cambridge Analytica, you should probably reconsider the idea of blocking out European visitors – after all, you’d miss nearly 750 million potential customers from the world’s largest economy.
Finally, the biggest mistake many companies make is to think that GDPR’s sole purpose is to somehow make their lives more miserable and to punish them with unnecessary fines. However, like any other compliance regulation, GDPR is above all a comprehensive set of IT security, data protection and legal best practices. Complying with GDPR - even if you don’t plan to do business in the EU market - is thus a great exercise that can prepare your business for some of the most difficult challenges of the Digital Age. Maybe in the same sense as a volcano eruption is a great test of your running skills, but running exercises are still quite useful even if you do not live in Hawaii.
As the May 25th, 2018 GDPR enforcement date approaches, more and more companies are actively taking steps to find, evaluate, and protect the personally identifiable information (Personal Data) of EU persons. Organizations that do business with EU persons are conducting data protection impact assessments (DPIAs) to find Personal Data under their control. Many are also asking “do we need to keep the data?” and putting into practice data minimization principles. These are good measures to take.
IT and privacy professionals are inventorying HR, CRM, CIAM, and IAM systems, which is reasonable since these likely contain Personal Data. Administrators should also consider performing DPIAs on security solutions.
Security solutions such as SIEMs, EMMs, and Endpoint Security/EDR tools collect lots of data, including Personal Data, for analysis. Many of the following types of Personal Data (as defined by GDPR) are routinely harvested for ongoing security and risk analysis:
- Email address
- User attributes, including organizational affiliations, citizenship, group membership
- IP address
- User-created data files
Most security solutions allow options for on-premise analysis or cloud-based analysis. As an example, most anti-malware products "scoop up" files for deep inspection at the vendor's cloud, which may be outside of EU. Some vendor solutions are configurable in terms of what attributes can be collected and/or sent elsewhere for analysis; some are not.
Any processing of Personal Data is controlled under GDPR. The definition of processing is so wide that it likely includes these forms of scanning and analysis
In light of GDPR, one question administrators should ask “Is this information collected with user consent?” In some cases, user consent will be required. However, according to GDPR Article 6, personal information collection may proceed for the following purposes:
- for the performance of a contract or legal obligation;
- to protect the vital interests of the data subject;
- for a task in the public interest;
- or where processing is necessary for the legitimate interests of the controller.
Moreover, there will be situations in which Personal Data may be processed by more than one Data Processor. In these joint-processor scenarios, all entities involved in processing share responsibility for ensuring that the use of Personal Data is authorized under one of the GDPR-specified purposes above.
Security administrators should work with their DPOs and legal team to address the following additional points:
- Determine which of your deployed security solutions collect which kinds of data; in effect, do DPIAs on security solutions.
- Ascertain where this data goes: local storage? Telemetry transmitted to the cloud? If so, does it stay in the EU? Could it go outside the EU? GDPR defines the notion of data protection adequacy with regard to countries and organizations outside the EU. The Official Journal of the EU will publish and maintain a list of locations for which no additional data transfer agreements will be required.
- If the security scanning or analysis is performed by a third party or cloud provider, irrespective of wherever this is done there must be a written legal agreement as set out in Article 28 (3).
- Do your security solutions permit Personal Data anonymization? GDPR Recital 26 states that data which is sufficiently masked to prevent the identification of the user will not be subject to the data protection mandates. However, SIEMs and forensic tools sometimes need to be able to pinpoint users. Specifically, IP addresses and user credentials are almost always necessary and serve as “primary keys” on which security analyses are based. Within your security solutions, is it possible to mask user data at a high level for external analysis, but leave details encrypted locally, so that they can be unmasked by authorized security analysts during investigations? This is a difficult technical challenge, which is not supported yet by many security vendors. Regardless, even local processing of data elements such as IP address falls under the jurisdiction of GDPR.
In summary, don’t forget your security solutions when running DPIAs. Check with vendors about what information they collect and how it is treated. Work closely with your DPOs and legal counsel to plan the best course of action if you find that remediation or some re-design is needed.
Ten years ago, for the second EIC, we published a report and survey on the intersection of IAM and SOA (in German language). The main finding back then was that most businesses don’t secure their SOA approaches adequately, if at all.
Ten years later, we are talking Microservices. Everything is DevOps, a small but growing part of it is DevSecOps. And again, the question is, whether we have appropriate approaches in place to protect a distributed architecture. This question is even more important in an age where deployment models are agile and hybrid.
So how to do IAM for this microservices world? Basically, there are two challenges: supporting the environments and supporting the services and applications.
The former are about securing containers. That includes privileged access to the environments the containers run in as well as the containers itself, but also the fine-grained access management and governance of such environments. It also includes the interesting challenge of segregating access to development, test, and production in the DevOps world, which is an even more demanding task than in traditional IT.
The second challenge is about how to secure communication between microservices. One of the technologies that inevitably comes into play here is API Management & Security. Beyond that, we will have to rethink authorization for services, but also how to manage and govern identities and their access at both the level of individual microservices and the orchestrated services and applications provided to the business.
Reasonably defined microservices, fully encapsulated and providing their functionality to connected services and applications exclusively via secure, authenticated and auditable APIs, are an important step towards secure architectures “by design”.
Notably, we must also start thinking about deploying security components as services, externalizing and standardizing them. I discussed this topic a while ago in a webinar – you might want to watch the webcast. With moving to a more agile approach of IT, where changes are quickly deployed to production environments, identity and security must become adequately agile. Automation becomes key to success. We see some interesting trends and offerings arriving, however most of them currently are focused on privileged users – which is a good start, but by far not the end of our journey towards secure microservices architectures.
It’s about time to make our IAM services ready to support the new way IT is done: agile and modular. Otherwise we will end up in a security nightmare.
Since I’m observing the IAM business, it has been under constant change. However, there is a change on its way that is bigger than many of the innovations we have seen over the past decade. It is IAM adopting the architectural concept of microservices.
This will have a massive impact on the way we can do IAM, and it will impact the type of offerings in the market. In a nutshell: microservices can make IAM far more agile and flexible. But let’s start with the Wikipedia definition of Microservices:
Microservices is a software development technique—a variant of the service-oriented architecture (SOA) architectural style that structures an application as a collection of loosely coupled services. In a microservices architecture, services are fine-grained and the protocols are lightweight. [Source: Wikipedia]
Basically, it is about moving to loosely coupled services and lightweight protocols for integration. Factually, this also includes lightweight APIs these services expose. Each microservice delivers specific, defined capabilities, which are then orchestrated.
When we look at the evolution of the IAM market, most applications have been architected more or less monolithic in the past. Most of them have some “macroservice” model built-in, with a couple of rather large functional components such as a workflow engine or an integration layer. Some vendors already have come a bit further in their journey towards microservices, but when looking at today’s on-premises solutions for IAM, for most vendors the journey towards microservices has just started, if at all.
Looking at IDaaS (Identity as a Service), the situation is different. Many (but not all) of the IDaaS solutions on the market have been architected from scratch following the microservices approach. However, in most cases, they do so internally, while still being exposed as a monolithic service to the customer.
The emerging trend – and, even more important, the growing demand of customers – now is for IAM being delivered as a set of microservices and via containers (which might contain multiple microservices or even a few legacy components not yet updated to the new model). Such an approach allows for more flexible deployments, customization, integration, and delivers the agility businesses are asking for today.
From a deployment point of view, such architecture gives business the option to decide where to run which of the services and, for example, support a hybrid deployment or a gradual shift from on-premises to private cloud and on to public cloud.
From a customization and integration perspective, orchestrating services via APIs with IAM services and other services such as IT Service Management is more straightforward than coding, and more flexible than just relying on customization. Lightweight APIs and standard protocols help.
Finally, a microservice-style IAM solution (and the containers its microservices reside in) can be deployed in a far more agile manner by adding services and orchestrations, instead of the “big bang” style rollout of rather complex toolsets we know today.
But as always, this comes at a price. Securing the microservices and their communication, particularly in hybrid environments, is an interesting challenge. Rolling out IAM in an agile approach, integrated with other services, requires strong skills in both IT and security architecture, as well as a new set of tools and automation capabilities. Mixing services of different vendors requires well-thought-out architectural approaches. But it is feasible.
Moving to a microservices approach for IAM provides a huge potential for both the customers and the vendors. For customers, it delivers flexibility and agility. They also can integrate services provided by different vendors in a much better way and they can align their IAM infrastructure with an IT service model much more efficiently.
For vendors, it allows supporting hybrid infrastructures with a single offering, instead of developing or maintaining both an on-premises product and an IDaaS offering. But it also raises many questions, starting with the one on the future licensing or subscription models for microservices – particularly if customers only want some, not all services.
There is little doubt that the shift to microservices architectures in IAM will significantly affect the style of IAM offerings provided by vendors, as it will affect the way IAM projects are done.
Register now for KuppingerCole Select and get your free 30-day access to a great selection of KuppingerCole research materials and to live trainings.
AI for the Future of your Business: Effective, Safe, Secure & Ethical Everything we admire, love, need to survive, and that brings us further in creating a better future with a human face is and will be a result of intelligence. Synthesizing and amplifying our human intelligence have therefore the potential of leading us into a new era of prosperity like we have not seen before, if we succeed keeping AI Safe, Secure and Ethical. Since the very beginning of industrialization, and even before, we have been striving at structuring our work in a way that it becomes accessible for [...]