Blog posts by Martin Kuppinger

Complexity Kills Agility: Why the German Reference Architecture Model for Industry 4.0 Will Fail

The German ZVEI (Zentralverband Elektrotechnik- und Elektroindustrie), the association of the electrical and electronic industries, and the VDI (Verein Deutscher Ingenieure), the association of German engineers, has published a concept called RAMI (Referenzarchitekturmodell Industrie 4.0). This reference architecture model has a length of about 25 pages, which is OK. The first target listed for RAMI 4.0 is “providing a clear and simple architecture model as reference”.

However, when analyzing the model, there is little clearness and simplicity in it. The model is full of links to other norms and standards. It is full of multi-layer, sometimes three-dimensional architecture models. On the other hand, the model doesn’t provide answers on details, and only a few links to other documents.

RAMI 4.0 e.g. says that the minimal infrastructure of Industry 4.0 must fulfill the principles of Security-by-Design. There is no doubt that Industry 4.0 should consequently implement the principles of Security-by-Design. Unfortunately, there is not even a link to a description of what Security-by-Design concretely means.

Notably, security (and safety) are covered in a section of the document spanning not even 1% of the entire content. In other words: Security is widely ignored in that reference architecture, in these days of ever-increasing cyber-attacks against connected things.

RAMI 4.0 has three fundamental faults:

  1. It is not really concrete. It lacks details in many areas and doesn’t even provides links to more detailed information.
  2. While only being 25 pages in length and not being very detailed, it is still overly complex, with multi-layered, complex models.
  3. It ignores the fundamental challenges of security and safety.

Hopefully, we will see better concepts soon, that focus on supporting the challenges of agility and security, instead of over-engineering the world of things and Industry 4.0.

There Is No Such Thing as an API Economy

Martin Kuppinger explains why there is no API economy.

Free MFA on Windows for the masses

Cion Systems, a US-based IAM (Identity and Access Management) vendor, recently released a free service that allows Windows users to implement  two-factor authentication. It works only for Windows and supports either a PIN code sent via SMS (not available in all countries) or a standard USB storage key as the second factor. Thus, users can increase the security of their Windows devices in a simple and efficient manner, without the need of paying for a service or purchasing specialized hardware.

Their free service is based on the same technology Cion Systems is using for their enterprise self-service and multi-factor authentication solutions and services, which, e.g., also support password synchronization with Microsoft Office 365.

While such a service is not unique, it is a convenient approach for Windows users for increasing the security of their systems that still commonly are only protected via username and password.

There is no Consumer Identity & Access Management at all – at least not as a separate discipline

These days, there is a lot of talk about Consumer Identity & Access Management or CIAM. However, there is no such thing as CIAM, at least not as a separate discipline within IAM. There are technologies that are of higher relevance when dealing with customers and consumers than they are when dealing with employees. But there neither are technologies that are required for CIAM only nor is there any benefit in trying to set up a separate CIAM infrastructure.

This does not mean that IAM should or must not focus on consumers – in contrast. But it is about extending and, to some extent, renovating the existing on-premise IAM, which commonly is focused on employees and some business partners. It is about one integrated approach for all identities (employees, partner, consumers,…), managing their access to all services regardless of the deployment model, using all types of devices and things. It is about seamlessly managing all access of all identities in a consistent way. Notably, “consistent way” is not the same as “from a single platform”.

So why don’t we need a separate CIAM? The easiest answer is found by asking a simple question: “Is there any single application in your organization that is only accessed by consumers?” This implies “and not by at least some of your employees, e.g. for customer services, administration & operations, or analyzing the data collected.” The obvious answer on that question is that there is no such application. There are applications, which are only used by employees, but not the other way round. So why should there be separate IAM deployments for applications that are used by a common group of users? That could only result in security issues and management trouble.

The other aspect is that the way applications are used within the enterprise is changing anyway. Mobile users access cloud applications without even touching the internal network anymore. Thus, technologies such as Adaptive Authentication, Cloud IAM or IDaaS (Identity Management as a Service), Identity Federation, etc. are relevant not only for consumer-facing solutions but for all areas of IAM.

Finally, there is the aspect, that users frequently have multiple digital identities, even in relationship to their employers. Many employees of car manufacturers also are customers of that company. Many employees of insurance companies also have insurance contracts with that companies, and some even act as freelance insurance brokers. Managing such complex relationships becomes far easier when having one IAM for all – employees, partner, and consumers. One IAM that serves all applications, on-premise and in the Cloud. And one IAM, that supports all type of access.

That might anyway result in projects that focus on managing consumer access to services, IAM for cloud services, and so on. But all these projects should be part of moving IAM to the next level: An IAM that serves all requirements, from the traditional access of an employee using his PC in the corporate LAN to access a legacy backend system to the mobile consumer coming in via a social login and accessing a cloud service.

„Disruptive Change“: Right time to think security anew

Is „Digital Transformation“ something of the future? Definitely not. It has long become reality. With connected things and production, business models of enterprises are already changing profoundly. Communication with customers no longer happens over traditional websites. It encompasses apps and increasingly connected things as well. Rapidly changing business models and partnerships lead to new application architectures like micro service models, especially however to a more intensive usage of APIs (Application Programming Interfaces, interfaces of applications for external function calls), in order to combine functions of various internal and external services to new solutions.

This quick change is often being used as an argument that security can't be improved, since there is the believe that this would hinder the fulfilment of temporal and functional business requirements, especially of all at once. No new, better, up-to-date and future-oriented security concepts in applications are being implemented due to alleged time pressure. However, exactly the opposite is the case: Precisely this change is the chance to implement security faster than ever before. And anyhow, for communication from apps to backend and external systems, user authentication and of course complete handling of connected things one can’t use the same concepts that were introduced for websites five, ten or fifteen years ago.

Furthermore, by now there is a whole lot of established standards, from the more traditional SAML (Security Assertion Markup Language) to more modern worldwide standards, in which REST-based access of apps to services and between services is normal. OAuth 2.0 and OpenID Connect are good examples of this. Or, in other words: Mature possibilities for better security solutions are already a reality, in the form of standards as well as on a conceptual level.

Another good example is the new (and not yet really established) UMA (User Managed Access) standard of the Kantara Initiative. With this standard, users can share “their” data purposefully with applications beyond the basic OAuth 2.0 functions. If you look for example at some of the data challenges associated with the “connected car”, it soon becomes clear how useful new concepts can be.

UMA and other new standards enable easy control of who gets access when and to which data. Traditional concepts don’t allow this – as soon as diverse user groups need access to diverse data sources in diverse situations, one hits the wall or needs to “tinker” solutions (with much effort). If you look e.g. at the crash data recorder, to which insurances, manufacturers and the police need to have access – however, not always and definitely not to all data – it becomes clear how expansively some new challenges in digital transformation have to be solved if not built on modern security concepts.

“Disruption”, the fundamental change we experience in the digital transformation in many places – contrary to the slow and continual development that was the rule in many industries for years – is the chance to become faster, more agile and more secure. For this, we need to deploy new concepts that are oriented towards these new requirements. Already in the first project you are often quicker with this approach than by trying to adapt old concepts to new problems. We should use the chance to make security stronger, especially in the digital transformation. The alternative is risking not to be sufficiently agile enough to withstand competition, due to outdated software and old security architectures.

Thycotic acquires Arellia – moving beyond pure Privilege Management

On February 23rd, 2016, Thycotic, one of the leading vendors in the area of Privilege Management (also commonly referred to as Privileged Account Management or Privileged Identity Management) announced the acquisition of Arellia. Arellia delivers Endpoint Security functionality and, in particular, Application Control capabilities. Both Thycotic and Arellia have built their products on the Microsoft Windows platform, which will allow the more rapid integration of the two offerings.

Thycotic, with its Secret Server product, has evolved over the past years from an entry-level solution towards an enterprise-level product, with significant enhancements in functionality. With the addition of the Arellia products, Thycotic will be able not only to protect access to shared accounts, to discover privileged identities, and to manage sessions, but furthermore can actually control what users do with their privileged accounts and restrict account usage. Applications can be whitelisted or blacklisted, further enhancing control.

With this acquisition, another vendor is combining Privilege Management and Application Control, after CyberArk’s acquisition of Viewfinity some months ago. While it might be too early to name this a trend, there is logic in extending Privilege Management beyond the account management or session management aspect. Protecting not only the access to privileged accounts, but furthermore limiting and controlling the use of such accounts had already become part of Privilege Management with Session Management capabilities, but also more commonly in Unix and Linux environments with restrictions for the use of shell commands. Thus, adding Application Control and other Endpoint Security features is just a logical step.

Our view on Privilege Management always has been beyond pure Shared Account Password Management. The current evolution towards integration with Application Control and other features fits in our broader view of protecting all accounts with elevated privileges at any time, both for access and use.

Beyond Datacenter Micro-Segmentation – start thinking about Business Process Micro-Segmentation!

Sometime last autumn I started researching the field of Micro-Segmentation, particularly as a consequence of attending a Unisys analyst event and, subsequently, VMworld Europe. Unisys talked a lot about their Stealth product, while at VMworld there was much talk about the VMware NSX product and its capabilities, including security by Micro-Segmentation.

The basic idea of Datacenter Micro-Segmentation, the most common approach on Micro-Segmentation, is to segment by splitting the network into small (micro) segments for particular workloads, based on virtual networks with additional capabilities such as integrated firewalls, access control enforcement, etc.

Using Micro-Segmentation, there might even be multiple segments for a particular workload, such as the web tier, the application tier, and the database tier. This allows further strengthening security by different access and firewall policies that are applied to the various segments. In virtualized environments, such segments can be easily created and managed, far better than in physical environments with a multitude of disparate elements from switches to firewalls.

Obviously, by having small, well-protected segments with well-defined interfaces to other segments, security can be increased significantly. However, it is not only about datacenters.

The applications and services running in the datacenter are accessed by users. This might happen through fat-client applications or by using web interfaces; furthermore, we see a massive uptake in the use of APIs by client-side apps, but also backend applications consuming and processing data from other backend services. Furthermore, there is also a variety of services where, for example, data is stored or processed locally, starting with downloading documents from backend systems.

Apparently, not everything can be protected perfectly well. Data accessed through browsers is out of control once it is at the client – unless the client can become a part of the secure environment as well.

Anyway, there are – particularly within organizations with good control of everything within the perimeter and at least some level of control around the devices – more options. Ideally, everything becomes protected across the entire business process, from the backend systems to the clients. Within that segmentation, other segments can exist, such as micro-segments at the backend. Such “Business Process Micro-Segmentation” stands not in contrast to Datacenter Micro-Segmentation, but extends that concept.

From my perspective, we will need two major extensions for moving beyond Datacenter Micro-Segmentation to Business Process Micro-Segmentation. One is encryption. While there is limited need for encryption within the datacenter (don’t consider your datacenter being 100% safe!) due to the technical approach on network virtualization, the client resides outside the datacenter. The minimal approach is protecting the transport by means like TLS. More advanced encryption is available in solutions such as Unisys Stealth.

The other area for extension is policy management. When looking at the entire business process —and not only the datacenter part — protecting the clients by integrating areas like endpoint security into the policy becomes mandatory.

Neither Business Process Micro-Segmentation nor Datacenter Micro-Segmentation will solve all of our Information Security challenges. Both are only building blocks within a comprehensive Information Security strategy. In my opinion, thinking beyond Datacenter Micro-Segmentation towards Business Process Micro-Segmentation is also a good example of the fact that there is not a “holy grail” for Information Security. Once organizations start sharing information with external parties beyond their perimeter, other technologies such as Information Rights Management – where documents are encrypted and distributed along with the access controls that are subsequently enforced by client-side applications – come into play.

While there is value in Datacenter Micro-Segmentation, it is clearly only a piece of a larger concept – in particular because the traditional perimeter no longer exists, which also makes it more difficult to define the segments within the datacenter. Once workloads are flexibly distributed between various datacenters in the Cloud and on-premises, pure Datacenter Micro-Segmentation reaches its limits anyway.

Secured by Design: The smart streets of San Francisco

On April 18th 1906, an earthquake and fires destroyed nearly three quarters of San Francisco. Around 3000 people lost their lives. Right up to the present many other, less critical tremors followed. The danger of another catastrophe can’t be ignored. In a city like San Francisco, however wonderful it might be to live there, people always have to be aware that their whole world can change in an instant. Now the Internet of Things (IoT) can help to make alarm systems get better. People in this awesome city can at least be sure that the mayor and his office staff do their best to keep them safe and secure in all aspects.

Not only that: With the help of the Internet of Things (IoT) they’re also looking for new ways to make the life of the citizens more convenient. That became clear to me when I saw ForgeRock’s presentation about their IoT and Identity projects in San Francisco. I noticed with pleasure that Lasse Andresen, ForgeRock’s CTO and Founder confirmed what I have been saying for quite some time: Security and Privacy must not be an afterthought. Rightfully designed from the start, both do not hinder new successful business models but actually enable them. In IoT, security and privacy are integral elements. They lead to more agility and less risk.

Andresen says in the presentation that Identity, Security and Privacy are core to IoT: “It’s kind of what makes IoT work or not work. Or making big data valuable or not valuable.” San Francisco is an absolutely great example of what that means in practice. Everything – “every thing” - in this huge city shall have its own unique identity, from the utility meter to the traffic lights and parking spaces to the police, firefighters and ambulance. This allows fast, secure and ordered action in case of emergencies. Because of their identities and with geolocation, the current position of each vehicle is always exactly known to the emergency coordinators. The firefighters identify themselves with digital key cards at the scene to show that they are authorized to be there. Thus everything and everyone becomes connected with each other, people, things and services. With identity as the glue.

Identity information enables business models that e. g. improve life in the city. The ForgeRock demonstration shows promising examples such as optimizing the traffic flow and road planning with big data, street lights that reduce power consumption by turning on and off automatically, smart parking that allows the car driver to reserve a space online in advance combined with demand based pricing of parking spaces and, last but not least, live-optimization of service routes.

The ForgeRock solution matches the attributes and characteristics of human users to those of things, devices, and apps, collects the notifications all together in a big data repository and then flexibly manages the relationships between all entities - people and things -  from this central authoritative source. Depending on her or his role, each different user will be carefully provisioned with access to certain devices as well as certain rights and privileges. That is why identity is a prerequisite for secure relationships. Things are just another channel demanding access to the internet. It has to be clear what they are allowed to do, e. g. may item A send sensitive data to a certain server B? If so, does the information have to be encrypted? Without the concept of identities, their relations, and for managing their access there are too many hindrances for successful change in business models and regulations.

Besides the questions about security and privacy, the lack of standards has long been the biggest challenge for full-functioning IoT. Manifold platforms, various protocols and many different APIs made overall integration of IoT systems problematic. Yes, there are even many different “standards”. However, with User Managed Access (UMA) a new standard eventually evolved that’s taking care of the management of access rights. With UMA, millions of users can manage their own access rights and keep full control over their own data without giving it to the service provider. They alone decide which information they share with others. While the resources may be stored on several different servers, a central authorization server controls that the rules laid down by the owner are being reliably applied. Any enterprise that adapts UMA early now has the chance to build a new, strong and long-lasting relationship with customers built on security and privacy by design.

Why Distributed Public Ledgers such as Blockchain will not solve the identification and thus the authentication problem

There is a lot of talk about Blockchain and, more generally, Distributed Public Ledgers (DPLs) these days. Some try to position DPLs as a means for better identification and, in consequence, authentication. Unfortunately, this will not really work. We might see a few approaches for stronger or “better” identification and authentication, but no real solution. Not even by DPLs, which I see as the most disruptive innovation in Information Technology in a very, very long time.

Identification is the act of finding out whether someone (or something) is really the person (or thing) he (it) claims to be. It is about knowing whether the person claiming to be Martin Kuppinger is really Martin Kuppinger or in fact someone else.

Authentication, in contrast, is a proof that you have a proof such as a key, a password, a passport, or whatever. The quality of authentication depends on one hand on the quality of identification (to obtain the proof) and on the other hand on aspects such as protection against forgery and the ubiquitous authentication strength.

Identification is, basically, the challenge in the enrollment process of an authenticator. There are various ways of doing it. People might be identified by their DNA or fingerprints – which works as long as you know that the DNA or fingerprint belongs to someone. But even then, you might not have the real name of that person. People might be identified by showing their ID cards or passports – which works well unless they use faked ID cards or passports. People might be identified by linking profiles of social networks together – which doesn’t help much, to be honest. They might use fake profiles or they might use fake names in real profiles. There is no easy solution for identification.

In the end, it is about trust: Do we trust the identification when rolling out authentications to trust the authenticators?

Authentication can be performed with a variety of mechanisms. Again, this is about trust: How much do we trust a certain authenticator? However, authentication does not identify you. It proves that you know the username and password; that you possess a token; or that someone has access to your fingerprints. Some approaches are more trustworthy; others are less trustworthy.

So why don’t DPLs such as Blockchain solve the challenge of identification and authentication? For identification, this is obvious. They might provide a better proof that an ID is linked to various social media profiles (such as with Onename), but they don’t solve the underlying identification challenge.

DPLs also don’t solve the authentication issue. If you have such an ID, it either must be unlocked in some way (e.g. by password, in the worst case) or bound to something (e.g. a device ID). That is the same challenge as we have today.

DPLs can help in improving trust e.g. in that still the same social media profiles are linked. It can support non-repudiation which is an essential element. It will increase the trust level with a growing number of parties participating in a DPL. But it can’t solve the underlying challenges of identification and authentication. Simply said, Technology will never know exactly who someone is.

Cyber Security: Why Machine Learning is Not Enough

Currently, there is a lot of talk about new analytical approaches in the field of cyber security. Anomaly detection and behavioral analytics are some of the overarching trends along with RTSI (Real Time Security Intelligence), which combines advanced analytical approaches with established concepts such as SIEM (Security Information and Event Management).

Behind all these changes and other new concepts, we find a number of buzzwords such as pattern-matching algorithms, predictive analytics, or machine learning. Aside from the fact that such terms frequently aren’t used correctly and precisely, some of the concepts have limitations by design, e.g. machine learning.

Machine learning implies that the “machine” (a piece of software) is able to “learn”. In fact this means that the machine is able to improve its results over time by analyzing the effect of previous actions and then adjusting the future actions.

One of the challenges with cyber security is the fact that there are continuously new attack vectors. Some of them are just variant of established patterns; some of them are entirely new. In an ideal world, a system is able to recognize unknown vectors. Machine learning per se doesn’t – the concept is learning from things that have gone wrong.

This is different from anomaly detection which identifies unknown or changing patterns. Here, the new is something which is identified as an anomaly.

Interestingly, some of the technologies where marketing talks about “machine learning” in fact do a lot more than ex-post-facto machine learning. Frequently, it is not a matter of technology but of the wrong use of buzzwords in marketing. Anyway, customers should be careful about buzzwords: Ask the vendor what is really meant by them. Any ask yourself whether the information provided by the vendor really is valid and solves your challenge.

Stay Connected

Discover KuppingerCole

KuppingerCole Select

Register now for KuppingerCole Select and get your free 30-day access to a great selection of KuppingerCole research materials and to live trainings.

Blog

Spotlight

Learn more

Internet of Things & Industry 4.0

Internet of Things the intelligent connectivity of smart devices by which objects can sense one another and communicate, thus changing how where and by whom decisions about our physical world are made. Manufacturing companies are currently implementing this “intelligent connectivity of smart devices” in their factories and on the shop floor. To distinguish these applications of the IoT from those among consumers and other realms, the term Industrial Internet of Things is often used. (...)

Latest Insights

How can we help you

Send an inquiry

Call Sales-Team +49 211 2370770

Mo – Fr 8:00 – 17:00