KuppingerCole Blog

Elementary, My Dear Watson

A couple of weeks ago, just as we were busy running our European Identity & Cloud Conference, we’ve got news from IBM announcing the company’s foray into the area of Cognitive Security. And, although I’m yet to see their solution in action (closed beta starts this summer), I have to admit I rarely feel so excited about news from IT industry.

First of all, a quick reminder: the term “cognitive computing” broadly describes technologies based on machine learning and natural language processing that mimic the functions of human brains. Such systems are able to analyze vast amounts of unstructured data usually inaccessible to traditional computing platforms and not just search for answers, but create hypotheses, perform reasoning and support human decision making. This is really the closest we have come to Artificial Intelligence as seen in science fiction movies.

Although the exact definition of the term still causes much debate among scientists and marketing specialists around the world, cognitive computing solutions in the form of specialized hardware and software platforms have existed for quite some time, and the exponential growth of cloud computing has been a big boost for their further development. In fact, IBM has always been one of the leading players in this field with their Watson platform for natural language processing and machine learning.

IBM Watson was initially conceived in 2005 as a challenge to beat human players in the game of Jeopardy, and its eventual victory in a 2011 match is probably its best publicized achievement, but the platform has been used for a number of more practical applications for years, including business analytics, healthcare, legal and government services. The company continues to build an entire ecosystem around the platform, partnering with numerous companies to develop new solutions that depend on unstructured data analysis, understanding natural language and complex reasoning.

In the hindsight, the decision to utilize Watson’s cognitive capabilities for cyber security application seems completely reasonable. After all, with their QRadar Security Intelligence Platform, IBM is also one of the biggest players in this market, and expanding its scope to incorporate huge amounts of unstructured security intelligence makes a lot of sense. By tapping into various sources like analyst publications, conference presentations, forensic reports, blogs and so on, cognitive technology will provide security analysts with new powerful tools to support and augment their decision making. Providing access to the collective knowledge from tens of thousands sources constantly adapted and updated with the newest security intelligence, Watson for Cyber Security is supposed to solve the biggest problem IT security industry is currently facing – a dramatic lack of skilled workforce to cope with the ever growing number of security events.

Naturally, the primary source of knowledge for Watson is IBM’s own X-Force research library. However, the company is now teaming with multiple universities to expand the amount of collected security intelligence to feed into the specialized Watson instance running in the cloud. The ultimate goal is to unlock the estimated 80% of all security intelligence data, which is currently available only in an unstructured form.

It should be clear, of course, that this training process is still work in progress and by definition it will never end. There are also some issues to be solved, such as obvious concerns about privacy and data protection. Finally, it’s still not clear whether this new area of application will generate any substantial revenue for the company. But I’m very much looking forward to seeing Watson for Cyber Security in action!

By the way, I was somewhat disappointed to find out that Watson wasn’t actually named after Sherlock Holmes’ famous friend and assistant, but in fact after IBM’s first CEO Thomas Watson. Still, the parallels with “The Adventure of the Empty House” are too obvious to ignore :)

Complexity Kills Agility: Why the German Reference Architecture Model for Industry 4.0 Will Fail

The German ZVEI (Zentralverband Elektrotechnik- und Elektroindustrie), the association of the electrical and electronic industries, and the VDI (Verein Deutscher Ingenieure), the association of German engineers, has published a concept called RAMI (Referenzarchitekturmodell Industrie 4.0). This reference architecture model has a length of about 25 pages, which is OK. The first target listed for RAMI 4.0 is “providing a clear and simple architecture model as reference”.

However, when analyzing the model, there is little clearness and simplicity in it. The model is full of links to other norms and standards. It is full of multi-layer, sometimes three-dimensional architecture models. On the other hand, the model doesn’t provide answers on details, and only a few links to other documents.

RAMI 4.0 e.g. says that the minimal infrastructure of Industry 4.0 must fulfill the principles of Security-by-Design. There is no doubt that Industry 4.0 should consequently implement the principles of Security-by-Design. Unfortunately, there is not even a link to a description of what Security-by-Design concretely means.

Notably, security (and safety) are covered in a section of the document spanning not even 1% of the entire content. In other words: Security is widely ignored in that reference architecture, in these days of ever-increasing cyber-attacks against connected things.

RAMI 4.0 has three fundamental faults:

  1. It is not really concrete. It lacks details in many areas and doesn’t even provides links to more detailed information.
  2. While only being 25 pages in length and not being very detailed, it is still overly complex, with multi-layered, complex models.
  3. It ignores the fundamental challenges of security and safety.

Hopefully, we will see better concepts soon, that focus on supporting the challenges of agility and security, instead of over-engineering the world of things and Industry 4.0.

There Is No Such Thing as an API Economy

Martin Kuppinger explains why there is no API economy.

Free MFA on Windows for the masses

Cion Systems, a US-based IAM (Identity and Access Management) vendor, recently released a free service that allows Windows users to implement  two-factor authentication. It works only for Windows and supports either a PIN code sent via SMS (not available in all countries) or a standard USB storage key as the second factor. Thus, users can increase the security of their Windows devices in a simple and efficient manner, without the need of paying for a service or purchasing specialized hardware.

Their free service is based on the same technology Cion Systems is using for their enterprise self-service and multi-factor authentication solutions and services, which, e.g., also support password synchronization with Microsoft Office 365.

While such a service is not unique, it is a convenient approach for Windows users for increasing the security of their systems that still commonly are only protected via username and password.

Winners of the European Identity & Cloud Awards 2016

KuppingerCole presented The European Identity & Cloud Awards 2016 last night at the 10th European Identity & Cloud Conference (EIC). These awards honor outstanding projects and initiatives in Identity & Access Management (IAM), Governance, Risk Management and Compliance (GRC), as well as Cloud Security. Numerous projects were nominated by vendors and end-user companies during the last 12 months. The winners have been chosen by KuppingerCole analysts from among the most outstanding examples of applications and ideas in the areas of IAM, GRC, and Cloud security. Additionally, two Special Awards have been given.

In the category Best Innovation / New Standard, the award was granted to STIX, TAXII & CybOX. They enable cyber threat intelligence data to be shared among trusted partners and communities. Originating as a community project sponsored by the US Department of Homeland Security, STIX, TAXII, and CybOX are now being advanced as international standards via the OASIS open standards consortium.

The European Identity & Cloud Award for the category Best Innovation in eGovernment / eCitizen went to UK Government Digital Services (GDS) which has added a new verification service called GOV.UK Verify. This service is a simple way for UK citizens to access an increasing range of UK government services online. UK GDS is awarded for its continued efforts for enabling citizen IDs for access to a variety of services.

Another award has been given to TomTom in the category Best Consumer Identity Project. TomTom has initiated a program for delivering a new identity platform that manages identities of customers and devices worldwide at very large scale. The identity platform is a global solution including countries like China, where strict legal regulations are required. It is a great example of a way to manage all identities ─ of peoples, devices, and things ─ in a consistent way.

The award for Best Approach to Improving Governance and Mitigating Risks in 2016 went to the Qvarn-Platform. The Qvarn Platform is a free, open source solution for managing the professional identities of today’s moving workforces. It handles the security challenges in that area in an innovative way. Qvarn is currently used to serve the construction industry in the Nordic and Baltic regions, and is designed for Pan-European use.

dm-drogerie markt again achieved the award for the Best IAM Project for their integration of SSO and the use of RFID tokens for authentication. The RFID token enables users in branch offices to not only access the workstation, but at the same time provides access to all relevant applications via SSO. Using RFID tokens for personal identification and SSO allows full traceability of workstation and application accesses. Such applications of IAM in retail stores are still rare.

Orange Business Services was honored with the Best Cloud Security Award for providing their customers highly secure access to business critical cloud-based applications. They deliver a unified solution that offers seamless end-to-end connectivity with additional security using multi-factor authentication.

Finally, the Taqanu Bank received the Special Award for Responsive Innovation for building a blockchain based, mobile bank aimed at people without fixed addresses. 

The Special Award for Best Project in Research was given to Leeds Beckett University & III Taiwan for implementing a Cloud Computing Adoption Framework (CCAF) as part of the large scale systems between universities and industries.

The European Identity & Cloud Awards honor projects that promote the awareness for and business value of professional Identity Management and Cloud Security.

Bridging the Gap Between IT, OT and Business in the Digital Transformation Age

The Digital Transformation age is focussed on integrating digital technologies such as social, mobile, manufacturing, cloud computing etc.  It will inherently lead to new types of innovation and creativity and is already having far reaching application across business, government, medical and mass communications to name a few.  The Internet of Things (IoT), that is connecting everything to everything, also presents new challenges to organisations.   This new world places Business at risk because they have not embraced security standardisation, developed a holistic view of business risks across the business, or determined how Information Technology (IT) and Operational Technology (OT) will work together to minimise the risks.   

Digital Transformation is really a business transformation.  Business Models need to be rewritten to take advantage of the new possibilities that Digital Transformation brings as well as how to monetise these opportunities.  It is not just about deploying Smart Objects on the factory floor or implementing a blockchain solution to take care of one aspect of the business, it is about developing a go-to-market blueprint that will include reorganising the business, embracing the new technologies, optimising processes, binding customers and aim for a profitable outcome.

There is a huge trend to move away from offering just products and replacing them with customer services.  We have seen this for years with Cloud-based software licensing and, as an example, several markets have introduced electric motor vehicles on a “user pays” basis, so instead of buying a car for city use, you rent one by the hour or the day (find one on the street, walk up, and open it with an app on your smartphone), just like other services like bicycle rental.

In the Manufacturing sector, Smart Manufacturing has brought with it a whole new set of business opportunities but also increased risks.  The object of Industry 4.0 is to connect the manufacturing environment and OT to optimise the end-to-end processes and to build a service infrastructure between the business and the end customer.  Optimisation will be disruptive and may well disenfranchise the middlemen, such as brokers and dealers, from the new operating model.

Optimising the end-to-end view of an organisation that joins the business view to the manufacturing view opens up the manufacturing side to attack as well as the business systems.   This changes the security paradigm and puts everything at risk.  The IoT and “things” controlling a manufacturing process open up areas of cyber threat that were not previously there.  With Smart Vehicles a blackbox could capture data such as performance, location or payment information which would be made available to service providers, motor manufacturers, insurance companies, law enforcement etc.   There are a myriad of possibilities and they all need to be managed in an optimal, controlled, safe and secure manner.

A new Business Model must incorporate the requirement to adopt a standardised and configurable security infrastructure to manage cyber risk and at the same time, enable the business to become agile.  Agility will enable the business to quickly react to new opportunities or changed circumstances and improve its competitive advantage.

Businesses must also develop a Risk Management Plan to deal with the new circumstances, with a focus on risk mitigation.  While risk cannot be totally eliminated, major risks can be identified and mitigated that could endanger the organisation from a number of different perspectives: cost, reputation, regulation, legal, business process, or technical.  A comprehensive communications plan is also vital to addressing incident responses across the spectrum of the enterprise.

In this new Digital Transformation age, organisations have to think about security by design and, as a result, agility by design.  The IT/OT group must implement a secure, standardised and configurable security infrastructure that embraces security and privacy by design.  This will allow an organisation the flexibility required to open or close configurations to meet changing regulatory demands, exchange information with the outside, and address risks as they occur in a quick and economical way and not in the old inefficient ways of costly and risky code changes.

Organisations might consider merging the IT and OT organisations to deliver their part of the Business Model in a more efficient and integrated manner.  OT has always been challenging in its own right.  OT systems are required to control valves, engines, conveyors and other machines to regulate various process values, such as temperature, pressure, flow, and to monitor them to prevent hazardous conditions. OT systems have used various technologies for hardware design and communications protocols, that are unknown in IT. The most common problems are legacy system and devices and numerous vendor architectures and standards.  The focus of OT has been availability rather than confidentiality, integrity and availability as is the case with IT.  As OT embraces Smart Devices, integrating OT into an overall enterprise solution will require standardised data exchange abilities and standardised, configurable security to manage the environment.  Combining the IT and OT organisations can help facilitate and optimise an organisations end-to-end security and data management in a consistent and optimal manner.

Multi-Factor, Adaptive Authentication Security Cautions

KuppingCole has written previously on the benefits of adaptive authentication and authorization, and the need for authentication challenges that go beyond the password. These benefits fall largely under the categories of an improved user experience, since the user only gets challenged for multiple authentication challenges based on risk and context, as well as improved security precisely due to the use of multi-factor, multi-channel authentication challenges.

However, these multi-factor authentication challenges only offer additional security if the multiple challenges used for these authentication challenges are sufficiently separated. Some examples of common approaches to multi-factor authentication include the use of one-time passwords sent via an SMS message, or smartphone applications which function as soft tokens for time-limited passwords. These are generally a good idea, and do offer additional security benefits. But, if the application that depends on multi-factor authentication as an additional security measure is itself a mobile application then the lack of separation between the channels used for multi-factor authentication vitiates the possible security benefits of MFA.

Security researchers have recently proven how both a compromised Android or iOS smartphone can be manipulated by attackers in order to enable them to capture the additional step-up authentication password from the smartphone itself. This is one of the outstanding challenges of anywhere computing. Another attack that that is immune to the additional security provided by multi-factor authentication is the man-in-the-browser-attack MITB. With this type of attack, a malicious actor gains control of a user’s browser via a browser exploit. The user then logs into, for example, online banking, and successfully completes all necessary, multi-factor authentication challenges perform a high risk action such as performing an electronic fund transfer, the hijacked browser can be used by the attacker to substitute form data the the user is imputing. In this example the sum could be redirected to a strangers bank account.

With the MITB attack, the user is seen by the accessed service as fully authenticated, but since the browser itself has been compromised, any action the user could have done legitimately can also appear to have been done by the attacker.

With a user’s smartphone already receiving emails and being used for browsing, the additional use of smartphones for multi-factor authentication must be carefully considered. Otherwise, it only provides the illusion of security. These attacks do not make adaptive, multi-factor authentication useless, but they do show that there is no single mitigation approach that allows an organization to ignore the ever-evolving cybersecurity threat landscape.

Tactical security approaches here include careful selection and separation of authentication channels when MFA is used, as well as the use of additional web service and browser scripting protection approaches which have been developed to mitigate MITB attacks.

Yet the strategic solution remains an approach that is not solely focused on prevention. With the digital transformation well underway, it is difficult to employee endpoints, and almost impossible to control consumer endpoints. A strategic, holistic security approach should focus on prevention, detection and response, an approach known as Real-Time Security Intelligence. It should focus on the data governance, regardless of the location of the information asset, an approach known as Information Rights Management.

Unknown and sophisticated attack vectors will persist, and balancing security and user experience does remain a challenge, but the RTSI approach recognizes this and does not ever assume that a system or approach can be 100% immune to vulnerabilities.

Be careful not to DROWN

On March 1st OpenSSL published a security advisory CVE-2016-0800, known as “DROWN”. This is described as a cross-protocol attack on TLS using SSLv2 and is classified with a High Severity. The advice given by OpenSSL is:

“We strongly advise against the use of SSLv2 due not only to the issues described below, but to the other known deficiencies in the protocol as described at https://tools.ietf.org/html/rfc6176

This vulnerability illustrates how vigilant organizations need to be over the specific versions of software that they use. However, this is easier said than done. Many organizations have a website or application that was built by a third party. The development may have been done some time ago and used what were the then current versions of readily available Open Source components. The developers may or may not have a contract to keep the package they developed up to date.

The application or website may be hosted on premise or externally; wherever it is hosted, the infrastructure upon which it runs also needs to be properly managed and kept up to date. OpenSSL is part of the infrastructure upon which the website runs. While there may be some reasons for continuing to use SSLv2 for compatibility, there is no possible excuse for reusing SSL Private Keys between websites. It just goes against all possible security best practices.

It may be difficult to believe but I have heard auditors report that when they ask “what does that server do?” they get the response “I don’t know – it’s always been here and we never touch it”. The same can be true of VMs in the cloud which get created, used and then forgotten (except by the cloud provider who keeps on charging for them).

So as vulnerabilities are discovered, there may be no process to take action to remediate the operational package. The cyber criminals just love this. They can set up an automated process to externally scan to find where known vulnerabilities exist unpatched and exploit the results at their leisure.

There are two basic lessons from this:

  1. Most code contains exploitable errors and its evolution generally leads to a deterioration in its quality over time unless there are very stringent controls over change. It is attractive to add functionality but increase in size and complexity leads to more vulnerabilities. Sometimes it is useful to go back to first principles and recode using a stringent approach.
    I provided an example of this in my blog AWS Security and Compliance Update. AWS has created a replacement for OpenSSL TLS - S2N Open Source implementation of TLS. S2N replaces the 500,000 lines code in OpenSSL with approximately 6,000 lines of audited code. This code has been contributed to Open Source and is available from S2N GitHub Repository.

  2. Organizations need to demand maintenance as part of the development of code by third parties. This is to avoid the need to maintain out of date infrastructure components for compatibility.
    The infrastructure, whether on premise or hosted, should be kept up to date. This will require change management processes to ensure that changes do not impact on operation. This should be supported by regular vulnerability scanning of operational IT systems using one of the many tools available together with remediation of the vulnerabilities detected.

IT systems need to have a managed lifecycle. It is not good enough to develop, deploy and forget.

IBM Acquires Resilient Systems

Yesterday at the RSA Conference, IBM has officially confirmed what’s already been a rumor for some time – the company is planning to acquire Resilient Systems for an undisclosed amount.

Resilient Systems, a relatively small privately held company based in Cambridge, MA, is well known for its Incident Response Platform, a leading solution for orchestrating and automating incident response processes. With the number of security breaches steadily growing, the focus within IT security industry is currently shifting more and more from detection and prevention towards managing the consequences of an attack that’s already happened. Such an incident response solution can provide a company with a predefined strategy for responding to various types of attacks, tailored to specific laws and industry regulations. It would then support the IT department at every step of the process, helping to get the affected infrastructure back online, address privacy concerns, solve organizational and legal issues and so on.

Despite being on the market for less than 5 years, Resilient Systems has already become a leading player in this segment, with their IRP solution being used by a variety of clients in all verticals, from mid-size businesses to Fortune 500 companies. Among other features, the product is known for its integration with multiple leading security solutions. In fact, Resilient Systems has been IBM’s partner for some time, integrating their product with IBM’s QRadar.

So, in the hindsight, the announcement doesn’t really come as a big surprise. For IBM Security, this acquisition means not just incorporating a leading incident response solution into their cyber security portfolio, but also hiring a 100 men strong team of security experts including the venerable Bruce Schneier, who’s currently serving as the Resilient Systems’ CTO. What’s in the deal for Resilient Systems is not as easy to say, since the financial details of the deal are not disclosed, but we can definitely be sure that gaining access to IBM’s vast partner network opens a lot of interesting business prospects.

By adding the new Incident Response capabilities to their existing QRadar security intelligence solution and X-Force Exchange threat intelligence platform, IBM is hoping to become the world’s first vendor with a fully integrated platform for security operations and response. In the same press release, the company has already announced their new IBM X-Force Incident Response Services.

There is no Consumer Identity & Access Management at all – at least not as a separate discipline

These days, there is a lot of talk about Consumer Identity & Access Management or CIAM. However, there is no such thing as CIAM, at least not as a separate discipline within IAM. There are technologies that are of higher relevance when dealing with customers and consumers than they are when dealing with employees. But there neither are technologies that are required for CIAM only nor is there any benefit in trying to set up a separate CIAM infrastructure.

This does not mean that IAM should or must not focus on consumers – in contrast. But it is about extending and, to some extent, renovating the existing on-premise IAM, which commonly is focused on employees and some business partners. It is about one integrated approach for all identities (employees, partner, consumers,…), managing their access to all services regardless of the deployment model, using all types of devices and things. It is about seamlessly managing all access of all identities in a consistent way. Notably, “consistent way” is not the same as “from a single platform”.

So why don’t we need a separate CIAM? The easiest answer is found by asking a simple question: “Is there any single application in your organization that is only accessed by consumers?” This implies “and not by at least some of your employees, e.g. for customer services, administration & operations, or analyzing the data collected.” The obvious answer on that question is that there is no such application. There are applications, which are only used by employees, but not the other way round. So why should there be separate IAM deployments for applications that are used by a common group of users? That could only result in security issues and management trouble.

The other aspect is that the way applications are used within the enterprise is changing anyway. Mobile users access cloud applications without even touching the internal network anymore. Thus, technologies such as Adaptive Authentication, Cloud IAM or IDaaS (Identity Management as a Service), Identity Federation, etc. are relevant not only for consumer-facing solutions but for all areas of IAM.

Finally, there is the aspect, that users frequently have multiple digital identities, even in relationship to their employers. Many employees of car manufacturers also are customers of that company. Many employees of insurance companies also have insurance contracts with that companies, and some even act as freelance insurance brokers. Managing such complex relationships becomes far easier when having one IAM for all – employees, partner, and consumers. One IAM that serves all applications, on-premise and in the Cloud. And one IAM, that supports all type of access.

That might anyway result in projects that focus on managing consumer access to services, IAM for cloud services, and so on. But all these projects should be part of moving IAM to the next level: An IAM that serves all requirements, from the traditional access of an employee using his PC in the corporate LAN to access a legacy backend system to the mobile consumer coming in via a social login and accessing a cloud service.

Stay Connected

Discover KuppingerCole

KuppingerCole Select

Register now for KuppingerCole Select and get your free 30-day access to a great selection of KuppingerCole research materials and to live trainings.

Blog

Spotlight

Learn more

Internet of Things & Industry 4.0

Internet of Things the intelligent connectivity of smart devices by which objects can sense one another and communicate, thus changing how where and by whom decisions about our physical world are made. Manufacturing companies are currently implementing this “intelligent connectivity of smart devices” in their factories and on the shop floor. To distinguish these applications of the IoT from those among consumers and other realms, the term Industrial Internet of Things is often used. (...)

Latest Insights

How can we help you

Send an inquiry

Call Sales-Team +49 211 2370770

Mo – Fr 8:00 – 17:00