KuppingerCole Blog

Obsession? Hype? Revolution? It Might Be a Bit of Everything: Moving Into the Age of Blockchain in Cybersecurity, Identity and Privacy

In looking at the current investor craze mainly around the primary use case of blockchain, the Bitcoin, it sometimes gets a bit difficult to think beyond the bubble and track those blockchain projects, which indeed are on their way to becoming useful in changing the way we do things like selling or buying stuff, digitally moving value, immutably store any kind of documents and data, consume information, create and manage digital IDs, or otherwise influence and change most aspects of our social, political and economic interactions. What we see happening in the crypto-world, is an explosion of creativity and innovation, well-funded through initial coin offerings (ICOs). Most of the blockchain projects we are observing show a high potential for disrupting whole industries.    

Blockchain in Cybersecurity

Based on decades of research in cryptography and resilience, cybersecurity and blockchain technology have the same roots and look like natural allies. In offering a totally new way of securing information integrity, performing transactions and creating trust relationships between parties that don´t know each other, blockchains are secure by design and suit well for use cases with high security requirements. It is therefore easily understandable that DARPA (US Defense Advanced Research Projects Agency) has been funding a number of interesting blockchain startups experimenting with secure, private and failsafe communication infrastructures. DARPA’s program manager behind the blockchain effort, Timothy Booher, well describes the paradigm shift blockchain implies to cybersecurity in an analogy: “Instead of trying to make the walls of a castle as tall as possible to prevent an intruder from getting in, it’s more important to know if anyone has been inside the castle, and what they’re doing there.”

Blockchain Identity & Privacy: It all Depends on the Governance Model

Managing digital identities as well as linking them to real humans (identification) is becoming a primary playground for blockchain technology, as it is fundamental for any blockchain use case and as it seems to not only reduce vulnerabilities of traditional infrastructures, but finally offer a solution to give control over personal information back to the user it belongs to (“Self-Sovereign Identity – SSI”). However, the assumption that blockchain is the only way to repair the missing internet identity layer would be as wrong as the opposite assumption. There is no doubt about that blockchain will change the way we deal with identity and privacy, but there are some vital challenges to be solved before -  with Blockchain Governance being the one that matters most, as all other problems that are being discussed depend on selecting the right governance model:

  • How do we deal with change? We have been in the IT space long enough to know that the only constant is permanent change. Who would decide on “updating” the blockchain? How much of the pure-play blockchain do we need to give up avoiding messing with hard-forks?
  • Scalability: The proof-of-work based Bitcoin blockchain has its limits.  Is proof-of-stake the only viable alternative or will we soon see massive parallel blockchain infrastructures?
  • Private vs. Public, "permissioned vs. unpermissioned": Are we facing a future of walled blockchain gardens?
  • Off-Chain vs. On-Chain Governance: What are the risks of on-chain Governance? Will self-amending ledgers be the ones that rule the identity field?
  • Future Governance Models based on prediction markets

Shaping the Future of Blockchain ID, Privacy & Security: Be part of it!

The Blockchain discussion will continue to be a core element in KuppingerCole´s Upcoming Events

For the 1st time ever, we´ll offer a “Blockchain ID Innovation Night” at #EIC18, where you will meet with developers, evangelists and experts from most or all blockchain ID projects out there.

McAfee Acquire Skyhigh Networks

McAfee, from its foundation in 1987, has a long history in the world of cyber-security.  Acquired by Intel in 2010, it was spun back out, becoming McAfee LLC, in April 2017.  According to the announcement on April 23rd, 2017 by Christopher D. Young, CEO – the new company will be “One that promises customers cybersecurity outcomes, not fragmented products.   So, it is interesting to consider what the acquisition of Skyhigh Networks, which was announced by McAfee on November 27th, will mean. 

Currently, McAfee solutions cover areas that include: antimalware, endpoint protection, network security, cloud security, database security, endpoint detection and response, as well as data protection.   Skyhigh Networks are well known for their CASB (Cloud Access Security Broker) product.  So how does this acquisition fit into the McAfee portfolio? 

Well, the nature of the cyber-risks that organizations are facing has changed.  Organizations are increasingly using cloud services because of the benefits that they can bring in terms of speed to deployment, flexibility and price.  However, the governance over the use of these services is not well integrated into the normal organizational IT processes and technologiesCASBs address these challengesThey provide security controls that are not available through existing security devices such as Enterprise Network Firewalls, Web Application Firewalls and other forms of web access gateways. They provide a point of control over access to cloud services by any user and from any device.  They help to demonstrate that the organizational use of cloud services meets with regulatory compliance needs. 

In KuppingerCole’s opinion, the functionality to manage access to cloud services and to control the data that they hold should be integrated with the normal access governance and cyber security tools used by organizations.  However, the vendors of these tools were slow to develop the required capabilities, and the market in CASBs evolved to plug this gap.  The McAfee acquisition of Skyhigh Networks is the latest of several recent examples of acquisitions of CASBs by major security and hardware software vendors. 

The diagram illustrates how the functions that CASBs provide fit into the overall cloud governance process. These basic functionalities are:

  1. Discovery of what cloud services are being used, by whom and for what data.
  2. Control over who can use which services and what data can be transferred.
  3. Protection of data in the cloud against unauthorized access and leakage.
  4. Regulatory compliance and protection against cyber threats through the above controls.

So, in this analysis CASBs are closer to Access Governance solutions than to traditional cyber-security tools.  They recognize that identity and access management are the new cyber-frontier, and that cyber-defense needs to operate at this level.  By providing these functions Skyhigh Networks provides a solution that is complementary to those already offered by McAfee and extends McAfee’s capabilities in the direction needed to meet the capabilities of the cloud enabled, agile enterprise.

The Skyhigh Networks CASB provides comprehensive functionality that strongly matches the requirements described above.  It is also featured in the leadership segment of KuppingerCole’s Leadership Compass: Cloud Access Security Brokers - 72534.  This acquisition is consistent with KuppingerCole’s view on how cyber-security vendors need to evolve to meet the challenges of cloud usage.  Going forward, organizations need a way to provide consistent access governance for both on premise and cloud based services.  This requires functions such as segregation of duties, attestation of access rights and other compliance related governance aspects.  Therefore, in the longer term CASBs need to evolve in this direction.  It will be interesting to watch how McAfee integrates the Skyhigh product and how the McAfee offering evolves towards this in the future.

Please! No More GDPR Related Blog Posts!

You have heard it all before: May 25th, 2018, enormous fines, "you have to act now", the "right to be forgotten", DPO and breach notification. Every manufacturer whose marketing database contains your data will send you information, whitepapers, webinars, product information and reminders about GDPR. And they of course can help you in getting towards compliance. So you have set up a filter in your mail client that sorts GDPR messages directly into spam and #gdpr is muted in your Twitter client.  

Because you have started your journey towards compliance to GDPR early? Compliance activities have long been established and your employees are informed? Consent management is not just theory? Data Leakage is prevented, monitored, detected and if it does occur, communicated appropriately?  

But there might be still a good reason to read on: Unlike other regulations, there is no regular inspection of compliance with the requirements. Rather, individuals (including customers, employees or other relevant data subjects) and the competent supervisory authorities are able to make enquiries if alleged or actual omissions or offences are to be investigated. However, as yet there is no proof of GDPR compliance as a regular and permanent seal of quality. 

It is difficult to identify sufficient indicators for good preparation. Yes, vendors and integrators provide some basic questionnaires… But you still might be in need of a neutral set of criteria determining the maturity level of your organization's readiness in the areas of compliance with regulatory or industry-specific regulations or frameworks. To support such reviews, KuppingerCole provides Maturity Level Matrixes that are specifically targeted to distinct areas of the IT market, in this case, GDPR readiness.  

Assessing the quality and maturity of the controls, systems and processes implemented by your organization is essential. Given the level of agility required from business and market requirements this assessment needs to be executed on a regular basis. Continuous improvements are essential to achieve an adequate level of compliance in all key areas of the GDPR. 

To achieve the highest level 5 of GDPR maturity it is essential to continuously measure GDPR readiness to enable an organization to understand their status quo, document it and, if possible, realize the potential benefits of investing in improving data protection. Then you might happily ignore further GDPR-related blogposts. 

The KuppingerCole Maturity Level Matrix for GDPR readiness provides neutral criteria exactly for that purpose. Find it here for download. 

And get in touch with us if you feel that an independent assessment (along the lines of exactly the same maturity levels) might be even more meaningful. 

Not Just Another Buzzword: Cyber Risk Governance

Today, companies are increasingly operating on the basis of IT systems and are thus dependant on them. Cyber risks must therefore be understood as business risks. The detection and prevention of cyber security threats and appropriate responses to them are among the most important activities to protect the core business from risks. 

But in practice, however, many challenges arise here. The requirement to arrive at a uniform and thus informed view of all types of business risks often fails due to a multitude of organisational, technical and communication challenges:  

Technical risk monitoring systems in the enterprise (e. g. systems for monitoring compliance with SoD rules or systems for monitoring network threats at the outer edge of an enterprise network) are often extremely powerful in their specific area of application. Interoperability across these system boundaries usually fails due to a lack of common language (protocols) or the semantics of information to be exchanged (uniform risk concepts and metrics). 

The same thing is happening in the organization of large organizations: although it is only a few years in which we have observed this trend, this leads to independently operating IT operations teams, IT security teams and (cyber) governance teams that focus on individual tasks and their solutions with which they deal with individual, but very similar problems. They typically act without adequate integration into a corporate security strategy or a consolidated communication approach for the joint, coordinated management of risks. They do this without correlating the results to determine a comprehensive IT security maturity and thus without identifying the overall risk situation of a company. 

Management boards and executives must act and react on the basis of incomplete and mostly very technical data, which can only lead to inadequate and incomplete results. The implicit link between cyber risks and business risks is lost when only individual aspects of cyber security are considered. Management decisions made on the basis of this information are usually far from adequate and efficient. 

The only way to solve this problem is to move from tactical to strategic approaches. Recently the term “Cyber Risk Governance” has been coined to describe holistic solutions to this problem, covering organization, processes and technologies. More and more companies and organizations are realizing that cyber risk governance is a challenge that needs to be addressed at management level. Cyber security and regulatory compliance are strong drivers for rethinking and redesigning a mature approach to improve cyber resilience.  

This requires an adequate strategic approach instead of tactical, more or less unplanned ad hoc measures. A strong risk governance organisation, a strategic framework for a comprehensive cyber risk governance process and related technological components must underpin it. This can only be achieved by bundling corporate expertise, taking a holistic view of the overall risk situation and understanding the sum of all risk mitigation measures implemented. 

If the situation described above sounds familiar, read more about “Cyber Risk Governance” as a strategic architecture and management topic in the free KuppingerCole "White Paper: TechDemocracy: Moving towards a holistic Cyber Risk Governance approach". 

Administrative Security in Security Products

At KuppingerCole, cybersecurity and identity management product/service analysis are two of our specialties. As one might assume, one of the main functional areas in vendor products we examine in the course of our research is administrative security. There are many components that make up admin security, but here I want to address weak authentication for management utilities.

Most on-premises and IaaS/PaaS/SaaS security and identity tools allow username and password for administrative authentication. Forget an admin password? Recover it with KBA (Knowledge-based authentication).

Many programs accept other stronger forms of authentication, and this should be the default. Here are some better alternatives:

  • Web console protected by existing Web Access Management solution utilizing strong authentication methods 
  • SAML for SaaS 
  • Mobile apps (if keys are secured in Secure Enclave, Secure Element, and app runs as Trusted App in Trusted Execution Environment [TEE]) 
  • FIDO UAF Mobile apps 
  • USB Tokens 
  • FIDO U2F devices 
  • Smart Cards 

Even OATH TOTP and Mobile Push apps, while having some security issues, are still better than username/passwords.

Why? Let’s do some threat modeling.

Scenario #1: Suppose you’re an admin for Acme Corporation, and Acme just uses a SaaS CIAM solution to host consumer data. Your CIAM solution is collecting names, email addresses, physical addresses for shipping, purchase history, search history, etc. Your CIAM service is adding value by turning this consumer data into targeted marketing, yielding higher revenues. Until one day a competitor comes along, guesses your admin password, and steals all that business intelligence. Corporate espionage is real - the “Outsider Threat” still exists.

Scenario # 2: Same CIAM SaaS background as #1, but let’s say you have many EU customers. You’ve implemented a top-of-the-line CIAM solution to collect informed consent to comply with GDPR. If a hacker steals customer information and publishes it without user consent, will Acme be subject to GDPR fines? Can deploying username/password authentication be considered doing due diligence?

Scenario # 3: Acme uses a cloud-based management console for endpoint security. This SaaS platform doesn’t support 2FA, only username/password authentication. A malicious actor uses KBA to reset your admin password. Now he or she is able to turn off software updates, edit application whitelists, remove entries from URL blacklists, or uninstall/de-provision endpoint agents from your company’s machines. To cover their tracks, they edit the logs. This would make targeted attacks so much easier.

Upgrading to MFA or risk-adaptive authentication would decrease the likelihood of these attacks succeeding, though better authentication is not a panacea. There is more to cybersecurity than authentication. However, the problem lies in the fact that many security vendors allow password-based authentication to their management consoles. In some cases, it is not only the default but also the only method available. Products or services purporting to enhance security or manage identities should require strong authentication.

For Oracle, the Future Is Autonomous

Recently, I have attended the Oracle OpenWorld in San Francisco. For five days, the company has spared no expenses to inform, educate and (last but not least) entertain its customers and partners as well as developers, journalists, industry analysts and other visitors – in total, a crowd of over 50 thousand. As a person somewhat involved in organizing IT conferences (on a much smaller scale, of course), I could not but stand in awe thinking about all the challenges organizers of such an event had to overcome to make it successful and safe.

More important, however, was the almost unexpected thematic twist that dominated the whole conference. As I was preparing for the event, browsing the agenda and the list of exhibitors, I found way too many topics and products quite outside of my area of coverage. Although I do have some database administrator (DBA) experience, my current interests lie squarely within the realm of cybersecurity and I wasn’t expecting to hear a lot about it. Well, I could not be more wrong! In the end, cybersecurity was definitely one of the most prominent topics, starting right with Larry Ellison’s opening keynote.

The Autonomous Database, the world’s first database, according to Oracle, that comes with fully automated management, was the first and the biggest announcement. Built upon the latest Oracle Database 18c, this solution promises to completely eliminate human labor and hence human error thanks to complete automation powered by machine learning. This includes automated upgrades and patches, disaster recovery, performance tuning and more. In fact, an autonomous database does not have any controls available for a human administrator – it just works™. Of course, it does not replace all the functions of a DBA: a database specialist can now focus on more interesting, business-related aspects of his job and leave the plumbing maintenance to a machine.

The offer comes with a quite unique SLA that guarantees 99.995% availability without any exceptions. And, thanks to more elastic scalability and optimized performance, “it’s cheaper than AWS” as we were told at least a dozen times during the keynote. For me however, the security implications of this offer are extremely important. Since the database is no longer directly accessible to administrators, this not only dramatically improves its stability and resilience against human errors, but also substantially reduces the potential cyberattack surface and simplifies compliance with data protection regulations. This does not fully eliminate the need for database security solutions, but at least simplifies the task quite a bit without any additional costs.

Needless to say, this announcement has caused quite a stir among database professionals: does it mean that a DBA is now completely replaced by an AI? Should thousands of IT specialists around the world fear for their jobs? Well, the reality is a bit more complicated: the Autonomous Database is not really a product, but a managed service combining the newest improvements in the latest Oracle Database release with the decade-long evolution of various automation technologies, running on the next generation of Oracle Exadata hardware platform supported by the expertise of Oracle’s leading engineers. In short, you can only get all the benefits of this new solution when you become an Oracle Cloud customer.

This is, of course, a logical continuation of Oracle’s ongoing struggle to position itself as a Cloud company. Although the company already has an impressive portfolio of cloud-based enterprise applications and it continues to invest a lot in expanding their SaaS footprint, when it comes to PaaS and IaaS, Oracle still cannot really compete with its rivals that started in this business years earlier. So, instead of trying to beat competitors on their traditional playing fields, Oracle is now focusing on offering unique and innovative solutions that other cloud service providers simply do not have (and in the database market probably never will).

Another security-related announcement was the unveiling of the Oracle Security Monitoring and Analytics – a cloud-based solution that enables detection, investigation and remediation of various security threats across on-premises and cloud assets. Built upon the Oracle Management Cloud platform, this new service is also focusing on solving the skills gap problem in cybersecurity by reducing administration burden and improving efficiency of cybersecurity analysts.

Among other notable announcements are various services based on applied AI technologies like intelligent conversation bots and the newly launched enterprise-focused Blockchain Cloud Service based on the popular Hyperledger Fabric project. These offerings, combined with the latest rapid application development tools unveiled during the event as well, will certainly make the Oracle Cloud Platform more attractive not just for existing Oracle customers, but for newcomers of all sizes – from small startups with innovative ideas to large enterprises struggling to make their transition to the cloud as smooth as possible.

The Power of Utility in the Future of Marketing

Guest post by Christian Goy, Co-founder and Managing Director of Behavioral Science Lab

In the future, marketing will be driven neither by demographics, on- or off-line behavioral identifiers or psychographics, but by understanding and fulfilling the individual utility expectations of the consumer.

Mitch Joel captures this view of future marketing by concluding, “If the past decade was about developing content and engagement strategies in social channels (in order to provide value, humanize the brand, be present in search engines and more), the next decade will be about the brands that can actually create a level of utility for the consumer.” 

No one disputes that a persuasive marketing message or social media campaign drives web traffic. However, if your brand does not deliver utility, it will not be purchased. Consumers do not love brands because of their brilliant ad campaigns or funny videos on Facebook. Consumers love brands that create utility or true value for themselves; this is what creates affinity between the consumer and the brand, not just the brand attributes. Utility is what consumers believe they cannot live without.

Utility is the heart of behavioral economics. The utility of each product or service is determined by a very specific set of psychological and economic elements, which determine how the consumer determines the expected value (utility) of each brand. Relative differences in expected utility associated with each choice option determines how much consumers will pay, what they purchase and how loyal they expect to be. Interestingly, we are learning that the economic and psychological factors that determine utility and purchase have little or nothing to with the buyer’s demographics or psychographics.

In none of our studies did demographic or psychographic segmentation explain why consumers switch or remain loyal to a brand. However, when consumers were typed by their utility expectation for individual brands, our clients were able to predict with extreme accuracy whether the consumer would stay loyal, switch away from their brand, and more importantly, why.

Knowing the expectation of utility explains why Instacart — an app that lets shoppers buy all their groceries online from any grocery store and have them delivered to their doorstep — became an instant hit for a small, but important, percentage of US shoppers.

Old line marketers assumed that a certain percentage of US shoppers with relatively high household income and education, who were environmentally savvy and attracted to organic produce would remain loyal Whole Foods or Trader Joe’s shoppers. What they didn’t understand was that the utility for those shoppers was not driven by their demographics or psychographics, but by what they were looking for — convenience, ease of shopping and minimal shopping time which could not be fulfilled by either Whole Foods or Trader Joe’s.

What is next for marketing?

To remain effective, marketing must move beyond traditional segmentation, psychographics, and message development strategies. Marketers should first understand what drives true utility for their consumers — what consumers value, what they can or cannot live without. As some are already doing, marketers will create personalized messages that maximize individual utility expectations requirements by:

Deeply understanding what drives the expectation of the utility of their products — These are the psychological and economic decision elements used by the buyer to define utility.

Defining buyers by their utility expectation — Group buyers on the basis of a similar utility expectation. This allows marketers to be more cost-efficient and effective in their messaging and product offerings because the specific needs of their customers will be met.

Creating products and services that address consumers’ psychological and economic needs — Do not just focus on the product. Understand how the consumer defines utility, and then deliver on it. In our studies, we have found that by only addressing and fulfilling the primary driver in consumers’ utility “equation,” the likelihood of purchase is very high. Just imagine how much greater the likelihood of purchase could be if the second and third drivers were addressed as well.

Product and service utility are the future of effective marketing. Start today with an understanding how your consumers arrive at their utility expectation to stay ahead of the game. 

Learn more about in my session at the Consumer Identity World from November 27-29, 2017 in Paris.

Cryptography’s Darkest Hour

For anyone working in IT security, this week surely did not start well. Not one, but two major cryptography-related vulnerabilities have been disclosed, and each of them is at least as massive in scale and potential consequences as the notorious Heartbleed incident from 2014.

First, a Belgian researcher Mathy Vanhoef from the University of Leuven has published the details of several critical weaknesses discovered in WPA2 – the de-facto standard protocol used for securing modern Wi-Fi networks. By exploiting these weaknesses, an attacker can launch so-called key reinstallation attacks (hence the name KRACK, and we’ve discussed the importance of catchy names for vulnerabilities before) and eventually decrypt any sensitive data transmitted over a supposedly secured wireless channel.

As opposed to Heartbleed, however, the vulnerability is not found in a particular library or a product – it’s caused by an ambiguity in the definition of the WPA2 protocol itself, so any operating system or library that implements it correctly is still vulnerable. Thus, all desktop and mobile operating systems were affected by this attack, as well as numerous embedded and IoT devices with built-in Wi-Fi capabilities. Somewhat luckily, this protocol weakness can be fixed in a backwards-compatible manner, so we do not have to urgently switch to WPA3 (and by no means you should switch to WEP or any other even less secure connection method in your wireless network). However, there is no other way to mitigate the problem without patching each client device. Changing the Wi-Fi password, for example, won’t help.

Of course, quite a few vendors have already released updates (including Microsoft), but how long will it take for everyone to apply these? And what about huge numbers of legacy products which will never be patched? The only way to secure them properly is to disable Wi-Fi and basically repurpose them as expensive paperweights. For desktop and mobile users, using HTTPS-only websites or encrypted VPN tunnels for accessing sensitive resources is recommended, just like for any other untrusted network, wireless or not. In general, one should slowly get used to the notion of treating every network as untrusted, even their own home Wi-Fi.

 The second vulnerability revealed just recently is of a different nature, but already classified as even more devastating by many experts. The ROCA (Return of the Coppersmith’s Attack) vulnerability is an implementation flaw discovered by an international team of British, Czech and Italian researchers in a cryptographic library used in security chips produced by Infineon Technology. This flaw essentially means that RSA encryption keys generated using these chips are not cryptographically strong and are much easier to crack.

In theory, this problem should not be as widespread as the KRACK vulnerability, but in reality, it affects numerous security products from such vendors as Microsoft, Google, HP or Lenovo and existing RSA keys dating as far back as 2012 can be vulnerable. Also, since public key cryptography is so widely used in IT – from network encryption to signing application code to digital signatures in eGovernment projects – this opens a broad range of potential exploits: spreading malware, preforming identity theft or bypassing Trusted Platform Modules to run malicious code in secure environments.

What can we do to minimize the damage of this vulnerability? Again, it’s first and foremost about checking for available security updates and applying them in timely manner. Secondly, all potentially affected keys must be replaced (nobody should be using 1024-bit RSA keys in 2017 anyway).

And, of course, we always have be ready for new announcements. The week has only just begun, after all!

The Need for Speed: Why the 72-hour breach notification rule in GDPR is good for industry

The EU’s General Data Protection Regulation (GDPR) will force many changes in technology and processes when it comes into effect in May 2018.  We have heard extensively about how companies and other organizations will have to provide capabilities to:

  • Collect explicit consent for the use of PII per purpose
  • Allow users to revoke previously given consent
  • Allow users to export their data
  • Comply with users’ requests to delete the data you are storing about them
  • Provide an audit trail of consent actions

Software vendors are preparing, particularly those providing solutions for IAM, CIAM, ERP, CRM, PoS, etc., by building in these features if not currently available. These are necessary precursors for GDPR compliance.  However, end user organizations have other steps to take, and they should begin now.

GDPR mandates that, 72 hours after discovering a data breach, the responsible custodian, in many cases it will be the organization’s Data Protection Officer (DPO), must notify the Supervisory Authority (SA).  If EU persons’ data is found to have been exfiltrated, those users should also be notified. Organizations must begin preparing now how to execute notifications: define responsible personnel, draft the notifications, and plan for remediation.

Consider some recent estimated notification intervals for major data breaches in the US:

  • Equifax: 6 weeks to up to 4-5 months
  • Deloitte:  perhaps 6 months
  • SEC: up to 1 year
  • Yahoo: the latest revelations after the Verizon acquisition indicate up to 4 years for complete disclosure

The reasons data custodians need to be quick about breach notifications are very clear and very simple:

  • The sooner victims are notified, the sooner they can begin to remediate risks.  For example, Deloitte’s customers could have begun to assess which of their intellectual property assets were at risk and how to respond earlier.
  • Other affected entities can begin to react.  In the SEC case, the malefactors had plenty of time to misuse the information and manipulate stock prices and markets. 
  • Cleanup costs will be lower for the data custodian.  Selling stocks after breaches are discovered but prior to notification may be illegal in many jurisdictions.
  • It will be better for the data custodian’s reputation in the long run if they quickly disclose and fix the problems.  The erosion of Yahoo’s share price prior to purchase is clear evidence here.

Understandably, executives can be reticent in these matters.  But delays give the impression of apathy, incompetence, and even malicious intent on the part of executives by attempting to hide or cover up such events. Though GDPR is an EU regulation, it directly applies to other companies and organizations who host data on EU member nations’ citizens.  Even for those organizations not subject to GDPR, fast notification of data breaches is highly recommended. 

CIAM Vendor Gigya to be Acquired by SAP Hybris

This past weekend we learned that Gigya will be acquired by SAP Hybris.  California-based Gigya has been a top vendor in our CIAM Platforms Leadership Compass reports. Gigya offers a pure SaaS CIAM solution, and has one of the largest customer bases in the market.  SAP’s Identity solution was previously positioned more as an IDaaS for SAP customers for SAP use cases.

What is most interesting is the pairing of Gigya with SAP Hybris.  Hybris is SAP’s marketing tools, analytics, and automation suite.  It already has a considerable customer base and big feature set. Gigya is also very strong in this area, with specialties for leveraging consumer data for personalization and more accurate targeted marketing.

This Gigya – SAP transaction is the latest in an active year of CIAM startups, funding rounds, buyouts, and even an IPO.  CIAM is the fastest growing section of the Identity market, and all aforementioned activity is evidence that this trend is being recognized and rewarded by investors.

CIAM is an essential component in contemporary business architectures.  Consider the following:

  • EU GDPR – CIAM solutions collect and retain user consent for compliance
  • EU PSD2 – CIAM solutions are significant competitive advantages for banks and financial services providers that implement them; CIAM can also offer strong customer authentication options
  • Retail – consumers will shop elsewhere if the user experience is cumbersome
  • Digital transformation – IoT and SmartHome gadgets are best managed via consumer identities with strong consumer security and privacy protections 

Traditional IAM is also capturing a larger portion of most organizations’ budgets, as the C-Suite begins to understand the importance of cybersecurity and the pivotal function of digital identity within cybersecurity strategies. The lines between IDaaS and CIAM have begun to blur.  Traditional IAM vendors have been modifying and rebranding their solutions to meet the changing needs of consumer identity.  Some CIAM vendors are offering their services as IDaaS.  If SAP chooses to leverage the full CIAM feature set that Gigya has, rather than just integrating the marketing analytics and automation capabilities with Hybris, it will broaden SAP’s reach in the CIAM and IDaaS space.

KuppingerCole will continue to monitor and research the dynamic CIAM market.  If CIAM is a trending topic for you, join us at our Consumer Identity World events, in Paris (November 28-29) and Singapore (December 13-14)

Discover KuppingerCole

KuppingerCole Select

Register now for KuppingerCole Select and get your free 30-day access to a great selection of KuppingerCole research materials and to live trainings.

Stay Connected

Blog

Spotlight

Privacy & the European Data Protection Regulation Learn more

Privacy & the European Data Protection Regulation

The EU GDPR (General Data Protection Regulation), becoming effective May 25, 2018, will have a global impact not only on data privacy, but on the interaction between businesses and their customers and consumers. Organizations must not restrict their GDPR initiatives to technical changes in consent management or PII protection, but need to review how they onboard customers and consumers and how to convince these of giving consent, but also review the amount and purposes of PII they collect. The impact of GDPR on businesses will be far bigger than most currently expect. [...]

Latest Insights

How can we help you

Send an inquiry

Call Us +49 211 2370770

Mo – Fr 8:00 – 17:00