Es kann viele Gründe geben, warum ein Unternehmen eine Initiative zur Verbesserung seiner Informationssicherheit ergreift. Es gibt jedoch einen spezifischen Grund, der sich immer wieder wiederholt: "Weil die Auditoren das sagen, müssen wir....".
Die Realität und die hieraus resultierende Logik war bislang oft wie folgt: Zur Durchsetzung der regulatorischen oder gesetzlichen Anforderungen gehören Sanktionen bei Nichteinhaltung. Diese galt es zu vermeiden. Dies führte zu einem Ankreuz-Listen-Ansatz für die Einhaltung der Vorschriften. Wenn dieser mit dem wie auch immer möglichen absoluten Minimum an Kosten und Aufwand betrieben wurde, um eine Nicht-Compliance und damit die Geldstrafe zu vermeiden, war der "vorteilhafteste" Ansatz für das Unternehmen gefunden. Als eine durchdachte strategische Sichtweise von Governance und Compliance konnte und kann das nicht betrachtet werden.
Doch mit der Zeit verändern sich die Anforderungen, sie werden mehr und spezifischer. Jüngstes Beispiel aus dem Bereich der Versicherungswirtschaft: Mit dem im Juli 2018 final vorgelegten Dokument „Versicherungsaufsichtliche Anforderungen an die IT“ (VAIT) gibt die BaFin (Bundesanstalt für Finanzdienstleistungsaufsicht) Versicherungsunternehmen konkretere Vorgaben für die Umsetzung ihrer Geschäftsprozesse mittels IT an die Hand.
Die Namensähnlichkeit zu den BAIT und damit den “Bankaufsichtlichen Anforderungen an die IT“ ist mitnichten Zufall: Beide Dokumente stammen von der BaFin und weisen auch inhaltlich starke Parallelen auf. Damit stellen beide Dokumente Herausforderungen dar, denen in betroffenen Unternehmen angemessen, transparent und wohldokumentiert begegnet werden muss. Und da diese nur Verfeinerungen sind, sind diese per sofort gültig, weil die ursprünglich zu verfeinernden, zugrunde liegenden Regelungen ja auch schon gültig sind.
Doch nicht nur die externen Anforderungen verändern sich, auch in den Unternehmen ist verstanden, dass IT heute eine zentrale Komponente des Kerngeschäftes darstellt - oder IT ist das Kerngeschäft. Backup, Contingency Management, Security, Audit und Governance werden damit auch zunehmend Anforderungen, die von einer wachsenden Anzahl interner Stakeholdern zur Wahrung und Verbesserung der Geschäftsgrundlage eingefordert werden. IT Risiko Management führt dazu, dass aussagefähige Kennzahlen wie „Key Risk Indicators“ zu klaren Vorgaben an mögliche Ausfall- und Wiederanlaufzeiten, aber auch zu Aussagen zu SoD, Privilege Management, Rechtevergabe und Access Governance führen
Klar ist darüber hinaus auch, dass Banken mit der um weniges früheren Publikation der BAIT einen gewissen zeitlichen Vorsprung in der Umsetzung wirksamer Maßnahmen haben können. Im Umkehrschluss kann es für Unternehmen der Versicherungsbranche in hohem Maße sinnvoll sein, direkt oder über konsolidierte Best Practices von den Erfahrungen der doch verwandten Branchen zu profitieren.
Proaktive Unternehmen, die nachweislich eine Vielzahl an Anforderungen (extern wie intern) durch Policies, Controls , Dokumentation und Reporting erfüllen müssen, werden die VAIT im Rahmen einer effizienten „Control once, comply to many“-Strategie abdecken wollen. Und mit den deutlich spezifischeren (aber immer noch interpretationsfähigen) Vorgaben der VAIT werden einige Versicherungen konkreten Handlungsbedarf, sei es bei der Analyse eines verlässlichen Status Quo oder der Identifikation und Durchführung konkreter Umsetzungsprojekte.
Als Herausforderung formuliert: Die VAIT stehen für jeden im Internet publiziert zur Verfügung stehen. Wirklich proaktive CISOs in Unternehmen jenseits der Finanzbranche werden sich diese als Ausgangsbasis und als Herausforderung an die Qualität der eigenen,angemessene Security und Compliance annehmen. Jenseits konkreter regulatorischer Anforderungen, aber zur Absicherung des eigenen Unternehmens.
There can be many reasons why a company takes an initiative to improve its information security. However, there is one specific reason that repeats itself time and again: "Because the auditors say that, we have to..."
The reality and the resulting logic have so far often been as follows: The enforcement of regulatory or legal requirements includes sanctions for non-compliance. These had to be avoided. This led to a check-list approach for regulatory compliance. If this was done with the absolute minimum possible cost and effort in order to avoid non-compliance and thus the fine, the "most advantageous" approach for the company was found. This could not and cannot be regarded as a well-thought-out strategic view of governance and compliance.
But over time the requirements change, they become more and more specific. The latest example from the insurance industry is the document "Versicherungsaufsichtliche Anforderungen an die IT" (VAIT), which was finalised in July 2018 and published by BaFin (Bundesanstalt für Finanzdienstleistungsaufsicht - German Federal Financial Supervisory Authority), providing insurance companies with more tangible requirements for the implementation of their business processes using IT.
The similarity of the names to BAIT and thus to the banking supervisory requirements for IT is by no means a coincidence: both documents originate from BaFin and also have strong parallels in terms of content. Thus, both documents represent challenges that must be met appropriately, transparently and well-documented by the affected companies. And since these are only refinements, they are valid immediately, because the underlying regulations originally to be refined are already valid.
However, it is not only external requirements that are changing. Companies also understand that IT today is a central component of their core business - or IT is their core business. Backup, contingency management, security, audit and governance are therefore increasingly becoming requirements demanded by a growing number of internal stakeholders to maintain and improve the business basis. IT risk management leads to meaningful key figures such as "key risk indicators" leading to clear guidelines on possible downtimes and restart times, but also to statements on SoD, privilege management, assignment of rights and access governance.
It is also clear that with BAIT's publication, which was a little earlier, banks can have a certain head start in implementing effective measures. Conversely, it can be very useful for insurance companies to benefit directly or through consolidated best practices from the experience of related industries.
Proactively acting companies that demonstrably have to meet a large number of requirements (both external and internal) through policies, controls, documentation and reporting will want to cover VAIT as part of an efficient "Control once, comply to many" strategy. And with the much more specific (but still interpretable) requirements of VAIT, some insurance companies will have a concrete need for action, be it the analysis of a reliable status quo or the identification and implementation of concrete implementation projects.
Put as a challenge: The VAIT are openly available to everyone and are published on the Internet, with an English version soon to be expected. Truly proactive CISOs in companies beyond the financial sector will take these as a starting point and challenge to the quality of their own, appropriate security and compliance. Beyond concrete regulatory requirements, but to secure their own company.
The primary factor that most organizations consider when choosing a cloud service is how well the service meets their functional needs. However, this must be balanced against the non-functional aspects such as compliance, security and manageability. These aspects are increasingly becoming a challenge in the hybrid multi-cloud IT environment found in most organizations. This point was emphasized by Virtustream during their briefing in London on September 6th, 2018.
Virtustream was founded in 2009 with a focus on providing cloud services for mission-critical applications like SAP. In order to achieve this Virtustream developed its xStream cloud management platform to meet the requirements of complex production applications in the private, public and hybrid cloud. This uses patented xStream cloud resource management technology (μVM), to deliver assured SLA levels for business-critical applications and services.Through a series of acquisitions Virtustream is now a Dell Technologies business.
The hybrid multi-cloud IT environment has made the challenges of governance, compliance and security even more complex. There is no single complete solution currently on the market to this problem.
Typically, organizations use multiple cloud services including office productivity tools from one CSP (Cloud Service Provider), a CRM system from another CSP, and a test and development service from yet another one. At the same time, legacy applications and business critical data may be retained on-premises or in managed hosting. This hybrid multi-cloud environment creates significant challenges relating to the governance, management, security and compliance of the whole system.
What is needed is a consistent approach with common processes supported by a single platform that provides all the necessary functions across all the various components involved in delivering all the services.
Most CSPs offer their own proprietary management portal– which may in some cases extend to cover some on premises cases. This makes it important when choosing a cloud service to evaluate how the needs for management, security and compliance will be integrated with the existing processes and components that make up the enterprise IT architecture. The hybrid IT service model requires an overall IT governance approach as described in KuppingerCole Advisory Note: Security Organization Governance and the Cloud - 72564
An added complexity is that the division of responsibility for the different layers of the service depends upon how the service is delivered. There are 5 layers:
- The lowest layer is the physical service infrastructure which includes as the data center, the physical network, the physical servers and the storage devices. In the case of IaaS this is the responsibility of the CSP.
- Above this sits the Operating Systems, basic storage services and the logical network. For IaaS, the management of this layer is the responsibility of the customer.
- The next plane includes the tools and middleware needed to build and deploy business applications. For PaaS (Platform as a Service) these are the responsibility of the CSP.
- Above the middleware are the business applications and for SaaS (Software as a Service) these are the responsibility of the CSP.
- The highest plane is the governance of business data and control of access to the data and applications. This is always the responsibility of the customer.
An ideal solution would be a common management platform that covers all the cloud and on-premises services and components. However, most cloud services only offer a proprietary management portal that covers the management of their service.
So, does Virtustream provide a solution that completely meets these requirements? The answer is: Not yet. However, there are two important points in its favour:
- Firstly, Virtustream have highlighted that the problem exists. Acceptance is the first step on the road to providing a solution.
- Secondly, Virtustream is a part of Dell and Dell also own VMware. VMware provides a solution to this problem but only where VMware is used across different IT service delivery models. VMware is used by Virtustream and is also supported by several other CSPs.
In conclusion, the hybrid multi-cloud environment presents a complex management challenge particularly in the areas of security and compliance. There are five layers each with six dimensions that need to be managed, and the responsibilities are shared between the CSP and the customer. It is vital that organizations consider this when selecting cloud services and that they implement a governance-based approach. Look for the emergence of tools to help with this challenge. There was a workshop this subject at KuppingerCole’s EIC earlier this year.
Guest Author: Vinny Lingham, CEO, Civic Technologies
Bitcoin. Blockchain. Crypto. Decentralization. Tokens. A lot of buzzwords have emerged alongside the rise of blockchain technology. Yet, there is often a lack of context about what those terms actually mean and the impact they will have.
Decentralized identity re-envisions the way people share access, control, and share their personal information. It gives people power back over their identity.
Current identity challenges all tie back to the way we collect and store data. The world has evolved from floppy disks to the Cloud, but now, every single time that data is collected, processed, or stored, security and privacy concerns emerge. With the rise of the digital economy, consumers have unintentionally turned banks, governments, and stores into identity management organizations, responsible for the storage and protection of an unprecedented amount of personal data. Unfortunately, as recent hacks have shown, not all of them were ready to deal with this new role.
Decentralized identity puts that power and responsibility back in the hands of the individual, giving them the ability to control and protection their own personal information. This concept is made possible by the decentralized nature of blockchain and the trust created by consensus algorithms.
How Blockchain Creates Trust
The most prominent blockchain application to date is Bitcoin, a technology that emerged following the U.S. financial crisis of 2008 when trust in institutions was at an all-time low. Blockchain technology, specifically the public blockchain, has several unique characteristics that solve problems of trust and make it a great fit for identity solutions.
First, blockchain is immutable, or unchangeable. Blockchain transactions are processed by a network. Computers work together to confirm a transaction, and every computer in the network must eventually confirm every transaction in the chain. These transactions are processed in blocks, and each block is linked to the preceding block. This structure makes it reasonably impossible to go back and alter a transaction. Additionally, blockchain is transparent. Every computer in the network has a record of every transaction that occurred.
Decentralization is the essence of blockchain: no one party control the data, so there is no single point of failure or someone who can override a transaction. Second, it is reasonably impossible to alter blockchain transactions. And this is how blockchain builds trust: when data cannot be modified and is independently verifiable, it can be trusted.
How Blockchain Helps Decentralized Identity
Currently, there is a presumption that knowledge of information is identity. If a person knows a social security number or password, they are presumed to be the person who that information represents. And if a person knows your personal information, they can impersonate you.
Using blockchain technology to decentralize identity is about digital validation and keys. For example, a digital wallet with cryptographic keys that cannot be recreated. You must have physical access to a device to validate identity. With a decentralized identity system, a remote hacker might have access to pieces of personal information but being able to prove an actual identity would require physical possession of that person’s device. Decentralized identity is literally putting the power back in the hands of the people.
Why It Matters
In 2017, Equifax became one of the worst data breaches in corporate history, exposing personal information of over 147 million people, including Social Security numbers, dates of birth, home addresses, driver’s license numbers, and credit card numbers.
In 2018, the Cambridge Analytica scandal about user data misuse has continued to unfold, as the F.B.I and Justice Department are investigating Facebook for failing to safeguard 87 million user profiles.
Equifax and Cambridge Analytica are two prime examples of how current systems for sharing and storing personal information have proven to be not as safe, secure, or trustworthy as previously thought.
And everyone feels this impact.
Governments are implementing more stringent laws and regulations for consumer protection. In May, the General Data Protection Regulation (GDPR), a standard for data collection and storage, went into effect. In July, California passed the California Consumer Protection Act enacting similar standards. And this is probably the first in a wave of consumer protection and privacy policies that will come to life.
Consumers are concerned as well. In a recent Deloitte study, 81 percent of U.S. respondents feel they have lost control over the way their personal data are collected and used.
The ability to prove you are who you say you are is critical to engaging with the world and being a part of the economy. Decentralized identity gives that control back to people.
Get to know more about Blockchain and listen to my Keynote "Practical Examples of Decentralized ID's in the Real World" at the Consumer Identity World USA in Seattle in September.
For a deep dive into the Blockchain topic please find the following blog posts:
Entrust Datacard, founded in 1969 and headquartered in Minnesota, announced today that it is making a strategic investment in CensorNet and acquiring the SMS Passcode business from CensorNet (originally a Danish company). Entrust Datacard is a strong brand in IAM, with card and certificate issuance, and financial and government sector business.
CensorNet was founded in 2007 in the UK. Their original product was a secure web gateway. It now includes multi-mode in-line and API-based CASB service. It also has an email security service, which utilizes machine learning algorithms to scan email looking for potential malicious payloads. Entrust Datacard already has substantial capabilities in the adaptive and multi-factor authentication areas, and the SMS Passcode product line will add to that. With this investment and acquisition, Entrust Datacard plans to move beyond digital transformation to realize continuous authentication and enhance its e-government offerings.
The results of the acquisition will be reflected in product roadmaps, likely starting in 2019. Entrust Datacard products and services will continue to handle initial authentication, and CensorNet’s capabilities will be able to add user activity monitoring through the CASB piece. The integration of identity-linked event data from CensorNet CASB will help security analysts to know, for example, which files users are moving around, and who and what are users emailing. This functionality will help administrators reduce the possibility of fraud and data loss.
Broadcom, after having denied the acquisition of Qualcomm earlier this year by Trump administration based on national security concerns, has decided to acquire CA Technologies showing one of the greatest shifts in an acquisition strategy from a semiconductor business to an IT Software and Solutions business. The proposed Qualcomm acquisition by once Singapore-based Broadcom had the likelihood of several 5G patents passing beyond US control.
The CA Technologies’ acquisition still gets over 1200 patents and mission-critical software deployments by CA Technologies at US Govt sites in the hands of Broadcom, and yet appears getting a green signal from the Trump administration. Negating the basics of acquisition with absolutely no or very little commercial synergies, the Broadcom’s objective to acquire ‘established mission-critical technology businesses’ is fully satisfied by this move which could be considered one of the most ambitious acquisitions of this size and scale in the recent times. Not to forget the Intel’s acquisition of McAfee which didn’t work well for the company due to little synergies between McAfee’s endpoint protection business and Intel’s core hardware strategy, finally resulted into a divestment of McAfee after seven years of rough marriage.
CA Technologies itself is built on a series of smaller acquisitions done in almost every segment of IT software – ranging from IT operations management, application performance, mainframes, DevOps, IT security and automation to analytics. CA Technologies has, however, had a good overall success rate of driving product and roadmap integrations to achieve expected synergies out of the acquisitions done in the past. Broadcom must consider using some of the CA management’s expertise gathered over a decade and more to drive this acquisition towards a successful business integration. There’s no similar business unit at Broadcom that delivers IT software or services, which should make it even easier for CA Technologies to continue operating under the larger shed without the need to make any immediate shift to operating strategy.
The dissimilarity of businesses and customer-base would only offer limited cross-sell opportunities arising from this acquisition in short to mid-term. However, CA Technologies’ recurring profitable bookings are guaranteed to bring stability by the increased future cash flow for Broadcom in the short term to accommodate for the expected fluctuations to its business due to the uncertainties arising from the recent (though still proposed and under review) US trade tariffs against semiconductor goods manufactured in China.
Besides mainframes which remain a majority revenue stream, and some other areas such as IT project & portfolio management, CA Technology has invested significantly in building its IT Security portfolio over the last decade, starting with Netegrity, IDFocus, Eurekify, Arcot, Layer 7, Xceedium, IdMLogic and Veracode – all within the Identity and Access Management (IAM) domain alone. CA’s aggressive acquisition strategy has kept innovation out of the company’s door for a long time and now with the Broadcom’s acquisition of CA Technologies there’s little hope that innovation will be the key to revenue generation for the new entity anytime in near future. With numerous acquisitions, CA’s Identity and Access Management portfolio has taken a bumpy ride over the past decade but despite all the challenges and long-term ramifications, its excellent IAM product and engineering team has ensured a seamless absorption of acquired products into its IAM and broader security software portfolio.
While the uncertainties will continue to loom over its acquisition objectives and alignment of synergies for some more time, it will be interesting to see how Broadcom would decide to nurture CA’s enterprise software and services business and where would that lead its still very well-positioned IAM product line.
BOMGAR, owned by PE firm Francisco Partners has recently announced that it has acquired Avecto, a UK based Endpoint Privilege Management (EPM) company. The move coming within 6 months of Lieberman Software’s acquisition by BOMGAR clearly depicts the quest to strengthen its position in the PAM market by offering a full-featured PAM suite.
Originally a provider of ‘remote support’ solutions, BOMGAR offered remote session management capabilities in the market for a while until it acquired Argentina based Pitbull Software in late 2015 to enter the PAM market with its password management technology. Since then BOMGAR has been on an acquisition spree to expand its portfolio of PAM technologies to compete more effectively against the market leaders.
Avecto has been a market leader in the niche market of Endpoint Privilege Management (EPM). Its flagship product Avecto Defendpoint offers capabilities to manage threats associated with local administrative rights on windows and mac endpoints by offering controlled and monitored escalation of admin privileges. Avecto Defendpoint also offers effective application whitelisting and sandboxing capabilities for enhanced endpoint protection which has positioned it uniquely in the market with almost twice the number of managed endpoints than its closest competitor. For a couple of years before acquiring Viewfinity in late 2015, CyberArk embedded Defendpoint as a technology licensed through an OEM agreement with Avecto to sell a more complete PAM solution in the market for its customers and compete against then leading EPM product BeyondTrust PB for Windows.
Endpoint Privilege Management (EPM) has become one of the fastest growing sub-segments of PAM market, closing in on approximately 28% YoY growth. With EPM capabilities, PAM solutions are poised to offer effective second-in-line defense mechanism for endpoint threat protection in coming years. The increased demand of better EPM capabilities embedded in PAM solutions has led many market leading vendors to acquire or develop their own EPM capabilities in the recent past. CyberArk, for example, acquired Viewfinity and Thycotic acquired Arellia in recent years to bring EPM capabilities in their PAM portfolios.
At KuppingerCole, we define EPM solutions to primarily offer three distinct technologies:
- Application Control: This allows organizations to control what applications can be allowed to run on an endpoint. This is usually achieved through application whitelisting in which only known good applications are placed on the pre-approved list and allowed to run. Application control provides effective protection against shadow IT challenges for most organizations.
- Sandboxing: This technology uses the approach to isolate the execution of unknown applications or programs by restricting the resources they can access (for eg., files, registries etc.). This technology, also known as application isolation, provides an effective protection against cyberattacks by confining the execution of malicious programs and limiting their means to cause the harm.
- Privilege Management: This technology encompasses user and application privilege management. For user privileged management, it deals with controlled and monitored elevation to local admin privileges. Application privilege management deals with exception or policy-based elevation of administrative rights for known and approved applications to execute successfully.
Avecto DefendPoint offers a good mix of these EPM technologies in the market to provide effective endpoint protection against a range of cyber threats. The acquisition of these EPM capabilities make a natural fit for BOMGAR offering great cross-sell opportunities in the short to mid-term. While their integration under a common PAM platform should begin soon, no immediate changes are expected to either product lines. In the short term, Avecto will continue to operate under the rebranded entity as Avecto, a BOMGAR company until its fully integrated into BOMGAR organization through the remainder of 2018.
The BOMGAR’s approach to obtain additional PAM capabilities through acquisitions is expected to bring rapid growth and deliver quick synergies but is also accompanied by the risks of integration failures and long-term effects of dampened organic growth. No doubt, the Lieberman Software’s and now Avecto’s acquisition places BOMGAR on the list of top 5 PAM vendors by revenue but not necessarily on the list of market leaders for technology innovation. As the PAM market continues to evolve, consolidation is inevitable, however, a stronger focus of vendors on completeness of features as compared to innovation in order to compete can stiffen the healthy market growth by failing to deliver on opportunities created by innovation.
While a clear integration roadmap for Lieberman Software was still awaited, the acquisition of Avecto adds to the growing pipeline of product and engineering teams to develop an integrated PAM platform to realize the essentials of these acquisitions. With a good track record of delivering growth and profitability as well as driving operational excellence, we expect BOMGAR to steer clear of any such challenges in the short to mid-term by delivering on the actual synergies created by these acquisitions.
Cybersecurity needs to be at the heart of the digital transformation, but organisational models will have to evolve
Cybersecurity is in the process of becoming an essential component of any organisation’s digital transformation journey. There is no way around this, especially as policymakers start dipping their toes into privacy and security issues, and societal norms are shifting on the topic.
Most new technology layers enabling the digital transformation need to be protected from interference, intrusion, or corruption. This is especially the case across industry sectors seeking to take advantage of the enormous opportunities offered by driverless vehicles and the logistics sector – amongst others - could be unrecognizable in ten years’ time.
New technologies will also generate and feed on massive amounts of data - most of it sensitive or private - that will need to be collected, processed, and safeguarded in a way that is both sensible and ethical. The concepts of security by design and of privacy by design will inevitably become any organisation’s best allies in its innovative endeavours and must be taken seriously by all digital transformation players, especially as the regulatory and social contexts become harder to navigate.
There is no doubt – in our opinion – that organisations which put information security and privacy at the heart of their digital transformation from the start could obtain a real competitive advantage in the mid-to-long run.
As a matter of fact, the recent launch of the General Data Protection Regulation (GDPR) in the EU is changing dramatically the incentives landscape for all businesses active in Europe. In addition to the fines of 4% of the global turnover, firms are now required to report any relevant data breach to the regulator within 72 hours. This will require capabilities of detection, analysis and reaction, which go far beyond the scope of the security teams and will force many corporate stakeholders to work together on those matters (security, IT, legal, DPO teams, senior management etc…). As such, the GDPR could be a painful lesson as to why cybersecurity is necessarily a transversal matter for organisations of all sizes.
Finally, and perhaps most importantly, respect for privacy and the protection of personal data is likely to become a true competitive advantage as our societies become increasingly warry of these issues.
This shift is well illustrated by the first complaints filed under the GDPR framework. Privacy activists such as Max Schrems or the French Quadrature du Net, for example, have already started to drag high-profile tech companies (Facebook, Google, Instagram, etc…) into what could become lengthy legal proceedings. Depending on how the regulators react, this could have deep implications on how data-driven businesses are to operate in Europe.
Increasingly, security and privacy become intertwined, but it makes little sense from a corporate governance perspective to allow a new privacy organisation under a DPO to grow in parallel to – or in conflict with – existing security structures. Synergies are obvious and need to be leveraged, and where security practices are deemed dysfunctional or in need of improvement, this could provide an ideal opportunity.
In fact, it could be the start of a major evolution around corporate perceptions of security and privacy, from burden, annoyance and costs, towards becoming central management functions. But organisational models will have to evolve as a result to accommodate the truly transversal nature of security and privacy matters and carve out a niche for those new corporate functions.
At this junction, the traditional role of the CISO – heavily influenced by a technical bias, tactically-oriented and project-driven in many firms – could become exposed.
Not in its functional existence – IT security is more essential than ever – but in its corporate prominence. Having failed to project their roles beyond the tactical and technical fields for the best part of the last decade, many CISOs could find themselves pushed down the organisation while CSO and DPO roles take centre stage at the top.
With those new roles should come new people and a new focus, and probably a different way to approach security matters and talk about them.
We could be at the start of an exciting decade for all security professionals.
Learn more about this topic in my session at the Cybersecurity Leadership Summit 2018 Europe, November 12-14, 2018 in Berlin.
*** Please note this is a guest blog post and does not necessarily represent the opinion of KuppingerCole ***
On June 15th, 2018 I attended an OIX Workshop in London on this subject. The workshop was led by Don Thibeau of the Open Identity Exchange and Distributed Ledger foundation and was held in the Chartered Accountants’ Hall, Moorgate Place, London.
Blockchain and Distributed Ledger Technology (DLT) is often associated with crypto-currencies like Bitcoin. However, it has a much wider applicability and holds the potential to solve a wide range of challenges. Whenever a technology is evolving, the governance is often neglected until there are incidents requiring greater participation of involved parties and regulators to define operating guidelines. Governance is a wider subject and covers markets, laws and regulations, corporate activities as well as individual projects, the workshop covered many of these areas.
One question that often arises while evaluating or adopting a new technology is whether the existing legal framework is sufficient to protect your interests. According to the technology lawyer Hans Graux (time.lex) existing EU legislation on electronic signatures works well for blockchain. However, where blockchain is sold as technology there is no guarantee of governance to it back up. EU law allows the prohibition of electronic contracts for certain forms of transaction (e.g. real estate) so there are regional variations to the applicability of blockchain within EU. Some countries have created laws but, in his opinion, these are intended to show that these countries are open for business rather than because they are needed. He recommended that organizations should take a risk-based approach, similar to that for GDPR to gauge their readiness for blockchain and document the risks arising from an early adoption of blockchain as well as the controls required to manage these risks.
There was a panel on Smart Contracts and the legal framework surrounding the Smart Contracts. A key takeaway from the panel was the fact that Smart Contracts are not deemed legal contracts and so how can Smart Contracts be made legally enforceable? Tony Lai (CodeX & Legal.io) outlined the Accord project from the CodeX Stanford Blockchain Group. The initial focus of this group is in the areas of:
1. Regulatory frameworks and ethical standards around token generation events (also known as ICOs or Initial Coin Offerings);
2. Legal issues and opportunities presented by blockchain technologies and their intersection with existing legal frameworks;
3. Smart contracts and governance design for token ecosystems; and
4. Legal empowerment and legal services use cases for blockchain technologies.
The panel then discussed the ‘Pillars of Trust’ – Governance, Identity, Security and Privacy in DLT. During this panel Geoff Goodell (UCL) provided an interesting set of perspectives including the need for people to have multiple identities. He described how electronic funds transfer systems provide best surveillance network in the world. He stated that it is only now coming to the point where the risks associated with linking peoples’ activities is becoming clear. To ensure privacy only the minimum information needed should be required to be disclosed. Systems need to be accountable to their users. DLTs are not immutable – the people in control can decide to make changes (for example a code fork) in a way that is unaccountable. Peter Howes then discussed the evidentiary value of IoT data – he expressed the view that Blockchain will not obviate disputes but will reduce the number of areas for dispute.
During the afternoon some Real-World Use-Cases for blockchain and DLT were discussed:
Laura Bailey (Qadre & British Blockchain Association) – described how Qadre has developed their own blockchain system “PL^G” and how this is being prototyped for pharmaceutical anticounterfeiting in support of the EU Falsified Medicines Directive.
Jason Blick (EQI Trade) described how they are aiming to launch the world’s first offshore bank that bridges fiat and cryptocurrencies using blockchain technologies. He announced that they will shortly launch KYC blockchain based system EQI Check.
Brian Spector (Qredo) described a Distributed Ledger Payments Platform for the telecoms industry. This could not use proof of work because of the compute overhead instead they will use the network with a “proof of speed” consensus algorithm.
KuppingerCole is actively researching blockchain and DLT including its applications to identity, privacy and security. Recently at EIC (European Identity & Cloud Conference), in Munich there were several workshops and sessions devoted to practical implementations of blockchain. In the opening keynote at EIC, Martin Kuppinger described the areas where Blockchain technology has the potential to help to solve real-world identity challenges. There are already so many KYC (Know Your Customer) use cases based on Blockchain with valid business models that this is now a reality or at least close to becoming one. Blockchain also has the potential to simplify authentication by having various authenticators and IDs associated with a wallet. Its application to authorization, privacy and smart contracts also has obvious potential.
However, a practical realization of these potentials requires trustworthiness which takes us back to the question of governance. Good governance remains vital to avoid traditional challenges of DLT and to ensure that these inherent problems are not exacerbated in blockchain implementations due to a lack of governing principles.
Guest Author: Jordan L. Fischer, Esq., Co-Founder & Managing Partner of XPAN Law Group, LLC
Technology is changing rapidly, correlating in an increasing amount of data collected every second. These technologies cross-borders and allow businesses to operate on a global scale, at a rate never before seen. However, the corresponding legal infrastructures operate with borders -- hard borders -- that make the exchange of data, both internally and externally, complicated and challenging.
In the last two years, new data protection regulations have gone into effect in a number of different regions: Japan, China, Australia, and most recently (and with the largest “bang”), the European Union. Each of these regulations imposed nuanced requirements on companies, often asserting data localization requirements, implementing the principle of transparency and including consent initiatives when these organizations collect and process data. Most importantly, companies need to proactively be aware of the implications of the technology they use and the data they collect which depending on the regions in which they operate.
This changing legal landscape is no more apparent than in the European Union (EU), with the General Data Protection Regulation (GDPR). The GDPR imposes a number of proactive privacy measures on entities, both within the EU and outside of the EU, that are poised to drastically change the way businesses maintain and exchange data from within the EU. At its core, the GDPR asserts data privacy and security principles on companies. The GDPR does not discriminate depending on the industry or the size of the organization. It universally and equally requires data minimization, data localization, transparency, and accountability by all organizations. The GDPR empowers data subjects to take control of the data collected by companies about them, and to require that those companies to account for all processing of that data, and all third-parties who have access to that data.
The “GDPR model” is becoming the de facto standard. Canadian data protection laws are changing this fall, bringing them more in line with the the GDPR. Even individual states are moving more towards providing similar data protections as the GDPR: California is in the midst of a debate of how much control to give data subjects regarding their data. What started as a potential ballot to be included in the fall elections has now become a bill in the California state legislature and appears to provide similar data protections as many of these international regulations.
These varying principles of data privacy and cybersecurity converge when organizations exchange, transfer and process sensitive information across borders and, as such, implicate a number of different regulations. Take for example the growing prevalence of cloud storage, with companies opting to store data and systems off premise, in a data center located in a specific location, or in multiple data centers. Either option directly correlates with a legal obligation and potential ramifications for regulatory compliance and contractual agreements.
When addressing cross-border data management, companies should take key steps in order to better understand any legal obligations or liabilities, before an issue arises. The first step is knowledge: What data is collected? What is done with that data? Where is that data stored? These regulations increase the power of the data subject, which dovetails into a burden on companies to provide the necessary transparency, both prior to and after the collection of data. In order to provide accurate information to meet these obligations companies need to know, before collecting the data, what it intends to do with that data.
Second, a company needs to know who has access to that data. This is both internal access -- a company’s own employees-- and external access -- third-parties or partners. Understanding the “who” is involved in a “data transaction” is key to ensuring security along that entire chain and providing the necessary transparency to the data subject. The use of processors and sub-processors is common -- but, companies need to ensure that each party involved understands its obligations and adequately protects and secures the data.
Third, a company needs to understand the data lifecycle: how long is the data needed? What happens when we no longer need the data? Data storage is expensive, especially if additional security measures are needed such as encryption or redundancy. Often, companies are not even aware of all of the “old” data that it maintains -- old data that is no longer useful but remains a liability in the event of a breach. Creating “house cleaning” policies (i.e. data destruction and retention policies) is key to decreasing costs and potential legal ramifications.
Ultimately, companies need to understand this convergence of domestic and international data obligations and its effect on creating efficient and secure data management practices in order to meet the needs of the business. Technology and data is like a spiderweb within an organization -- it impacts a number of different business units, and requires a holistic approach. Taking key steps early-on in the data collection process can drastically minimize long term costs and liabilities.
Learn more about this topic in my session at the Consumer Identity World September 19-21, 2018 in Seattle.
* * * * *
Nothing contained in this blog post should be construed as creating an attorney-client relationship or providing legal advice of any kind. If you have a legal issue regarding cybersecurity, domestic or international data privacy, or electronic discovery, you should consult a licensed attorney in your jurisdiction.
Register now for KuppingerCole Select and get your free 30-day access to a great selection of KuppingerCole research materials and to live trainings.
Companies continue spending millions of dollars on their cybersecurity. With an increasing complexity and variety of cyber-attacks, it is important for CISOs to set correct defense priorities and be aware of state-of-the-art cybersecurity mechanisms. [...]