Blog posts by Mike Small

UK Open Banking – Progress and Challenges

On January 13th, 2018 a new set of rules for banking came into force that open up the market by allowing new companies to offer electronic payment services. These rules follow from the EU Directive 2015/2366 of 25 November 2015 that is commonly referred to as Payment Services Directive II (PSDII). They promise innovation that some believed the large banks in the UK would otherwise fail to provide. However, as well as providing opportunities they also introduce new risks. Nevertheless, it is good to see the progress that has been made in the UK towards implementing this directive.

Under this new regime the banks, building societies, credit card issuers, e-money institutions, and others  (known as Account Servicing Payment Service Providers ASPSPs) must provide an electronic interface (APIs) that allows third parties (Payment Service Providers or PSPs) to operate an account on behalf of the owner. This opens up the banking system to organizations that are able to provide better ways of making payments, for example through new and better user interfaces (Apps), as well as completely new services that could depend upon an analysis of how you spend your money. These new organizations do not need to run the complete banking service with all that that entails, they just need to provide additional services that are sufficiently attractive to pay their way.

This introduces security challenges by increasing the potential attack surface and, according to some, may introduce conflicts with GDPR privacy obligations. It is therefore essential that security is top of mind when designing, implementing and deploying these systems. In the worst case they present a whole new opportunity for cyber criminals. As regards the potential conflicts with GDPR there will be a session at KuppingerCole’s Digital Finance World in February on this subject. For example, one challenge concerns providing the details of a recipient of an erroneous transfer who refuses to return the money.

To meet the requirements of this directive, the banking industry is moving its IT systems towards platforms that allow them to exploit multiple channels to their customers. This can be achieved in various ways – the cheap and cheerful method being to use “screen scraping” which needs no change to existing systems and new apps use the existing user interfaces to interact. This creates not only security challenges but also a technical architecture that is very messy. A much better approach is to extend existing systems to add open APIs. This is this approach being adopted in the UK.

PSD II is a directive and therefore each EU state needs to implement this locally. However, the job of implementing some of the provisions, including regulatory technical standards (RTS) and guidelines, has been delegated to the European Banking Authority (EBA). In the UK, HM Treasury published the final Payment Services Regulations 2017. The UK Financial Conduct Authority (FCA) issued a joint communication with the Treasury on PSDII and open banking following the publication of these regulations.

While PSDII prevents the UK regulators from mandating a particular method of access, the UK’s Competition and Markets Authority set up the Open Banking Implementation Entity (OBIE) to create software standards and industry guidelines that drive competition and innovation in UK retail banking. 

As of now they have published APIs that include:

Open Data API specifications allow API providers (e.g. banks, building societies and ATM providers) to develop API endpoints which can then be accessed by API users (e.g. third-party developers) to build mobile and web applications for banking customers. These allow providers to supply up to date, standardised, information about the latest available products and services so that, for example, a comparison website can more easily and accurately gather information, and thereby develop better services for end customers.

Open Banking Read/Write APIs enable Account Servicing Payment Service Providers to develop API endpoints to an agreed standard so that Account Information Service Providers (AISPs) and Payment Initiation Service Providers (PISPs) can build web and mobile applications for Payment Service Users (PSUs, e.g. personal and business banking customers).

These specifications are now in the public domain which means that any developer can access them to build their end points and applications. However, use of these in a production environment is limited to approved/authorised ASPSPs, AISPs and PISPs.

Approved/authorised will be enrolled in Open Banking Directory. This will provide digital identities and certificates which enable organisations to securely connect and communicate via the Open Banking Security Profile in a standard manner and to best protect all parties. 

Open Banking OIDC Security Profile - In many cases, Fintech services such as aggregation services use screen scraping and store user passwords. This is not adequately secure, and the approach being taken is to use a token model such as OAuth [RFC6749, RFC6750]. The aim is to develop a REST/JSON model protected by OAuth. However, OAuth needs to be profiled to be used in the financial use-cases. Therefore, the Open Banking Profile has some differences between the FAPI R+W profile necessary to reduce delivery risk for ASPSPs.

This all seemed straightforward until the publication of the draft Draft supplement to the EU technical regulations. This appears to prohibit the use of many secure approaches and I will cover this in a later blog.

In conclusion, the UK banking industry has taken great strides to define an open set of APIs that will allow banks to open their services as required by PSD II.It would appear that, in this respect, the UK is ahead of the rest of the EU. At the moment, these are only available to cover a limited set of use cases, principally the make an immediate transfer of funds in UK Pounds. In addition, the approach to strong authentication is still under discussion.  One further concern is to ensure that all of the potential privacy issues are handled transparently. To hear more on these subjects, attend KuppingerCole Digital Finance World in Frankfurt in February 2018.

McAfee Acquire Skyhigh Networks

McAfee, from its foundation in 1987, has a long history in the world of cyber-security.  Acquired by Intel in 2010, it was spun back out, becoming McAfee LLC, in April 2017. According to the announcement on April 23rd, 2017 by Christopher D. Young, CEO – the new company will be “One that promises customers cybersecurity outcomes, not fragmented products.” So, it is interesting to consider what the acquisition of Skyhigh Networks, which was announced by McAfee on November 27th, will mean.

Currently, McAfee solutions cover areas that include: antimalware, endpoint protection, network security, cloud security, database security, endpoint detection and response, as well as data protection.   Skyhigh Networks are well known for their CASB (Cloud Access Security Broker) product.  So how does this acquisition fit into the McAfee portfolio?

Well, the nature of the cyber-risks that organizations are facing has changed.  Organizations are increasingly using cloud services because of the benefits that they can bring in terms of speed to deployment, flexibility and price.  However, the governance over the use of these services is not well integrated into the normal organizational IT processes and technologies; CASBs address these challenges. They provide security controls that are not available through existing security devices such as Enterprise Network Firewalls, Web Application Firewalls and other forms of web access gateways. They provide a point of control over access to cloud services by any user and from any device.  They help to demonstrate that the organizational use of cloud services meets with regulatory compliance needs.

In KuppingerCole’s opinion, the functionality to manage access to cloud services and to control the data that they hold should be integrated with the normal access governance and cyber security tools used by organizations.  However, the vendors of these tools were slow to develop the required capabilities, and the market in CASBs evolved to plug this gap.  The McAfee acquisition of Skyhigh Networks is the latest of several recent examples of acquisitions of CASBs by major security and hardware software vendors.

The diagram illustrates how the functions that CASBs provide fit into the overall cloud governance process. These basic functionalities are:

  1. Discovery of what cloud services are being used, by whom and for what data.
  2. Control over who can use which services and what data can be transferred.
  3. Protection of data in the cloud against unauthorized access and leakage.
  4. Regulatory compliance and protection against cyber threats through the above controls.

So, in this analysis CASBs are closer to Access Governance solutions than to traditional cyber-security tools.  They recognize that identity and access management are the new cyber-frontier, and that cyber-defense needs to operate at this level.  By providing these functions Skyhigh Networks provides a solution that is complementary to those already offered by McAfee and extends McAfee’s capabilities in the direction needed to meet the capabilities of the cloud enabled, agile enterprise.

The Skyhigh Networks CASB provides comprehensive functionality that strongly matches the requirements described above.  It is also featured in the leadership segment of KuppingerCole’s Leadership Compass: Cloud Access Security Brokers - 72534.  This acquisition is consistent with KuppingerCole’s view on how cyber-security vendors need to evolve to meet the challenges of cloud usage.  Going forward, organizations need a way to provide consistent access governance for both on premise and cloud based services.  This requires functions such as segregation of duties, attestation of access rights and other compliance related governance aspects.  Therefore, in the longer term CASBs need to evolve in this direction.  It will be interesting to watch how McAfee integrates the Skyhigh product and how the McAfee offering evolves towards this in the future.

Grizzly Steppe – What Every Organization Needs to Do

On December 29th, the FBI together with CERT finally released a Joint Analysis Report on the cyber-attacks on the US Democratic Party during the US presidential election.  Every organization, whether they are based in the US or not, would do well to read this report and to ensure that their organization takes account of its recommendations.  Once released into the wild – the tools and techniques and processes (TTPs) used by state actors are quickly taken up and become widely used by other adversaries. 

This report is not a formal indictment of a crime as was the case with the alleged hacking of US companies by the Chinese filed in 2014.  It is however important cyber threat intelligence.

Threat intelligence is a vital part of cyber-defence and cyber-incident response, providing information about the threats, TTPs, and devices that cyber-adversaries employ; the systems and information that they target; and other threat-related information that provides greater situational awareness.  This intelligence needs to be timely, relevant, accurate, specific and actionable.  This report provides such intelligence.

The approaches described in the report are not new.  They involve several phases and some have been observed using targeted spear-phishing campaigns leveraging web links to a malicious website that installs code.  Once executed, the code delivers Remote Access Tools (RATs) and evades detection using a range of techniques.  The malware connects back to the attackers who then use the RAT tools to escalate privileges, search active directory accounts, and exfiltrate email through encrypted connections.

Another attack process uses internet domains with names that closely resemble those of targeted organizations and trick potential victims into entering legitimate credentials.  A fake webmail site that collects user credentials when they log in is a favourite.  This time, a spear-phishing email tricked recipients into changing their passwords through a fake webmail domain. Using the harvested credentials, the attacker was able to gain access and steal content.

Sharing Threat Intelligence is a vital part of cyber defence and OASIS recently made available three foundational specifications for the sharing of threat intelligence.  These are described in Executive View: Emerging Threat Intelligence Standards - 72528.  Indicators of Compromise (IOCs) associated with the cyber-actors are provided using these standards (STIX) as files accompanying the report.

There are several well-known areas of vulnerability that are consistently used by cyber-attackers.  These are easy to fix but are, unfortunately, still commonly found in many organizations’ IT systems.  Organizations should take immediate steps to detect and remove these from their IT systems:

The majority of these attacks exploit human weaknesses in the first stage.  While technical measures can and should be improved, it is also imperative to provide employees, associates and partners training on how to recognize and respond to these threats.

The report describes a set of recommended mitigations and best practices.  Organizations should consider these recommendations and takes steps to implement them without delay.  KuppingerCole provides extensive research on securing IT systems and on privilege management in particular. 

What Value Certification?

In the past weeks, there have been several press releases from CSPs (Cloud Service Providers) announcing new certifications for their services.  In November, BSI announced that Microsoft Azure had achieved Cloud Security Alliance (CSA) STAR Certification. On December 15th, Amazon Web Services (AWS) announced that it had successfully completed the assessment against the compliance standard of the Bundesamt für Sicherheit in der Informationstechnik (BSI), the Cloud Computing Compliance Controls Catalogue (C5).

What value do these certifications bring to the customer of these services?

The first value is compliance. A failure by the cloud customer to comply with laws and industry regulations in relation to the way data is stored or processed in the cloud could be very expensive.  Certification that the cloud service complies with a relevant standard provides assurance that data will be processed in a way that is compliant.

The second value is assurance.  The security, compliance and management of the cloud service is shared between the CSP and the customer.  Independent certification provides reassurance that the CSP is operating the service according to the best practices set out in the standard.  This does not mean that there is no risk that something could go wrong – it simply demonstrates that the CSP is implementing the best practices to reduce the likelihood of problems and to mitigate their effects should they occur.

There are different levels of assurance that a CSP can provide – these include:

CSP Assertion – the CSP describes the steps they take.  This value of this level of assurance depends upon the customer’s trust in the CSP.

Contractual assurance – the contract for the service provides specific commitments concerning the details of the service provided.  The value of this commitment is determined by the level of liability specified in the contract under circumstances where the CSP is in default as well as the cost and difficulties in its enforcement.

Independent validation – the cloud service has been evaluated by an independent third party that provides a certificate or attestation.  Examples of this include some forms of Service Organization Control (SOC) reports using the standards SSAE 16 or ISAE 3402.  The value of this depends upon the match between the scope of the evaluation and the customer’s requirements as well as its how frequently the validation is performed.

Independent testing – the service provided has been independently tested to demonstrate that it conforms to the claims made by the CSP.  This extends the assessment to include measuring the effectiveness of the controls.  Examples include SOC 2 type II reports as well as some levels of certification with the Payment Card Industry data security Standard (PCI-DSS).  The value of this depends upon the match between the scope of the evaluation and the customer’s requirements as well as how frequently the testing is performed.

The latter of these – Independent testing – is what customers should be looking for.  However, it is important that the customer asks the following questions:

1)      What is the scope of the certification?  Does it cover the whole service delivered or just parts of it – like the data centre?

2)      How does the standard compare with the customer’s own internal controls?  Are the controls in the standard stronger or weaker?

3)      Is the standard relevant to the specific use of the cloud service by the customer?  Many CSPs now offer an “alphabet soup” of certifications.  Many of these certifications only apply to certain geographies or certain industries.

4)      How well is your side of cloud use governed?  Security and compliance of the use of cloud services is a shared responsibility.  Make sure that you understand what your organization is responsible for and that you meet these responsibilities.

For more information on this subject see: Executive View: Using Certification for Cloud Provider Selection - 71308 - KuppingerCole

AWS re:Invent 2016 Blog

In the last week of November I attended the AWS re:Invent conference in Las Vegas – this was an impressive event with around 32,000 attendees. There were a significant number of announcements at this event; many were essentially more of the same but bigger, better based on what their customers were asking for. It is clear that AWS is going from strength to strength. AWS announced many faster compute instances with larger amounts of memory optimized for various specific tasks. This may seem boring - but these announcements were received with rapturous applause from the audience. This is the AWS bread and butter and just what many customers are looking for. The value of these improvements is that a customer can switch their workload onto one of these new instances without the need to specify, order, pay for, and await delivery of new hardware as they would have had to do for on premise equipment. Continuing on that theme - James Hamilton, VP & Distinguished Engineer – described the work that AWS does behind the scenes to deliver their services. The majority of AWS traffic runs on a private network (except in China) this guarantees: improved latency, packet loss and overall quality, avoids capacity conflicts and gives AWS greater operational control. AWS designs and manages its own network routers, its own custom compute nodes to optimize power versus space and even its own custom power utility controls to cater for rare power events.

You may think - well so what? The reason why this matters is that an AWS customer gets all of this included in the service that they receive. These are time consuming processes that the customer would otherwise have to manage for their on premise IT facilities. Furthermore these processes need specialized skills that are in short supply. In the opening keynote at the conference, AWS CEO Andy Jassy compared AWS with the “legacy software vendors”. He positioned these vendors as locking their customers into long term, expensive contracts. In comparison he described how AWS allows flexibility and works to save customers’ money through price reductions and customer reviews.

However, to get the best out of AWS services, just like most IT technology, you need to exploit proprietary functionality. Once you use proprietary features it becomes more difficult to migrate from that technology. Mr. Jassy also gave several examples of how customers had been able to migrate around 13,000 proprietary database workloads to the AWS database services. While this shows the care that AWS has put into its database services it also slightly contradicts the claim that customers are being locked-in to proprietary software.

Continuing on the theme of migration – while AWS is still strong among the “born on the cloud” startups and for creating new applications, organizations are increasingly looking to migrate existing workloads. This has not always been straightforward since any differences between the on premise IT and the AWS environment can make changes necessary. The announcements previously made at VM World that a VMWare service will be offered on AWS will be welcomed by many organizations. This will allow the many customers using VMWare and the associated vSphere management tools to migrate their

workloads to AWS and while continuing to manage the hybrid cloud / on premise IT using the tools they are already using.

Another problem related to migration is that of transferring data. Organizations wishing to move their workloads to the cloud need to move their data and, for some, this can be a significant problem. The practical bandwidth of communications networks can be the limiting factor and the use of physical media introduces security problems. In response to these problems, AWS have created a storage device that can be used to physically transfer Terabytes of data securely. This first of these devices, the “AWS Snowball”, was announced at AWS last year and has now been improved and upgraded to the “AWS Snowball Edge”. However, the highlight of the conference was the announcement of the “AWS Snowmobile”. This is system mounted in a shipping container carried on a transport truck that can be used to securely transfer Exabytes of data. Here is a ‘photo that I took of one of these that was driven into the conference hall.

So, is this just an eye-catching gimmick? Not so according to the first beta customer.  The customer’s on premise datacenter was bursting at the seams and could no longer support their expanding data based business.  They wanted to move to the AWS cloud but it was not practical to transfer the amount of data they had over a network and they needed an alternative secure and reliable method.  AWS Snowmobile provided exactly the answer to this need.

Last but not least, security -  at the event AWS announced AWS Shield.  This is a managed Distributed Denial of Service (DDoS) protection service that safeguards web applications running on AWS.   The value of this was illustrated in an interesting talk SAC327 – “No More Ransomware: How Europol, the Dutch Police, and AWS Are Helping Millions Deal with Cybercrime”.  This talk described a website set up to help victims of Ransomware attacks recover their data.  Not surprisingly, this site has come under sustained attacks from cyber-criminals. The fact that this site has withstood these attacks is a confirmation that AWS can be used to create and securely host applications, and that AWS Shield can add an extra layer of protection.

In conclusion, this event demonstrates that AWS is going from strength to strength.  Its basic value proposition of providing cost effective, flexible and secure IT infrastructure remains strong and continues to be attractive.  AWS is developing services to become more Hybrid Cloud and enterprise friendly while extending its services upwards to include middleware and intelligence in response to customer demand.  

For KuppingerCole’s opinion on cloud services see our research reports Cloud Reports - KuppingerCole

Democratized Security

At the AWS Enterprise Security Summit in London on November 8th, Stephen Schmidt, CISO at AWS gave a keynote entitled “Democratized Security”.  What is Democratized Security and does it really exist? 

Well, to quote Humpty Dumpty from the book Alice in Wonderland “When I use a word it means just what I choose it to mean—neither more nor less."  So, what Mr. Schmidt meant by this phrase may or may not be what other people would understand it to mean.  This is my interpretation.

The word democracy originates in ancient Greece and where it meant the rule of the common people.  It described the opposite of the rule by an elite.  More recently, the “democratization of technology” has come to mean the process whereby sophisticated technology becomes accessible to more and more people.  In the 1990s, Andrew Feenberg described a theory for democratizing technological design. He argued for what he calls “democratic rationalization” where participants intervene in the technological design process to shape it toward their own ends.

How does this relate to cloud services?  Cloud services are easily accessible to a wide range of customers from individual consumers to large organizations.  These services survive and prosper by providing the functionality that their customers value at a price that is driven down by their scale.  Intense competition means that they need to be very responsive to their customers’ demands.  Cloud computing has made extremely powerful IT services available at an incredibly low cost in comparison with the traditional model, where the user had to invest in the infrastructure, the software and the knowledge before they could event start.

What about security? There have been many reports of cyber-attacks, data breaches and legal government data intercepts impacting on some consumer cloud services (not AWS).  The fact that many of these services still survive seems to indicate that individual consumers are not overly concerned.   Organizations however have a different perspective – they do care about security and compliance.  They are subject to a wide range of laws and regulations that define how and where data can be processed with significant penalties for failure.  Providers of cloud services that are aimed at organizations have a very strong incentive to provide the security and compliance that this market demands.

Has the security elite been eliminated?  The global nature of the internet and cyber-crime has made it extremely difficult for the normal guardians – the government and the law – to provide protection.  Even worse, the attempts by governments to use data interception to meet the challenges of global crime and terrorism have made them suspects.  The complexity of the technical challenges around cyber-threats make it impractical for all but the largest organizations to build and operate their own cyber-defences.  However, the cloud service provider has the necessary scale to afford this.  So, the cloud service providers can be thought of as representing a new security elite – albeit one that is subject to the market demands for the security of their services.

With democracy comes responsibility.  In relation to security this means that the cloud customer must take care of the aspects under their control.  Many, but not all, of the previously mentioned consumer data breaches involved factors under the customers’ control, like weak passwords.  For organizations using cloud services the customer must understand the sensitivity of their data and ensure that it is appropriately processed and protected.  This means taking a good governance approach to assure that the cloud services used meet these requirements.

Cloud services now provide a wide range of individuals and organizations with access to IT technology and services that were previously beyond their reach.  While the main driving force behind cloud services has been their functionality; security and compliance are now top of the agenda for organizational customers.  The cloud can be said to be democratizing security because organizations will only choose those services that meet their requirements in this area.  In this world, the cloud service providers have become the security elite through their scale, knowledge and control.  The cloud customer can choose which provider to use based on their trust in this provider to deliver what they need.

For more information see KuppingerCole’s research in this area: Reports - Cloud Security.

Be careful not to DROWN

On March 1st OpenSSL published a security advisory CVE-2016-0800, known as “DROWN”. This is described as a cross-protocol attack on TLS using SSLv2 and is classified with a High Severity. The advice given by OpenSSL is:

“We strongly advise against the use of SSLv2 due not only to the issues described below, but to the other known deficiencies in the protocol as described at https://tools.ietf.org/html/rfc6176

This vulnerability illustrates how vigilant organizations need to be over the specific versions of software that they use. However, this is easier said than done. Many organizations have a website or application that was built by a third party. The development may have been done some time ago and used what were the then current versions of readily available Open Source components. The developers may or may not have a contract to keep the package they developed up to date.

The application or website may be hosted on premise or externally; wherever it is hosted, the infrastructure upon which it runs also needs to be properly managed and kept up to date. OpenSSL is part of the infrastructure upon which the website runs. While there may be some reasons for continuing to use SSLv2 for compatibility, there is no possible excuse for reusing SSL Private Keys between websites. It just goes against all possible security best practices.

It may be difficult to believe but I have heard auditors report that when they ask “what does that server do?” they get the response “I don’t know – it’s always been here and we never touch it”. The same can be true of VMs in the cloud which get created, used and then forgotten (except by the cloud provider who keeps on charging for them).

So as vulnerabilities are discovered, there may be no process to take action to remediate the operational package. The cyber criminals just love this. They can set up an automated process to externally scan to find where known vulnerabilities exist unpatched and exploit the results at their leisure.

There are two basic lessons from this:

  1. Most code contains exploitable errors and its evolution generally leads to a deterioration in its quality over time unless there are very stringent controls over change. It is attractive to add functionality but increase in size and complexity leads to more vulnerabilities. Sometimes it is useful to go back to first principles and recode using a stringent approach.
    I provided an example of this in my blog AWS Security and Compliance Update. AWS has created a replacement for OpenSSL TLS - S2N Open Source implementation of TLS. S2N replaces the 500,000 lines code in OpenSSL with approximately 6,000 lines of audited code. This code has been contributed to Open Source and is available from S2N GitHub Repository.

  2. Organizations need to demand maintenance as part of the development of code by third parties. This is to avoid the need to maintain out of date infrastructure components for compatibility.
    The infrastructure, whether on premise or hosted, should be kept up to date. This will require change management processes to ensure that changes do not impact on operation. This should be supported by regular vulnerability scanning of operational IT systems using one of the many tools available together with remediation of the vulnerabilities detected.

IT systems need to have a managed lifecycle. It is not good enough to develop, deploy and forget.

ISO/IEC 27017 was it worth the wait?

On November 30th, 2015 the final version of the standard ISO/IEC 27017 was published.  This standard provides guidelines for information security controls applicable to the provision and use of cloud services.  This standard has been some time in gestation and was first released as a draft in spring 2015.  Has the wait been worth it?  In my opinion yes.

The gold standard for information security management is ISO/IEC 27001 together with the guidance given in ISO/IEC 27002.  These standards remain the foundation but the guidelines are largely written on the assumption that an organization’s processes its own information.  The increasing adoption of managed IT and cloud services, where responsibility for security is shared, is challenging this assumption.  This is not to say that these standards and guidelines are not applicable to the cloud, rather it is that they need interpretation in a situation where the information is being processed externally.  ISO/IEC 27017 and ISO/IEC 27018 standards provide guidance to deal with this.

ISO/IEC 27018, which was published in 2014, establishes controls and guidelines for measures to protect Personally Identifiable Information for the public cloud computing environment.  The guidelines are based on those specified in ISO/IEC 27002 with controls objectives extended to include the requirements needed to satisfy privacy principles in ISO/IEC 29100.  These are easily mapped onto the existing EU privacy principles.  This standard is extremely useful to help an organization assure compliance when using a public cloud service to process personally identifiable information.  Under these circumstances the cloud customer is the Data Controller and, under current EU laws, remains responsible for processing breaches of the Data Processor.  To provide this level of assurance some cloud service providers have obtained independent certification of their compliance with this standard.

The new ISO/IEC 27017 provides guidance that is much more widely applicable to the use of cloud services.  Specific guidance is provided for 37 of the existing ISO/IEC 27002 controls; separate but complementary guidance is given for the cloud service customer and the cloud service provider.  This emphasizes the shared responsibility for security of cloud services.  This includes the need for the cloud customer to have policies for the use of cloud services and for the cloud service provider to provide information to the customer.

For example, as regards restricting access (ISO 27001 control A.9.4.1) the guidance is:

  • The cloud service customer should ensure that access to information in the cloud service can be restricted in accordance with its access control policy and that such restrictions are realized.
  • The cloud service provider should provide access controls that allow the cloud service customer to restrict access to its cloud services, its cloud service functions and the cloud service customer data maintained in the service.

 In addition, the standard includes 7 additional controls that are relevant to cloud services.  These new controls are numbered to fit with the relevant existing ISO/IEC 27002 controls; these extended controls cover:

  • Shared roles and responsibilities within a cloud computing environment
  • Removal and return of cloud service customer assets
  • Segregation in virtual computing environments
  • Virtual machine hardening
  • Administrator's operational security
  • Monitoring of Cloud Services
  • Alignment of security management for virtual and physical networks

 

In summary ISO/IEC 27017 provides very useful guidance providers and KuppingerCole recommends that this guidance should be followed by cloud customers and cloud service providers. While it is helpful for cloud service providers to have independent certification that they comply with this standard, this does not remove the responsibility from the customer for ensuring that they also follow the guidance.

KuppingerCole has conducted extensive research into cloud service security and compliance, cloud service providers as well as engaging with cloud service customers.  This research has led to a deep understanding of the real risks around the use of cloud service and how to approach these risks to safely gain the potential benefits.  We have created services, workshops and tools designed to help organizations to manage their adoption of cloud services in a secure and compliant manner while preserving the advantages that these kinds of IT service bring.

Why Governance Matters to IT Security

MetricStream, a US company that supplies Governance, Risk and Compliance applications, held their GRC Summit in London on November 11th and 12th.  Governance is important to organizations because of the increasing burden of regulations and laws upon their operations.  It is specifically relevant to IT security because these regulations touch upon the data held in the IT systems.  It is also highly relevant because of the wide range of IT service delivery models in use today.

Organizations using IT services provided by a third party (for example a cloud service provider) no longer have control over the details of how that service is delivered.  This control has been delegated to the service provider.  However the organization will likely remain responsible for the data being processed and held in a way that is compliant.  This is the challenge that governance can address and why governance of IT service provision is becoming so important.

The distinction between governance and management is clearly defined in COBIT 5. Governance ensures that business needs are clearly defined and agreed and that they are satisfied in an appropriate way.  Governance sets priorities and the way in which decisions are made; it monitors performance and compliance against agreed objectives.  Governance is distinct from management in that management plans, builds, runs and monitors activities in alignment with the direction set by the governance body to achieve the objectives.  This is illustrated for cloud services in the figure below.

Governance provides an approach to IT security that can be applied consistently across the many different IT service delivery models.  By focussing on the business objectives and monitoring outcomes it decouples the activities involved in providing the service from those concerned with its consumption.  Most large organization have a complex mix of IT services provided in different ways: on premise managed internally, on premise managed by a third party, hosted services and cloud services.  Governance provide a way for organizations to ensure that IT security and compliance can be directed, measured and compared across this range of delivery models in a consistent way.

Since this specification and measurement process can involve large amounts of data from a wide variety of sources it helps to use a common governance framework (such as COBIT 5) and technology platform such as the MetricStream GRC Platform.  This platform provides centralized storage of and access to risk and compliance data, and a set of applications that allow this data to be consumed from a wide variety of sources and the results shared through a consistent user interface available on different devices.

The need for this common platform and integrated approach was described at the event by Isabel Smith Director Corporate Internal Audit at Johnson & Johnson.  Ms Smith explained that the problem of an integrated approach is particularly important because Johnson and Johnson has more than 265 operating companies located in 60 countries around the world with more than 125,000 employees.  These operating companies have a wide degree of autonomy to allow them to meet the local needs.  However the global organization must comply with regulations ranging from financial, such as Sarbanes Oxley, to those relating to health care and therapeutic products. Using the common platform enabled Johnson and Johnson to achieve a number of benefits including: getting people across the organization to use a common language around compliance and risk, to streamline and standardize policy and controls and obtain an integrated view of control tests results.

In conclusion organizations need to take a governance led approach to IT security across the heterogeneous IT service delivery models in use today.  Many of these are outside the direct control of the customer organization and their use places control of the service and infrastructure in the hands of a third party.   A governance based approach allows trust in the service to be assured through a combination of internal processes, standards and independent assessments.  Adopting a common governance framework and technology platform are important enablers for this.

AWS Security and Compliance Update

Security is a common concern of organizations adopting cloud services and so it was interesting to hear from end users at the AWS Summit in London on November 17th how some organizations have addressed these concerns.

Financial services is a highly regulated industry with a strong focus on information security.  At the event Allan Brearley, Head of Transformation Services at Tesco Bank, described the challenges they faced exploiting cloud services to innovate and reduce cost, while ensuring security and compliance.  The approach that Tesco Bank took, which is the one recommended in KuppingerCole Advisory Note: Selecting your Cloud Provider, is to identify and engage with the key stakeholders.  According to Mr Brearley it is important adopt a culture to satisfy all of the stakeholders’ needs all of the time.

In the UK the government has a cloud first strategy. Government agencies using cloud services must follow the Cloud Security Principles, first issued by UK Communications- Electronics Security Group’s (CESG) in 2014.  These describe the need to take a risk based approach to ensure suitability for purpose.   Rob Hart of the UK DVSA (Driver & Vehicle Standards Agency), that is responsible for road safety in UK, described the DVSA’s journey to the adoption of AWS cloud services.  Mr Hart described that the information being migrated to the cloud was classified according to UK government guidelines as “OFFICIAL”.  That is equivalent to commercially sensitive or Personally Identifiable Information.  The key to success, according to Mr Hart, was to involve the Information Security Architects from the very beginning.  This was helped by these architects being in the same office as the DVSA cloud migration team.

AWS has always been very open that the responsibility for security is shared between AWS and the customer.  AWS publish their “Shared Responsibility Model” which distinguishes between the aspects of security that AWS are responsible for, and those for which the customer is responsible. 

Over the past months AWS has made several important announcements around the security and compliance aspects of their services.  There are too many to cover in here and so I have chosen 3 around compliance and 3 around security.  Firstly announcements around compliance include:

  • ISO/IEC 27018:2014 – AWS has published a certificate of compliance with this ISO standard which provides a code of practice for protection of personally identifiable information (PII) in public clouds acting as PII processors.

  • UK CESG Cloud Security Principles.  In April 2015 AWS published a whitepaper to assist organisations using AWS for United Kingdom (UK) OFFICIAL classified workloads in alignment with CESG Cloud Security Principles.

  • Security by Design – In October 2015 AWS published a whitepaper describing a four-phase approach for security and compliance at scale across multiple industries.  This points to the resources available to AWS customers to implement security into the AWS environment, and describes how to validate controls are operating.

Several new security services were also announced at AWS re:Invent in October.  The functionality provided by these services is not unique however it is tightly integrated with AWS services and infrastructure.  Therefore these services provide extra benefits to a customer that is prepared to accept the risk of added lock-in.  Three of these include:

  • Amazon Inspector – this service, which is in preview, scans applications running on EC2 for a wide range of known vulnerabilities. It includes a knowledge base of rules mapped to common security compliance standards (e.g. PCI DSS) as well as up to date known vulnerabilities.

  • AWS WAF Web Application Firewall – this is a Web Application Firewall that can detect suspicious network traffic.  It helps to protect web applications from attack by blocking common web exploits like SQL injection and cross-site scripting.

  •  S2N Open Source implementation of TLS – This is a replacement created by AWS for the commonly used OpenSSL (which contained the “Heartbleed” vulnerability).  S2N replaces the 500,000 lines code in OpenSSL with approximately 6,000 lines of audited code.  This code has been contributed to Open Source and is available from S2N GitHub Repository.

AWS has taken serious steps to help customers using its cloud services to do so in a secure manner and to assure that they remain compliant with laws and industry regulations.  The customer experiences presented at the event confirm that AWS’s claims around security and compliance are supported in real life.  KuppingerCole recommends that customers using AWS services should make full use of the security and compliance functions and services provided by AWS.

Discover KuppingerCole

KuppingerCole Select

Register now for KuppingerCole Select and get your free 30-day access to a great selection of KuppingerCole research materials and to live trainings.

Stay Connected

Blog

Spotlight

AI for the Future of your Business Learn more

AI for the Future of your Business

AI for the Future of your Business: Effective, Safe, Secure & Ethical Everything we admire, love, need to survive, and that brings us further in creating a better future with a human face is and will be a result of intelligence. Synthesizing and amplifying our human intelligence have therefore the potential of leading us into a new era of prosperity like we have not seen before, if we succeed keeping AI Safe, Secure and Ethical. Since the very beginning of industrialization, and even before, we have been striving at structuring our work in a way that it becomes accessible for [...]

Latest Insights

How can we help you

Send an inquiry

Call Us +49 211 2370770

Mo – Fr 8:00 – 17:00