English   Deutsch   Русский   中文    

Blog posts by Mike Small

Getting the Cloud under Control

Oct 06, 2015 by Mike Small

Many organizations are concerned about the use of cloud services; the challenge is to securely enable the use of these services without negating and the benefits that they bring. To meet this challenge it is essential to move from IT Management to IT Governance.

Cloud services are outside the direct control of the customer’s organization and their use places control of the service and infrastructure in the hands of the Cloud Service Provider (CSP). The service and its security provided cannot be ensured by the customer – the customer can only assure the service through a governance process. A governance based approach allows trust in the CSP to be assured through a combination of internal processes, standards and independent assessments.

Governance is distinct from management in that management plans, builds, runs and monitors the IT service in alignment with the direction set by the governance body to achieve the objectives. This distinction is clearly defined in COBIT 5. Governance ensures that business needs are clearly defined and agreed and that they are satisfied in an appropriate way. Governance sets priorities and the way in which decisions are made; it monitors performance and compliance against the agreed objectives.

The starting point for a governance based approach is to define the organizational objectives for using cloud services; everything else follows from these. Then set the constraints on the use of cloud services in line with the organization’s objectives and risk appetite. There are risks involved with all IT service delivery models; assessing these risks in a common way is fundamental to understanding the additional risk (if any) involved in the use of a cloud service. Finally there are many concrete steps that an organization can take to manage the risks associated with their use of cloud services. These include:

  • Common governance approach – the cloud is only one way in which IT services are delivered in most organizations. Adopt a common approach to governance and risk management that covers all forms of IT service delivery.
  • Discover Cloud Use – find out what cloud services are actually being used by the organization. There is now a growing market in tools to help with this. Assess the services that you have discovered are being used against the organization’s objectives and risk appetite.
  • Govern Cloud Access – to cloud services with the same rigour as if they were on premise. There should be no need for you to use a separate IAM system for this – identity federation standards like SAML 2.0 are well defined and the service should support these. The service should also support the authentication methods, provide the granular access controls and monitor individuals’ user of the services that your organization requires.
  • Identify who is responsible for each risk relating to the cloud service – the CSP or your organization. Make sure that you take care of your responsibilities and assure that the CSP meets their obligations.
  • Require Independent certification – an important way to assure that a cloud service provides what it claims is through independent certification. Demand the CSP provides independent certification and attestations for the aspects of the service that matter to your organization.
  • Use standards – standards provide the best way of avoiding technical lock-in to a proprietary service. Understand what standards are relevant and require the service to support these standards
  • Encrypt your data – there are many ways in which data can be leaked or lost from a cloud service. The safest way to protect your data against this risk is to encrypt it. Make sure that you retain control over the encryption keys.
  • Read the Contract – make sure you read and understand the contract. Most cloud service contracts are offered by the CSP on a take it or leave it basis. Make sure that what is offered is acceptable to your organization.

KuppingerCole has extensive experience of guiding organizations through their adoption of cloud services as well as many published research notes. Remember that the cloud is only one way of obtaining an IT service – have a common governance process for all. If a cloud service meets your organization’s need then the simple motto is “to trust your Cloud Provider but verify everything they claim”.

This article has originally appeared in KuppingerCole Analysts' View newsletter.


Windows 10: How to Ensure a Secure and Private Experience

Aug 13, 2015 by Mike Small

Together with many others I received an offer from Microsoft to upgrade my Windows 7 desktop and Windows 8.1 laptop to Windows 10. Here is my initial reaction to successfully performing this upgrade with a specific focus on the areas of privacy and security.

As always when considering security the first and most important step is to understand what your requirements are. In my case – I have several computers and I mainly use these with Microsoft Office, to use the internet for research and to store personal ‘photos. My main requirements are for consistency and synchronization across these systems together with security and reliability. The critical dimensions that I considered are privacy, confidentiality, integrity and availability. Let’s start with availability:


  1. Make sure you back up your files before you start the upgrade! My files were preserved without problems but it is better to be safe than sorry. It is also a good idea to understand how you could roll back if there is a catastrophic failure during the upgrade. One really big improvement over Windows 8 is the ability to restore files from a Windows 7 backup.
  2. Check that your computer is compatible with the upgrade. The Microsoft upgrade tool checks your computer for compatibility and some manufacturers provide information on which systems they have tested. The Dell support site informed me that my new laptop was tested but my old desktop wasn’t. However both upgraded without problems, but I did need to re-install some software – for my HP printer.
  3. Consider whether you want new features as soon as they are available (with the risk that they may cause problems). The default setting for updates is for these to be automatically installed. You can change this through the advanced setting menu by checking the box to defer upgrades. You will still receive security fixes but new features will be delayed.

  4. Windows 10 has a number of recovery options – you can roll back to your previous OS for up to 30 days after the upgrade as well as performing a complete reset. 


  1. Windows 10 automatically includes Windows Defender for protection – make sure this is activated. If you prefer another anti-malware product you will need to install this yourself.
  2. If you already use OneDrive then you will notice some changes. Previous versions of the OneDrive App supported a placeholder function that allowed File Explorer to display files that were held online but not sync’d onto your PC. This is no longer available; any directories that are not sync’d are not visible through file explorer. I experienced sync problems with files that were previously held online only. I was able to resolve this using the OneDrive Setting menu – first uncheck the folder(s) and save the settings. The folders and files are then erased on your device (scary!). Then repeat the process but this time check the folders for sync in the menu. When you save these settings the files in the folders are re-synced from the cloud. 


  1. The user accounts are copied from your previous OS – if these were all local accounts then they remain so. If you have a Microsoft account than you can link this with one of these local accounts. Doing this allows you to use a PIN instead of a password to log-in.
  2. If you are using Office 365 you will already have a Microsoft Account, you can also set up a free account which provides some free OneDrive space. However if you use the Microsoft account it is a good idea to understand and manage your privacy settings.
  3. The files in OneDrive are all held in the Microsoft cloud and you need to accept the risk that this poses bearing in mind that most breaches result from weak user credentials.
  4. If you are using BitLocker to encrypt your files then the encryption key will also be held on your OneDrive unless you opt out. 


  1. You should review the privacy setting from the Express setup and decide what to change. 

    A future blog will provide more detailed advice on what these mean and how best to set things up. My short advice is to go through these settings carefully and chose which Apps you are happy to allow to access the various functions. In particular I would disable the App Connector since this gives access to unknown apps. I would also not allow Apps to access my name, picture and other info – but then I’m just paranoid.
  2. You also need to consider the privacy setting for the new Edge browser. These are to be found under “Advanced Settings”. Consider whether you really need Flash enabled since this has been a frequent target for attacks. Also consider enabling the “Do not Track Requests Button”.

  3. If you decide to use Cortana – this may involve setting region, language and downloading language pack – make sure you check out the privacy agreement:

My personal experience with this upgrade has been very positive. The upgrades went smoothly and the performance especially the boot up time for my old Desktop is much faster than with windows 7. The settings are now much more understandable and accessible but you need to take the time to review the defaults to achieve your objectives for privacy and confidentiality. KuppingerCole plan a series of future blogs that will give more detailed guidance on how to do this.


Security and Operational Technology / Smart Manufacturing

Jul 07, 2015 by Mike Small

Industry 4.0 is the German government’s strategy to promote the computerization of the manufacturing industry. This strategy foresees that industrial production in the future will be based on highly flexible mass production processes that allow rich customization of products. This future will also include the extensive integration of customers and business partners to provide business and value-added processes. It will link production with high-quality services to create so-called “hybrid products”.

At the same time, in the US, the Smart Manufacturing Leadership Coalition is working on their vision for “Smart Manufacturing”. In 2013 the UK the Institute for Advanced Manufacturing, which is part of the University of Nottingham, received a grant of £4.6M for a study on Technologies for Future Smart Factories.

This vision depends upon the manufacturing machinery and tools containing embedded computer systems that will communicate with each other inside the enterprise, and with partners and suppliers across the internet. This computerization and communication will enable optimization within the organizations, as well as improving the complete value adding chain in near real time through the use of intelligent monitoring and autonomous decision making processes. This is expected to lead to the development of completely new business models as well as exploiting the considerable potential for optimization in the fields of production and logistics.

However there are risks, and organizations adopting this technology need to be aware of and manage these risks. Compromising the manufacturing processes could have far reaching consequences. These consequences include the creation of flawed or dangerous end products as well as disruption of the supply chain. Even when manufacturing processes based on computerized machinery are physically isolated they can still be compromised through maladministration, inappropriate changes and infected media. Connecting these machines to the internet will only increase the potential threats and the risks involved.

Here are some key points to securely exploiting this vision:

  • Take a Holistic Approach: the need for security is no longer confined to the IT systems, the business systems of record but needs to extend to cover everywhere that data is created, transmitted or exploited. Take a holistic approach and avoid creating another silo.
  • Take a Risk based approach: The security technology and controls that need to be built should be determined by balancing risk against rewards based on the business requirements, the assets at risk together with the needs for compliance as well as the organizational risk appetite. This approach should seek to remove identifiable vulnerabilities and put in place appropriate controls to manage the risks.
  • Trusted Devices: This is the most immediate concern since many devices that are being deployed today are likely to be in use, and hence at risk, for long periods into the future. These devices must be designed and manufactured to be trustworthy. They need an appropriate level of physical protection as well as logical protection against illicit access and administration. It is highly likely that these devices will become a target for cyber criminals who will seek to exploit any weaknesses through malware. Make sure that they contain protection that can be updated to accommodate evolving threats.
  • Trusted Data: The organization needs to be able to trust the data from this. It must be possible to confirm the device from which the data originated, and that this data has not been tampered with or intercepted. There is existing low power secure technology and standards that have been developed for mobile communications and banking, and these should be appropriately adopted or adapted to secure the devices.
  • Identity and Access Management – to be able to trust the devices and the data they provide means being able to trust their identities and control access. There are a number of technical challenges in this area; some solutions have been developed for some specific kinds of device however there is no general panacea. Hence it is likely that more device specific solutions will emerge and this will add to the general complexity of the management challenges.

More information on this subject can be found in Advisory Note: Security and the Internet of Everything and Everyone - 71152 - KuppingerCole


From Hybrid Cloud to Standard IT?

Jun 18, 2015 by Mike Small

I have recently heard from a number of cloud service providers (CSP) telling me about their support for a “hybrid” cloud. What is the hybrid cloud and why is it important? What enterprise customers are looking for is a “Standard IT” that would allow them to deploy their applications flexibly wherever is best. The Hybrid Cloud concept goes some way towards this.

There is still some confusion about the terminology that surrounds cloud computing and so let us go back to basics. The generally accepted definition of cloud terminology is in NIST SP-800-145. According to this there are three service models and four deployment models. The service models being IaaS, PaaS and SaaS. The four deployment models for cloud computing are: Public Cloud, Private Cloud, Community Cloud and Hybrid Cloud. So “Hybrid” is related to the way cloud services are deployed. The NIST definition of the Hybrid Cloud is:

“The cloud infrastructure is a composition of two or more distinct cloud infrastructures (private, community, or public) that remain unique entities, but are bound together by standardized or proprietary technology that enables data and application portability (e.g., cloud bursting for load balancing between clouds).”

However sometimes Hybrid is used to describe a cloud strategy – meaning that the organization using the cloud will use cloud services for some kinds of application but not for others. This is a perfectly reasonable strategy but not quite in line with the above definition. So I refer to this as a Hybrid Cloud Strategy.

In fact this leads us on to the reality for most enterprises is that the cloud is just another way of obtaining some of their IT services. Cloud services may be the ideal solution for development because of the speed with which they can be obtained. They may be good for customer interaction services because of their scalability. They may be the best way to perform data analytics needing the occasional burst of very high performance computing. Hence, to the enterprise, the cloud becomes another added complexity in their already complex IT environment.

So the CSPs have recognised that in order to tempt the enterprises to use their cloud services they need recognise this complexity challenge that enterprises face and provide help to solve it. So the “Hybrid” cloud that will be attractive to enterprises needs to:

* Enable the customer to easily migrate some parts of their workload and data to a cloud service. This is because there may be some data that is required to remain on premise for compliance or audit reasons.

* Orchestrate the end to end processing which may involve on premise as well as services from other cloud providers.

* Allow the customer to assure the end to end security and compliance for their workload.

When you look at these requirements it becomes clear that standards are going to be a key component to allow this degree of flexibility and interoperability. The standards needed go beyond the support for Hypervisors, Operating Systems, Databases and middleware to include the

deployment, management and security of workloads in a common way across on premise and cloud deployments as well as between cloud services from different vendors.

There is no clear winner in the standards yet – although OpenStack has wide support including from IBM, HP and Rackspace – but one of the challenges is that vendors offer versions of this with their own proprietary additions. Other important vendors have their own proprietary offerings that they would like customers to adopt including AWS, Microsoft and VMWare. So the game is not over yet, but the industry should recognize that the real requirement is for a “Standard IT” that can easily be deployed in whatever way is most appropriate at any given time.


EMC to acquire Virtustream

May 27, 2015 by Mike Small

On May 26th EMC announced that it is to acquire the privately held company Virtustream. What does this mean and what are the implications?

Virtustream is both a software vendor and a cloud service provider (CSP). Its software offering includes a cloud management platform xStream, an infrastructure assessment product Advisor, and the risk and compliance management software, ViewTrust. It also offers Infrastructure as a Service (IaaS) with datacentres in the US and Europe. KuppingerCole identified Virtustream as a “hidden gem” in our report: Leadership Compass: Infrastructure as a Service - 70959

The combination of these products has been used by Virtustream to target the Fortune 500 companies and help them along their journey to the cloud. Legacy application often have very specific needs that are difficult to reproduce in the vanilla cloud and risk and compliance issues are the top concerns when migrating systems of record to the cloud.

In addition the Virtustream technology works with VMWare to provide an extra degree of resource optimization through their Micro Virtual Machine (µVM) approach. This approach uses smaller units of allocation for both memory and processor which removes artificial sizing boundaries, makes it easier to track resources consumed, and results in less wasted resources.

The xStream cloud management software enables the management of hybrid clouds through a “single pane of glass” management console using open published APIs. It is claimed to provides enterprise grade security with integrity built upon the capabilities in the processors. Virtustream was the first CSP to announce support for NIST draft report IR 7904 Trusted Geolocation in the Cloud: Proof of Concept Implementation. This allows the user to control the geolocation of their data held in the cloud.

EMC already provides their Federation Enterprise Hybrid Cloud Solution — an on premise private cloud offering that provides a stepping stone to public cloud services. EMC also recently entered the cloud service market with an IaaS service vCloud Air based on VMWare. Since many organization already use VMWare to run their IT on premise, it was intended to make it possible to migrate these workloads without change to the cloud. An assessment of vCloud Air is also included in our Leadership Compass Report on Infrastructure as a Service – 70959.

The early focus by CSPs was on DevOps but the market for enterprise grade cloud solutions is a growth area as large organizations look to save costs by datacentre consolidation and “cloud sourcing” IT services. However success in this market needs the right combination of consultancy services, assurance and trust to succeed. Virtustream seems to have met with some success in attracting US organizations to their service. The challenge for EMC is to clearly differentiate between the different cloud offerings they now have and to compete with the existing strong players in this market. As well as the usual challenges of integrating itself into the EMC group, Virtustream may also find it difficult to focus on both providing cloud services as well as developing software.


Risk and Governance in Analytics

May 12, 2015 by Mike Small

There is now an enormous quantity of data which is being generated in a wide variety of forms. However this data, in itself, has little meaning or value; it needs interpretation to make it useful. Analytics are the tools, techniques and technologies that can be used to analyze this data into information with value. These analytics are now being widely adopted by organizations to improve their performance. However what are the security and governance aspects of the use of these tools?

For example Dunnhumby which was created in 1989, by a husband and wife team, to help businesses better understand their customers by being 'voyeurs of the shopping basket'. Within a few years, they were working with Tesco to develop their Clubcard loyalty program. The insights from this help Tesco stock the right products, optimize prices, run relevant promotions and communicate personalized offers for customers across all contact channels.

However another side to this kind of analysis was described in the NY Times article How Companies Learn Your Secrets - NYTimes.com. According to this article a statistician working for the US retailer Target figured out how to identify customers in the second trimester of their pregnancy based on buying habits and other customer data. The targeted advertising based on this led to an angry father complaining to a store manager about advertising for baby clothes and cribs being sent to his daughter who was still in high school. It turned out that the analytics had worked out she was in fact pregnant but she had not told her father.

These examples based on loyalty cards illustrate the value of data analytics but the problem is now even more difficult. This is because of the amount of data that is being generated through smart devices and Apps vastly exceeds that from the occasional use of a loyalty card.

So where is the line between improving service to customers and invading their privacy? At what point does the aggregation and analysis of data become a threat rather than a benefit? These are difficult questions to answer and regulations and the law provide little help. For example when a customer in the UK accepts a customer loyalty card they accept the terms and conditions. These will almost certainly include an agreement that the card provider can use the data collected through its use in a wide variety of ways. Most people do not read the small print – they simply want the loyalty rewards. Those who do read the small print are unlikely to understand the full implication of what they are agreeing to. However under the data protection laws this agreement is considered to be “informed consent”. So is this a fair bargain? Based on the take up of loyalty cards in the UK - for most people it is.

So from the point of view of an organization that wants to get closer to its customers, to provide better products, to become more competitive data analytics are a powerful tool. According to Erik Brynjolfsson Professor at the MIT Sloan School of Management: “Companies with ‘data driven decision making’ actually show higher performance”. Working with Lorin Hitt and Heekyung Kim, Professor Brynjolfsson analyzed 179 large publicly-traded firms and found that the ones that adopted this method are about 5% more productive and profitable than their competitors. Furthermore, the study found a relationship between this method and other performance measures such as asset utilization return on equity and market value.

But what are the risks to the organization in using these forms of analytics? Firstly it is important to be sure of the accuracy of the data.

Can you be sure of the source of the data which originates from outside of your organization and outside of your control? Many consumers take steps to cloak their identity by using multiple personas, the Internet of Things may provide a rich source of data but without guarantees regarding its provenance or accuracy. If you are sure of the data what about the conclusions from analysis?

Can the analytics process provide an explanation of why it has reached the conclusions that you can understand? If not be careful before you bet the farm on the results.

Are you sure that you have permission to use the data at all and in that way in particular? In the EU there are many rules regarding the privacy of personal information. An individual gives data to a third party (known as the data controller) for a specific purpose. The data controller is required to only hold the minimum data and to only process it for the agreed purpose.

If you are going to use analytics it is a decision which should involve the board of directors. They should set the business objectives for its use, define the policies for its governance, and their appetite for risks relating to its use.

This article has originally appeared in the KuppingerCole Analysts' View newsletter.


AWS Announces Machine Learning Service

Apr 10, 2015 by Mike Small

AWS has recently announced the Amazon Machine Learning service – what is this and what does it mean for customers? 

Organizations now hold enormous quantities of data and more data in a wide variety of forms is rapidly being generated.  Research has shown that organizations that base their decision making and processes on data are more successful than those that do not.  However interpretation and analysis is needed to transform this data into useful information.  Data analysis and interpretation is not easy and there are many tools on the market to help to transform raw data into valuable information. 

The challenge that most organizations face is that the special skills needed to analyze their data and these skills are not widely available.  In addition, to make use of the data the analysis and results need to be tightly integrated with the existing data sources and applications.  However, in general, software developers do not have the required data analysis skills.  AWS believe that their newly launched Amazon Machine Learning service will overcome these two challenges. 

AWS leveraged the data analysis tools and techniques that were developed for the Amazon.com retail organization when designing and building the ML service.  These are the underlying tools that try to anticipate the interests of buyers so as to direct them to the item they want and hence to make a purchase more likely.  Given the success of Amazon.com these tools and techniques ought to be very useful to the organizations wanting to get closer to their retail customers. 

In addition according to AWS,  the service can be used without the need for expertise in the area of data analytics.  The service provides features that can be used by software developers to build a model based on imperfect data; to validate that the predictions from the model are accurate and then to deploy that model in a way that can easily be integrated without change to existing applications.  AWS shared an anecdotal example in which their service was able to create a model in 20 minutes which had the same accuracy as a model that took two software developers a month to create manually. 

As you would expect the new service is tightly integrated with AWS data sources such as Amazon S3, Amazon Redshift and Amazon RDS. It can be invoked to provide predictions in real-time; for example, to enable the application to detect fraudulent transactions as they come in 

However there are the security and governance aspects of the use of this kind of tool.  The recent KuppingerCole Newsletter on Data Analytics discussed the problem of how to draw the line between improving service to customers and invading their privacy.  At what point does the aggregation and analysis of data become a threat rather than a benefit?  These are difficult questions to answer and regulations and the law provide little help.   

However from the point of view of an organization that wants to get closer to its customers, to provide better products, and to become more competitive data analytics are a powerful tool.   In the past the limiting factor has been the skills involved in the analysis and machine learning is a way to overcome this limitation. 

Using this form of analytics does have some risks.  Firstly it is important to be sure of the accuracy of the data.  This is especially true if the data comes from a source which is outside of your control.  Secondly can you understand the model and conclusions from the analytics process; an explanation would be nice?   If not be careful before you bet the farm on the results.  Correlations and associations are not cause and effect – make sure the results are valid.  Finally are you sure that you have permission to use the data at all and in that way in particular?  Privacy rules can limit the use you can make of personal data. 

Overall, AWS Machine learning provides an attractive solution to enable an organization to become more data driven.  However it is important to set the business objectives for the use of this approachto define the policies for its governance, and the appetite for risks relating to its use.


Migrating IT Infrastructure to the Cloud

Mar 10, 2015 by Mike Small

Much has been written about “DevOps” but there are other ways for organizations to benefit from the cloud. Moving all or part of their existing IT infrastructure and applications could provide savings in capital and, in many cases, increase security.

The cloud has provided an enormous opportunity for organizations to create new markets, to experiment and develop new applications without the need for upfront investment in hardware and to create disposable applications for marketing campaigns. This approach is generally known as DevOps; where the application is developed and deployed into operation in an iterative manner which is made possible by an easily expansible cloud infrastructure.

While DevOps has produced some remarkable results, it doesn’t help with the organization’s existing IT infrastructure. There are many reasons why an organization could benefit from moving some of their existing IT systems to the cloud. Cost is one but there are others including the need to constantly update hardware and to maintain a data centre. Many small organizations are limited to operating in premises that are not suitable as a datacentre; for example in offices over a shopping mall.  Although the organization may be wholly dependent upon their IT systems they may have no control over sprinkler systems, power, telecommunications, and even guaranteed 24x7 access to the building. They may be a risk of theft as well as fire, and incidents outside of their control. These are all factors which are well taken care of by cloud service providers (CSP) hosted in Tier III data centres.

However moving existing IT systems and applications to the cloud is not as simple. These legacy applications may be dependent upon very specific characteristics of the existing infrastructure such as IP address ranges or a particular technology stack which may be difficult to reproduce in the standard cloud environments. It is also important for customers to understand the sensitivity of the systems and data that they are moving to the cloud and the risks that these may be exposed to. Performing a cloud readiness risk assessment is an essential pre-requisite for an organization planning to use cloud services. Many of the issues around this relate to regulation and compliance and are described in KuppingerCole Analysts' View on Compliance Risks for Multinationals.

However it was interesting to hear of a US based CSP dinCloud that is focussing on this market. dinCloud first brought a hosted virtual desktop to the market. They have now expanded their offering to include servers, applications and IT infrastructure. dinCloud claim that their “Business Provisioning” service can help organizations to quickly and easily migrate all or part of their entire existing infrastructure to cloud.

This is a laudable aim; dinCloud claims some successes in the US and intend to expand worldwide. However, some of the challenges that they will face in Europe are the same as those currently faced by all US based CSPs – a lack of trust. Some of this has arisen through the Snowden revelations, the ongoing court case, where Microsoft in Ireland is being required to hand over emails to the US authorities, is fanning these flames. On top of this the EU privacy regulations, which are already strict, face being strengthened; and in some countries certain kinds of data must remain within the country. These challenges are discussed in Martin Kuppinger’s blog Can EU customers rely on US Cloud Providers?

This is an interesting initiative but to succeed in Europe dinCloud will need to win the trust of their potential customers. This will mean expanding their datacentre footprint into the EU/EEA and providing independent evidence of their security and compliance. When using a cloud service a cloud customer has to trust the CSP; independent certification, balanced contracts taking specifics of local regulations and requirements into account, and independent risk assessments are the best way of allowing the customer to verify that trust.


Organization, Security and Compliance for the IoT

Mar 03, 2015 by Mike Small

The Internet of Things (IoT) provides opportunities for organizations to get closer to their customers and to provide products and services that are more closely aligned to their needs. It provides the potential to enhance the quality of life for individuals, through better access to information and more control over their environment. It makes possible more efficient use of infrastructure by more precise control based on detailed and up to data information. It will change the way goods are manufactured by integrating manufacturing machinery, customers and partners allowing greater product customization as well as optimizing costs, processes and logistics.

However the IoT comes with risks the US Federal Trade Commission recently published a report of a workshop they held on this subject. This report, which is limited in its scope to IoT devices sold or used by consumers, identifies three major risks. These risks are enabling unauthorised access and misuse of personal information, facilitating attacks on other systems and creating risks to personal safety. In KuppingerCole’s view the wider risks are summarized in the following figure:

Organizations adopting this technology need to be aware of and manage these risks. As with most new technologies there is often a belief that there is a need to create a new organizational structure. In fact it is more important to ensure that the existing organization understands and addresses the potential risks as well as the potential rewards.

Organizations should take a well governed approach to the IoT by clearly defining the business objectives for its use and by setting constraints. The IoT technology used should be built to be trustworthy and should be used in a way that is compliant with privacy laws and regulations. Finally the organization should be able to audit and assure the organization’s use of the IoT.

The benefits from the IoT come from the vast amount of data that can be collected, analysed and exploited. Hence the challenges of Big Data governance security and management are inextricably linked with the IoT. The data needs to be trustworthy and it should be possible to confirm both its source and integrity. The infrastructure used for the acquisition, storage and analysis of this data needs to be secured; yet the IoT is being built using many existing protocols and technology that are weak and vulnerable.

The devices which form part of the IoT must be designed manufactured, installed and configured to be trustworthy. The security built into these devices for the risks identified today needs to be extensible to be proof against future threats since many of these devices will have lives measured in decades. There are existing low power secure technologies and standards that have been developed for mobile communications and banking, and these should be appropriately adopted, adapted and improved to secure the devices.

Trust in the devices is based on trust in their identities and so these identities need to be properly managed. There are a number of challenges relating to this area but there is no general solution.

Organizations exploiting data from the IoT should do this in a way that complies with laws and regulations. For personal information particular care should be given to aspects such as ensuring informed consent, data minimisation and information stewardship. There is a specific challenge to ensure that users understand and accept that the ownership of the device does not imply complete “ownership” of data. It is important that the lifecycle of data from the IoT properly managed from creation or acquisition to disposal. An organization should have a clear policy which identifies which data needs to be kept, why it needs to be kept and for how long. There should also be a clear policy for the deletion of data that is not retained for compliance or regulatory reasons.

This article has originally appeared in the KuppingerCole Analysts' View newsletter.


Where is my Workload?

Jan 15, 2015 by Mike Small

One of the major challenges that faces organizations using a cloud or hosting service is to know where their data is held and processed. This may be to ensure that they remain in compliance with laws and regulations or simply because they have a mistrust of certain geo-political regions. The location of this data may be defined in the contract with the CSP (Cloud Service Provider) but how can the organization using the service be sure that the contract is being met? This question has led to many organizations being reluctant to use cloud.

Using the cloud is not the only reason for this concern – my colleague Martin Kuppinger has previously blogged on this subject. Once information is outside of the system it is out of control and potentially lost somewhere in an information heaven or hell.

One approach to this problem is to encrypt the data so that if it moves outside of your control it is protected against unauthorized access. This can be straightforward encryption for structured application data or structured encryption using private and public keys as in some RMS systems for unstructured data like documents. However, as soon as the data is decrypted the risk re-merges. One approach to this could be to make use of ”sticky access policies”.

However while these approaches may protect against leakage they don’t let you ensure that your data is being processed in a trusted environment. What is needed is a way to enable you to control where your workload is being run in a secure and trusted way. This control needs to be achieved in a way that doesn’t add extra security concerns – for example allowing you to control where your data is must not allow an attacker to find your data more easily,

Two years ago NIST published a draft report IR 7904 Trusted Geolocation in the Cloud: Proof of Concept Implementation. The report describes the challenges that this poses and sets out a proposed approach that meets these challenges and which could be implemented as a proof of concept.   The US based cloud service provider Virtustream recently announced that its service now supports this capability. They state “This capability allows our customers to specify what data centre locations that their data can be hosted at and what data centres cannot host their data. This is programmatically managed with our xStream cloud orchestration application.”

The NIST document describes three stages that are needed in the implementation of this approach:

  1. Platform Attestation and Safer Hypervisor Launch. This ensures that the cloud workloads are run on trusted server platforms. To achieve this you need to:
    1. Configure a cloud server platform as being trusted.
    2. Before each hypervisor launch, verify (measure) the trustworthiness of the cloud server platform.
    3. During hypervisor execution, periodically audit the trustworthiness of the cloud server platform.
  2. Trust-Based Homogeneous Secure Migration. This stage allows cloud workloads to be migrated among homogeneous trusted server platforms within a cloud.
    1. Deploy workloads only to cloud servers with trusted platforms.
    2. Migrate workloads on trusted platforms to homogeneous cloud servers on trusted platforms; prohibit migration of workloads between trusted and untrusted servers
  3. Trust-Based and Geolocation-Based Homogeneous Secure Migration. This stage allows cloud workloads to be migrated among homogeneous trusted server platforms within a cloud, taking into consideration geolocation restrictions.
    1. Have trusted geolocation information for each trusted platform instance
    2. Provide configuration management and policy enforcement mechanisms for trusted platforms that include enforcement of geolocation restrictions.
    3. During hypervisor execution, periodically audit the geolocation of the cloud server platform against geolocation policy restrictions.
This is an interesting initiative by Virtustream and, since it is implemented through their xStream software which is used by other CSPs, it is to be hoped that this kind of functionality will be more widely offered. When using a cloud service a cloud customer has to trust the CSP. KuppingerCole’s advice is trust but verify.  This approach has the potential to allow verification by the customer.


Author info

Mike Small
Fellow Analyst
Profile | All posts
KuppingerCole Blog
KuppingerCole Select
Register now for KuppingerCole Select and get your free 30-day access to a great selection of KuppingerCole research materials and to live training sessions.
Register now
Cloud Risk & Security
Many organizations are concerned about the use of cloud services; the challenge is to securely enable the use of these services without negating and the benefits that they bring. To meet this challenge it is essential to move from IT Management to IT Governance.
KuppingerCole CLASS
Trusted Independent Advice in CLoud ASSurance including a detailed analysis of the Cloud Assurance management tasks in your company.
 KuppingerCole News

 KuppingerCole on Facebook

 KuppingerCole on Twitter

 KuppingerCole on Google+

 KuppingerCole on YouTube

 KuppingerCole at LinkedIn

 Our group at LinkedIn

 Our group at Xing
Imprint       General Terms and Conditions       Terms of Use       Privacy policy
© 2003-2015 KuppingerCole