English   Deutsch   Русский   中文    

Blog posts by Mike Small

EMC to acquire Virtustream

May 27, 2015 by Mike Small

On May 26th EMC announced that it is to acquire the privately held company Virtustream. What does this mean and what are the implications?

Virtustream is both a software vendor and a cloud service provider (CSP). Its software offering includes a cloud management platform xStream, an infrastructure assessment product Advisor, and the risk and compliance management software, ViewTrust. It also offers Infrastructure as a Service (IaaS) with datacentres in the US and Europe. KuppingerCole identified Virtustream as a “hidden gem” in our report: Leadership Compass: Infrastructure as a Service - 70959

The combination of these products has been used by Virtustream to target the Fortune 500 companies and help them along their journey to the cloud. Legacy application often have very specific needs that are difficult to reproduce in the vanilla cloud and risk and compliance issues are the top concerns when migrating systems of record to the cloud.

In addition the Virtustream technology works with VMWare to provide an extra degree of resource optimization through their Micro Virtual Machine (µVM) approach. This approach uses smaller units of allocation for both memory and processor which removes artificial sizing boundaries, makes it easier to track resources consumed, and results in less wasted resources.

The xStream cloud management software enables the management of hybrid clouds through a “single pane of glass” management console using open published APIs. It is claimed to provides enterprise grade security with integrity built upon the capabilities in the processors. Virtustream was the first CSP to announce support for NIST draft report IR 7904 Trusted Geolocation in the Cloud: Proof of Concept Implementation. This allows the user to control the geolocation of their data held in the cloud.

EMC already provides their Federation Enterprise Hybrid Cloud Solution — an on premise private cloud offering that provides a stepping stone to public cloud services. EMC also recently entered the cloud service market with an IaaS service vCloud Air based on VMWare. Since many organization already use VMWare to run their IT on premise, it was intended to make it possible to migrate these workloads without change to the cloud. An assessment of vCloud Air is also included in our Leadership Compass Report on Infrastructure as a Service – 70959.

The early focus by CSPs was on DevOps but the market for enterprise grade cloud solutions is a growth area as large organizations look to save costs by datacentre consolidation and “cloud sourcing” IT services. However success in this market needs the right combination of consultancy services, assurance and trust to succeed. Virtustream seems to have met with some success in attracting US organizations to their service. The challenge for EMC is to clearly differentiate between the different cloud offerings they now have and to compete with the existing strong players in this market. As well as the usual challenges of integrating itself into the EMC group, Virtustream may also find it difficult to focus on both providing cloud services as well as developing software.


Google+

Risk and Governance in Analytics

May 12, 2015 by Mike Small

There is now an enormous quantity of data which is being generated in a wide variety of forms. However this data, in itself, has little meaning or value; it needs interpretation to make it useful. Analytics are the tools, techniques and technologies that can be used to analyze this data into information with value. These analytics are now being widely adopted by organizations to improve their performance. However what are the security and governance aspects of the use of these tools?

For example Dunnhumby which was created in 1989, by a husband and wife team, to help businesses better understand their customers by being 'voyeurs of the shopping basket'. Within a few years, they were working with Tesco to develop their Clubcard loyalty program. The insights from this help Tesco stock the right products, optimize prices, run relevant promotions and communicate personalized offers for customers across all contact channels.

However another side to this kind of analysis was described in the NY Times article How Companies Learn Your Secrets - NYTimes.com. According to this article a statistician working for the US retailer Target figured out how to identify customers in the second trimester of their pregnancy based on buying habits and other customer data. The targeted advertising based on this led to an angry father complaining to a store manager about advertising for baby clothes and cribs being sent to his daughter who was still in high school. It turned out that the analytics had worked out she was in fact pregnant but she had not told her father.

These examples based on loyalty cards illustrate the value of data analytics but the problem is now even more difficult. This is because of the amount of data that is being generated through smart devices and Apps vastly exceeds that from the occasional use of a loyalty card.

So where is the line between improving service to customers and invading their privacy? At what point does the aggregation and analysis of data become a threat rather than a benefit? These are difficult questions to answer and regulations and the law provide little help. For example when a customer in the UK accepts a customer loyalty card they accept the terms and conditions. These will almost certainly include an agreement that the card provider can use the data collected through its use in a wide variety of ways. Most people do not read the small print – they simply want the loyalty rewards. Those who do read the small print are unlikely to understand the full implication of what they are agreeing to. However under the data protection laws this agreement is considered to be “informed consent”. So is this a fair bargain? Based on the take up of loyalty cards in the UK - for most people it is.

So from the point of view of an organization that wants to get closer to its customers, to provide better products, to become more competitive data analytics are a powerful tool. According to Erik Brynjolfsson Professor at the MIT Sloan School of Management: “Companies with ‘data driven decision making’ actually show higher performance”. Working with Lorin Hitt and Heekyung Kim, Professor Brynjolfsson analyzed 179 large publicly-traded firms and found that the ones that adopted this method are about 5% more productive and profitable than their competitors. Furthermore, the study found a relationship between this method and other performance measures such as asset utilization return on equity and market value.

But what are the risks to the organization in using these forms of analytics? Firstly it is important to be sure of the accuracy of the data.

Can you be sure of the source of the data which originates from outside of your organization and outside of your control? Many consumers take steps to cloak their identity by using multiple personas, the Internet of Things may provide a rich source of data but without guarantees regarding its provenance or accuracy. If you are sure of the data what about the conclusions from analysis?

Can the analytics process provide an explanation of why it has reached the conclusions that you can understand? If not be careful before you bet the farm on the results.

Are you sure that you have permission to use the data at all and in that way in particular? In the EU there are many rules regarding the privacy of personal information. An individual gives data to a third party (known as the data controller) for a specific purpose. The data controller is required to only hold the minimum data and to only process it for the agreed purpose.

If you are going to use analytics it is a decision which should involve the board of directors. They should set the business objectives for its use, define the policies for its governance, and their appetite for risks relating to its use.

This article has originally appeared in the KuppingerCole Analysts' View newsletter.


Google+

AWS Announces Machine Learning Service

Apr 10, 2015 by Mike Small

AWS has recently announced the Amazon Machine Learning service – what is this and what does it mean for customers? 

Organizations now hold enormous quantities of data and more data in a wide variety of forms is rapidly being generated.  Research has shown that organizations that base their decision making and processes on data are more successful than those that do not.  However interpretation and analysis is needed to transform this data into useful information.  Data analysis and interpretation is not easy and there are many tools on the market to help to transform raw data into valuable information. 

The challenge that most organizations face is that the special skills needed to analyze their data and these skills are not widely available.  In addition, to make use of the data the analysis and results need to be tightly integrated with the existing data sources and applications.  However, in general, software developers do not have the required data analysis skills.  AWS believe that their newly launched Amazon Machine Learning service will overcome these two challenges. 

AWS leveraged the data analysis tools and techniques that were developed for the Amazon.com retail organization when designing and building the ML service.  These are the underlying tools that try to anticipate the interests of buyers so as to direct them to the item they want and hence to make a purchase more likely.  Given the success of Amazon.com these tools and techniques ought to be very useful to the organizations wanting to get closer to their retail customers. 

In addition according to AWS,  the service can be used without the need for expertise in the area of data analytics.  The service provides features that can be used by software developers to build a model based on imperfect data; to validate that the predictions from the model are accurate and then to deploy that model in a way that can easily be integrated without change to existing applications.  AWS shared an anecdotal example in which their service was able to create a model in 20 minutes which had the same accuracy as a model that took two software developers a month to create manually. 

As you would expect the new service is tightly integrated with AWS data sources such as Amazon S3, Amazon Redshift and Amazon RDS. It can be invoked to provide predictions in real-time; for example, to enable the application to detect fraudulent transactions as they come in 

However there are the security and governance aspects of the use of this kind of tool.  The recent KuppingerCole Newsletter on Data Analytics discussed the problem of how to draw the line between improving service to customers and invading their privacy.  At what point does the aggregation and analysis of data become a threat rather than a benefit?  These are difficult questions to answer and regulations and the law provide little help.   

However from the point of view of an organization that wants to get closer to its customers, to provide better products, and to become more competitive data analytics are a powerful tool.   In the past the limiting factor has been the skills involved in the analysis and machine learning is a way to overcome this limitation. 

Using this form of analytics does have some risks.  Firstly it is important to be sure of the accuracy of the data.  This is especially true if the data comes from a source which is outside of your control.  Secondly can you understand the model and conclusions from the analytics process; an explanation would be nice?   If not be careful before you bet the farm on the results.  Correlations and associations are not cause and effect – make sure the results are valid.  Finally are you sure that you have permission to use the data at all and in that way in particular?  Privacy rules can limit the use you can make of personal data. 

Overall, AWS Machine learning provides an attractive solution to enable an organization to become more data driven.  However it is important to set the business objectives for the use of this approachto define the policies for its governance, and the appetite for risks relating to its use.


Google+

Migrating IT Infrastructure to the Cloud

Mar 10, 2015 by Mike Small

Much has been written about “DevOps” but there are other ways for organizations to benefit from the cloud. Moving all or part of their existing IT infrastructure and applications could provide savings in capital and, in many cases, increase security.

The cloud has provided an enormous opportunity for organizations to create new markets, to experiment and develop new applications without the need for upfront investment in hardware and to create disposable applications for marketing campaigns. This approach is generally known as DevOps; where the application is developed and deployed into operation in an iterative manner which is made possible by an easily expansible cloud infrastructure.

While DevOps has produced some remarkable results, it doesn’t help with the organization’s existing IT infrastructure. There are many reasons why an organization could benefit from moving some of their existing IT systems to the cloud. Cost is one but there are others including the need to constantly update hardware and to maintain a data centre. Many small organizations are limited to operating in premises that are not suitable as a datacentre; for example in offices over a shopping mall.  Although the organization may be wholly dependent upon their IT systems they may have no control over sprinkler systems, power, telecommunications, and even guaranteed 24x7 access to the building. They may be a risk of theft as well as fire, and incidents outside of their control. These are all factors which are well taken care of by cloud service providers (CSP) hosted in Tier III data centres.

However moving existing IT systems and applications to the cloud is not as simple. These legacy applications may be dependent upon very specific characteristics of the existing infrastructure such as IP address ranges or a particular technology stack which may be difficult to reproduce in the standard cloud environments. It is also important for customers to understand the sensitivity of the systems and data that they are moving to the cloud and the risks that these may be exposed to. Performing a cloud readiness risk assessment is an essential pre-requisite for an organization planning to use cloud services. Many of the issues around this relate to regulation and compliance and are described in KuppingerCole Analysts' View on Compliance Risks for Multinationals.

However it was interesting to hear of a US based CSP dinCloud that is focussing on this market. dinCloud first brought a hosted virtual desktop to the market. They have now expanded their offering to include servers, applications and IT infrastructure. dinCloud claim that their “Business Provisioning” service can help organizations to quickly and easily migrate all or part of their entire existing infrastructure to cloud.

This is a laudable aim; dinCloud claims some successes in the US and intend to expand worldwide. However, some of the challenges that they will face in Europe are the same as those currently faced by all US based CSPs – a lack of trust. Some of this has arisen through the Snowden revelations, the ongoing court case, where Microsoft in Ireland is being required to hand over emails to the US authorities, is fanning these flames. On top of this the EU privacy regulations, which are already strict, face being strengthened; and in some countries certain kinds of data must remain within the country. These challenges are discussed in Martin Kuppinger’s blog Can EU customers rely on US Cloud Providers?

This is an interesting initiative but to succeed in Europe dinCloud will need to win the trust of their potential customers. This will mean expanding their datacentre footprint into the EU/EEA and providing independent evidence of their security and compliance. When using a cloud service a cloud customer has to trust the CSP; independent certification, balanced contracts taking specifics of local regulations and requirements into account, and independent risk assessments are the best way of allowing the customer to verify that trust.


Google+

Organization, Security and Compliance for the IoT

Mar 03, 2015 by Mike Small

The Internet of Things (IoT) provides opportunities for organizations to get closer to their customers and to provide products and services that are more closely aligned to their needs. It provides the potential to enhance the quality of life for individuals, through better access to information and more control over their environment. It makes possible more efficient use of infrastructure by more precise control based on detailed and up to data information. It will change the way goods are manufactured by integrating manufacturing machinery, customers and partners allowing greater product customization as well as optimizing costs, processes and logistics.

However the IoT comes with risks the US Federal Trade Commission recently published a report of a workshop they held on this subject. This report, which is limited in its scope to IoT devices sold or used by consumers, identifies three major risks. These risks are enabling unauthorised access and misuse of personal information, facilitating attacks on other systems and creating risks to personal safety. In KuppingerCole’s view the wider risks are summarized in the following figure:

Organizations adopting this technology need to be aware of and manage these risks. As with most new technologies there is often a belief that there is a need to create a new organizational structure. In fact it is more important to ensure that the existing organization understands and addresses the potential risks as well as the potential rewards.

Organizations should take a well governed approach to the IoT by clearly defining the business objectives for its use and by setting constraints. The IoT technology used should be built to be trustworthy and should be used in a way that is compliant with privacy laws and regulations. Finally the organization should be able to audit and assure the organization’s use of the IoT.

The benefits from the IoT come from the vast amount of data that can be collected, analysed and exploited. Hence the challenges of Big Data governance security and management are inextricably linked with the IoT. The data needs to be trustworthy and it should be possible to confirm both its source and integrity. The infrastructure used for the acquisition, storage and analysis of this data needs to be secured; yet the IoT is being built using many existing protocols and technology that are weak and vulnerable.

The devices which form part of the IoT must be designed manufactured, installed and configured to be trustworthy. The security built into these devices for the risks identified today needs to be extensible to be proof against future threats since many of these devices will have lives measured in decades. There are existing low power secure technologies and standards that have been developed for mobile communications and banking, and these should be appropriately adopted, adapted and improved to secure the devices.

Trust in the devices is based on trust in their identities and so these identities need to be properly managed. There are a number of challenges relating to this area but there is no general solution.

Organizations exploiting data from the IoT should do this in a way that complies with laws and regulations. For personal information particular care should be given to aspects such as ensuring informed consent, data minimisation and information stewardship. There is a specific challenge to ensure that users understand and accept that the ownership of the device does not imply complete “ownership” of data. It is important that the lifecycle of data from the IoT properly managed from creation or acquisition to disposal. An organization should have a clear policy which identifies which data needs to be kept, why it needs to be kept and for how long. There should also be a clear policy for the deletion of data that is not retained for compliance or regulatory reasons.

This article has originally appeared in the KuppingerCole Analysts' View newsletter.


Google+

Where is my Workload?

Jan 15, 2015 by Mike Small

One of the major challenges that faces organizations using a cloud or hosting service is to know where their data is held and processed. This may be to ensure that they remain in compliance with laws and regulations or simply because they have a mistrust of certain geo-political regions. The location of this data may be defined in the contract with the CSP (Cloud Service Provider) but how can the organization using the service be sure that the contract is being met? This question has led to many organizations being reluctant to use cloud.

Using the cloud is not the only reason for this concern – my colleague Martin Kuppinger has previously blogged on this subject. Once information is outside of the system it is out of control and potentially lost somewhere in an information heaven or hell.

One approach to this problem is to encrypt the data so that if it moves outside of your control it is protected against unauthorized access. This can be straightforward encryption for structured application data or structured encryption using private and public keys as in some RMS systems for unstructured data like documents. However, as soon as the data is decrypted the risk re-merges. One approach to this could be to make use of ”sticky access policies”.

However while these approaches may protect against leakage they don’t let you ensure that your data is being processed in a trusted environment. What is needed is a way to enable you to control where your workload is being run in a secure and trusted way. This control needs to be achieved in a way that doesn’t add extra security concerns – for example allowing you to control where your data is must not allow an attacker to find your data more easily,

Two years ago NIST published a draft report IR 7904 Trusted Geolocation in the Cloud: Proof of Concept Implementation. The report describes the challenges that this poses and sets out a proposed approach that meets these challenges and which could be implemented as a proof of concept.   The US based cloud service provider Virtustream recently announced that its service now supports this capability. They state “This capability allows our customers to specify what data centre locations that their data can be hosted at and what data centres cannot host their data. This is programmatically managed with our xStream cloud orchestration application.”

The NIST document describes three stages that are needed in the implementation of this approach:

  1. Platform Attestation and Safer Hypervisor Launch. This ensures that the cloud workloads are run on trusted server platforms. To achieve this you need to:
    1. Configure a cloud server platform as being trusted.
    2. Before each hypervisor launch, verify (measure) the trustworthiness of the cloud server platform.
    3. During hypervisor execution, periodically audit the trustworthiness of the cloud server platform.
  2. Trust-Based Homogeneous Secure Migration. This stage allows cloud workloads to be migrated among homogeneous trusted server platforms within a cloud.
    1. Deploy workloads only to cloud servers with trusted platforms.
    2. Migrate workloads on trusted platforms to homogeneous cloud servers on trusted platforms; prohibit migration of workloads between trusted and untrusted servers
  3. Trust-Based and Geolocation-Based Homogeneous Secure Migration. This stage allows cloud workloads to be migrated among homogeneous trusted server platforms within a cloud, taking into consideration geolocation restrictions.
    1. Have trusted geolocation information for each trusted platform instance
    2. Provide configuration management and policy enforcement mechanisms for trusted platforms that include enforcement of geolocation restrictions.
    3. During hypervisor execution, periodically audit the geolocation of the cloud server platform against geolocation policy restrictions.
This is an interesting initiative by Virtustream and, since it is implemented through their xStream software which is used by other CSPs, it is to be hoped that this kind of functionality will be more widely offered. When using a cloud service a cloud customer has to trust the CSP. KuppingerCole’s advice is trust but verify.  This approach has the potential to allow verification by the customer.


Google+

A Haven of Trust in the Cloud?

Nov 11, 2014 by Mike Small

In September a survey was published in Dynamic CISO that showed that “72% of Businesses Don’t Trust Cloud Vendors to Obey Data Protection Laws and Regulations”.  Given this lack of trust by their customers what can cloud service vendors do?

When an organization stores data on its own computers, it believes that it can control who can access that data. This belief may be misplaced given the number of reports of data breaches from on premise systems; but most organizations trust themselves more than they trust others.  When the organization stores data in the cloud, it has to trust the cloud provider, the cloud provider’s operations staff and the legal authorities with jurisdiction over the cloud provider’s computers. This creates many serious concerns about moving applications and data to the cloud and this is especially true in Europe and in particular in geographies like Germany where there are very strong data protections laws.

One approach is to build your own cloud where you have physical control over the technology but you can exploit some of the flexibility that a cloud service provides. This is the approach that is being promoted by Microsoft.  In October Microsoft in conjunction with Dell announced their “Cloud Platform System”.  This is effectively a way for an organization to deploy Dell servers running the Microsoft Azure software stack on premise.  Using this platform, an organization can build and deploy on premise applications that are Azure cloud ready.  At the same time it can see for itself what goes on “under the hood”.  Then, when the organization has built enough trust, or when it needs more capacity it can easily extend the existing workload in to the cloud.   This approach is not unique to Microsoft – other cloud vendors also offer products that can be deployed on premise where there are specific needs.

In the longer term Microsoft researchers are working to create what is being described as a “Haven in the Cloud”.  This was described in a paper at the 11th USENIX Symposium on Operating Systems Design and Implementation.  In this paper, Baumann and his colleagues offer a concept they call “shielded execution,” which protects the confidentiality and the integrity of a program, as well as the associated data from the platform on which it runs—the cloud operator’s operating system, administrative software, and firmware. They claim to have shown for the first time that it is possible to store data and perform computation in the cloud with equivalent trust to local computing.

The Haven prototype uses the hardware protection proposed in Intel’s Software Guard Extensions (SGX)—a set of CPU instructions that can be used by applications to isolate code and data securely, enabling protected memory and execution. It addresses the challenges of executing unmodified legacy binaries and protecting them from a malicious host.  It is based on “Drawbridge” another piece of Microsoft research that is a new kind of virtual-machine container.

The question of trust in cloud services remains an important inhibitor to their adoption. It is good to see that vendors are taking these concerns seriously and working to provide solutions.  Technology is an important component of the solution but it is not, in itself sufficient.  In general computers do not breach data by themselves; human interactions play an important part.  The need for cloud services to support better information stewardship as well as for cloud service providers to create an information stewardship culture is also critical to creating trust in their services.  From the perspective of the cloud service customer my advice is always trust but verify.


Google+

CESG Draft Cloud Security Principles and Guidelines

Sep 27, 2014 by Mike Small

UK CESG, the definitive voice on the technical aspects of Information Security in UK Government, has published draft versions of guidance for “public sector organizations who are considering using cloud services for handling OFFICIAL information”. (Note that the guidelines are still at a draft stage (BETA) and the CESG is requesting comments).  There are already many standards that exist or are being been developed around the security of cloud services (see: Executive View: Cloud Standards Cross Reference – 71124) so why is this interesting?

Firstly there is an implied prerequisite that the information being held or processed has being classified as OFFICIAL. KuppingerCole advice is very clear; the first step to cloud security is to understand the risk by considering the business impact of loss or compromise of data.  CESG publishes a clear definition for OFFICIAL which is the lowest level of classification and covers “ALL routine public sector business, operations and services”.  So to translate this into business terms the guidelines are meant for cloud services handling the day to day operational services and data.

Secondly the guidelines are simple, clear and concise, and simple is more likely to be successful that complex. There are 14 principles that apply to any organization using cloud services.  The principles are summarized as follows:

  1. Protect data in transit
  2. Protect data stored against tampering, loss, damage or seizure. This includes consideration of legal jurisdiction as well as sanitization of deleted data.
  3. A cloud consumer’s service and data should be protected against the actions of others.
  4. The CSP (service provider) should have and implement a security governance framework.
  5. The CSP should have processes and procedures to ensure the operational security of the service.
  6. CSP staff should be security screened and trained in the security aspects of their role.
  7. Services should be designed and developed in a way that identifies and mitigates security threats.
  8. The service supply chain should support the principles.
  9. Service consumers should be provided with secure management tools for the service.
  10. Access to the service should be limited to authenticated and authorized individuals.
  11. External interfaces should be protected
  12. CSP administration processes should be designed to mitigate risk of privilege abuse.
  13. Consumers of the service should be provided with the audit records they need to monitor their access and the data.
  14. Consumers have responsibilities to ensure the security of the service and their data.
Thirdly there is detailed implementation advice for each of these principles.  As well as providing technical details for each principle it describes six ways in which the customer can obtain assurance.  These assurance approaches can be used in combination to increase confidence.   The approaches are:
  1. Service provider assertions – this relies upon the honesty, accuracy and completeness of the information from the service provider.
  2. Contractual commitment by the service provider.
  3. Review by an independent third party to confirm the service provider’s assertions.
  4. Independent testing to demonstrate that controls are correctly implemented and objectives are met in practice. Ideally this and 3 above should be carried out to a recognised standard. (Note that there are specific UK government standards here but for most commercial organizations these standards would include ISO/IEC 27001, SOC attestations to AICPA SSAE No. 16/ ISAE No. 3402 and the emerging CSA Open Certification Framework)
  5. Assurance in the service design - A qualified security architect is involved in the design or review of the service architecture.
  6. Independent assurance in the components of a service (such as the products, services, and individuals which a service uses).
These guidelines provide a useful addition to the advice that is available around the security of cloud services.  They provide a set of simple principles that are easy to understand.  These principles are backed up with detailed technical advice on their implementation and assurance.  Finally they take a risk based approach where the consumer needs to classify the data and services in terms of their business impact.

KuppingerCole has helped major European organizations to successfully understand and manage the real risks associated with cloud computing. We offer research and services to help cloud service providers, cloud security tool vendors, and end user organizations.  To learn more about how we can help your organization, just contact sales@kuppingercole.com).


Google+

Microsoft OneDrive file sync problems

Sep 01, 2014 by Mike Small

A number of users of Microsoft’s OneDrive cloud storage system have reported problems on the Microsoft community relating to synchronizing files between devices. So far, I have not seen an official response from Microsoft. This can be very disconcerting so, in the absence of a response from Microsoft, here are some suggestions to affected users. These worked for me but – in the absence of a formal response from Microsoft – I can offer no cast iron guarantees.

What is the problem? It appears that files created on one device are synced to another device in a corrupt state. This only seems to affect Microsoft Office files (Word, Excel, PowerPoint etc.), which have been created or updated since around August 27th. It does not appear to affect other types of files such as .pdf, .jpg and .zip, for example. When the user tries to access the corrupt file, they get a message of the form “We’re sorry, we can’t open the <file> because we found a problem with its contents”.

This problem does not affect every device, but it can be very disconcerting when it happens to you! The good news is that the data appears to be correct on the OneDrive cloud and – if you are careful – you can retrieve it.

Have I got the problem? Here is a simple test that will allow you to see if you have the problem on your device:

  1. Create a simple Microsoft Office file and save it on the local files store of the device. Do not save it on the OneDrive system.
  2. Log onto OneDrive https://onedrive.live.com/ using a browser and upload the file to a folder on your OneDrive.
  3. Check the synced copy of the file downloaded by the OneDrive App onto your device. If the synced file is corrupted, you have the problem!
What can I do? Do not panic – the data seems to be OK on the OneDrive cloud. Here is how I was able to get the data back onto my device:
  1. Log onto OneDrive https://onedrive.live.com/ using a browser and download the file to your device - replace the corrupt copy
  2. Do NOT delete the corrupt file on your device - this will send the corrupt version to the recycle bin. It will also cause the deletion of the good version on other devices.
  3. It is always a good idea to run a complete malware scan on your devices. If you have not done so recently, now is a very good time. I did that but no threats were detected.
  4. Several people including me have followed the advice on how to troubleshoot sync problems published by Microsoft – but this did not work for them or me.
  5. I did a complete factory reset on my Surface RT – this did not help. Many other people have tried this also to no avail.
Is there a work around? I have not yet seen a formal response from Microsoft, so here are some things that all worked for me:
  1. Accept the problem and whenever you find a corrupt file perform a manual download as described above.
  2. Protect your Office files using a password – this caused the files to be encrypted and it appears that password protected files are not corrupted.  In any case KuppingerCole recommends that information held in cloud storage should be encrypted.
  3. Use WinZip to zip files that are being changed. It seems that .zip files are not being corrupted.
  4. Use some other cloud storage system or a USB to share these files.
This example illustrates some of the downsides of using a cloud service. Cloud services are very convenient when they work, but when they don’t work you may have very little control over the process to fix the problem. You are completely in the hands of the CSP (Cloud Service Provider). If you are using a service for business, access to the data you are entrusting to the CSP may be critical to your business operations. One of the contributors to Microsoft support community described how since he was unable to work he was getting no pay and this is a graphic illustration of the problem.

KuppingerCole can offer research, advice and services relating to securely using the cloud. In London of October 7th KuppingerCole will hold a Leadership Seminar on Risk and Reward from the Cloud and the Internet of Things. Attend this seminar to find out how to manage these kinds of problems for your organization.

Update September 3rd, 2014

An update – the program manager at OneDrive (Arcadiy K) responded to the Microsoft Community and apologized.

“We’ve found the cause of the issue and we believe we have made sure that no new files will be affected. Any files you sync moving forward should work fine and you should no longer encounter the corruption issues described in this thread. Please let us know if you find otherwise.”

I have  tested this on my Windows RT 8.1 and on this device I can confirm that it is fixed. Interestingly there have been no Microsoft updates (or any other changes except a virus signature update) to my device.

Microsoft have just announced the rollup update for August 2014.   Under “Fixed issues included in this is: August 2014 OneDrive reliability update for Windows RT 8.1 and Windows 8.1

Some folk are still having problems trying to clean up the mess from the previous errors. I would advise reading the thread on the support forum for suggestions on how to recover from these.


Google+

Cloud Provider Assurance

Aug 05, 2014 by Mike Small

Using the cloud involves an element of trust between the consumer and the provider of a cloud service; however, it is vital to verify that this trust is well founded. Assurance is the process that provides this verification. This article summarizes the steps a cloud customer needs to take to assure that cloud a service provides what is needed and what was agreed.

The first step towards assuring a cloud service is to understand the business requirements for it. The needs for cost, compliance and security follow directly from these requirements. There is no absolute assurance level for a cloud service – it needs to be just as secure, compliant and cost effective as dictated by the business needs – no more and no less.

The needs for security and compliance depend upon the kind of data and applications being moved into the cloud. It is important to classify this data and any applications in terms of their sensitivity and regulatory requirement needs. This helps the procurement process by setting many of the major parameters for the cloud service as well as the needs for monitoring and assurance. Look at Advisory Note: From Data Leakage Prevention (DLP) to Information Stewardship – 70587.

Use a standard process for selecting cloud services that is fast, simple, reliable, standardized, risk-oriented and comprehensive. Without this, there will be a temptation for lines of business to acquire cloud services directly without fully considering the needs for security, compliance and assurance. For more information on this aspect see Advisory Note: Selecting your cloud provider - 70742.

Take care to manage the contract with the cloud service provider. An article on negotiating cloud contracts from Queen Mary University of London provides a comprehensive list of the concerns of organizations adopting the cloud and a detailed analysis of cloud contract terms. According to this article, many of the contracts studied provided very limited liability, inappropriate SLAs (Service Level Agreements), and a risk of contractual lock in. See also - Advisory Note: Avoiding Lock-in and Availability Risks in the Cloud - 70171.

Look for compliance with standards; a cloud service may have significant proprietary content and this can also make the costs of changing provider high. Executive View: Cloud Standards Cross Reference – 71124 provides advice on this.

You can outsource the processing, but you can’t outsource responsibility – make sure that you understand how responsibilities are divided between your organization and the CSP. For example, under EU Data Protection laws, the cloud processor is usually the "data processor" and the cloud customer is the "data controller". Remember that the "data controller" can be held responsible for breaches of privacy by a "data processor".

Independent certification is the best way to verify the claims made by a CSP. Certification of the service to ISO/IEC 27001 is a mandatory requirement. However, it is important to properly understand that what is certified is relevant to your needs. For a complete description of how to assure cloud services in your organization see Advisory Note: Cloud Provider Assurance - 70586.

This article was originally published in the KuppingerCole Analysts’ View Newsletter.


Google+


top
Author info

Mike Small
Fellow Analyst
Profile | All posts
KuppingerCole Blog
By:
KuppingerCole Select
Register now for KuppingerCole Select and get your free 30-day access to a great selection of KuppingerCole research materials and to live trainings.
Register now
Spotlight
Analytics
There is now an enormous quantity of data which is being generated in a wide variety of forms. However this data, in itself, has little meaning or value; it needs interpretation to make it useful. Analytics are the tools, techniques and technologies that can be used to analyze this data into information with value. These analytics are now being widely adopted by organizations to improve their performance. However what are the security and governance aspects of the use of these tools?
KuppingerCole Services
KuppingerCole offers clients a wide range of reports, consulting options and events enabling aimed at providing companies and organizations with a clear understanding of both technology and markets.
Links
 KuppingerCole News

 KuppingerCole on Facebook

 KuppingerCole on Twitter

 KuppingerCole on Google+

 KuppingerCole on YouTube

 KuppingerCole at LinkedIn

 Our group at LinkedIn

 Our group at Xing

 GenericIAM
Imprint       General Terms and Conditions       Terms of Use       Privacy policy
© 2003-2015 KuppingerCole