Blog posts by Warwick Ashford

Working Securely at Home During the Pandemic

Working securely at home during the pandemic

As more people are working from home than ever before, there is an increasing demand for communication services. But security needs to be a key consideration as businesses adapt to a new way of working, as my colleagues John Tolbert, Matthias Reinwarth, and Alexei Balaganski have pointed out in their recommendations on responding to the Covid19 pandemic.

The move to cloud is obvious

For many organizations, meeting the challenges presented by the pandemic means making a quick move to the cloud, but as Matthias points out, this must be managed properly with security in mind.

AWS, which places a great deal of emphasis on security and claims that all its services are secure out of the box, is inevitably seeing a huge spike in demand for its cloud-based communication services, but is well-positioned to meet the change in demand and usage patterns.

AWS reports reductions in demand from some customers and increases in others, depending on how those organizations are being impacted by the pandemic. This is easily managed for AWS, which is able to scale in both directions as demand requires.

As noted by my colleagues, organizations should seriously consider the security implications of employees using their own, potentially malware infested, laptop and desktop computers when working from home.

Remote desktops a good option

In the light of the risk of malware on employee laptops and desktops, organizations should consider using a remote desktop. According to AWS, it is seeing an increase in demand for its WorkSpaces service which is a secure desktop-as-a-service solution for Windows or Linux.

This approach makes sense during the pandemic because organizations do not need to provide laptops and desktops to all employees because those that have their own equipment can use it to access to a remote desktop, but without malware and other security concerns. The service can also be deployed without delay. According to AWS, WorkSpaces can be deployed in as little as 5 minutes.

The approach inserts a logical gap between the employees’ laptops and the enterprise environment because the processor and operating system are provided by the supplier of the remote desktop service.

The location of AWS data centres in Dublin, Frankfurt and Paris ensures that there are no latency problems within Europe.

Secure meetings

The recent security warnings about vulnerabilities in the Windows client of the Zoom video conferencing app have underlined the importance of choosing a secure video conferencing option.

AWS is offering a three-month free trial of its new Chime Professional communications service, which AWS uses internally. The service is designed with regulations such as the EU’s General Data Protection (GDPR) in mind. Chime Professional allows users to choose where the communications bridge is located, and the service is designed so that no traffic will leave the region of the chosen bridge location.

Critical infrastructure

In addition to capacity provided by regional data centers, AWS is considered part of critical national infrastructure in many European countries, which means that governments have a vested interest in providing support wherever it may be needed.

Due to compliance with German cyber security legislation, Amazon Elastic Compute Cloud (EC2), CloudFront content delivery network and Route 53 domain name service (DNS) have official recognition as critical infrastructure in Germany.  

AWS does not anticipate any limits or restrictions regarding the availability of AWS services or restrictions on AWS usage as a result of COVID-19. The AWS Cloud is built for customers to scale up as needed, so they can continue to use AWS as normal.

New AWS security capabilities

Access Analyzer and Amazon Detective, two innovations announced the AWS re:Invent conference in Las Vegas in December 2019, are now generally available.

Access Analyzer is a new Identity and Access Management (IAM) capability for Amazon S3 (Simple Storage Service) to make it easy for customer organizations to review access policies and audit them for unintended access.

Access Analyser is a feature of AWS accounts offered at no additional charge that provides a single view across all access policies to determine whether any have been misconfigured to allow unintended public or cross-account access.

The newly available Amazon Detective security service is designed to make it easy for customers to conduct faster and more efficient investigations into security issues across their workloads.

Amazon Detective helps security teams conduct investigations by automatically analyzing and organizing data from AWS CloudTrail and Amazon Virtual Private Cloud (VPC) Flow Logs into a graph model that summarizes resource behaviors and interactions across a customer’s AWS environment.

Amazon Detective’s visualizations are designed to provide the details, context, and guidance to help analysts determine the nature and extent of issues identified by AWS security services like Amazon GuardDuty, Amazon Inspector, Amazon Macie, and AWS Security Hub, to enable security teams to begin remediation quickly.

It is good that security is an integral part of all AWS services, and that AWS is continually improving existing services as well as adding new services to further enhance existing security services, that will now appeal to a whole new market as organizations look for ways to keep working.

For more on security solutions, reviews and comparisons, see our research. For actionable guidance, our team of advisors can assist you with developing tactics and strategies.

Mitigate Citrix Vulnerability in Face of PoC Exploits

Despite a Citrix warning in mid-December of a serious vulnerability in Citrix Application Delivery Controller (ADC) and Citrix Gateway (formerly NetScaler and NetScaler Gateway), thousands of companies have yet to put in place the recommended mitigations.

In the meantime, several proof of concept (PoC) exploits have been published on GitHub, making it extremely easy for attackers to gain access to networks and impersonate authorized users.

Thousands of Citrix systems still vulnerable

Initial estimates put the number of vulnerable systems at 80,000 in 158 countries. Researchers reported on 8 January that scans showed the number was probably around 60,000 and that 67% did not have mitigations enabled, including high value targets in finance, government and healthcare.

Any company that uses either of the affected Citrix products should therefore implement the mitigation measures as soon as possible to reduce the risk of unauthenticated attackers using the PoC exploits to carry out arbitrary code execution on their systems.

Citrix “strongly urges” affected customers to apply the provided mitigation and recommends that customers upgrade all vulnerable appliances to a fixed version of the appliance firmware when released.

Window of opportunity for attackers

The first security updates are expected to be available on 20 January 2020 for versions 11.1 and 12.0. A fix for versions 12.1 and 13.0 is expected on 27 January, while a fix for version 10.5 is expected only on 31 January.

In the light of the fact that PoCs have been published and various security research teams have reported evidence that attackers are scanning the internet for vulnerable appliances and attempting exploits, IT admins using affected Citrix products should not wait to implement mitigations to reduce the risk of compromise.

Mitigate and patch as soon as possible

When easily exploitable vulnerabilities are announced by suppliers, it is always a good idea to apply recommended mitigations and security updates as soon as they are available. The importance of this is underlined by the impact of attacks like WannaCry and NotPetya due to the failure of affected organizations to apply patches as soon as they were available.

Patching reduces the attack surface by ensuring that vulnerabilities are mitigated as quickly as possible. Many forms of ransomware exploit known vulnerabilities for which patches are available, for example. For more detail see KuppingerCole’s leadership brief: Defending Against Ransomware and advisory note: Understanding and Countering Ransomware.

Other related research includes:
Leadership Brief: Optimizing your Cybersecurity Spending
Leadership Brief: Penetration Testing Done Right
Leadership Brief: Responding to Cyber Incidents

Related blog posts include:
Akamai to Block Magecart-Style Attacks
Microsoft Partnership Enables Security at Firmware Level
API Security in Microservices Architectures

Regulatory Compliance a Potential Driver of Cloud Migration

Newly announced AWS offerings of Access Analyzer, Amazon Detective and AWS Nitro Enclaves discussed in my last blog post, further round out AWS’s security services and tools such as Amazon GuardDuty that continuously monitors for threats to accounts and workloads, Amazon Inspector that assesses application hosts for vulnerabilities and deviations from best practices, Amazon Macie that uses machine learning to discover, classify, and protect sensitive data, and AWS Security Hub, a unified security and compliance center.

These new security capabilities come hard on the heels of other security-related innovation announced ahead of re:Invent, including a feature added to AWS IAM to help organizations identify unused roles in AWS accounts by reporting the latest timestamp when role credentials were used to make an AWS request so that unused roles can be identified and removed; a native feature called  Amazon S3 Block Public Access to help customers use core services more securely; and the ability to connect Azure Active Directory to AWS Single Sign-on (SSO) once, manage permissions to AWS centrally in AWS SSO, and enable users to sign in using Azure AD to access assigned AWS accounts and applications.

Increasing focus on supporting regulatory frameworks

Further underlining the focus by AWS on security and compliance, its Security Hub service available in Europe since June 2019 recently announced 12 new partner integrations and plans to announce a set of new features in early 2020, focusing on supporting all major regulatory frameworks.

By making it easier for organizations using web services to comply with regulations, AWS once again appears to be shoring up the security reputation of cloud-based services as well as working to make security and compliance prime drivers of cloud migration.

While Security Hub integrates with three third-party Managed Security Services Providers (MSSPs), namely Alert Logic, Armor and Rackspace and has more than 25 security partner integrations that enable sharing of threat intelligence, most of the tools announced at re:Invent are designed to work with other AWS services to protect AWS workloads.

Reality check: IT environments are typically hybrid and multi-cloud

The reality is that most organizations using cloud services have a hybrid environment and are working with multiple cloud providers, which is something AWS should consider supporting with future security-related services.

In the meantime, organizations that have a hybrid multi-cloud IT environment may want to consider other solutions. At the very least, they should evaluate which set of solutions helps them across their complete IT environment, on premises and across various clouds. Having strong security tools for AWS, for Microsoft Azure, for other clouds, and for their on-premise environments helps for these platforms, but lacks the support for comprehensive security across and integrated Incident Management spanning the whole IT environment.

KuppingerCole Advisory Services can help in streamlining the security tools portfolio with our “Portfolio Compass” methodology, but also in defining adequate security architectures.

If you want more information about hybrid cloud security, check the Architecture Blueprint "Hybrid Cloud Security" and make sure you visit our 14th European Identity & Cloud Conference. Prime Discount expires by the end of the year, so get your ticket now.

Breaches and Regulations Drive Better Security, AWS re:Invent Shows

The high proportion of cyber attacks enabled by poor security practices has long raised questions about what it will take to bring about any significant change. Finally, however, there are indications that the threat of substantial fines for contravening the growing number of data protection regulations and negative media exposure associated with breaches are having the desired effect.

High profile data breaches driving industry improvements

The positive effect of high-profile breaches was evident at the Amazon Web Services (AWS) re:Invent conference in Las Vegas, where the cloud services firm made several security related announcements, that were undoubtedly expedited if not inspired by the March 2019 Capital One customer data breach, which was a text book example of a breach enabled by a cloud services customer not meeting their obligations under the shared responsibility model, which states organizations are responsible for anything they run in the cloud.

While AWS was not compromised and the breach was traced to a misconfiguration of a Web Application Firewall (WAF) and not the underlying cloud infrastructure, AWS has an interest in helping its customers to avoid breaches that inevitably lead to concerns about cloud security.

It is therefore unsurprising that AWS has introduced Access Analyzer, an Identity and Access Management (IAM) capability for Amazon S3 (Simple Storage Service) to make it easy for customer organizations to review access policies and audit them for unintended access. Users of these services are less likely to suffer data breaches that reflect badly on all companies involved and the cloud services industry is general. Something AWS is obviously keen to avoid.

Guarding against another Capital One type data breach

Access Analyzer complements preventative controls, such as Amazon S3 Block Public Access, which help protect against risks that stem from policy misconfiguration, widely viewed as the single biggest security risk in the context of cloud services. Access Analyser provides a single view across all access policies to determine whether any have been misconfigured to allow unintended public or cross-account access, which would have help prevent the Capital One breach.

Technically speaking, Access Analyzer uses a form of mathematical analysis called automated reasoning, which applies logic and mathematical inference to determine all possible access paths allowed by a resource policy to identify any violations of security and governance best practice, including unintended access.

Importantly, Access Analyzer continuously monitors policies for changes, meaning AWS customers no longer need to rely on intermittent manual checks to identify issues as policies are added or updated. It is also interesting to note, that Access Analyzer has been provided to S3 customers at no additional cost, unlike most of the other security innovations which represent new revenue streams for AWS.

On the security front, AWS also announced the Amazon Detective security service, currently available in preview, which is designed to make it easy for customers to conduct faster and more efficient investigations into security issues across their workloads.

In effect, Amazon Detective helps security teams conduct faster and more effective investigations by automatically analyzing and organizing data from AWS CloudTrail and Amazon Virtual Private Cloud (VPC) Flow Logs into a graph model that summarizes resource behaviors and interactions across a customer’s AWS environment.

Amazon Detective’s visualizations are designed to provide the details, context, and guidance to help analysts quickly determine the nature and extent of issues identified by AWS security services like Amazon GuardDuty, Amazon Inspector, Amazon Macie, and AWS Security Hub, to enable security teams to begin remediation quickly. Essentially an add-on to enable customers (and AWS) to get more value out of existing security services.

Hardware-based data isolation to address data protection regulatory compliance

Another capability due to be available in preview in early 2020 is AWS Nitro Enclaves, which is aimed at making it easy of AWS customers to process highly sensitive data by partitioning compute and memory resources within an instance to create an isolated compute environment.

This is an example of how data protection regulations are driving suppliers to support better practices by customer organizations by creating demand for such services. Although personal data can be protected using encryption, this does not address the risk of insider access to sensitive data as it is being processed by an application.

AWS Nitro Enclaves avoid the complexity and restrictions of either removing most of the functionality that an instance provides for general-purpose computing or creating a separate cluster of instances for processing sensitive data, protected by complicated permissions, highly restrictive networking, and other isolations. Instead, AWS customers can use AWS Nitro Enclave to create a completely isolated compute environment to process highly sensitive data.

Each enclave is an isolated virtual machine with its own kernel, memory, and processor that requires organizations only to select an instance type and decide how much CPU and memory they want to designate to the enclave. There is also no persistent storage, no ability to login to the enclave, and no networking connectivity beyond a secure local channel.

An early adopter of AWS Nitro Enclaves is European online fashion platform, Zalando, to make it easier for the Berlin-based firm to achieve application and data isolation to protect customer data in transit, at rest and while it is being processed.

AWS shoring up security in cloud services while adding revenue streams

The common theme across these security announcements is that they reduce the amount of custom engineering required to meet security and compliance needs, allow security teams to be more efficient and confident when responding to issues, and make it easier to manage access to AWS resources, which also harkens back to the Capital One breach.

In effect, AWS is continually making it easy for customers to meet their security obligations to protect the its own reputation as well as the reputation of the industry as a whole to the point that organizations will not only trust and have confidence in cloud environments, but will increasingly see improved security as being one of the main drivers for cloud migration.

AWS is also focusing on regulatory compliance as a driver rather than inhibitor of cloud migration. We will cover this in a blogpost tomorrow.

If you want more information about hybrid cloud security, check the Architecture Blueprint "Hybrid Cloud Security" and make sure you visit our 14th European Identity & Cloud Conference. Prime Discount expires by the end of the year, so get your ticket now.

The First Step to Cloud Is Not Technical – an AWS Perspective on Cloud Migration

As usual, Amazon Web Services (AWS) is making a slew of announcements at its reinvent conference in Las Vegas, and as expected, the key ones related to making it easier for organizations to move workloads to the cloud, keep data secure and get more value out of their data with services supported by Machine Learning.

However, one of the most interesting points made in the keynote by CEO Andy Jassy was not the power of the cloud transform business, revolutionize industry sectors or the latest AWS server processor chip and services, but about the common, non-technical barriers organizations have to overcome to move to the cloud, which every organization thinking about Digital Transformation should bear in mind.

Achieve business leadership alignment to drive cloud migration

The top observation is that leadership is essential. Digital Transformation of the business and the customer experience (which commonly involves moving workloads to the cloud) is most successful where there is strong support from the business leaders.

Leadership must be aligned on how and why the business needs to be transformed and must set aggressive goals for moving to the cloud. This means that one of the first and most important challenges for organizations to tackle is figuring out how to get the executive team aligned.

Set specific, aggressive targets to build momentum

AWS experience shows that setting specific goals forces everyone in the organization to commit to change. That in turn builds momentum within the organization with everyone driving towards achieving the goals that have been set and coming up with ideas for what can be done in the cloud.

Conversely, where organizations start by “dipping their toes” into cloud with experimentation, they tend to get stuck in this phase for an extended period of time without making any real progress. Only when Digital Transformation is driven from the top down, is real progress made quickly.

Cloud is not difficult, but training is essential

After leadership, the next challenge is that typically most people in an organization do not have any experience or understanding of the cloud. Education and training is therefore an important first step so that everyone in the organization understands how and why doing things in the cloud is different, and how that can benefit the business. While using the cloud is not difficult, it does require training.

It is important that organizations not attempt to move everything into the cloud at the same time. Instead, they should prioritize projects and draw up a methodical plan for moving workloads into the cloud, starting with the simplest and easiest first.

This approach avoids organizations getting paralysed into inaction by trying to do too much at once. It also enables the organization to learn with the easiest transitions, which in turn makes it easier to tackle the workloads that are more challenging as people in the organization gains experience and confidence.

AWS Outposts: removing another obstacle to cloud migration

This approach is more likely to result in completing a greater number of cloud projects in a relatively short time and build momentum to moving all remaining workloads into the cloud, and where there are things that simply cannot move or not right away, AWS has just announced general availability of AWS Outposts, fully managed racks that allow organizations to run compute and storage on-prem, while connecting to AWS services in the cloud.

This was just one of many more announcements on the first day of reinvent 2019, but the opening message was all about taking care of the non-technical aspects of cloud and the transformation goals of your business before considering the cloud services that will deliver the desired outcomes.

In short, get everyone in the leadership team to agree and get behind the why, then focus on the how and building momentum by training everyone to enable them to get up to speed.

Cloud migration similar to other complex IT projects

For all complex, heterogenous projects in IT with multiple stakeholders in the organization, including cloud migration projects, KuppingerCole Analysts recommend:

  • Knowing your stakeholders and getting their buy-in;
  • Understanding problem areas;
  • Defining the policies and processes;
  • Setting the right expectations about what you want to achieve; 
  • Outline a clear roadmap with defined goals;
  • Highlighting the quick wins; and
  • Ensuring you have the right resources on hand;

It is always important that the business understands the benefits of the project, and that will make it easier to get the buy-in and support of all the stakeholders. For the same reasons, it is important to make the purpose of the project clear so that all stakeholders are aware not only of the benefits, but also of what needs to be done, what is expected and when. And while it is important not to try to do too much in the initial stages, it is equally as important to identify quick wins from the outset and prioritize those to demonstrate value or benefit to the business in the early stages of the project.

Part of identifying quick wins is defining the goals and processes at the start - including responsibilities and accountabilities - to support the desired outcome of the project. This is also where the education piece, also mentioned by AWS comes in so that all stakeholders understand the processes and goals and have the tools and skills they need.

Understanding the problem areas and processes of the business is also key to the success of any IT project as this will be valuable in getting stakeholders on board as well as in setting the right goals and ensuring that you have the right resources and skill sets on hand for the project.

Continually measure progress and keep an eye on the future

Once the project is underway, KuppingerCole Analysts recommend continually measuring benefit/progress against the set of defined goals to demonstrate tangible success at every stage of the project.

Finally, keep an eye on emerging IT/business trend relevant to the project. Take them into account when planning your project and update your planning on a regular basis as new trends emerge.

Find out more on how to make your project a success, in this case applied to Identity and Access Management (IAM) projects, in this webinar podcast: How to Make Your IAM Program a Success and this point of view paper: One Identity - The Journey to IAM Success - 70226

Palo Alto Networks Continues to Bet on Security-as-a-Service

The market shift to cloud-based security services was highlighted at the Ignite Europe 2019 held by Palo Alto Networks in Barcelona, where the company announced a few product enhancements in an effort to round out its offerings to meet what it expects will be growing market demand.

A key element of its go-to market strategy is in response to market demand to reduce the complexity of security and to reduce the number of suppliers by adding cloud-delivered so Software-Defined Wide Area Network SD-WAN and DLP (data loss prevention) capabilities to its Prisma Access product.

The move not only removes the need for separate SD-WAN suppliers but also rounds out Prisma Access to deliver converged networking and security capabilities in the cloud to address the limitations of traditional architectures.

Palo Alto Networks is not alone in going after this market, but it is the latest player in the so-called Secure Access Service Edge (SASE) market to add SD-WAN capabilities to combine edge computing, security and wide-area networking (WAN) into a single cloud-managed platform.

The move builds on core company strengths and is therefore logical. Failure to have done so would have missed a market opportunity and would have been surprising.

Data drives better threat detection

In line with the company’s belief that threat intelligence should be shared, that security operations centers (SOCs) need to be more data-driven to support current and future threats, and the company’s drive to enable greater interoperability between security products, the second key announcement at Ignite Europe 2019 is the extension of its Cortex XDR detection and response application to third-party data sources.

In addition to data from Palo Alto Networks and other recognized industry sources, Cortex XDR 2.0 (which automates threat hunting and investigations, and consolidates alerts) is designed to consume logs from third-party security products for analytics and investigations, starting with logs from Check Point, soon to be followed by Cisco, Fortinet and Forcepoint.

SD-WAN market shakeup

The expansion by Palo Alto Networks into the SD-WAN market makes commercial sense to round out its Prisma Access offering to meet market demands for simpler, easier, consistent security with fewer suppliers and support the growing number of Managed Security Service providers.

However, it also signals a change in the relationship between Palo Alto Networks and its SD-WAN integration partners and in the market overall, although executives are downplaying the negative impact on SD-WAN partners, saying relationships are continually evolving and will inevitably change as market needs change.

The big takeaway from Ignite Europe 2019 is that Palo Alto Networks is broadening its brands and capabilities as a cloud-based security services company and that in future, selling through partners such as Managed Service Providers will be a critical part of its go-to market strategy.

It will be interesting to see whether the bet by Palo Alto Networks pays off to steal a march on competitors also rushing to the market with similar offerings such as SD-WAN firm Open Systems, networking firm Cato Networks, IT automation and security firm Infoblox, and virtualization firm VMware.

Authentication and Education High on CISO Agenda

Multifactor authentication and end-user education emerged as the most common themes at a CISO forum with analysts held under Chatham House Rules in London.

Chief information security officers across a wide range of industry sectors agree on the importance of multifactor authentication (MFA) to extending desktop-level security controls to an increasingly mobile workforce, with several indicating that MFA is among their key projects for 2020 to protect against credential stuffing attacks.

In highly-targeted industry sectors, CISOs said two-factor authentication (2FA) was mandated at the very least for mobile access to corporate resources, with special focus on privileged account access, especially to key databases.

Asked what IT suppliers could do to make life easier for security leaders, CISOs said providing MFA with everything was top of the list, along with full single sign on (SSO) capability, with some security leaders implementing or considering MFA for customer/consumer access to accounts and services.

The pursuit of improved user experience along with more secure access, appears to have led some security leaders to standardise on Microsoft products and services that enable collaboration, MFA and SSO, reducing the reliance on username/password combinations alone for access control.

Training end users

End user security education and training is another key area of attention for security leaders to increase the likelihood that any gaps in security controls will be bridged by well-informed users.

However, there also a clear understanding that end users cannot be held responsible as a front line of defense, that there needs to be a zero-blame policy to encourage engagement and participation of end users in security, that end users need to be supported by appropriate security controls and effective incident detection and response processes, and that communication is essential to ensure end users understand the cyber threats to them at home and at work as well as the importance of each security control.

Supporting end users

CISOs are helping to protect end users by implementing browser protections and URL filtering to prevent access to malicious sites, and improving email defenses to protect users from spoofing, phishing and spam, and by introducing tools that make it easy to report suspected phishing and conducting regular phishing simulation exercises to keep end users vigilant.

The implementation of the Domain-based Message Authentication, Reporting and Conformance (DMARC) protocol designed to help ensure the authenticity of the sender’s identity is also being used by some CISOs to drive user awareness by highlighting emails from an external source.

Some security leaders believe there should be a special focus on board member and other senior executives in terms of anti-phishing training and awareness because while this group is likely to be most-often targeted by phishing and spear phishing attacks, they are less likely to be attuned to the dangers and the warning signs.

Some CISOs have also provided password managers to help end users choose and maintain strong, unique passwords, reducing the number of passwords that each person is required to remember.

Encouraging trends

It is encouraging that security leaders are focusing on better authentication by moving to MFA and that they understand the need to support end users, not only with security awareness and education, but the necessary security controls, processes and capabilities, including effective email and web filtering, network monitoring, incident detection and response, and patch management.

If you want to deep dive into this topic, be sure to read our Leadership Compass Consumer Authentication. For unlimited access to all our research, buy your KC PLUS susbscription here.

Nok Nok Labs Extends FIDO-Based Authentication

Nok Nok Labs has made FIDO certified multi-factor authentication – which seeks to eliminate dependence on password-based security - available across all digital channels by adding a software development kit (SDK) for smart watches to the latest version of its digital authentication platform, the Nok Nok S3 Authentication Suite.

In truth, the SDK is only for the Apple watchOS, but it is the first - and currently only - SDK available to do all the heavy lifting for developers seeking to enable FIDO-certified authentication via smart watches that do not natively support FIDO, and is a logical starting point due to Apple’s strong position in the smart watch market (just over 50%), with SDKs for other smart watch operating systems expected to follow.

This means that business to consumer organizations can now use the Nok Nok S3 Authentication Suite to enable strong, FIDO-based authentication and access policy controls for Apple Watch apps as well as mobile apps, mobile web and desktop web applications.

The new SDK, like its companion SDKs from Nok Nok, provides a comprehensive set of libraries and application program interfaces (APIs) for software developers to enable FIDO certified multi-factor authentication that uses public and private key pairs, making it resistant to man-in-the-middle attacks because the private key never leaves the authenticator, or in this case, the smart watch.

As global smart watch sales continue to grow, the devices are becoming an increasingly important channel for digital engagement, particularly with 24 to 35-year-olds. At the same time, smart watch usage has grown beyond fitness applications to include banking, productivity apps such as Slack, ecommerce such as Apple Pay, and home security such as NEST.

A further driver for the use of smart watch applications is the fact that consumers often find it easier to access information on a watch without the need for passwords or one-time passcodes, especially smart watches like the Apple Watch that does not rely on having a smartphone nearby.

The move is a strategic one for Nok Nok because it not only satisfies customer requirements, but also fulfils one of the key goals for Nok Nok as a company and the FIDO Alliance as a whole.

From the point of view of S3 Authentication Suite end-user organizations, the new SDK will make it easier to make applications available to consumers on smart watches as a new client platform in its own right and meet the security and privacy requirements of both smart watch users and global, regional and industry-specific regulations, especially in highly-regulated industries such as telecommunications and financial services.

In addition, the SDK for smart watches enables end-user organisations an opportunity to simplify their backend infrastructure by having a single authentication method for all digital channels enabled by a unified backend authentication infrastructure, thereby reducing cost by reducing complexity and operational overhead.

From a Nok Nok point of view, the SDK delivers greater value to existing customers and is likely to win new customers as organisations, particularly in the financial services sector, seek to engage consumers across all available digital channels.

Enabling the same strong FIDO-backed authentication across all digital channels is also a key goal of Nok Nok, both as a company and as a founder member of the FIDO (Fast IDentity Online) Alliance.

The FIDO Alliance is a non-profit consortium of technology industry partners – including Amazon, Facebook, Google, Microsoft and Intel – working to establish standards for strong authentication to address the lack of interoperability among strong authentication devices as well as the problems users face with creating and remembering multiple usernames and passwords.

The FIDO Alliance plans to change the nature of authentication by developing specifications that define an open, scalable, interoperable set of mechanisms that supplant reliance on passwords to securely authenticate users of online services via FIDO-enabled devices.

The new S3 SDK from Nok Nok for Apple watchOS offers a stronger authentication alternative to solutions that typically store OAuth tokens or other bearer tokens in their smart watch applications. These tokens provide relatively weak authentication and need to be renewed frequently because they can be stolen.

In contrast, FIDO-based authenticators provide strong device binding for credentials, providing greater ease of use as well as additional assurance that applications are being accessed only by the smart watch owner (authorized user).

While commercially a strategic move for Nok Nok to be the first mover in enabling strong FIDO-based authentication via its S3 Authentication Suite, the real significance of the new SDK for Apple Watches is that it moves forward the IT industry’s goal of achieving stronger authentication and reducing reliance on password-based security.

Akamai to Block Magecart-Style Attacks

Credit card data thieves, commonly known as Magecart groups, typically use JavaScript code injected into compromised third-party components of e-commerce websites to harvest data from shoppers to commit fraud.

A classic example was a Magecart group’s compromise of Inbenta Technologies’ natural language processing software used to answer user questions by UK-based ticketing website, Ticketmaster.

The Magecart group inserted malicious JavaScript into the Inbenta JavaScript code, enabling the cyber criminals to harvest all the customer credit card data submitted to the Ticketmaster website.  

As a result, Ticketmaster is facing a £5m lawsuit on behalf of Ticketmaster customers targeted by fraud as well as a potential GDPR fine by the Information Commissioner’s Office, which is yet to publish the findings of its investigation.

A data breach at British Airways linked to similar tactics potentially by a Magecart group resulted in the Information Commissioner’s Office announcing in July 2019 that they are considering a fine for the company of more than €200m.

According to security researchers, the breach of Ticketmaster customer data was part of a larger campaign that targeted at least 800 websites.

This is a major problem for retailers, with an Akamai tool called Request Map showing that more than 90% of content on most websites comes from third-party sources, over which website owners have little or no control.

These scripts effectively give attackers direct access to website users, and once they are loaded in the browser, they can link to other malicious content without the knowledge of website operators.

Current web security offerings are unable to address and manage this problem, and a Content Security Policy (CSP) alone is inadequate to deal with potentially thousands of scripts running on a website.  Akamai is therefore developing and bringing a new product to market that is dedicated to helping retailers reduce the risk posed by third-party links and elements of their websites for things like advertising, customer support and performance management.

The new service dubbed Page Integrity Manager has completed initial testing and is now entering the beta testing phase with up to 25 volunteer customers with a range of different data types.

The aim of Akamai Page Integrity Manager is to enable website operators to detect and stop third-party breaches before their users are impacted. The service is designed to discover and assess the risk of new or modified JavaScript, control third-party access to sensitive forms or data fields using machine learning to identify relevant information, enable automated mitigation using policy-based controls, and block bad actors using Akamai threat intelligence to improve accuracy.

The service works by inserting a JavaScript into customer web pages to analyze all content received by the browser from the host organization and third parties. This will identify and block any scripts trying to access and exfiltrate financial or other personal data (form-jacking) as well as notify the website operator.

Third-party JavaScripts massively increase the attack surface and ramp up the risk for website operators and visitors alike with no practical and effective way for website operators to detect the threat and mitigate the risk, but that is set to change with the commercial availability of Akamai’s Page Integrity Manager expected in early 2020.

Microsoft Partnership Enables Security at Firmware Level

Microsoft has partnered with Windows PC makers to add another level of cyber attack protection for users of Windows 10 to defend against threats targeting firmware and the operating system.

The move is in response to attackers developing threats that specifically target firmware as the IT industry has built more protections into operating systems and connected devices. A trend that appears to have been gaining popularity since Russian espionage group APT28 – also known as Fancy Bear, Pawn Storm, Sofacy Group, Sednit, and Strontium – was found to be exploiting firmware vulnerabilities in firmware to distribute the LoJax malware by security researchers at ESET.

The LoJax malware targeting European government organizations exploited a firmware vulnerability to effectively hide inside the computer's flash memory. As a result, malware was difficult to detect and able to persist even after an operating system reinstall because whenever the infected PC booted up, the malware would re-execute.

In a bid to gain more control over the hardware on which its Windows operating system runs like Apple, Microsoft has worked with PC and chip makers on an initiative dubbed “Secured-core PCs” to apply the security best practices of isolation and minimal trust to the firmware layer to protect Windows devices from attacks that exploit the fact that firmware has a higher level of access and higher privileges than the Windows kernel. This means attackers can undermine protections such as secure boot and other defenses implemented by the hypervisor or operating system.

The initiative appears to be aimed at industries that handle highly-sensitive data, including personal, financial and intellectual property data, such as financial services, government and healthcare rather than the consumer market. However, consumers using new high-end hardware like the Surface Pro X and HP's Dragonfly laptops will benefit from an extra layer of security that isolates encryption keys and identity material from Windows 10.

According to Microsoft, Secured-core PCs combine identity, virtualization, operating system, hardware and firmware protection to add another layer of security underneath the operating system to prevent firmware attacks by using new hardware Dynamic Root of Trust for Measurement (DRTM) capabilities from AMD, Intel and Qualcomm to implement Microsoft’s System Guard Secure Launch as part of Windows Defender in Windows 10.

This effectively removes trust from the firmware because although Microsoft introduced Secure Boot in Windows 8 to mitigate the risk posed by malicious bootloaders and rootkits that relied on Unified Extensible Firmware Interface (UEFI) firmware, the firmware is already trusted to verify the bootloaders, which means that Secure Boot on its own does not protect from threats that exploit vulnerabilities in the trusted firmware.

The DRTM capability also helps to protect the integrity of the virtualization-based security (VBS) functionality implemented by the hypervisor from firmware compromise. VBS then relies on the hypervisor to isolate sensitive functionality from the rest of the OS which helps to protect the VBS functionality from malware that may have infected the normal OS even with elevated privileges, according to Microsoft, which adds that protecting VBS is critical because it is used as a building block for important operating system security capabilities like Windows Defender Credential Guard which protects against malware maliciously using OS credentials and Hypervisor-protected Code Integrity (HVCI) which ensures that a strict code integrity policy is enforced and that all kernel code is signed and verified.

It is worth noting that the Trusted Platform Module 2.0 (TPM) has been implemented as one of the device requirements for Secured-core PCs to measure the components that are used during the secure launch process, which Microsoft claims can help organisations enable zero-trust networks using System Guard runtime attestation.

Although ESET has responded to its researchers’ UEFI rootkit discovery by introducing a UEFI Scanner to detect malicious components in the firmware, and some chip manufacturers are aiming to do something similar with specific security chips, Microsoft’s Secured-core PC initiative is aimed at blocking firmware attacks rather than just detecting them and is cross-industry, involving a wide range of CPU architectures and Original Equipment Manufacturers (OEMs), which means that the firmware defence will be available to all Windows 10 users regardless of the PC maker and form factor they choose.

It will be interesting to see what effect this initiative has in reducing the number of successful ransomware and other BIOS/UEFI or firmware-based cyber attacks on critical industries. A high success rate is likely to see commoditization of the technology and result in availability for all PC users in all industries.

KuppingerCole Select

Register now for KuppingerCole Select and get your free 30-day access to a great selection of KuppingerCole research materials and to live trainings.

Stay Connected

Subscribe to our Podcasts

KuppingerCole Podcasts - watch or listen anywhere

How can we help you

Send an inquiry

Call Us +49 211 2370770

Mo – Fr 8:00 – 17:00