Blog posts by Anmol Singh
Security has seldom been the focus of device manufacturers who have historically taken their own approach for securing the devices in the IoT. Most devices in enterprise, consumer or industrial IoT continue to be developed and designed to perform specific functions and security is often a neglected theme in the majority of product development lifecycles. The proprietary protocols these devices operate on are primarily characterized by the purpose they are built to serve and offer very limited or no interoperability. With the increasing convergence of IT and OT towards IoT, lack of a common operating framework and security principles pose some serious challenges for device manufacturers and also the consumers.
In an increasingly connected world where we see an explosion of networked and cloud-enabled devices ranging from home appliances to medical devices to consumer electronics, creating and maintaining device and user identities, the relationships between the various entities and ensuring the integrity of devices has remained a constant challenge for consumers as well as for the security leaders. The industry has seen the emergence of several standards from governing bodies and consortiums but we still lack appropriate mechanisms that define how the identity of things (IDoT) should be defined, standardized and deployed across operating networks and entities. Besides the need for verifying identity and establishing trust levels of various entities such as devices, people, applications, cloud services and/or gateways operating in an IoT environment – there’s a need to manage ‘Identity of Things’ or IDoT throughout the lifecycle of things.
An effective authentication and identity framework for IoT devices should be able to provide appropriate protection against cyber threats throughout the distinct operational life-cycle stages of a device depicted here in the figure:
Authentication for the devices in IoT is different and considerably lighter weight than people authentication methods prevalent today due to the potential resource constraints of devices, the bandwidth of networks they operate within, and the nature of interaction with the devices.
A lack of established industry standards for IoT authentication has led vendors to develop proprietary authentication methods. Since many IoT devices can be resource-constrained with low computing power and storage capacity, existing authentication methods are not a good candidate due to their significant bandwidth and computational requirements. There is a growing need to evaluate and streamline the methods adopted for device and service authentication over constrained IoT networks. It is important to analyze and use the factors essential for verifying the identity of ‘things’ to establish the desired level of trust in the device identity without overburdening the fit-for-purpose computing abilities of the IoT device. While there is increased adoption of the adaptations/ deviations of standard authentication methods such as PKI, OAuth and OIDC to serve the required scope and scale of the IoT use-cases, we also see the use of standards from OAM DM, LWM2M and TR-069 specifications for establishing secure communication between the constrained nodes in the IoT networks.
The resource-constrained nature of many IoT endpoints severely limits their ability to sustain prevalent authentication methods which have further led to the adoption of proprietary authentication methods in the market that do not conform to trust requirements and offer very limited or no interoperability. This is, however, changing rapidly as the industry moves from considering security as an afterthought to including it as part of the system design process. The majority of embedded systems today implement device authentication methods that rely predominantly on a software-based approach. Popular examples include use of MAC (Message Authentication Code) to enable secure key exchange for device authentication over constrained IoT protocols, as well as light-weight adaptations of PKI and OAuth2 protocols to match the scope and scale of IoT use-cases.
The software implementation of IoT authentication has notable cost and maintenance advantages but the protection offered by these methods is severely restricted by the security of embedded OS and coding practices of embedded system developer. Largely ineffective against the common software-based attacks and physical device tempering, software implementation of IoT authentication methods are known to offer limited or no protection against the next-gen IoT threats that exploit specific IoT device functions such as remote administration, device provisioning and boot sequence.
Hardware-based security approaches such as Hardware Root of Trust (HRoT) and Trusted Execution Environment (TEE) are fast becoming an industry-wide standard for securing desktops, tablets and mobile phones. These approaches have found an increased relevance to securing IoT devices. Hardware-based ‘root of trust’ offers on-chip security functions including key generation, integrity checks and attestation, which are executed in an isolated hardware environment and, therefore, offer effective protection against physical thefts. Trusted Platform Module (TPM), a prime example of HRoT implementation, when combined with software-based PKI delivers high trust authentication for IoT devices. Other than securing digital credentials, TPM trust measurements provide secure control over the boot sequence of IoT devices thereby validating the authenticity of each device as it loads up in the IoT network.
The choice of an appropriate authentication method for a given set of IoT devices is largely driven by the identity assurance requirements that vary from device to device, depending primarily on the operating environment, device lifecycle stages (as depicted in the figure above) and the impact of potential compromise through unauthorized access. IoT security designers and architects should make use of defined metrics that correlate the trust level(s) offered by available device authentication methods to the trust requirements of IoT devices. Hardware-based authentication methods such as TPM are most suited for IoT use-cases where the requirement of establishing higher levels of trust (equivalent to FIPS 140-2, level 3 to 4) in the device identity is paramount.
Since strong authentication methods that provide a higher level of trust in the device identity are not a viable option for many IoT use-cases where the concerns of additional component cost and increased device size restrict the adoption of TPM or TEE methods, considerable loss in the strength of authentication method and hence in the associated level of assurance is expected. IoT security architects should work to find out the right balance and realize an appropriate trade-off in the level of trust offered by authentication methods for each category of IoT devices in operation.
Today, organizations are capturing trillions of bytes of data every day on their employees, consumers, services and operations through multiple sources and data streams. As organizations explore new ways to collect more data, the increased use of a variety of consumer devices and embedded sensors continue to fuel this exponential data growth. Large pools of data, often referred to as data lakes, are created as a result of this massive data aggregation, collection and storage – which remains the easiest of all processes in a Big Data and BI value chain.
What’s concerning is the complete ignorance of data owners, data privacy officers as well as security leaders towards a defined scope for collection and use of this data. Very frequently, not only the scope for use of this data is poorly defined but the legal implications that might arise from the incompliant use of this data remain unknown or are ignored in broad daylight.
An example that recently made it to the news was the storage of the millions of user passwords by Facebook in clear text. There was no data breach involved, nor the passwords were abused but ignoring the fundamentals of data encryption outrightly puts Facebook in an undeniable defiant position against cybersecurity basics. The absence of controls for restricting users’ access to sensitive customer data further violates the data privacy and security norms by allowing the user passwords to be freely accessed for potential abuse by 20,000 Facebook employees.
It is important for data owners, privacy officers and security leaders to know what data they have in order to classify, analyze and protect it. Obviously, you can’t protect what you don’t know you have in your possession. Therefore, it's necessary for data leaders to have a continually updated catalogue of data assets, data sources and the data privacy and residency regulations that the data elements in your possession directly attract.
Most Big Data environments comprise of massive data sets of structured, unstructured and semi-structured data that can’t be processed through traditional database and software techniques. This distributed processing of data across unsecured processing nodes put the data as the interactions between the distributed nodes are not secured. A lack of visibility into the information flows, particularly the unstructured data leads to inconsistent access policies.
Business Intelligence platforms, on the other hand, are increasingly offering capabilities such as self-service data modeling, data mining and dynamic data content sharing – all of which only exaggerates the problem of understanding the data flows and complying with data privacy and residency regulations.
Most data security tools, including database security and IAM tools, only cater to the part of the problem and have their own limitations. With the massive collection of data through multiple data sources including third-party data streams, it becomes increasingly important for CIOs, CISOs and CDOs to implement effective data security and governance (DSG) for the Big Data and BI platforms to gain the required visibility and appropriate level of control over the data flowing through the enterprise systems, applications and databases.
Some security tools and technologies that are commonly in use and can be extended to certain components within a Big Data or BI platform are:
- Database Security
- Data Discovery & Classification
- Database & Data Encryption
- UBA (User Behaviour Analytics)
- Data Masking & Tokenization
- Data Virtualization
- IGA (Identity Governance & Administration)
- PAM (Privileged Access Management)
- Dynamic Authorization Management
- DLP (Data Leakage Prevention)
- API (Application Programming Interface) Security
There remain specific limitations of each of these technologies in addressing the broader security requirements of a Big Data and BI platform. However, using them wisely and selectively for the right Big Data and BI component potentially reduces the risks of data espionage and misuse arising from these components and thereby contributing to the overall security state of the environment.
Data governance for Big Data and BI is fast becoming an urgent requirement and has largely been absent from the existing IGA tools. Existing IGA tools provide basic access governance, mostly for structured data but lack in-built capabilities to support the complex access governance requirements of the massive unstructured data as well as do not support the multitude of data dimensions required for driving authorizations and access control including access requests and approvals at a granular level.
It is therefore recommended that security leaders work with application and data owners to understand the data flows and authorization requirements of the Big Data and BI environments. Besides practicing standard data sanitization and encryption, security leaders are advised to evaluate the right set of existing data security technologies to meet the urgent Big Data and BI security requirements and build on additional security capabilities in the long term.
We, at KuppingerCole, deliver our standardized Strategy Compass and Portfolio Compass methodology to help security leaders assess their Big Data and BI security requirements and identify the priorities. The methodology also helps leaders provide ratings to available security technologies based on these priorities – eventually providing strong and justifiable recommendations for use of the right set of technologies. Please get in touch with our sales team for more information on relevant research and how we can help you in your plans to secure your Big Data and BI environment.
Broadcom, after having denied the acquisition of Qualcomm earlier this year by Trump administration based on national security concerns, has decided to acquire CA Technologies showing one of the greatest shifts in an acquisition strategy from a semiconductor business to an IT Software and Solutions business. The proposed Qualcomm acquisition by once Singapore-based Broadcom had the likelihood of several 5G patents passing beyond US control.
The CA Technologies’ acquisition still gets over 1200 patents and mission-critical software deployments by CA Technologies at US Govt sites in the hands of Broadcom, and yet appears getting a green signal from the Trump administration. Negating the basics of acquisition with absolutely no or very little commercial synergies, the Broadcom’s objective to acquire ‘established mission-critical technology businesses’ is fully satisfied by this move which could be considered one of the most ambitious acquisitions of this size and scale in the recent times. Not to forget the Intel’s acquisition of McAfee which didn’t work well for the company due to little synergies between McAfee’s endpoint protection business and Intel’s core hardware strategy, finally resulted into a divestment of McAfee after seven years of rough marriage.
CA Technologies itself is built on a series of smaller acquisitions done in almost every segment of IT software – ranging from IT operations management, application performance, mainframes, DevOps, IT security and automation to analytics. CA Technologies has, however, had a good overall success rate of driving product and roadmap integrations to achieve expected synergies out of the acquisitions done in the past. Broadcom must consider using some of the CA management’s expertise gathered over a decade and more to drive this acquisition towards a successful business integration. There’s no similar business unit at Broadcom that delivers IT software or services, which should make it even easier for CA Technologies to continue operating under the larger shed without the need to make any immediate shift to operating strategy.
The dissimilarity of businesses and customer-base would only offer limited cross-sell opportunities arising from this acquisition in short to mid-term. However, CA Technologies’ recurring profitable bookings are guaranteed to bring stability by the increased future cash flow for Broadcom in the short term to accommodate for the expected fluctuations to its business due to the uncertainties arising from the recent (though still proposed and under review) US trade tariffs against semiconductor goods manufactured in China.
Besides mainframes which remain a majority revenue stream, and some other areas such as IT project & portfolio management, CA Technology has invested significantly in building its IT Security portfolio over the last decade, starting with Netegrity, IDFocus, Eurekify, Arcot, Layer 7, Xceedium, IdMLogic and Veracode – all within the Identity and Access Management (IAM) domain alone. CA’s aggressive acquisition strategy has kept innovation out of the company’s door for a long time and now with the Broadcom’s acquisition of CA Technologies there’s little hope that innovation will be the key to revenue generation for the new entity anytime in near future. With numerous acquisitions, CA’s Identity and Access Management portfolio has taken a bumpy ride over the past decade but despite all the challenges and long-term ramifications, its excellent IAM product and engineering team has ensured a seamless absorption of acquired products into its IAM and broader security software portfolio.
While the uncertainties will continue to loom over its acquisition objectives and alignment of synergies for some more time, it will be interesting to see how Broadcom would decide to nurture CA’s enterprise software and services business and where would that lead its still very well-positioned IAM product line.
BOMGAR, owned by PE firm Francisco Partners has recently announced that it has acquired Avecto, a UK based Endpoint Privilege Management (EPM) company. The move coming within 6 months of Lieberman Software’s acquisition by BOMGAR clearly depicts the quest to strengthen its position in the PAM market by offering a full-featured PAM suite.
Originally a provider of ‘remote support’ solutions, BOMGAR offered remote session management capabilities in the market for a while until it acquired Argentina based Pitbull Software in late 2015 to enter the PAM market with its password management technology. Since then BOMGAR has been on an acquisition spree to expand its portfolio of PAM technologies to compete more effectively against the market leaders.
Avecto has been a market leader in the niche market of Endpoint Privilege Management (EPM). Its flagship product Avecto Defendpoint offers capabilities to manage threats associated with local administrative rights on windows and mac endpoints by offering controlled and monitored escalation of admin privileges. Avecto Defendpoint also offers effective application whitelisting and sandboxing capabilities for enhanced endpoint protection which has positioned it uniquely in the market with almost twice the number of managed endpoints than its closest competitor. For a couple of years before acquiring Viewfinity in late 2015, CyberArk embedded Defendpoint as a technology licensed through an OEM agreement with Avecto to sell a more complete PAM solution in the market for its customers and compete against then leading EPM product BeyondTrust PB for Windows.
Endpoint Privilege Management (EPM) has become one of the fastest growing sub-segments of PAM market, closing in on approximately 28% YoY growth. With EPM capabilities, PAM solutions are poised to offer effective second-in-line defense mechanism for endpoint threat protection in coming years. The increased demand of better EPM capabilities embedded in PAM solutions has led many market leading vendors to acquire or develop their own EPM capabilities in the recent past. CyberArk, for example, acquired Viewfinity and Thycotic acquired Arellia in recent years to bring EPM capabilities in their PAM portfolios.
At KuppingerCole, we define EPM solutions to primarily offer three distinct technologies:
- Application Control: This allows organizations to control what applications can be allowed to run on an endpoint. This is usually achieved through application whitelisting in which only known good applications are placed on the pre-approved list and allowed to run. Application control provides effective protection against shadow IT challenges for most organizations.
- Sandboxing: This technology uses the approach to isolate the execution of unknown applications or programs by restricting the resources they can access (for eg., files, registries etc.). This technology, also known as application isolation, provides an effective protection against cyberattacks by confining the execution of malicious programs and limiting their means to cause the harm.
- Privilege Management: This technology encompasses user and application privilege management. For user privileged management, it deals with controlled and monitored elevation to local admin privileges. Application privilege management deals with exception or policy-based elevation of administrative rights for known and approved applications to execute successfully.
Avecto DefendPoint offers a good mix of these EPM technologies in the market to provide effective endpoint protection against a range of cyber threats. The acquisition of these EPM capabilities make a natural fit for BOMGAR offering great cross-sell opportunities in the short to mid-term. While their integration under a common PAM platform should begin soon, no immediate changes are expected to either product lines. In the short term, Avecto will continue to operate under the rebranded entity as Avecto, a BOMGAR company until its fully integrated into BOMGAR organization through the remainder of 2018.
The BOMGAR’s approach to obtain additional PAM capabilities through acquisitions is expected to bring rapid growth and deliver quick synergies but is also accompanied by the risks of integration failures and long-term effects of dampened organic growth. No doubt, the Lieberman Software’s and now Avecto’s acquisition places BOMGAR on the list of top 5 PAM vendors by revenue but not necessarily on the list of market leaders for technology innovation. As the PAM market continues to evolve, consolidation is inevitable, however, a stronger focus of vendors on completeness of features as compared to innovation in order to compete can stiffen the healthy market growth by failing to deliver on opportunities created by innovation.
While a clear integration roadmap for Lieberman Software was still awaited, the acquisition of Avecto adds to the growing pipeline of product and engineering teams to develop an integrated PAM platform to realize the essentials of these acquisitions. With a good track record of delivering growth and profitability as well as driving operational excellence, we expect BOMGAR to steer clear of any such challenges in the short to mid-term by delivering on the actual synergies created by these acquisitions.
CyberArk, an overall leader in privilege management according to KuppingerCole Leadership Compass on Privilege Management, announced yesterday that it has acquired certain assets in a privately held America-based Israeli cloud security provider, Vaultive.
Data encryption has emerged as a key inhibitor for organizations seeking to adopt cloud services. Most cloud providers today offer own encryption to ensure that data in transit and at rest remains unreadable if a breach occurs. However, as organizations adopt multiple SaaS applications, varied encryption standards and inconsistent key management practices of cloud providers can quickly lead to a complex environment with lack of visibility and control of keys.
While most privilege management products today can help with credential vaulting and monitoring of shared administrative access to cloud platforms (including SaaS, IaaS and PaaS), they are largely ineffective against the risks of privileged credentials under direct compromise at cloud providers' end. Some cloud access security brokers (CASBs) can prevent such risks by offering data encryption capabilities that separate encryption of data at rest and key management from that of the cloud providers. However, the CASBs lack privileged account management capabilities and usually do not support on-premises systems. Therefore, organizations requiring a complete control of privileged access across cloud platforms have no option but to integrate CASB's capabilities with their privileged management solution. CyberArk's acquisition of Vaultive is primarily aimed at solving this challenge for its customers.
Vaultive is a data encryption platform for cloud that helps organizations retain control of their encryption keys providing an end-to-end encryption of data across cloud platforms. CyberArk with its existing capabilities to manage privileged access in cloud platforms can benefit from Vaultive's data encryption capabilities to:
- assure its customers of exclusive administrative access to cloud while retaining control over entire data lifecycle
- extend its privilege management capabilities beyond administrative access to privileged business users of SaaS applications
- build finer-grained privileged access control for cloud environments using context-aware access policies from Vaultive
While only time will tell how well CyberArk is able to integrate and promote Vaultive's Cloud Data Security platform within its privileged account and session management capabilities for cloud, this acquisition comes in the wake of a conscious and well thought out decision to offer a one-stop cloud security solution for the customers.
Register now for KuppingerCole Select and get your free 30-day access to a great selection of KuppingerCole research materials and to live trainings.
AI for the Future of your Business: Effective, Safe, Secure & Ethical Everything we admire, love, need to survive, and that brings us further in creating a better future with a human face is and will be a result of intelligence. Synthesizing and amplifying our human intelligence have therefore the potential of leading us into a new era of prosperity like we have not seen before, if we succeed keeping AI Safe, Secure and Ethical. Since the very beginning of industrialization, and even before, we have been striving at structuring our work in a way that it becomes accessible for [...]