Blog posts by Sebastian Rohr
Just recently my Strong Authentication report has been published and now there is one vendor less in the scope: French-American card and token giant GEMALTO announced that it acquired the niche player TODOS: http://www.todos.se/index.php/media/archives/gemalto_acquires_e-banking_specialist_todos_ab/ Todos has some very interesting tokens, but I am pretty sure that Gemalto was just after the Todos'IP around online-banking security. Unknown to most of the world, it is Todos (or now Gemalto) that owns the technology that secure online banking solutions are based upon. Hopefully, Gemalto does not mess up those solutions too (remember the Debit/Credit Card frenzy that broke loose when the Gemalto chips on many German cards failed to operate after 2010 "suddenly came around the corner" and cards were not able to work with the date input?). Being a victim of this bug myself, I strongly hope the product scope and expertise of Todos will remain with Gemalto - I have deep respect for the achievements of the Swedish experts!!!
Whether you want to place a bid at eBay, check your bank balance online or your credit rating at Schufa or Experian, or access your corporate SAP account: Instead of asking you to please enter your user name and password, chances are the system nowadays will demand some other method of authentication like a token or a smartcard, or it may offer to scan your finger or iris.The procedures may differ, but the reasons behind them are the same: Companies want to protect themselves from rampant online fraud. And it's not just banks that are starting to deploy so-called "two-tier" or "redundant" security to their customers. The big question, though, remains: Are these systems really safer? Or to put it another way: Are two better than one in the complex world of IT security?
Our gut feeling says yes, but theories sometimes misfire. Take for instance a customer with multiple bank accounts who wants to be able to access all of them while on the road. He may have to lug a load of authentication hardware around wherever he goes. Oh, and don't forget the corresponding reader devices!
On the other side, operators of an online shop or internet service often opt for proprietary solutions, forcing the user to go through the learning process every time he wants to use a new service. There may be safety in confusion, but don't bet on it.
As a rational person, one with an average sense of risk-awareness and an adequately developed understanding of the technology issues, you may sometimes ask yourself whether all this is strictly necessary. After a while, though, we all tend to just give up and submit to life's complexity.
But in fact, there are already alternatives available. Just take highly flexible authentication systems on the market today which generally run under the name "Versatile Authentication Service Platform", or VASP. Most of them allow for simultaneous use of various authentication mechanisms such as user name/password, one-time passwords, certificate-based verification via smartcards, grid cards, challenge-response systems, biometric systems or any combination of the above.
This goes especially for shared applications that perform transaction-based or context-based risk analysis during authentication, something that can be of great benefit to the companies involved. In this case, authentication aims at matching the level of access allowed to the individual transaction or task. True strong authentication is optional in such cases since slightly weaker (and cheaper) methods may suffice. Such systems are usually able to escalate the authentication process by requesting additional bonafides if more sensitive data is needed.
In order to successfully deploy this kind of system, organizations need to model risks and thresholds on experience from previous transactions. If they don't, helpdesks will be inundated with calls from frustrated users and customers complaining about being unable to read email while on vacation or access their corporate data from their home offices.
A sensible, well thought-out and above all highly flexible authentication strategy should be part of any responsible IT strategy. The tools and tokens are out there for IT departments and system integrators to create strong authentication systems that don't lead to new problems and generally make life miserable for the user. But since there is yet no universal system available, the best method seems to be to opt for end-point centralization using VASP. Choosing the right solution may depend on the number of users, the type of applications involved as well as on available budget, but once in place it can go far towards reducing complexity and increasing security - and that not just in theory, but in actual practice.
Coming from a network security background, for me “IPSec 3DES VPNs” seemed to be the solution for secure data transfer between business partners for quite a long time. Over the years, with more experience, I naturally found out that this was not the solution for all use-cases and scenarios these crazy folks called “customers” came up with. Nonetheless, when SSL-VPNs became en-vogue I hesitated to join the choir of supporters. While I fully understand and support the idea of a more flexible, more application or user-centric approach due to the gain in usability, I still love my “old VPN client” when connecting to the company resources.
During the last 13 month two projects kept me busy, that changed my personal perception of what one may need to be happy regarding secure access to resources and secure file transfer. One of those is largely related to “Cloud Computing” as such, and using//processing company data which is not stored inside my brick + mortar, perimeter secured, firewall protected company server but somewhere in the “internet”. Making sure only the right person with the right credential accesses this data makes me want to use strong authentication, but few of the Cloud service providers do offer such an additional layer of protection.
The other project was based on very Information Society 1.0 processes – the need to secure and protect the personal subscriber information of periodicals and daily newspapers that are exchanged between the publisher and the logistic service provider who manages the delivery of above mentioned print products – even if the subscriber is on vacation in Spain or recently moved to new address. These transfers are conducted between separate systems, distributed all over Europe. As most of these application systems are build individually, no real data standard is established. As the number of parties involved is high and participants change frequently, classic VPNs are out of question (and possibly “too expensive”). Thus, the need to protect data transfer (yes, it is based on FTP!!!) is obvious. Well, have you ever tried to create a solution that acts both as a server AND a client and supports FTP, sFTP, FTPS and other cryptic siblings of the FTP protocol? No? Well, you should not!
The “cure”? Being a big fan of hardware, a.k.a. token-based, strong authentication mechanisms, vendors of non-hardware based mechanisms usually have a hard time convincing me that it is worthwhile paying attention to their product briefings. MultiFactors' Garret Grajek was one of those CTOs whom I was giving a hard time until I finally arranged an appointment for a briefing. What can I say? The approach to using soft-certificates as second factor for authentication and the combination with out-of-band (a.k.a. SMS based) messaging during registration of a computer/session did impress me – because it was so simple and straight-forward! Especially for me, who uses multiple devices in parallel to access e.g. my mail, registering my personal computer at home or my clients' laptop in the customer network to access Outlook Web Access this really did the trick. Ok, the downside is, I still need to log-in with my AD credentials – but this is something I criticized with Entrusts' GRID authentication scheme, also (which I love, because it is such a low prized alternative to OTP tokens). Back to my project experience with outsourcing and “Cloud Services”, MultiFactor now has launched a nice extension which makes this approach available for use with services such as SalesForce.com and GoogleApps by leveraging federation technology. Now, I have to admit, this is something one can hardly achieve by using their own smartcard or token based authentication technology – especially not if one frequently changes the machine used. I guess if this approach can be tied into an Authentication Strategy and could possibly be supported by one of the Versatile Authentication Platform solutions, I could be a full supporter of these ominous “soft-tokens”.
Still, this does not help directly with my friends' subscriber data, that needs to be updated daily. Fortunately, last Friday I had a briefing with nuBridges, a vendor of data protection tools that target both data at rest and data in motion. For the data at rest part, tokenization, scrambling and obfuscation of data – especially sensitive information such as credit card information – can be altered and stored in such ways that unique identification is still possible but leaked data would essentially be worthless. I won't go into too much detail on this, but my experience with outsourcing and out-tasking applications that also handle payment transactions tells that there is some need for this. I was by far more interested in their secure data transfer solution, called nuBridges Exchange. Again, without going into too much technical detail, this solution provides a nice standard-of-the-shelf product to securely handle multiple parties exchanging large quantities of files in a secure way. Besides support for all varieties of secure data file transfer protocols, the most important fact is the streaming capability of the solution. The files in transfer are not stored on the receiving end of the transfer connection but rather streamed onwards to a protected internal storage system. As the receiving server sits in-between two firewalls and the “inbound streaming” transmission through the internal firewall is initiated by the control server inside the secured area, no open ports need to be put into the internal firewall system. As time for a first briefing usually is insufficient to go into much detail, I was unable to investigate the architecture and implementation further, but both management interface, report dashboard and the availability of a self-service portal for the business partners made a rather good overall impression. I am looking forward to further investigate these solutions and for sure will take a closer look at their Exchange Network service, also – especially as protecting credit card data at the point-of-sales and between PoS and central merchant systems seems to be attracting the attention of auditors lately.
What do you think about protecting data transfer and authentication/authorization strategies in a Cloud-environment? Let me know!
I recently took the chance to investigate the virtualization market a bit deeper, namely the market for Virtual Desktops as I have been used to server virtualization and the different flavors thereof for some time. While server virtualization was pretty much straight forward with regard to approach and deployment and those systems – once deployed – had little to no influence on how one runs his environment from a management perspective, Desktop Virtualization does seem to put some new obstacles in the way when it comes to identities, access to resources and management thereof.
While most large vendors like VMware, Microsoft and Citrix are eager to round-up their offerings with tools around deployment management, load balancing and session brokerage up to live-streaming of virtualized applications into the also virtualized Desktops, the access to, usage and separation of resources sometimes is not really that well thought through. As an example, it scared the hell out of me, that "security" as kill-all term was highlighted as the differentiator between the "Professional" and "Platinum" varieties of one vendor. Say, what?
How long have we personally, how long has the community preached, that "security" needs to be integrated right from start, should be basic and mandatory and not an additional feature which you have to pay a premium for? Despite the fact that this may in detail refer to features one will only reap benefits from when deploying a massive enterprise-scale solution, the decision of using and deploying the necessary security barriers and segregations should remain with the customer and should not be "suppressed" by licensing schemes.
One thing that really gave me the shivers though, was the idea of an identity management within the virtualization technology: if you strip the OS from the machine, then strip the user-profile from the OS and finally strip the applications from this to mix & mash them all together during run-time, one does not only have to take care of the traditional "who has access to what" question but also make sure that the "on-the-fly" provisioning of the applications to the virtual desks and the access rights within those can be managed properly. While I am totally PRO desktop virtualization regarding software management, efficiency and especially regarding updates ad patches, I yet did not find a virtualization engineer who could explain to me in detail how this whole monster is handled identity-wise...
I recently bought a very expensive high-end Sony VAIO VGN-z31 and was more than surprised and downright angry, when I found out they had disabled the "VT"support of the Intel CPU, making it almost useless when it comes to virtualization with Virtual PC, VMware Workstation, Xen or what ever your favourite Hypervisor was.
With their latest set of updates for their EFI (the new BIOS technology) now finally they gave in to the numerous customer complaints, all coming from power users and professionals, who were upset to just have spent 2.000 -3.000 €/$ on a machine, that was basically leaving them without support for virtualization.
Vaio customers, rejoice! Check the update sources for your machine, and hopefully you will find a matching update. For all others: check out the "reverse engineered" hacks for activating VT... Happy VMwaring
Sebastian PS: off to get that SQL Server running...
I guess it became unpopular to read printed news in some societies but I really enjoy reading WELT KOMPAKT, a smaller printed formfactor of well-known daily WELT. Today, the more or less entertaining "Internet" section had a lead article called "Safe in the Web 2.0" or "Sicher im Web 2.0" by author Peter Zschunke. Eager to learn more about how "the general public" is informed about the dangers that lurk in the web, I read the mid-size article, featuring a James Bond-like shot of what seems to be Security Ops Center. My interest turned into surprise, ending in a sort of rage when I finished the article. It takes quite some time and effort to make me angry, but I instantly - for the first time in my life - wrote a letter to the author and the editors, and went like this: Sehr geehrte Damen und Herren, sehr geehrter Herr Zschunke!
Ich habe anfangs mit Interesse, später mit zunehmender Verwunderung das gelesen, was die Welt Kompakt als redaktionellen Beitrag in der Internet Rubrik hat drucken lassen. Für mich klingt diese doch sehr einseitige, leider wenig von journalistischer Qualität sprechende Berichterstattung eher nach Advertorial, denn nach guter Recherche und umfassender Information. Dem Format und dem Umfang sei geschuldet, dass hier nur ein Bruchteil der Problematik von Datensicherheit und Datenschutz im Web 2.0 beleuchtet werden kann – aber dann ernsthaft dem Leser zu vermitteln, die Firma RSA hätte „die Lösung im Schrank“ und könne diese Probleme quasi „wegzaubern“ wenn sich die sozialen Netzwerker denn endlich mal aus dem Sessel bequemen würden? Das halte ich nicht nur für inkorrekt, ich halte es für gefährlich! Zumal „RSA“ nun wirklich nicht das Produkt sondern der Firmenname ist und Sie, wie ich annehme, eigentlich von einer Kombination der enVision Produktlinie mit anderen Werkzeugen sprechen. Zumindest die Nennung einiger vergleichbarer Technologien oder Anbieter wie Novell, ArcSight, CA etc. hätte der Neutralität gut getan… Die Produkte und Lösungen der RSA sind sicher anerkannt und wirkungsvoll – sowohl bei der Analyse von (Fehl-)Verhalten als auch beim Zugriffsschutz und der Verschlüsselung. Aber, um es sinngemäß mit den Worten von Bruce Schneier zu sagen: „Wer denkt, dass Technologie seine Probleme lösen kann, der hat weder die Technologie noch die Probleme verstanden.“
Das Problem mit der sehr einseitigen Berichterstattung bleibt – es gilt eher am Konzept der sozialen Netzwerke, ihrer Datensammlung und Datenverwaltung zu arbeiten und den Anwender besser aufzuklären. Meiner Meinung nach steht Ihr Artikel der Aufklärung der Anwender eher im Weg, da hier ohne Sinn nach Technologie verlangt wird obwohl der eigene Menschenverstand ein viel besseres Mittel zum Schutz vor Missbrauch wäre. Bei mir hinterlässt dieser Artikel einen sehr faden Beigeschmack.
There is nothing wrong with a good advertorial or product related story, but this was so blatently single-sided, I just could not resist! I would love to discuss this with alll of you - feel free to comment, mail or call me!
#SAPTechEd - SAP Netweaver & GRC Identity Management During the last 30 month I was rather critical towards SAP's approach on how to position and further develop the technology acquired from Norwegian MaXware in 2007. The visit to SAP TechEd 2009 in Vienna showed through several technical presentations and direct interviews with people such as Keith Grayson, that SAP did a really job in not only integrating MaXware into the Netweaver group but also coming up with a sound strategy on how to move forward with whole offering. Besides the fact that Business Objects GRC systems still has some valuable functionality as provisioning tool for complex environments, the capabilities regarding the “Netweaver to SAP application” provisioning can now safely be called “unparalled” in the market. If you have access to the SDN platform, make sure to get your hands on the numerous slides in the SIMxyz track of TechEd. You can learn how to easily implement SAP Netweaver Identity Management, integrate with SAP Business Objects GRC and much more. As pointed out above, the joint deployment of the “standard provisioning engine” and the GRC one does have some benefits, especially if the Compliant User Provisioning (CUP) features are needed due to strong GRC requirements. It has been stressed in the sessions, that such a design needs to be planned very carefully and that cross-competence teams should be in charge of this to get all requirements and stakeholders represented in the final architecture. Regarding 3rd party system integration, the ongoing standardization plays into SAPs hands, as Keith and I discussed the growing relevance of SPML and SAML 2.0, which, by the way, has now been tested and certified to be working with SAP ID management solutions and might find its way into the core product in the future. More and more provisioning targets become easier to integrate, as the corresponding ISVs now see openness towards IAM solution as a benefit. To sum the impressions up: Keith and all the others did a great job in “turning around a skeptical analyst”. I am positive, that the current setup and strategy will result in a good position in the ever changing Enterprise Identity Management market for SAP.
I already pointed out my personal satisfaction about the recently announced cooperation between SAP and Novell in the GRC market. This morning I had the opportunity to discuss the whole approach with Jay Roxe of Novell and Ranga Bodla of the SAP GRC group, operating both out of the US. Besides my enthusiasm about the materialization of something I suggested to be beneficial (every once in a while, analysts DO show that they are humans, too!), the discussion of business opportunities, market pull and demand for GRC in general were almost identical between the three of us. First let's check the market pull: both companies said they received multiple requests by existing customers to provide insight on how to couple the more business-GRC oriented SAP solutions and the more IT-GRC oriented SIEM tool Sentinel of Novell. As open APIs were already available and Novell had their products on the path to SAP certification, taking the next step and analyzing the related business opportunity was only a matter of weeks. The joint approach beyond using and testing the APIs was then tested by a large consulting and system integration company in their labs. Looks like when there is a proven market, everybody is interested in providing a solution. Second, the demand for End-to-End GRC solutions: as KuppingerCole indicated during last year`s GRC event in Frankfurt, more general and broader oriented solution would be necessary and on offer soon. Only 10 month later, not a single-product but a joint solution IS available! SAP and Novell beat our projections and I guess it will take another 6-9 month before we either see another co-op or even a merger between two niche-players to offer a competing solution or product. Third, the business opportunity: SAP being the Business Intelligence provider they are, quickly was able to provide Novell with numbers on SAP GRC customers and quite a few hundred of them were identified as possible candidates to be addressed for a joint deployment. Vice versa, existing Novell customers with SAP deployments turned out to be of a significant magnitude, thus both groups form a considerable target. We at KuppingerCole can only second, that both the identified customers and the remaining “white space” in the market would benefit from a joint and integrated deployment – the former generating added value almost instantly – the latter reaping the benefits from the then (expectedly) available best practices generated by the early adopters. General perspective: KuppingerCole sees their own projections and analysis fulfilled ahead of time! SAP and Novell now have a considerable head-start in the market and thus have potential to counter offerings such from Enterprise GRC vendors such as BWise, OpenPages or Mega due to the breadth and depths of the combined solution. If you like to receive further insight, which GRC approach now makes sense for you, feel free to contact us and make sure to attend our upcoming related webinars http://www.kuppingercole.com/webinars
Communication & Collaboration - that is what email is all about - or should be. The GoogleWave concept mimics the snail-mail and a wiki at the same time, while being a protocol and an application also. The demo looks like a cooperative instant-message chat, but showing character by character, making an almost f2f chat impression... Who used OneNote online before, may be used to see the joint changes of multiple participants in one document - but it is amazing to see even uploads of photos and other material into the wave in a blink of a eye. To see somebody adding a Google-map into the wave and have it adjusted to show the right location IS amazing!
Let us put it like this: As a digital nomad and "never in the own office" worker, I want this, and I want it NOW! Now for Enterprise 2.0: adding a SAP Business Process Design tool Gravity to Wave enables cooperative work on new process designs inside the Wave. Re-designing processes to adjust changes caused i.e. by Mergers & Acquisitions now becomes easier due to real-time collaboration between subject matter experts. Cool user experience...
Again, sorry for bothering you with non-IAM information, but this is heavily interesting for those looking into Business-GRC. Jut now, Nokia, SAP and Gieseke+Devrient announced the JointVenture calles Original1, which will offer SaaS solutions for anti-piracy and anti-conterfeiting projects. Goal is to enable customs officers, supply-chain service providers and possible whole-sale customers to check and verify if a certain batch or delivery is actually original product or counterfeited merchandise. The solution will leverage technology by all three vendors, comprising SAP ERP back-end information, Nokia mobile device extensions for on-site reading/scanning of products and Gi+De technology to secure the process steps and information. The company will be led by Claudia Alsdorf as CEO and will be located in Frankfurt, Germany. As to specific requirements, the solutions will be technology agnostic and available on devices and systems not offered by the contributing parties. Target customers will be the brand-owners and vendors of high-value or high-risk products, e.g. luxury goods, pharmaceuticals or the like.
Get access to the whole body of KC PLUS research including Leadership Compass documents for only €800 a year
Register now for KuppingerCole Select and get your free 30-day access to a great selection of KuppingerCole research materials and to live trainings.
AI for the Future of your Business: Effective, Safe, Secure & Ethical Everything we admire, love, need to survive, and that brings us further in creating a better future with a human face is and will be a result of intelligence. Synthesizing and amplifying our human intelligence have therefore the potential of leading us into a new era of prosperity like we have not seen before, if we succeed keeping AI Safe, Secure and Ethical. Since the very beginning of industrialization, and even before, we have been striving at structuring our work in a way that it becomes accessible for [...]