Identity Management – Process or Technology?
One line of thinking has been that the major cause of identity theft and data loss is poor process and that strengthening the process is the key approach. Strong processes are indeed required but a strong process can be undermined by a weakness in technology.
The electronic identity of someone depends upon the process for managing that establishing that identity. Even biometrics depends upon the identity of the person being confirmed through a process or paper trail.
However the mechanism for proving the identity (authentication) needs to be chosen according to the risk. Traditionally this risk was fixed by the circumstances under which the identity is used – for example to access email internally. A password is cheap but relatively weak; however stronger forms like smart cards are expensive. The RSA SecureID was a nice compromise.
Wrongly assessing the risk or choosing the wrong technology undermines the process. The recent closure of the European Carbon Trading Market is an example of what happens when this goes wrong. Most operations at Europe’s 30 registries for greenhouse-gas emissions were suspended on Jan. 19 after a Czech trader reviewing his $9 million account found “nothing was there.” The EU estimates permits worth as many as 29 million Euros ($39 million) may be missing. Was this process or technology?
Now that systems are regularly accessed via the internet, for example by mobile employees or adoption of the Cloud, a more resilient technology is needed. An emerging solution to this is “versatile authentication” – where multiple factors like: the location of the request, the time, the value of the transaction, are taken into account. A versatile approach can be quickly reconfigured to take account of a new vulnerability to demand further proof of identity.
During the 1980 and 1990 the value of sharing information through “Groupware” was very high and the need for security was downgraded. The normal access mechanism implemented in most environments is called “Discretionary Access Control” or DAC. In this – if you have legitimate access to some information – you have discretion over what you do with it. You can copy it, print it e-mail it etc. This makes it easy for someone who has access to steal or misuse information. During the 1970 a stronger form of access control was invented called “Mandatory Access Control” or DAC. In this data is tagged so that only people authorized to access it are able to, and it is not possible for one person to copy the data to give it to another unauthorised person. This approach has now been reinvented under the name of Data Loss Prevention and Digital Rights Management technology.
Many organizations have poor processes for identifying valuable information and poor technology to prevent that information from leaking. See the recent example of a former Goldman Sachs programmer who stole key intellectual property. http://www.bloomberg.com/news/2011-03-16/ex-goldman-programmer-aleynikov-s-conviction-is-upheld-by-trial-judge.html
Abuse of Privilege
The infrastructure upon which cloud computing is built needs to be managed and maintained. To perform these tasks the servers, platforms and applications need powerful administrator accounts. These accounts are used by the Cloud Service provider to perform essential administration, yet they represent a potential risk because they allow powerful actions which include: bypassing normal access controls to read application data and changing or erasing entries in the system log. Managing the identity of these administrators is a critical issue for information security in the Cloud.
Distributed systems technology has an inherent weakness - the privileged accounts. Many organizations do not have process in place to compensate for this. See the recent example of an administrator who held the City of San Francisco to ransom. http://www.pcworld.com/businesscenter/article/148469/it_admin_locks_up_san_franciscos_network.html
Privilege Management (PxM) technology is an emerging solution to manage this weakness.
Bottom line - strong process always needs to be backed by good technology. Many of the technologies in use today have significant weaknesses and vendors need to work to remove these.