KuppingerCole's Advisory stands out due to our regular communication with vendors and key clients, providing us with in-depth insight into the issues and knowledge required to address real-world challenges.
Meet our team of analysts and advisors who are highly skilled and experienced professionals dedicated to helping you make informed decisions and achieve your goals.
Meet our business team committed to helping you achieve success. We understand that running a business can be challenging, but with the right team in your corner, anything is possible.
Matthias offers a critical analysis of the EU's NIS2 Directive's intricate demands, drawing attention to the limitations of one-size-fits-all solutions. He advocates for customized compliance plans, underscoring the unique challenges across various entities, with special attention to the constraints faced by SMEs. And obviously the future interpretation of this EU directive into national regulation adds another layer of complexity.
Essential strategies such as comprehensive risk evaluations, continuous educational efforts, and advanced incident management protocols are emphasized as crucial for effective compliance, integrating cybersecurity deeply into the organization's core values beyond just adherence. The talk concludes with a perspective that views NIS2 compliance as a dynamic goal necessitating enduring dedication and flexible approaches.
When PSD2 takes effect, banks across the European Union will be required to expose their core banking functions to these TPPs via APIs. It is imperative that banks begin now to build and lock down APIs in preparation for PSD2. We will take a look at the Open Banking APIs as well as some other competing API offerings, and discuss API security methods.
Strong Customer Authentication (SCA) is a 2nd primary technical requirement of PSD2. Banks and TPPs both must provide mechanisms to do at least 2-Factor Authentication for their customers. Risk adaptive authentication is preferred. Additionally, PSD2 states that financial transaction processors must employ User Behavioral Analytics for higher assurance outside of the SCA requirements.
Banks will soon have to comply with the Revised Payment Service Directive, commonly called "PSD2." The directive will introduce massive changes to the payments industry and radically alter the user experience for customers of European banks by allowing third party payment service providers (TPP) to access their account information to provide various innovative financial services. But to mitigate risk, banks and TPPs must address the core regulatory technical requirements outlined by PSD2.
On January 13th, 2018 a new set of rules for banking came into force that open up the market by allowing new companies to offer electronic payment services. On November 27th, 2017 the European Union published and press release and a draft Regulatory Technical Standard (RTS) on strong authentication.
On the one hand the press release says that – “thanks to PSD2 consumers will be better protected when they make electronic payments or transactions because the RTS makes strong customer authentication (SCA) the basis for accessing one's payment account, as well as for making payments online”. However, the RTS explicitly excludes preventing Payment Service Providers (PSP) from using the customer account credentials or imposing redirection to the Account Service Provider for authentication.
This session will discuss the security implications of this RTS on the use of proven industry standards such as OpenID and SAML as part secure authentication for open banking.
PSD2 and the Open Banking Standard are regulatory mandates being applied to the banking industry by the European Banking Authority (EBA) and Competition & Markets Authority (CMA) across Europe and in the UK respectively. The regulations require that banks operating across the region expose open APIs to allow other banks and third parties to access the data they hold on customers, when the customer has given their explicit consent. Designed to improve choice for customers, create more competition and stimulate innovation in the finance sector, the introduction of 'open banking' in the UK and across the EU will transform banking as we know it.
So good morning. Good afternoon. Once again, ladies gentlemen, welcome to this webinar from co in corporation nine. I would like to now hand over to Dave KU our first speaker, Dave ke senior Analyst of thank you. Thank you. Good morning. Good afternoon. Good evening, ladies and gentlemen, regulatory compliance. Many think that this has been the major driving force behind the spread of identity management and security management for the, this the beginning of the 21st century.
Many people traced this back to the financial problems we had at the end of the 20th century and government mandating standards for security and identity, which then drove the industry. But today we don't wanna talk about government mandated compliance standards.
Instead, we're actually going to talk about something that's even bigger than that because it encompasses a worldwide compliance standard. And it's a standard for those who rely on credit card transactions for their business. What we want talk about is what's usually referred to as PCI DSS. PCI is the payment card industry, and we'll define that for you in a minute. And DSS is the data security standard, which is the standard which has to be complied with. Now.
I'm not gonna tell you a whole lot about what DSS is because we have a couple of acknowledged experts in that field to tell you about it. What I want to do today is to set the stage for their discussion, by going over the history, what we have and why we're at the point we are today, The P C I D SS standard was created to increase the controls around cardholder data to reduce credit card fraud, PCI the payment card industry covers credit cards, charge cards, debit cards, and so on those pieces of plastic that we use all the time. Okay.
We know that credit card fraud is one of the biggest instances of so-called identity theft that we see these days. So it's important that, that we acknowledge that and that we do something about it. And that's what the BCI DSS standard is all about.
Now, we need to go back ways to understand how we got here 30, 35 years ago, or even before that 50 years ago, when the charge card industry diners club American express first started, there were few of fraud and there really wasn't much done about it. Then as charge cards came in in say 30, 35 years ago, retailers had these imprints, they had paper receipts with Interleaf carbons.
They would put the receipt in the imprint, put the embossed card in the imprint, imprint the receipt, write out the financial transaction, give a copy of the receipt to the customer, keep a copy for themselves and send a copy off to their payment processor. And the carbons were dumped in the trash to ensure that it was a valid credit card. The credit card companies would send out a booklet, weekly listing all of the bad credit cards. It was a black list. Okay.
You would check that if the transaction was over a certain monetary amount, then you would have to call to get approval and speak to a human being who would check an up to the minute list. This did combat fraud in certain ways, until we realized that all those carbons that we were throwing away contained an awful lot of personally identifiable information about the users, which could be used to commit fraud. We got rid of the carbons and then we moved on to some other things coming forward towards the end of the 20th century, then plastic card uses has grown exponentially.
We have magnetic Stripe carts now that are connected to electronic terminals. We have catalog sales, both by mail and by telephone. All of these are handled by transaction processors who have the data sent to them over telephone lines. They validate it, they verify it and they tell the retailer whether or not he can go ahead and process the transaction. And then they keep that information. The vulnerability here was found to be those wires in between the order and the fulfillment people would tap those lines and, and find out what was going on.
And we come up to current history where now we've cut out a lot of those middle man customers now go online, either from a computer or some sort of mobile device connect somehow to this large cloud transmit their financial information, which then shows up on some big storage facility somewhere. And now the storage facility has become the great vulnerability because it's storing all of this data that would be useful for identity fraud or credit card and is subject to attacked frequently.
So all the major credit card charge card, debit card issuers or brands had developed over the years, security methods that they imposed on the people who were handling issuing, honoring, and processing their cards in the early part of the 21st century, they realized that there was a lot of duplicative effort here. More to the point it was 95% duplicative. That means that you couldn't use the same set of standards to handle American express cards that you used to handle MasterCards. There was a slight difference.
So the major brands, American express discovery, financial services, JCB, MasterCard, and visa got together informed the PCI, the payment card industry security standards council. They did this so that they could consolidate the standards that they were using to protect the information that was being transmitted. They came together and on December 15th, 2004 released version, one of the data security standard covering these transactions.
Then in September of 2006, the standard was revised up to version 1.1, which clarified some of the things in version one and made some minor revisions, October, 2008 version 1.2 was released, which didn't change any of their requirements, but did enhance the clarity, improve the flexibility and address some of the risks and threats that had evolved since release one came out in 2004, finally in August, 2009, version 1.2 was superseded by version 1.2 0.1. This again, made some minor corrections created more clarity and consistency among the standards and enforcement documents.
So all of that brings us up now to version two, which was released on October 28th, took effect on January 1st, 2011 and has now finished its nine month implementation period. That means at this point, the PCI DSS version two requirements must be implemented by all entities, that process store or account data. And there are three main steps that people must follow. There's the assessment step, everyone who handles payment card transactions must familiarize themselves with the DSS SS and must examine their own situation to make sure they are complying with the DSS.
The council does issue a self-assessment questionnaire, which can be used to make sure that you are following through with the requirements. If any vulnerabilities are found, or if you were out of compliance at any point, then there is a remediation step. You have to go ahead and make corrections. And finally, there's a reporting step. You have to report regularly to your issuing banks and to the brand holders of the, the cards. There are two levels of compliance merchants and service providers have to be validated. Okay.
This often means having to bring in certified auditors to make sure that you are in compliance with what's going on. This isn't required for banks and other issuing authorities, but they have other steps that they have to do. So that brings us to where we are today, which is talking about version two, which in it, when do you have to have these steps implemented? How can you best do it?
And to tell you about that, we've got two acknowledged authorities in this field, Tom Arnold, with payment software companies of it's a firm specializing in payment systems, security and compliance for companies that accept process consumer payments. In other words, people who have to Implement version two, his experiences as a VP of engineering and chief software architect for over 20 years, he's a CIS S P and an is S MP joining him as Dr. Torston George, who has more than 17 years of global experience in promoting security software and network equipment products.
They're gonna tell you about version two, what you need to do, and perhaps show you some ways that you can ensure that you're compliant. And at the same time, not having to, you know, spend an arm in the lake to get there. So with that, I'm gonna turn it over to Tom and TASA. Thank you, Dave, for laying the foundation for today's webinar. We really appreciate that. So in the next 30 minutes, what we would want to do, Tom and I is really take you through the PCI DSS challenges, give you a little bit background on PCI and its principles.
Then really focus in on the key changes in PCI DSS, 2.0, you might have read about it in the media, but not really a lot of details. What really comes your way. And there are quite few challenges that you will be facing if you haven't started yet. And then also looking at really the impact that these changes create. And then we will introduce you to a concept of managing PCI DSS as a life cycle. We also will share with you a case study. And then at the end, we will be kind of doing a Q and a session with you where you can really take advantage of Tom joining us here today.
So Tom, why don't you tell us a little bit more about your company? Good morning, I'm Tom Arnold and I'm one of the principles at payment software company or PSC. We're one of the few or select few companies that is globally certified to perform assessments and audits on the payment card industry, data security standard, the payment application, data security standard. We are an authorized scanning vendor. We perform penetration testing as well out of our security laboratory.
And we perform several other types of security audits, including the ISO against the ISO standards, the verified by visa audits for 3d secure and audits related to the handling of checks. Our experience states back to the earliest days of the PCI standard. And we have been doing this since 2003. Do you wanna tell a little bit about Alliance? Sure. So a agile Alliance has been around for more than six years. We're a private company focusing on it, governance, risk and compliance and security risk management solutions.
We have seen tremendous growth and our innovative technology has earned three patents and many industry accolades. For instance, we're rated as the highest strength player in the market by Gartner as well as Forester, but more importantly, our success resulted in the fact that we're serving many global 2000 companies and government agencies when it comes to helping them with compliance, especially here, today's topic, PCI compliance. So let's go a little bit more into detail. So what we'll be taking away from today, as we all know, we're currently dealing with PCI in a very manual fashion.
It's a point in time approach that we're taking the QSA kind of takes on the role of a project manager. He's kind of running back and forth, trying to get the information from all the stakeholders that are involved, but it's a rather cumbersome and slow process. So what we have seen in the field is it takes about 90 to 180 days to kind of complete the overall audit preparation process. And a lot of times the knowledge stays within a very small group. And if they would leave your organization, you have to start all over again.
So what we would like to introduce you to today is really more a life cycle concept, something where you break down the process into different phases, consisting of scoping gap analysis, remediation, certification, and maintenance, and really do this as an ongoing change driven process to be continuously compliant. And we will learn today that basically the changes that PCI two, all brings to the table will drive you towards this type of approach, cuz they're quite significant changes. So what are other things that you will be learning today?
So we will establish the challenges of managing P C I DSS. We will discover the key changes in the P C I DSS two standard, and you will start understanding how these changes might impact your organization as it relates to it, spending and security. And then we will go back to really talk about approaching the overall process as a life cycle management process rather than as a point in time approach. So when we talk about PCI DSS challenges, we all know it's a very costly process.
According to a Gartner study, a level one mergence spend on average 2.1 million for the PCI process level two to level four merchants still spend 1.1 million. And it's a very project driven approach, very resource intensive. We have heard from many people that we talk with that for a longer period of time, they don't see their family course. They're spending the time and preparing, gathering the evidence that is required to be successful with the audit. It's a very slow process that you have to follow.
And it has also impact that it has potentially a low compliance impact, meaning that at the end of the day, what you're doing might not necessarily drive your security posture. The overall process creates a lot of audit fatigue. So a lot of people are jumping ship cuz they feel that they're stressed out, they're overloaded with work and you still get a lot of pressure from the cart industry to really meet the requirements. There are more, more companies that get fined for not being compliant. But if you take all of this into account, you would have to say, Hey, okay, we spend a lot of money.
We should be doing very well. Numbers show that this is not the case. The Verizon 2010 payment card industry survey showed that at initial report of compliance, only 22% of the company's achieved compliance. And this, after the certification, this level drops off dramatically. After 60 days, only 18% of companies are still compliant. And then there has been done a breach investigation where the numbers even more worrisome, only 11% of those companies are still in compliance. And this is again a reflection of the fact that it's more project driven approach.
Once you have your certification, you applaud yourself, clap on your shoulder and probably prepare to rest a little bit and then prepare again for the next audit cycle. But what we really tell you is you have to see this as an ongoing continuously process to really maintain compliance levels. So let's dive really more into the details. And I head over to Tom who really has the deep knowledge on, on PCI. So thank you. Torston we're gonna take a step back here and keep in mind the life cycle because we're gonna come back to it.
But let me talk a little bit about what the PCI data security standard is all about and how the standard basically is laid out and how it works. First off the PCI data security standard is a core set of 12 principles. People always think of these 12 principles. They look at them and they think, oh gee whiz, this is all there is to it. Well in the PCI data security standard, as Dave said it earlier on in the 1.2 0.1 standard, there were 253 individual requirements that required compliance.
And in order for an organization to be compliant with the PCI data security standard, the passing grade is 100%. So you have to be compliant with all 253 requirements for 1.2 0.1 under the 2.0 standard, there are 298 individual requirements as sub requirements of these 12 principles quickly on the 12 principles we have, number one, it says firewall configuration. This is the, the requirements that cover our perimeter security controls and our network security controls. Number two says vendor supply defaults. It is our server system and component controls.
So this is the hardening of our systems. This is our, so our configuration standards in those areas, number three is the protection of stored cardholder data. This is the one that PCI is most remembered for and deals with the encryption and key management and how that primary account number the pan is actually stored when it's being maintained. One thing I want you to remember is that all of these principles that you see and the 298 requirements all focus in on 16 digits, it is the primary account number, nothing but the primary account number, number four deals with the security.
Well, the card number is in transit. So when it's being communicated from that EMV device to the point of sale, through the point of sale, back to the processor or to the processing systems, it's how that data is secured. As it's being transmitted. Number five is our anti-malware section. It's not just anti-virus anymore because that would suggest just the PC environment, but the Unix environments in the server environments need to be protected from root kits. And malware. Number six is the security of our software development and our secure systems development.
It is all about our software development life cycles, our change control, our production control. All of those areas. Number seven, really deals with authorization. It is our authorization systems who is authorized to see cardholder data and to handle it. And what procedures must they follow while handling it? Number eight unique IDs is really our password and authentication systems. Things like our two factor authentication for remote access to the network is in here.
Things like the requirements for passwords to change every 90 days and the length and complexity of passwords are also in this section. Number nine is our physical security controls. We want to protect the systems that are involved. And that includes for retailers, the protection of the point of sale environment in the retail stores as well. Number 10 is our logical access controls. It's basically our section and logging. This is where we capture all the logs of all the events that are going on in our systems. This is where we analyze whether or not we have a problem.
It's not just the question of performance monitoring. It's also the question is anyone trying to attack our systems? Number 11 is the requirement for the security department to test security controls and to test systems. This is our scanning, our vulnerability assessments, our, our search every quarter for rogue wireless devices and rogue devices. And then finally, number 12, we have our security policies.
Now, when you overlay this across our life cycle, let's look at what the impact and changes to the PCI. 2.0 are within our life cycle, in the scoping phases early on in the scoping phases. The definition of scope now in the past, the systems looked at scope as being any systems or technologies that stored process or transmitted credit card or cardholder data. Now it's people process and technology that store process or transmit cardholder data, significant change. We'll talk more on the scoping area. The QSA must sign off and validate that the scoping process was complete.
The QSE a is our term for our outside auditor. That is the assessor who's looking at whether or not the organization's compliant in the gap analysis, governance of policies, evidence of execution of policies, and the controls over virtual environments are in here. There's a hundred percent asset coverage that's required for systems that are within our cardholder data environment and any systems that are connected to the cardholder data environment.
So that administrator's workstation that connects and has administrative access to that database server that is actually storing our cardholder data. Guess what that administrative workstation is now within the cardholder data environment and required to comply with all 288 of the requirements. Then finally there is in remediation, the question of risk modeling, the question of vulnerability assessments that are going on and the collection of information related to the risks of the organization. Now let's look specifically at some of the changes related to scoping.
There's a change in approach and responsibility in the past, the qualified security assessor would come in and analyze the organization. Do discovery work to try to figure out where the cardholder data is in the organization. We called this acceptance channel analysis actually from a technical term. And we would begin to identify what systems needed to be evaluated for compliance with the standard under 2.0, there has been a significant change.
In essence, this is a quote directly from the standard that says at least annually and prior to the annual assessment, the assessed entity, that would be you out there should confirm that the accuracy of their P C I DSS scope by identifying all locations and flows of cardholder data and ensuring they're included within the scope. Then there's many more definitions that actually take place. The process for scoping is basically the, the entity assesses and identifies and documents. The existence of cardholder data. This is a non-trivial piece of work.
It involves a lot of search, a lot of discovery and a lot of analysis with regards to the life cycle of handling this data. Cardholder data can be found in accounting organizations. It can be found in sales audit. It can be found in audit organizations. It can also be found in network sniffing devices, devices that are designed specifically to enhance the performance of your own internal network can suddenly have cardholder data within them. Once all the locations of cardholder data identified and documented the entity uses the results to verify the scope.
So what we're saying here is there's a specific, scientifically based testing that takes place, where systems that are believed to be outta scope are actually analyzed to see if they contain data are, have access to the cardholder data environment. Finally, the considers any card already found to be in scope and documents that scope and retains all the evidence that shows how they perform the scoping and scoping analysis. The entity then makes an assertion with regards to the scope.
And must also explain how any network segmentation, isolates systems that are not in scope and isolation of network. Segmentation is one of the new terms that's come in. So if we're saying that a system is outta scope, it is either physically or logically isolated. And for the technical people on the phone that is layer two layer, three isolation, basically from the cardholder data environment. And then the entity must have evidence that fully supports any conclusions that they make with regards to scoping.
At the end of the scoping effort, only the qualified security assessor is required to now review the results and document their agreement with the assertion that the entities actually making. That's the pieces for scoping. If you would. Now we're gonna talk a little bit about the impact of PCI DSS, 2.0 on it operations. So this is just on the technical operations of the systems themselves.
First off, there's an expansion of existing requirements. Testing procedures have been added to the standard to replace what was the bulleted items in the testing procedures. There's been significant rewording as a good example in the logging sections where you're a lot told to retain your logs for at least one year, they added the word that the logs have to be immediately accessible. Now that word has a lot of variances. As far as definition, what is immediate? The average person on the street would believe immediate is the response to a Google query.
When you're doing a search online, that's fairly immediate. So how those logs are actually stored how those logs can be actually retrieved so that they are in fact immediately available is a very difficult problem. There's a number of places in the standard where changes have occurred with regards to not just rewarding, but in clarification and in definition, and the security standards council did release a document just this last September 22nd, that is called the reporting requirements.
This is the requirements for PCI DSS, 2.0 that are put upon the QSA or the assessor that explains how they should interpret the standard and what types of evidence the assessor has to actually collect and specifically represent in the report. This provides an excellent tool for analyzing and evaluating what they mean in the standard w when they define and change wording. So it's a very important document to include along with the standard itself, there's limits on sampling for actual testing.
This has been an area where we, as a security assessment company have actually been working with the security standards council to try to get them, to reduce the mandates for full surveys on environments. For instance, where database accounts and database access is concerned. They want us to survey all database systems in the organization. That's very difficult when you're a large telecommunications or a global company to actually achieve, and we've made our case to them.
And now sampling is now allowed in several of the areas, but it must be known that that sampling has to be statistically relevant in those areas. There's redefinition of past requirements. There's many, many clarifications of terms that are included in the standard, and there's much greater emphasis on people and process. Let me underscore that it is all about people, process technology, that store processes or transmit cardholder data. And when we talk about cardholder data, we're talking about the 16 digits of the primary account number that is the pan.
As we refer to it, there's new requirements. There is a huge requirement that sunrises, as we say on June 30th, 2012, which basically makes and converts the standard into somewhat of a risk-based approach. This requirement requires a risk analysis on a routine basis, specifically monthly for new vulnerabilities that might affect or impact the systems that risk analysis then has to be applied within the areas of server configuration within the areas of software testing with some scanning and vulnerability analysis.
And within the areas specifically with regards to time and time systems that are used to synchronize time clocks on our systems, there is an introduction of a requirement of metrics to be included within any vulnerability analysis. Also sunrise on June 30th, 2012, and those metrics are related specifically to the use of something called a common vulnerability scoring system. And then there's a given threshold that if based on the CVSs or common vulnerability scoring system, any vulnerability identified greater than four on the score would have to be addressed by the organization.
These are all components that are related to ongoing maintenance of compliance, and that life cycle specifically a more in the operations area, as examples in the section three area, which is our area that deals with the stored cardholder data, we now include chip equivalent data. And this is becoming a very significant piece specifically in the European environment where EMV chip and pen exists.
And there are specific details in the standard now covering the chip equivalent data related to R F I D use and nearfield communications, which are areas that are coming both within the United States and within Europe right now, especially when we see mobile devices coming into play. And we're now seeing transit systems that allow nearfield communications, all that data is also included within the PCI data security standard under the two factor authentication.
Well, there's always been a reference of individual certificates and the use of a PKI within the organization as a good example. There's some areas in here where the reference to individual certificates is actually missing in the standard. If you notice. And so specifically how this gets complied with is an important impact on our it operations organization, audit logs are available for at least one year, and this is our word immediately that I mentioned earlier.
And finally, we hit 11.1 11.1 is the testing of presence for the presence of wireless access points and detect unauthorized wireless access points on a quarterly basis. Well, as you get deeper into this requirement, it starts discussing that it's all rogue devices in, in the, how they define wireless is another important point. It's just not devices that are involved with 8 0 2 point 11 as an example. It also includes Bluetooth. It includes things like, is there a GSM card that has been inserted into a point of sale system? So there's a physical inspection component to this as well.
So it includes USB devices that might be installed on point of sale systems, which has been an attack vector that's actually been used. So it's much more than just wireless scanning. It is the detection of rogue devices further on there on the it operations impacts. There's primary impacts that affect retailers with multiple locations in our experience to date with large retail organizations, if an organization's not ready for the PCI 2.0 impact on the it operations, the overall impact of PCI 2.0 can be up to a hundred percent greater than the prior year. Year's audit impact.
It's that significant because suddenly all of those retail sites have numerous requirements that they now have to comply with. There's it staffing and headcount issues for scoping and segmentation. The additional work required to do the work for scoping and to deal with network segmentation issues is very significant. There's readiness. There's significant CapEx for upgrades of networking equipment and infrastructure. There's also OPEX expenses involved as well on the assessment. We have the collection of evidence for the assessment can take twice as long for a service provider.
As you heard Dave earlier, there are service providers. These are companies like processors or payment gateways that might be aggregating merchant transactions. The impact of PCI 2.0 that we're seeing is around 27% great period than the past year under PCI. 1.2 0.1 for merchants they're eCommerce. It's about the same as the service providers at about 20% greater than 1.2 0.1 for retailers, as I said, it can be a hundred to 120% greater as far as the requirements for collection of evidence and the performance of the assessment.
And that's a huge motivator as to why a life cycle approach to performing this assessment is extremely important when you're getting into PCI 2.0, it's also why start now specifically, the answer is because the impact is so much higher. The reporting requirements on the QA, along with our quality assurance requirements is very significant. And right now for generation of even a report on compliance is taking about double the amount of time that it took in the prior year. So it can take up to three weeks to actually generate the reports. The reports right now are averaging at about 260 pages.
Each, even for a small organization, you can see the budget impact there's CapEx and there's OPEX impact. It can be two to three times higher than the prior years engagement on security operations. First off the strongest piece of advice we have for companies that work with us is that the security O organization needs to focus their activities on security compliance is basically a result of building a strong security system.
I get personally very frustrated when I walk into an organization and I see that the security officer has been, had handled the title of compliance officer, and now his job and metric is all measured on compliance, not security. And that's a real problem because now as they focus on compliance, it turns into a project, which is what Torsten talked about earlier. And when you have that project mentality going into this, the likelihood of you maintaining compliance and a strong security posture is almost zero. You're gonna fall out of compliance and you're gonna fall on the security front.
We would much rather the organization focus on good security practices following the ISO standards, the 27 1 and all of its family of requirements is very good. All right, following the PCI as well is very good, but just performing strong security practices in their operation, the it security dividend that we always like to remind people is if you focus on security, you will end up with much improved reliability of systems, much greater availability, easier maintainability trust through integrity and confidence through privacy.
On the security part, there is the threats and vulnerability management components that are in here. We have to verify that our system configuration standards are updated. Now this 2.2 B is specifically referencing section 6.2. This is our requirement that will sunrise, which means it becomes in effect as of June 30th, 2012. It requires that the risk assessments be performed on a routine monthly basis.
And that in essence, our configuration standards are being updated based on new vulnerabilities that might impact the organization time synchronization technology, and the implementation of time. Synchronization is extremely important within the organization. An attacker loves to mess with time synchronization because time synchronization will throw off the logging will throw off our detection systems will throw off our systems to try to identify whether or not we've had a problem and how the problem is occurring. If we can mess with the time on a clock on a system.
And so this scan is now associated specifically with the risk assessment. The scan reports scanning is extremely important, and I remind people, it is not just a quarterly event to perform our internal scans it's after anytime you change the systems. So if we're thinking security in our organization, anytime our systems are changed, we are rescanning and testing for our vulnerabilities. And we're updating our vulnerability database. According to 11.2 0.3 B basically we're updating our vulnerability database based on any new vulnerabilities that are coming into play with the requirement. Okay.
And, Okay, thanks. Thanks, Tom man. Quite overwhelming. And Tom only hit the tip of the iceberg. We will give you later on a link to a white paper that Tom wrote, where he really goes in depth on all the changes that are current and PCI DSS 2.0, but as you can see, I think just focusing in on the hundred percent asset coverage that is now required, even for smaller company, that is an overwhelming task. Or even if you have 50 hundreds of assets, you're now talking about having to go through hundreds, thousands of pages of locks to really get on track.
And that's very difficult to do if you do that as a project based approach. So again, with the new requirements, what we are propagating is really that you follow a life cycle approach, where you divide your, your process into the stages, find by scoping gap analysis, remediation the certification, but then also a maintenance phase. Meaning if you add a new asset that asset immediately triggers a scoping assessment, it really triggers an assessment. Are there any gaps related to that asset? So you are not waiting until you are ready to prepare for the audit.
You're immediately starting the process, which shortens the time that streamlines the overall process. And so it's all about automation and collaboration course. If you also work in a life cycle concept, it's, it's really easier to maintain your records to communicate beyond each other. And so this is something that is very important going forward, but what does it really mean for you? So let's take a look at each individual face.
So when we talk about automating scoping, we are really talking about addressing the challenges around a large asset base that you have to deal with where a lot of companies don't even have a clue about their current inventory. They're always struggling to update that inventory. When it comes time to the audit for PCI, you have to understand that you're sharing assets and that their different hierarchies then virtualization definitely throws a curve ball at you. How should you do this? How should the scoping process be handled? And then the new requirements.
So if you automate this process, if it's part of a life cycle concept, you have a scalable asset centric, risk management database that contains really your inventory. You assess once and you comply to many, not just PCI, a PCI has overlaps with other regulations. And so you can take advantage of that. And then as I mentioned, you can have an automated scoping triggered by the database changes.
So if you add the new asset that triggers immediately a scoping process, and if you would follow an automated approach, what we have seen in the field is really that you can reduce the time that you're normally spending with just around 30 days. You can reduce that down to 10 days. So let's take a look at gap analysis and gap analysis and remediation kind of falls into the same kind of bucket they go together.
But here again, the challenges are around the large number of assessments that you have to, you have to think about duplication efforts across assets assessments, and any year that you are conducting this, it's a lot of duplication. Then there's a lot of more evidence required nowadays on a new standard. So just handling that is really putting you down as it relates to resources and how long it will take. So when you automate the gap analysis process, you can really conduct unlimited automated assessments and control checks.
You can leverage pre-built connectors and pre-built surveys that make it easier to collaborate, to gather the evidence. So it's no longer the QSA that runs around like crazy and fax people to send them the evidence here. They get the survey, you, as somebody that manages the process can now see the status that they're applied to it. You can escalate it, there's a workflow to it, and it's all centralized. There's an audit trail to it. And you can reuse all of the information across owners, asset assessments year by year. And that's very important.
That's really reducing a lot of time, feel a reduction from 90 days down to 30 days, that's dramatic. And when we talk about remediation again, there is now the notion of risk based remediation. So you have to look at a huge number of locks you have to see, okay, how do they relate to risk? You have to correlate it back back to the business criticality of the asset, which a lot of times not possible if you just rely on vulnerability scanners. So then you need special approval for compensating controls. And all of this together really puts a lot of strain off your, on your organization.
So if you automate that, if you use technology to really gather the evidence to streamline your rehabiliation efforts, you can really automate the whole evidence incident and exception management process, which helps you, especially as it comes to vulnerabilities, cuz we all know the security team is responsible for finding a vulnerability. They throw it over the fence to the it team.
And then, then they pray for the best, cuz it most likely disappears on a black hole. If you automate the process and you bring those two groups together, then suddenly you have better collaboration, but you also have better visibility. And this visibility and timely approaching of vulnerability allows you to stay compliant. Going back to the numbers that we mentioned earlier within 60 days of certification, most companies drop down to an 18% compliance level. That's shocking.
So if you have a mindset where on an ongoing basis, your remediate, you collaborate, you address the problems right then and there that ensures that you're staying in compliance, which is very important considering the fines that are attached to PCI two. Oh and then the evidence repository is really mapped back to the requirements, which really helps you to save a lot of time here too. So gap analysis and remediation together, a reduction from 90 days down to 30 days, if you automate that process.