As Christmas shoppers spend away and data breaches keep hitting the headlines, the Payment Card Industry’s security council is charged with keeping customer’s data safe.
By Miya Knights, 12 Dec 2008 at 11:14
The Payment Card Industry Data Security Standard (PCI DSS) and the global forum formed to administer it, the PCI Security Standards Council (PCI SSC), pre-dated the biggest security breaches that have come to mark a new era of unprecedented cyber criminal activity.
Since card operators Visa, MasterCard, American Express, Discovery and JCB aligned their individual data security policies and created PCI DSS in 2004, the likes of TK Maxx, Cotton Traders and numerous government departments have proven the need for such regulation.
But the PCI DSS has risen up the corporate agenda ever since the threat of fines and losing the ability to process credit cards was introduced with a June 2007 deadline for those found to be non-compliant.
The standard is intended to create an additional level of protection for consumers by ensuring that merchants meet minimum levels of security when they store, process and transmit cardholder data. And the PCI council is charged with regulating PCI DSS and communicating its importance to any organisation handling credit card data anywhere in the world.
IT PRO spoke to PCI SSC general manager, Bob Russo about the challenges faced in raising the data security agenda.
IT PRO: 2007 was big year for PCI DSS, with the passing of the payment card operators’ final deadline for compliance. What’s been going on this year?
Russo: It’s been just as busy. We released version 1.2 of the standard in October. Just prior to its release, we had our North America community meeting, which attracted 625 attendants and actually included quite a few representatives from Europe. There were a couple of days’ good debate about the development of the standard, given that we’re in a two-year cycle.
Next year will be a feedback year on how the implementation of version 1.2 has gone. And we also talked about our new QA [quality assessor] programme and got a lot of feedback on that, having kicked it off in October to maintain the quality of PCI assessments as well.
Then we had our first European meeting in Brussels with well over 200 people attending. I would say there is a lot more uptake in Europe on the standard. In fact, they are running, not walking, to comply. Reaction to the new version was good. It doesn’t really contain any surprises, but instead includes a lot of clarifications, so organisations looking to stay up to date don’t have to go back to square one to remain compliant.
It’s interesting that you observe organisations are ‘running’ to be compliant. How do you propose they keep up if, as you say, the standard is on a two-year development cycle?
My guess is that the next release in 2010 will be a 2.0. But there are a couple of things we’re doing to make sure it develops in line with the capabilities of our stakeholders. Starting in January, we’re launching research into how the standard’s specification should embody emerging technologies, like end-to-end encryption, virtualisation and secure payment tokens, that might come outside of its scope, making it easier to comply.
The study that the council is commissioning will also look into making the standard more robust and will be a major piece of what version 2.0 will be. It will help determine what can be added to or deleted from the standard to take account of new systems’ functionality, as well as how any revision might impact that new functionality.
For example, there are specific sections in the standard that sets out how credit data is to be stored. But it has to be decided that if the data being stored in a certain way, using particular technologies, whether they would be sufficient to deal with the threats to its security.
We also introduced the Payment Application (PA) DSS. And before the end of the year, we’ll be releasing two additional controls to the existing PED [PIN entry device] standard around unattended payment terminals and hardware or software host security modules.
Having had the opportunity to get feedback on the current release of the standard from merchants and card payment companies, what have been the areas that have attracted the most debate?
I wouldn’t say we’ve had any debate, so much as clarifications, as version 1.2 sought to do, along with the combination and simplification of some of the forms that have to be completed. There were some clarifications on timings and on what security components are in or outside of its scope, such routers and firewalls. But any organisation handling sensitive data has to use the security features of both. And the standard applies just as much to paper media as it does to electronic media, as another example.
Another area that was discussed was the fact that a lot of merchants have gone down the WEP security route for their wireless networks. But events at TJX and other companies have proven WEP password security is not as secure as it used to be and so we’ve set a deadline of 31 March 2009, after which there should be no new installations of WEP security. And by June 2010, there should be no WEP installations at all.
Well, I’m sure you can imagine that there were a few that weren’t too happy about that, especially as a lot of major merchants have spent a lot of time and money on their wireless networks. But even they, perhaps grudgingly, understand that WPA and WPA2 wireless security standards are far stronger. And the deadlines for transition should give everybody enough time to get ready.
So, if you are finding overall agreement over the specifications of the standard, how easy has it been to get businesses to take the threat of non-compliance seriously?
Lots of companies I meet that are getting compliant are trying to deal with not having any security standards in place at all. They are using PCI DSS as springboard to get security on the business agenda.
And in the largest, Tier 1 retailers, they have been using legacy systems that were installed 10 to 15 years ago. You have to remember that, what was available in security terms, was quite a bit less than is available now. Retrofitting these security technologies is a very delicate thing to do and costs quite a bit, and perhaps even more so in making sure it doesn’t cause any problems to the business.
This is reflected by the fact we’re looking at developing the qualified assessor programme to be a first line of support for merchants. This is exactly what the PCI council wants, why we train them and why we’ve introduced a process of remediation for assessors as well.
As for the threat of fines, I can’t comment on that as the card brands are in charge of that side of regulation. Thankfully, it hasn’t come to that. But merchants are beginning to understand that the potential damage to their brand if they are involved in a security breach could far outweigh the cost of a fine. And they are realising compliance is becoming a differentiator – that consumers can feel safer shopping with them.
How do you see the progress of PCI DSS efforts in Europe going specifically?
Europe is a little more boisterous that the US, but then it is further along in implementing the EMV chip. That’s succeeded in lowering fraud at the counter with chip and PIN. But that’s also basically succeeded in moving fraud over to CNP (card-not-present) transactions. I also think they’re not shy in addressing any issues they are facing in complying with the standard.
Generally, I think European merchants have also done a lot more work on developing their transactional systems. Within the study I mentioned that we’re launching, we’re calling the EMV chip an emerging technology. But then you guys in Europe are using it every day. I remember back in the beginning of the roll out of PCI DSS, I heard merchants in the UK saying that they’d already jumped through hoops to become compliant with chip and PIN and done stuff to make their systems more secure that we hadn’t in the US. And that’s great, but the security issues are still there. One new technology doesn’t solve the issue. And it’s just one example that reflects the work that needs to be done to make sure the standard is as robust as possible.
You’ve mentioned a major study that the council is launching in the New Year. How will it be conducted and what will it involve?
I can’t say too much about its methodology as the study is now in RFP [request-for-proposal] stage, so its scope may change. But suffice as to say, it will very strongly focus on those emerging technologies I mentioned earlier to see how they affect, or don’t affect the scope of the standard.