Tokenization is a very hot topic among Gartner clients who have to comply with PCI DSS. After all, by not storing electronic cardholder, ‘most’ enterprises are eligible for a greatly reduced set of PCI requirements as contained in SAQ (Self Assessment Questionnaire) A, B or C.
The problems with tokenization are many but they don’t come close, in my opinion, to outweighing the benefits. (Please see our recently published Toolkit “Checklist of Tokenization of Card Data for PCI compliance). The biggest problem, in my opinion, is the lack of standards for tokenization, meaning that if you tokenize using one vendor’s solution, you’ll have to ‘retokenize’ with any replacement vendor solution that you may eventually switch too. That’s not too big a problem however, if you choose a vendor with a sound methodology and track record. For now its up to the implementing enterprise to determine what constitutes a sound methodology.
So what are the ‘authorities’ up to that will help standardize and formalize the strong trend toward tokenization?
Visa recently published a draft of its “Best Practices for Tokenization” Version 1.0 and solicited stakeholder feedback and comments by August 31. Lots of people have issues with the draft; I’m not one of them. Maybe I’m reading a different draft. It’s pretty basic and doesn’t help or hurt much. I think its main (positive) attribute is that it formalizes recognition for tokenization as a viable method for securing cardholder method.
The PCI Security Council on the other hand hasn’t come out with anything formal yet on tokenization and won’t be doing that any time soon. (I don’t expect anything on it in the updated PCI DSS standard coming to us this Fall). They’ve left that to a Special Interest Group (they also have SIGs that look at EMV/Chip, and point-to-point (a/k/a end-to-end) encryption). The SIG has no deadline for coming up with anything on tokenization but likely will publish something on it by the end of 2010. Is that going to offer practical advice? Probably. And that’s a good thing but it will simply be guidance – not an enforceable standard.
More importantly, the simplicity of the SAQ process is about to end and that will make the rewards of tokenization (a greatly reduced audit) not as easy to achieve. The PCI council is planning to come up with a Q & A that each card acceptor (e.g. retailer) fills out to describe the card acceptor’s card handling environment. And out will pop a customized SAQ tailored for the card acceptor’s/retailer’s environment. Again, there is no deadline or deliverable date for this revised SAQ process but I expect it to happen within 12 months, and probably much sooner. I also expect it will be done in phases with incremental additions and changes to the SAQ process starting with the release of the new PCI DSS standard this Fall.
Presumably if you don’t store data, you don’t have to protect data at rest. So hopefully we can count on that continuing. But data at rest for PCI compliance purposes will probably come to mean (and has already come to mean) data stored in memory – even for a fleeting moment – so watch out for the nuances.
Life and PCI compliance is not going to get easier. It’s probably going to get harder and more complicated when it comes to implementing tokenization in order to reduce the scope of PCI audits. It’s still worth undertaking but with so many cooks in the kitchen, expect a rougher ride.
Category: Uncategorized Tags: