Gartner Blog Network


The Politics of Encryption in PCI DSS (part 2)

by Ramon Krikken  |  April 17, 2012  |  2 Comments

In my last post, commenter Randall Gamby notes that “of course [tokenization is encryption]. ” I wholeheartedly agree. But unfortunately the current PCI guidance does not, and cannot support this notion (and, because of this, people who build and/or implement tokenization cannot do so either without creating a tokenization catch-22). When we look at the PCI SCC statement on scope of encrypted data, we see the following:

“However, encrypted data may be deemed out of scope if, and only if, it has been validated that the entity that possesses encrypted cardholder data does not have the means to decrypt it.”

This takes us to a problematic decision point: if all tokenization is indeed encryption, having a tokenization system in house doesn’t reduce scope. If it is not, then even if you were to use a unique key per encrypted token you’d be functionally equivalent to random tokens but still fully in scope. Neither is sensible (and the proposed re-examination of encrypted tokens to carve out exceptions doesn’t make sense to me either under the scope guidance).

I much appreciated when the PCI SCC loosened its requirements around encryption key rotation. After all, is it really sensible to have to rotate backup tape encryption keys every year? But more change is needed – change to ensure that any reversible data transformation system is examined with the right level of architectural and algorithmic scrutiny (i.e., we can focus on what matters in practicality).

Additional Resources

View Free, Relevant Gartner Research

Gartner's research helps you cut through the complexity and deliver the knowledge you need to make the right decisions quickly, and with confidence.

Read Free Gartner Research

Category: 

Tags: code-book  cryptography  encryption  insanity  pci  pci-dss  politics  tokenization  

Ramon Krikken
BG Analyst
2 years at Gartner
15 years IT industry

Ramon Krikken is a Research VP the Gartner for Technical Professionals Security and Risk Management Strategies team. He covers software/application security; service-oriented architecture (SOA) security; structured and unstructured data security management, including data masking, redaction and tokenization...Read Full Bio


Thoughts on The Politics of Encryption in PCI DSS (part 2)


  1. tokenization is catch-22 for companies that keep the actual sensitive data (full card number) in-house.

    If a merchant uses their payment processor’s token (and doesn’t store full card info in-house), it greatly reduces the potential impact/risk for the merchant.

  2. True, Michael, that using a processor or gateway tokenization solution makes the scoping part easier. But it’s (as usual) a tradeoff between security and other aspects – not everyone will be content using a single processor or gateway because of performance, up-time, and continuity concerns. Not an easy choice for the larger merchants.



Comments are closed

Comments or opinions expressed on this blog are those of the individual contributors only, and do not necessarily represent the views of Gartner, Inc. or its management. Readers may copy and redistribute blog postings on other blogs, or otherwise for private, non-commercial or journalistic purposes, with attribution to Gartner. This content may not be used for any other purposes in any other formats or media. The content on this blog is provided on an "as-is" basis. Gartner shall not be liable for any damages whatsoever arising out of the content or use of this blog.