Gartner Blog Network


What exactly makes a “secure tokenization” algorithm?

by Ramon Krikken  |  October 21, 2010  |  1 Comment

With tokenization being heralded as a PCI DSS knight in shining armor, many vendors are eager to rapidly develop and ship tokenization products and services. It is not encryption, is more secure because there is no key to steal, and enables you to decrease PCI DSS scope, or so the story goes … for now, because there is no guidance yet (other than the less-than-ideal “Visa Best Practices for Tokenization Version 1.0“) on what makes a tokenization design acceptably secure. Sure, we can design the architecture with standardized secure storage, secure channels, proper authentication and authorization, and auditing and logging, but the tokenization algorithm choice can also matter a great deal.

I discuss this topic in much more detail in my “Data Masking: Run-Time Data Aliasing” document, but at the heart of the matter is that tokenizing data and storing the mappings in a database is cryptography – it in essence creates a code book, and the strength of the code does matter. All too often I hear the “it’s not encryption: there is no mutual information between the original and the token” as the only requirement for a “secure token,” but it’s not as simple as that. A FIPS-validated random number generator (RNG) is theoretically the best choice, while optimizations (to increase storage, performance, and support distributed token systems) can easily weaken security. And although the industry is working on creating guidelines for credit card processing, it remains to be seen whether the optimizations that work for credit card number tokenization work for other data types.

So in short: without good standards to validate these algorithms, buyer beware! Ask vendors what their token-generating algorithms are, and be sure to analyze anything other than strong random number generators for security. Hopefully we’ll soon see some good guidance from the PCI council and standards bodies. And if you are a Gartner IT1 customer, feel free to set up a dialogue to discuss the tokenization solutions you may want to evaluate.

Category: 

Ramon Krikken
BG Analyst
2 years at Gartner
15 years IT industry

Ramon Krikken is a Research VP the Gartner for Technical Professionals Security and Risk Management Strategies team. He covers software/application security; service-oriented architecture (SOA) security; structured and unstructured data security management, including data masking, redaction and tokenization...Read Full Bio


Thoughts on What exactly makes a “secure tokenization” algorithm?


  1. Interesting article. I was under the obviously mistaken view that the encryption process was already safe, I had not even heard of tokenization before. I guess I had better do some more reading up on the subject.



Comments are closed

Comments or opinions expressed on this blog are those of the individual contributors only, and do not necessarily represent the views of Gartner, Inc. or its management. Readers may copy and redistribute blog postings on other blogs, or otherwise for private, non-commercial or journalistic purposes, with attribution to Gartner. This content may not be used for any other purposes in any other formats or media. The content on this blog is provided on an "as-is" basis. Gartner shall not be liable for any damages whatsoever arising out of the content or use of this blog.