Tokenization, the Newest Horse – err, Camel – in the Stable
As the old saying goes, “a camel is a horse designed by a committee.” This saying perfectly describes the recently published PCI DSS Tokenization Guidelines from the PCI SSC. While the original intent of the document was a noble one, the final version fell way short.
There were four goals in mind when we at Shift4 created tokenization and I’ll be going over each of them. It took six years for tokenization to go from concept to widespread adoption, and then on to committee. (It took five of those years for PCI SSC to even acknowledge the concept of tokenization, but that is another story for another day.) It then took less than one year for the committee to obliterate three of our original four goals and release this abomination that I will be referring to as “PCI Tokenization,” rather than TrueTokenization®, which refers to the original tokenization definition. (By the way, TrueTokenization is a registered trademark of Shift4 Corporation – we learned from our mistake of not registering tokenization, thereby allowing others to redefine the term.)
Goal #1 – Improve Security
Tokenization should improve security within the payment environment by reducing or eliminating entirely the use of primary account numbers (PAN) and cardholder data (CHD) within various components in the merchant environment. This, by far, was the number one goal in the creation of tokenization. What is the easiest way to keep CHD out of the hands of bad guys? Easy: don’t store it. At the time tokenization was introduced into the payments realm, the biggest vulnerability contributing to CHD breaches was data storage. Tokenization addressed this by substituting non-valuable tokens for the CHD it was protecting.
While strong encryption can address this vulnerability, strong encryption requires proper key management which can be both complex and costly. Tokens, on the other hand (since they are not mathematically related to the data they are protecting), do not require protection, and they eliminate the need for keys and key management, greatly simplifying the process.
TrueTokenization defines a token as not mathematically related to the CHD it is protecting. This means that a token cannot be any form of encrypted CHD or a hash of the CHD. By using a token not mathematically related to the data it is protecting, there is no way to decrypt the token to determine the PAN.
On the other side of the coin, PCI Tokenization allows for a token to be comprised of encrypted CHD or hashed CHD. Encrypted data can always be decrypted if the keys are ever determined or otherwise compromised. Hashes are one-way ciphers (meaning that hashed data cannot be decrypted like encrypted data) but they are vulnerable to compromise using a rainbow table, which is a big ugly table consisting of all the possible values (or likely values), and a simple look-up.
Goal #2 – Help with PCI Compliance
Shift4 is a strong believer of compliance through security, meaning that compliance should be a byproduct of security and not the other way around. Focusing primarily on compliance instead of true security is a recipe for failure. Provided that Goal #1 is properly defined and met, tokenization should help merchants with PCI compliance by limiting and/or removing components that transmit, process, or store CHD.
With TrueTokenization, since tokens are not mathematically related to CHD, the number of exposure points are reduced because tokens, by definition, are unrelated other than by reference to the CHD they are protecting.
With PCI Tokenization, tokens may be decryptable, so scope (and compliance) is harder to determine – hence the disclaimer that only acquirers and the card brands can make scoping decisions. (They, no doubt, will in turn refer merchants to QSAs, but again that’s a topic for another day.)
Goal #3 – KISS (Keep It Simple, Stupid)
We felt that the less complex the concept of TrueTokenization, the less chance of holes being introduced to exploit it. We also felt that the best way to promote this new concept was to keep it as simple and as streamlined as possible so the average “non-techie” could easily understand it.
With TrueTokenization, this simplicity was possible because the nature of the tokens (not mathematically derived) means there is no concern for the “strength” of the token or how resilient it is to unauthorized decryption efforts.
With PCI Tokenization, merchants must research how each solution generates the tokens. For example, tokenization solutions based on encryption must demonstrate that strong encryption is being used and key management must be analyzed to verify PCI compliance, and solutions based on hash values must be properly salted. In either case, the keys and salt values must be protected as both are points of vulnerability for an attacker.
We call these encryption or hashing-based products TINO, “Tokenization In Name Only,” as they really are not based on anything even resembling tokenization. They are existing solutions that have been rebranded to ride the wave of success created by tokenization.
Goal #4 – Promote the Concept to All
Rather than keeping the concept of tokenization a Shift4 proprietary solution, we recognized its potential to gain widespread adoption as an industry standard. With this in mind, we freely released it to the public domain.
Of the original four goals, this is the only one left intact by the PCI. They did keep this goal of wide-spread, standard adoption of tokenization; unfortunately, they didn’t provide a standard (or guidelines) that can actually accomplish this.
As Yoda would ask, “Happened, what the hell did?” The PCI committee process went something like this (I’ll do my best to be concise):
- Originally, tokens were defined as not mathematically related to the PAN. Some TINO vendors squawked.
- To appease them, PCI added mathematically related tokens. Shift4 squawked.
- Trying to make everyone happy, PCI changed wording stating that mathematically related tokens may fall within PCI scope (this was the original definition of “high value” tokens). TINO vendors (and possibly QSAs) squawked.
- PCI changed the definition of “high value” to include any token that can be used as a payment instrument — which boils down to all tokens within the payment environment. Now, only QSAs can determine the true value and scope of all tokens. While Shift4 wanted to squawk here, there was really no point because they announced it just one day before they released the document, giving council members no chance to question and/or fix it.
There is Still Hope
Unfortunately, all tokenization solutions now fall under the PCI Tokenization definition, even though TrueTokenization still honors the original four goals and really ought to remain out of scope. While PCI Tokenization has muddied the tokenization waters, Shift4 will be publishing a TrueTokenization guideline in the very near future. We cannot turn back the tides of ridiculous TINO solutions that have flooded the marketplace, but it is our hope that this document at least helps merchants understand what tokenization can do for them, and what they ought to be looking for in a potential solution. Stay tuned.