August 12, 2011

Tokenization Guidelines Missed the Mark

This morning, the Payment Card Industry Security Standards Council (PCI SSC) published an “information supplement” entitled, PCI DSS Tokenization Guidelines. The document was designed to standardize the rapidly growing tokenization landscape and to give official word on how much benefit tokenization could bring to merchants striving to comply with PCI DSS requirements.
It missed the mark.

We rely on the Security Standards Council to publish standards. At no point in the 23-page response does the PCI SSC publish anything that could even remotely be construed as a standard. Even the title of the document weakens the term to “guideline” and the irresolute text contained therein barely warrants being called that.

On page 20, we find this little gem: “Additionally, tokens that can be used to initiate a transaction mightbe in scope for PCI DSS” (emphasis added).

Might? Wasn’t the whole purpose of this document to take what “might” be true and determine what reallyis true?

It gets better (or worse), “…merchants should therefore consult with their acquirer and/or the Payment Brands directly to determine specific requirements for tokens that can be used as payment instruments.”
What was released today was not an industry standard, and it was not a guideline. It was an eloquently worded, poorly veiled passing of the buck from the PCI SSC to individual acquirers and QSAs.

And it is the QSAs and the PCI Council who stand to profit from these “guidelines,” as more merchants will be required to validate obviously secure solutions with QSAs in order to comply with this new document.
Where did they go wrong? Well, as the old saying goes, “a camel is a horse designed by committee.” This document is certainly a camel – it’s not sleek, it’s not pretty, and it’s relatively full of spit.

When we introduced tokenization to the industry in 2005, we chose not to patent the technology, nor copyright the name because we felt it had great promise for the industry as a whole. We likewise held high hopes when we were invited to participate in the special interest group on tokenization. However, as the group came together to draft this guideline, we quickly realized they did not have such altruistic purposes in mind.

Each group member brought his/her own agenda, and many promoted overtly supporting patented and/or copyrighted “tokenization” technologies to the group – knowing that if they could get their idea included, they stood to profit. Seeking compromise, the SSC forced these dissimilar pieces into the standard and created not just a camel, but a crippled one at that.

In the coming weeks we will take the opportunity on this blog to point out the particular flaws in the document. We will discuss the possible ramifications of its adoption as well as potential agendas behind its creation. We invite your comments and opinions, which you are free to submit on this page, or via e-mail to [email protected].

Watch this space for coming updates.