June 15, 2011
Is Tokenization a Fad?
Yesterday, Protegrity CTO Ulf Mattsson published a blog entitled, “Is Tokenization just a Fad?” The post was his response to an unnamed “key executive” who recently posed that question. Mattsson’s ultimate premise was that tokenization is a powerful and useful weapon in the IT and InfoSec professionals’ arsenal, and that it is certainly not a fad. I agree with this. However, there was one glaring error in his response with which I take issue.
“Just a few years ago,” Mattsson wrote, “the market was characterized by largely immature solutions. …” This is both patently untrue and plays directly into the unnamed executive’s argument that tokenization is a flash-in-the-pan trend.
Shift4 Corporation introduced tokenization to the industry in 2005, and by 2007 (a few years ago) had processed a billion tokenized transactions – that’s far from immature. We have processed more than four billion transactions. This is not a short-lived solution – it has continually gained traction over the past six years and is rapidly becoming an industry standard.
Now that I’ve cleared that up, let me address the original “fad” question. First, what is a fad? Merriam-Webster defines a fad as “a practice or interest followed for a time with exaggerated zeal : craze.”
Zeal? Craze? Do those terms sound characteristic of your security professionals? As a general rule, I would say that IT security professionals tend to be some of the most equable, pragmatic individuals in an organization. These folks are not prone to adopt new-fangled and unproven technologies without first engaging in serious research. (In fact, it is my experience that these are the people most likely to point out the fallacies in a new idea and make a point of exposing its weaknesses.)
If tokenization were a fad, the groundswell behind it would not be coming from the naturally skeptical IT professionals – but it is. The excitement surrounding tokenization is coming up from the rank-and-file security professionals – those uniquely capable of understanding the technical aspects of tokenization and recognizing its true power to protect sensitive data.
This executive calling tokenization a fad is, in my opinion, akin to another executive, the President of Michigan Savings Bank, who in 1901 said, “The horse is here to stay but the automobile is only a novelty, a fad.” Just as the automobile revolutionized personal transportation, tokenization has the capacity to revolutionize the way we protect cardholder data. Certainly, there will be ups and downs and growing pains (anyone remember the Edsel?), but I believe the technology will continue to evolve and gain acceptance until it truly becomes the standard for data protection.
Rapid growth and acceptance, however, could potentially be the greatest threat to tokenization. As the term gains popularity we are seeing more organizations scrambling to release a solution of their own. These often ill-thought-out and incomplete solutions, TINO or “tokenization in name only” as we refer to them, use the name “tokenization” but do not provide their users with the levels of data protection or simplification of PCI compliance that is promised by the true solution. If these TINO providers are able to convince the PCI Security Standards Council that their mathematically derived PAN encryptions are tokens, we may well see the demise of tokenization.
Ultimately, the only way I foresee tokenization fading into “fad-hood” is if the bastardized solutions distort the original intent of tokenization and sully its good name with their less-than-adequate security. If we can ensure that future solutions remain true to the original intent of tokenization, and are not just renamed and slightly modified encryption methodologies, tokenization will continue to gain acceptance and will prove a valuable, long-term solution to organizations around the world.
A fad? I think not.