Rechercher
Contactez-nous Suivez-nous sur Twitter En francais English Language
 

Freely subscribe to our NEWSLETTER

Newsletter FR

Newsletter EN

Vulnérabilités

Unsubscribe

Ulf Mattsson,Protegrity: Visa Best Practices for Tokenization - A Hot Topic for PCI DSS

August 2010 by Marc Jacob

Data Tokenization is a very hot topic among companies who have to comply with PCI DSS. By not storing electronic cardholder data, most enterprises are eligible for a greatly reduced set of PCI requirements. Visa published a draft of its “Best Practices for Tokenization” Version 1.0 and I applaud Visa for making what is overall a very positive move. My view is that they may have rushed this document to publication and, in the process, ignored several key issues. Visa implies a “one-size-fits-all” architectural solution that in many ways exemplifies the old approach to tokenization vs. today’s approach. The difference is significant and underscores just how much tokenization has evolved in recent years.

All old tokenization solutions are based on the simple concept of a large and dynamic table of token/credit card pairs. While this is an obvious and reasonable approach, it has its disadvantages and issues with respect to performance, scalability, and availability. The core obstacle with the traditional tokenization approach is that the token lookup table is so large and dynamic that it’s hard to manage. Solving the availability and scaling needs of large enterprises requires the complex replication or synchronization of the token tables.

New distributed tokenization approaches, on the other hand, simplify and eliminate performance, scalability and availability issues. New tokenization is able to pre-generate static token tables that can be installed in multiple locations. There is also no need for real time synchronization or replication. New tokenization can perform over two hundred thousand transactions per second on a small configuration. This is the approach being implemented by major credit card companies and retailers today. The Visa Best Practices document simply does not take into account issues with performance, scalability and availability that we are all aware of today.

Take, for example, Visa and their requirement for Token distinguish-ability. This requirement is quite logical because real problems could arise if it were not possible to distinguish between real card data and tokens representing card data. This requirement does, however, complicate systems that process card data. All systems would need to be modified to correctly identify real data and tokenized data. These systems might also need to properly take different actions depending on whether they are working with real or token data. So, although a logical requirement, it’s also one that could cause real performance issues.

In addition, Visa’s position is that it’s the responsibility of the tokenization system to determine when the data should be sent tokenized or clear-text. I believe this should be the responsibility of the data management system, of which tokenization is a key feature/subset.

In terms of the single and multi-use issue, the multi-use case seems like an accident waiting to happen. There are better ways to secure PAN data than encryption or hashing. I think the best path is to simply use a random assignment or number. If the output is not generated by a mathematical function applied to the input, it cannot be reversed to regenerate the original PAN data. The only way to discover PAN data from a real token is a lookup in the token server.

Visa’s position also ignores several other issues, including authentication and other issues.

Only authenticated entities shall be allowed access to the tokenization system – does not address authorized entities (individuals), need-to-know, and separation of duties.

The tokenization system must implement monitoring to detect malfunctions or anomalies and suspicious activities in token-to-PAN mapping requests. Upon detection, the monitoring system should alert administrators and actively block token-to requests or implement a rate limiting function to limit PAN data disclosure – presumably despite during an alert status the “token-from” requests are allowed, which may provide a dictionary attack or other entropy analysis by the sophisticated adversary.

In order to limit / eliminate storage of PAN data, the tokenization system should not provide PAN data to a token recipient (e.g., a merchant) – since the PAN enters the authorization system via the merchant, this implies that the merchant erases the PAN and solely relies on the token. However, this does not address the authorization flow whereby the merchant must provide the PAN to the acquirer for processing, so to me this implies a specific architecture where the tokenizer sits between the merchant and the acquirer. Sounds suspiciously like an FDR or ISO architecture for a mid-tier to smaller merchant, and may not fit with larger merchants.

The tokenization platform should allow for chargeback and refund processing without the need for the merchant to retain or have access to full PAN – this is another restriction on how a merchant now needs to handle post-authorization but does not clearly address the actual authorization process nor the clearing and settlement process. The merchant may not be compliant due to its service provider operating rules. IMHO, Visa should address the service providers and not force such liability on the merchants.

Any cryptographic keys used by the tokenization system must be managed in accordance with PCI DSS, the DSS does not properly address key management and has several rather glaring errors such as changing keys (undefined as to which keys) annually. This statement is weak.


See previous articles

    

See next articles


Your podcast Here

New, you can have your Podcast here. Contact us for more information ask:
Marc Brami
Phone: +33 1 40 92 05 55
Mail: ipsimp@free.fr

All new podcasts