Rechercher
Contactez-nous Suivez-nous sur Twitter En francais English Language
 

Freely subscribe to our NEWSLETTER

Newsletter FR

Newsletter EN

Vulnérabilités

Unsubscribe

Ulf Mattsson, CTO Protegrity: Data Breaches, Compliance and the New Frontier for Data Protection

December 2010 by Ulf Mattsson, CTO, Protegrity Corporation

Data security in today’s business world is a classic Catch-22. We need to protect both data and the business processes that rely on that data, but in order to do so we need to move from a reactive, fear (or compliance) driven mode to a proactive data security plan. Meeting Payment Card Industry Data Security Standard (PCI DSS) compliance is important, but it is equally as important to understand that compliance does not equal security, as PCI DSS was intended to be the floor, not the ceiling. This article will discuss a newer method for protecting the entire data flow across systems in an enterprise while minimizing the need for cryptographic services.

Data Breaches

The Verizon Business RISK team, in cooperation with the United States Secret Service (USSS), has been conducting an annual Data Breach Investigations Report. The purpose of the report is to study the common elements and characteristics that can be found in data breaches. In six years, the Verizon Business RISK team and USSS combined dataset now spans 900+ breaches and over 900 million compromised records.

As in previous years, the 2010 Report showed that nearly all data was breached from servers and online applications, with 98% of all data breaches coming from servers originating from hacking and malware as the most dominant perpetrators. Financial Services, Hospitality, and Retail still comprised the “Big Three” industries, which were affected by 33%, 23%, and 15%, respectively. Targeting of financial organizations is hardly shocking, as financial records represent the nearest approximation to actual cash for the criminal. An astounding 94% of all compromised records in 2009 were attributed to Financial Services.

Financial firms hold large volumes of sensitive consumer data for long periods of time, and because of this, fall under more stringent regulation and reporting requirements. However, 79% of financial firms whose data had been breached had failed to meet PCI DSS compliance, the minimum security measure. Thus, organizations have been searching for a solution that protects the business from endpoint to endpoint, while easily meeting compliance.

Encryption vs. Tokenization

End-to-end encryption can encrypt sensitive data fields throughout most of its lifecycle, from capture to disposal, providing the strongest protection of individual data fields. Therefore, end-to-end encryption, and its next of kin - tokenization - are very practical approaches to protect data between specific parts of a solution that are in high risk areas. While there is no silver bullet to the data security and compliance woes of large enterprise organizations, all eyes are on tokenization right now.

Tokenization is different from encryption in that it is based on randomness, not on a mathematical formula. Encryption also requires compliance with key management, key rotation, selection of algorithm, etc. that are moot with tokens. Next generation tokenization offers a faster, more secure solution and uses less computing power than encryption. Also according to PCI DSS, encrypted data must be re-encrypted every year, while tokenized data can be left for a life time.

Currently there are two forms of tokenization available – first generation tokenization and next generation tokenization. First generation tokenization solutions are based on the simple concept of a large and dynamic table of token/credit card pairs. While this is an obvious and reasonable approach, it has its disadvantages and issues with respect to performance, scalability, and availability. The core obstacle with the traditional tokenization approach is that the token lookup table is so large and dynamic that it’s hard to manage. Next generation tokenization, on the other hand, addresses all of these issues through scalability with multiple, parallel instances, dramatically increased performance, availability, centralized or distributed deployment, elimination of token collisions, and support of PCI, PHI and PII data.

When next generation tokenization is applied strategically to enterprise applications, confidential data management and PCI audit costs are reduced and the risk of a security breach is minimized. Security is immediately strengthened by the decrease of potential targets for would-be attackers, because authentic primary account numbers (PAN) is only required at authorization and settlement. Studies have shown annual audits average $225K per year for larger payment card acceptors, and next generation tokenization reduces audit costs dramatically by eliminating the need for encryption keys.

Given financial Institutions’ need for high availability, high performance, scalability and quick response times with regards to data security, tokenization is a perfect solution for this sector. These companies need to protect data inside their firewalls and not just on the wire transport of data. Tokenizing sensitive data including PAN and social security numbers is a cost effective end-to-end solution that meets PCI compliance.

Unfortunately, the PCI Security Standards Council (SSC) has not yet developed standards for tokenization, nor will they include tokenization in PCI DSS 2.0. In attempt to fill this void, Visa published its “Best Practices for Tokenization” Version 1.0 on July 14. Be careful, because this draft implies a “one-size-fits-all” architectural solution open enough for botched implementations including encryption pretending to be tokenization and home-grown tokenization that lack security requirements, where random-based tokenization is the only true end-to-end solution.

Conclusion

With attacks coming in many different forms and from many different channels, consumers, merchants and financial institutions must gain a better understanding of how criminals operate. Only then will they have a better chance of mitigating the risks and recognizing attacks before they do serious damage. Understanding the nature of both data theft and available protection options can help organizations of all types better anticipate where criminals may exploit the system, so they can put appropriate preventive measures in place.

A holistic solution for data security should be based on centralized data security management that protects sensitive information throughout the entire flow of data across the enterprise, from acquisition to deletion. While technologies such as tokenization and encryption cannot assure 100% security, they are proven to dramatically reduce the risk of credit card data loss and identity theft. Next generation tokenization, in particular, has the potential to help businesses protect sensitive data in a much more efficient and scalable manner, allowing them to lower the costs associated with compliance in ways never before imagined.


See previous articles

    

See next articles


Your podcast Here

New, you can have your Podcast here. Contact us for more information ask:
Marc Brami
Phone: +33 1 40 92 05 55
Mail: ipsimp@free.fr

All new podcasts