Contactez-nous Suivez-nous sur Twitter En francais English Language

De la Théorie à la pratique

Freely subscribe to our NEWSLETTER

Newsletter FR

Newsletter EN



Ulf Mattsson: Lower-level encryption’s dirty little secrets

March 2009 by Ulf Mattsson, CTO, Protegrity Corporation

Forget about curious but good-natured hackers. Forget about dangers from not-very-skilled script kiddies. The security stakes are much higher now. As our economy continues to stagger like a drunk on his way home from a bar, we’re going to find that profit-based data breaches will escalate. If there is ever a good time to play with ineffective encryption schemes, this isn’t it. Yet some of the biggest names in information technology and security—RSA, Oracle, IBM and Microsoft—are currently pushing the use of lower-level encryption at the storage device level or file systems level. It’s a mystery to me why these industry leaders are advocating a flawed process that will not protect data against the ever-increasing number of sophisticated attacks.

Lower-level encryption can sneakily drop-kick even the most well-intentioned enterprise out of compliance and into disarray. You’re sure you’re doing the right thing (after all, IBM said it was the way to go), but all of a sudden you’re listening to a compliance auditor explaining that your enterprise has failed in the separation of duties, data protection and key management areas, and now you need to do a lot of expensive, time-consuming remediation.

We all know it’s better to focus on maintaining strong data security than to center our protection efforts on ever-changing compliance requirements. A secure system is virtually always a compliant system. And as you’ve probably guessed by now, I strongly believe that it’s much better to use end-to-end encryption to protect sensitive data while it is in transit and at rest, internally and externally, because part of the sensitive data field (or the whole field) is continuously protected by a transparent encryption wrapper.

Continuous protection is critical because, even if we hate to admit it, IT experts are humans. We’re going to make mistakes and leave some attack vector open, some app unpatched, some bit of something unsecured. The recently released 2008 Verizon Business Data Breach Investigations Report, based on four years of research and forensic examinations of more than 500 companies that suffered a significant data breach, indicated that most breaches result from a mixture of events instead of a single issue, and it noted that a human error often directly or indirectly contributed to the success of the breach. (While the phrase “data breach” conjures up visions of an international attack, accidental, no-evil-intended breaches are quite common.) Malicious code was cited as contributing to the success of nearly one-third of data breaches under investigation.

Contrary to years past, when coders with bad intentions primarily released selfreplicating malware with highly obvious effects to boost their fame among their peers, the focus has now shifted to stealth (the longer it lingers, the more data it can collect) and targeted distribution. A key point to remember when you’re mulling over the benefits of lower-level encryption is that partial end-to-end encryption of data fields can protect from some attacks by malicious code.

Any security system is only as strong as its weakest link, and that is what attackers, inside and outside of the enterprise, look for. Indeed, they’re looking for those weak links everywhere, up to the application and even client level, and down to the system internals and driver level.

The conventional risk model used in IT security is that of a linked chain: The system is a chain of events, where the weakest link is found and made stronger. Sounds good, but it fails to solve the problem. The strengthening of any link, even if made much stronger, does not guarantee a less vulnerable system. The system is just dependent on the next weakest link. Worst-case scenario: The newly hardened link may produce new weak links due to, for example, interoperability issues with other system parts.

Further, such solutions are actually based on the illogical presumption that "no part will fail at any time"; if a critical part fails, the system fails. In short, there is an inevitable single point-of-failure: that weakest link. Making the link stronger will not make the single point-of-failure go away. At most, it may shift it. So layers of security, including integrated key management, identity management and policy-based enforcement, as well as encryption of data throughout the entire life cycle, are essential for a truly secure environment for sensitive data.

There is a slew of research indicating that advanced attacks against internal data flow (transit, applications, databases and files) is increasing, and many successful attacks were conducted against data that the enterprise did not know was on a particular system. Storage-layer encryption doesn’t provide the comprehensive protection that we need to protect against these attacks. And if you think you know exactly where every bit of personally identifiable data resides on your company’s systems, you either work for a very small company, or (congratulations!) you’re Mr. or Ms. Wizard.

There are other problems to consider if you’re thinking of using lower-level encryption. SAN/NAS encryption can result in questionable PCI compliance, and separation of duties is impossible to achieve. File encryption doesn’t protect against database-level attacks, and here again we have the separation of duties issue to worry about. How are you going to effectively and easily keep administrators from seeing what they don’t need to see with file-level encryption?

Native column-level encryption can create a world of security and compliance traumas, including a lack of key management capabilities, all sorts of interesting interoperability issues (especially with the point solutions you so blithely deployed during yet another time crunch) and, yes, separation of duties. Combining low-level encryption with essential auditing tasks, such as database activity monitoring and logging, creates additional problems, including monitoring but not blocking access to data. This results in scalability issues with log volumes, while never helping catch bad guys. There are also other problems to consider that are limited to specific products, such as central key generation not being supported by DB2 V9.5, Oracle 11g, SQL Server 2005, Informix 10, Sybase 15 and Teradata 2.6. Central key generation for column-level encryption is not supported by SQL Server 2008.

End-to-end encryption is an elegant solution to a number of messy problems. It’s not perfect; fieldlevel end-to-end encryption can, for example, break some applications, but its benefits in protecting sensitive data far outweigh these correctable issues. But the capability to protect at the point of entry helps ensure that the information will be both properly secured and appropriately accessible when needed at any point in its enterprise information life cycle.

End-to-end data encryption can protect sensitive fields in a multi-tiered data flow from storage all the way to the client requesting the data. The protected data fields may be flowing from legacy back-end databases and applications via a layer of Web services before reaching the client. If required, the sensitive data can be decrypted close to the client after validating the credential and data-level authorization.

I’m always eager to try out the next new thing in security technology. I’m even open to fixing things that aren’t broken, just to see what happens. But some ideas are obviously flawed and not worth deploying. Lower-level encryption is one of those ideas, and I believe it’s a dangerous devolution in data protection.

See previous articles


See next articles