The standard information security mantra is to protect sensitive data where it resides, but I say that with the number of security breaches being publicized these days, we should quickly move to remove sensitive data where not required.
I know that I’m not new to this train of thought, but the cost of non-compliance is growing exponentially. Financial damage can be insured against, but remember that reputational damage cannot.
In a previous article, I wrote about the need for complementing industry standard encryption with a process called tokenization. While encryption is intended to hide the actual data in a manner that is reversible, tokenization replaces the sensitive data with a tag or token, preserving only the format or schema of the data.
Tokenization in the payment card industry
The payment card industry (PCI) has clearly stated that any piece of infrastructure that is accessible by network to those systems that either process or store PCI (credit card) data are “in scope” for PCI compliance. This means that the scope of an annual compliance audit could essentially include every device on your network. Many software companies have taken on the tokenization challenge.
Originally, they provided APIs and libraries for developers to embed tokenization into applications, or bootstrap tokenization onto existing applications. These did little though to reduce the scope of your PCI compliance, and in many cases raised the complexity of the environment. Next came the tokenization broker appliances, which were housed in your data centre to communicate with your point of sale and payment processing systems. Although this reduces scope and complexity of your PCI environment, it still leaves a large amount of your environment “in scope” for PCI, and the “crown jewels” were still onsite, albeit in a very robust data vault.
With a tokenization solution outsourced via a SaaS model, sensitive data such as credit card numbers are not stored in your system. There is nothing to obtain during a breach. Full stop. Let someone else take on the burden of PCI compliance.
Toronto-based company incorporates tokenization to deter hackers
Toronto’s Blueline Data has taken on the challenge, by creating a novel tokenization gateway solution that not only covers web and point of sale transaction systems, but telephony and unified communications infrastructure as well. In fact, you can define any type of digital data sequence to be protected for SOX / HIPAA / OSFI or any other regulatory requirement and tokenize it as well. They call their strategy “assurance through deterrence”. By removing the sensitive data from an environment, they deter would-be attackers from investing in advanced persistent attacks.
The PCI-DSS covers 6 areas of protection with 12 Specific Requirements. Blueline’s unique offering covers seven of these requirements, across five areas!
The Blueline environment itself, subject to PCI audit, complies with the DSS 3.0 requirements. It offers a unique and low-risk approach to protect your IT assets, such as financial records, intellectual property, employee details and data entrusted to you by customers or third parties. The combined benefit is the highest security and the lowest cost. Their approach to format preserving and diskless tokenization at the perimeter, essentially creates what it calls a “zero vector of attack” computing environment, which is easy to operate but not easy to exploit.
I believe that their forward thinking initiative of providing tokenization services to non-traditional channels of data flow sets them apart from competitors in this market. I’m anxious to watch this company flourish amid the weekly disclosures of sensitive data breaches.
SANS: Six Ways to Reduce PCI DSS Audit Scope by Tokenizing Cardholder data http://bluelinex.com/resources/blp204_pci_compliance_sheet.pdf Blueline Services: Data Tokenization Securosis: Understanding and Selecting a Tokenization Solution