Top

Tokenization – security’s knight in shining armor?

August 19, 2015

translates as replacing sensitive data elements with their non-sensitive equivalents. The value of the resulting elements being non-exploitable, this procedure is supposed to reduce the risks. In order to retrace the steps back to the original sensitive data, the tokenization system provides a mapping procedure. This reverse procedure needs the tokenization system and the authority to request tokens – therefore ensuring access control.

This type of tokens can be cryptographic, non-cryptographic or hybrid. End-to-end encryption (certified point-to-point encryption—P2PE) is often used to protect data in transit during these procedures.
Tokenization can also be irreversible, when data can’t be converted back to the original.

The amount of retained data, the costs and the complexity decrease, while the security is enhanced. Various types of PII (personally identifiable information) are protected by tokenization.

PCI compliance offers retailers the tokenization option as an alternative to the expensive end-to-end encryption systems. Usually small and mid-sized businesses go with tokenization via . The responsibility for data safety moves onto the external provider, which issues to the merchant only the necessary tools for its transactional activity. When tokenization is bought as a product, the product vendor must account for the creation, distribution and maintenance of its product.

of tokenization

Tokens consist of random numbers or alphanumeric characters/symbols and may replace the original data partially, e.g. part of an actual card number. The retailer frees his environment of card data and the tokens match the card owners with the transactions. The procedure supports an array of operations, from one-time authorization, capture and settlement, to recurring and subscription billing or re-authorization. Opposite to using encryption keys, tokenization does not raise a challenge for organizations in safeguarding sensitive data.

The main declared benefit is making payment card data less interesting for hackers, or altogether meaningless.

*Mobile payments especially benefit from tokenization. The wide variety of options and the multiplicity of the terminals that can engage transactions increase the security challenges. Companies like Gemalto, SimplyTapp, Sequent and Visa invested or are currently working on tokenization. Strengthening security through this method increases flexibility.

Other particular benefits would be the instant use and minimal pushback.

Cloud-based applications also benefit from tokenization, by avoiding the sensitive data transmission. The randomly generated values that replace the original data pose less security risks when stored in the cloud, while the original data remains behind the company’s firewall. High-value or confidential data remain safe, while “delegating” their tokens in cloud-based operations.

The PCI Security Standards Council (PCI SSC) issued tokenization technical guidelines. Meeting these guidelines considerably increase the company’s PCI evaluation scores. The document covers any tokenization product that replaces the primary account number (PAN or bank number), whether it is a hardware device, or service provision. Tokenization drastically reduces the scope of systems for which PCI compliance should be proved – the systems running it are less numerous and more manageable.
Anyone interested can access these best practices here.

Tokenization criticism

Apple Pay’s tokenization integration (2014) was enthusiastically received, but also generated discussions about the existing vulnerabilities.

Token generation is a critical issue, especially when tokens act as a proxy for PANs. Compromising the token system automatically triggers access to all alternate PANs involved in token recycling. The solution might be a second form of authentication for every re-use instance of the token, or dynamically re-issuing tokens.

Another criticized feature would be that the organization is locked to a particular payment processor. The solution resides with the tokenization service – supporting multiple payment processors is preferable from the retailers’ perspective.

Replaying tokens would be another concern – when tokens are reused, attackers might find their actions facilitated and the security risks would increase.Compared to encryption, tokenization requires a bigger investment and more resources. Even when we don’ take into consideration the actual tokenization algorithms (assuming tokenization is handled via SaaS), storing the original data requires secure on-site systems or paying the specialists for this supplementary service.

Tokenization security-enhancing

The experts believe that tokenization needs strong guidelines and regulations that would ease its wide scale adoption. Standardizing tokenization and ensuring that user data never leaves the user’s possession would act as a combined attack deterrent.

Improving token generation and usage renders this security measure more powerful.

An innovative solution has recently launched, combining biometric authentication with tokenization. This way, the original data safely remains to its owner, while a tokenized version represents the owner across the Internet. HYPR Corp. announced its biometric tokenization platform in July 2015. This type of authentication has a projection of over a billion devices by the end of 2016.
The new technology is destined for banks, healthcare organizations and a range of other industries. It would be a response for the biometrics security concerns, as well as for some of the tokenization vulnerabilities (securing original data).

Tokenization-using payment systems are highly attractive as a security solution because they present the potential of enhanced privacy protection, fraud reducing and money saving for companies. The cloud can be accessed without major risks, transactions have a safety net and attackers are discouraged.

In other words, tokenization might qualify for being the knight in shining armor that saves sensitive data when it comes to cyber-security. It only has to polish its armor and find a way to rescue various cyber targets simultaneously. Unified international regulations should solve the first major request, while multi – processor support could respond the latter concern.

The enterprises expect tokenization to go beyond standardization into excellence, since the industry would trust this procedure with their most important data.
Until receiving the excellence vote, tokenization is steadily developing – and more and more security vendors incorporate it into their offers.