Tokenization as a service
Advancements in technology have penetrated our day-to-day life and impacted many areas such as healthcare, finance, payment, shopping, and driving, to name just a few. Automation is no longer a concept restricted to technology labs but has already entered the mainstream in the form of automated teller machines, automatic cars and automatic call recorders. Receiving personalized alerts on your watch warning you about overeating, lack of physical activity, increase in blood pressure or heart beats is not a fantasy in a James Bond movie anymore. Today the world is not just connected but also connected more meaningfully than a decade ago.
Though technology has been making life easier and simpler, its misuse, especially by hackers and cyber criminals, eclipses the advantages at times. Millions of data records including personal and financial details of customers and payroll information of employees are being stolen from employers as well as service providers such as financial institutions and retailers. Technology experts and leaders have been tirelessly working on making data storage and transfers more secure.
These relentless efforts have resulted in new hardware as well as software products like Intel's 'secured intelligence inside' or Apple's well-advertised 'ApplePay tokenization'. These products indicate the seriousness of industry leaders in ensuring secure data transfers without compromising on ease of operation and speed. While hardware security still has boundaries and is under the complete control of the manufacturer, this is not the case with software. Let us take a closer look at 'tokenization' which is extensively used as software security tool, today.
In the field of data security, tokenization is, to put it simply, substitution of sensitive information or data with non-sensitive data. Typically, tokenization is used to create dummy information which makes no sense to trespassers and unauthorized users. After reaching its destination, this information can reveal the exact original identity or message.
Not a new phenomenon
As history indicates, tokenization has been around for centuries. Kings of yore and political leaders of different parts of the world have been using encrypted messages to share secret information. This method proved to be a suitable form of tokenization in that era. Later, trading with gold and silver coins was converted to paper-based trading where a paper token like a cheque or a draft represented the transaction amount and could be exchanged for actual money. In fact, today's currency notes also began as promissory notes representing an equivalent amount of money and slowly turned into a form of money in their own right. Casino chips are an example of money changing identity in the context of a game. These examples only represent change of identity of the original amount without hiding the actual value of the money.
In the digital era, tokenization has achieved a much higher sophistication, creating unrecognizable data identity for sensitive information. This sensitive information can be an identity of a person, an account number, transaction details, a combination of all or even a string of information units like a uniform resource locator (URL). A common example of such ciphering is the 256-bit cipher algorithm used in Internet Explorer. This algorithm converts the URL and the information that goes with it into a string of bytes. The original form of the URL is extremely difficult to decipher and to arrive at in a fraction of a second.
A few key standards currently being used in the financial industry are:
· The Payment Card Industry Data Security Standard (PCI DSS) provides "an actionable framework for developing a robust payment card data security process including prevention, detection and appropriate reaction to security incidents."
· American National Standards Institute (ANSI) oversees the creation, dissemination and use of various norms and guidelines that directly impact businesses in almost all sectors.
· Europay, MasterCard and Visa (EMV), a global standard for inter-operation of integrated circuit cards
Institutions such as RSA, VISA, Mastercard, EMV Co and American Express already have their own tokenization mechanisms which are proven in the financial systems domain. Today, these players are acting as aggregators for various banks in the transaction banking space. They are also creating innovative techniques to make financial data transfers increasingly secure.
With the growing penetration of 'Internet of things' (IoT), non-financial sectors are also inclined towards creating an ecosystem for secure communication. The growth of cloud-based solutions has also fueled the need for secure data transfers. Strong and secure tokenization techniques are becoming the focus of all businesses.
The future of tokenization
The changed outlook towards security, privacy and data protection has prompted business leaders to think of tokenization as separate service. Businesses are looking for more efficient means of data security and if leading security experts provide Tokenization-as-a-Service (TaaS), it can end their quest for information safety. Security providers can focus their efforts on tokenization based on high-level market segmentation providing it as a service for the financial and non-financial industries. Aggregators like VISA, MC and AMEX are in a better position to host TaaS for businesses because they can ensure:
· Focused efforts to develop secure algorithms
· Cost saving by avoiding the re-invention of the wheel
· Faster penetration and greater adoption with the existing customer base
· Central risk mitigation
The VISA chief executive has already indicated the company's intention is to be an aggregator for token services. Given the growing awareness about data security, it will not be long before aggregators start creating specialized Tokenization-as-a-Service offerings to address the concerns of business.