Tokenization on the Node
Non-formula data security solution provides
greater protection against cyber-criminals.
Data security is the ultimate game
of cat and mouse—organizations
invest countless resources
researching, developing and producing top-of-
the-line security platforms for today’s
problems, but cyber-criminals have tomorrow’s
solution. Ultimately, there is no single
silver bullet for keeping databases protected,
but a technology at the forefront of innovation
is tokenization. Recent advances have
made it a legitimate alternative to encryption
for the enterprise data warehouse.
At a basic level, tokenization is based on
randomness. Unlike encryption, which
uses a potentially vulnerable mathematical
algorithm, tokenization is completely
random once the data is given alias values.
Cyber-criminals can decode the data only by
obtaining the token, then breaking into the
tokenization server and accessing the protected
lookup tables. Even if cyber-criminals
hack into a system, they will not be able to
do anything with the data tokens unless they
also have the lookup tables.
Tokenization is a process of protecting sensitive
data by replacing it with alias values or tokens
that are meaningless to someone who gains
unauthorized access to the information.
Tokenization has remarkable flexibility.
Since a token, by definition, will look like the
original data value, it can travel inside application
databases and components without
modifications, resulting in greatly increased
0transparency. This reduces remediation
costs to applications, databases and other
areas where sensitive data lives because the
tokenized information will match the original’s
type, length and other properties.
The tokenization process can be applied
to payments, such as payment card industry
transactions; healthcare-related sensitive
data, including Health Insurance Portability
and Accountability Act information; and
personally identifiable information (PII)
such as email addresses, phone numbers
and dates of birth. When factoring in compliance
costs, tokenization provides a more
cost-effective solution than other security
options. That’s because tokenization provides better separation between the data
and the security system. This can take the
sensitive data out of scope, eliminating the
requirement and costs to meet compliance.
For organizations that need to have the
actual data stored within their company,
tokenization greatly reduces the extent and
use of sensitive information, dramatically
reducing compliance costs.
First-generation tokenization solutions are
characterized by their use of large token servers
that constantly grow with encrypted data. The
next generation of tokenization is defined by
a small system footprint that compresses and
normalizes the randomized data. This implementation
can be fully distributed and does not
require any resource-intensive data replication or
synchronization. Users benefit from a lower-cost,
less-complex, higher-performing data security
solution that guarantees no collisions.
Small-footprint tokenization offers advantages
over first-generation methods:
- Latency. Token servers with small
footprints enable the distribution of
the tokenization process so that operations
can be executed in parallel and
closer to the data. Therefore, latency is
eliminated or greatly reduced, depending
on the deployment approach used.
- Replication. The smaller footprint
allows the creation of farms of
token servers that are based on
inexpensive commodity hardware.
This permits any required scaling
without the need for complex or
expensive replication processes,
which eliminates collisions.
- Versatility. Any number of data categories,
including credit card numbers
and medical records, can be tokenized
without the challenges that previous
tokenization methods impose. More
data types can also benefit from the
transparent properties that tokens offer.
Tokenization can apply to all types of
enterprises and is an especially good fit for
the Teradata Database. When deployed on
the node, small-footprint tokenization can
be fully distributed across nodes and AMPs
on a Teradata system.
for Today and Tomorrow
With its transparency characteristics and
ability to protect sensitive information,
tokenization will become a more popular
security solution because of standards
by the Payment Card Industry Standard
Security Council and data protection
requirements that go beyond payment
data. Standards will ultimately help set best
practices and govern tokenization implementation.
While tokenization doesn’t
solve every data security conundrum, it
will protect data today and for many years
to come, since there is no encryption
algorithm or key that must be refreshed
periodically to prevent the growing threats
of malicious attacks.