DataMasking has a very specific function, the process of replacing sensitive data with a similar non-sensitive IBM i, SQL server, Oracle, My SQL, and other environments. It is important to emphasize that token manager guarantees the same operability.
The tokenization system receives the confidential data. This data is protected and stored centrally. Then the tokenization system generates a unique token that associates it with the previously stored conditional data. The token is placed in our development database and it replaces in all operations, the confidential data to which it represents.
Another feature of the token is that it must be operative at the same time confidential, creating a less impact to the development and test that process it or the database that stores it to an adjacent contiguous image. An example of tokenization using as a confidential data would be a payment card number.
Replaces sensitive data with dummy data that maintains the characteristics of the original data.
Service skills number one, two, three
It maintains the entity relationship in the different databases where the confidential data was.
Complies with PCI-DSS regulations.