Home >  Blossary: Information Technology  >  Term: tokenization
tokenization

Tokenization refers to a process by which a piece of sensitive data, such as a credit card number, is replaced by a surrogate value known as a token. The sensitive data still generally needs to be stored securely at one centralized location for subsequent reference and requires strong protections around it. The security of a tokenization approach depends on the security of the sensitive values and the algorithm and process used to create the surrogate value and map it back to the original value.

0 0

Information Technology

Category: Technology

Total terms: 1778

Creator

  • consultant
  • (New York - NY, United States)

  •  (Gold) 1811 points
  • 100% positive feedback
© 2024 CSOFT International, Ltd.