About 2,840,000 results
Open links in new tab
  1. Tokenization (data security) - Wikipedia

    To protect data over its full lifecycle, tokenization is often combined with end-to-end encryption to secure data in transit to the tokenization system or service, with a token replacing the original …

  2. What is tokenization? - IBM

    In data security, tokenization is the process of converting sensitive data into a nonsensitive digital replacement, called a token, that maps back to the original. Tokenization can help protect …

  3. What is Tokenization? Definition, Working, and Applications

    Mar 28, 2023 · Tokenization is defined as the process of hiding the contents of a dataset by replacing sensitive or private elements with a series of non-sensitive, randomly generated …

  4. What is tokenization? | McKinsey

    Jul 25, 2024 · Tokenization is the process of creating a digital representation of a real thing. Tokenization can also be used to protect sensitive data or to efficiently process large amounts …

  5. Data Tokenization - A Complete Guide - ALTR

    Aug 11, 2025 · Tokenization is a data security technique that replaces sensitive information—such as personally identifiable information (PII), payment card numbers, or …

    Missing:
    • definition
    Must include:
  6. What is data tokenization? The different types, and key use cases

    Apr 16, 2025 · Data tokenization as a broad term is the process of replacing raw data with a digital representation. In data security, tokenization replaces sensitive data with randomized, …

  7. What is Tokenization? - TechTarget

    Feb 7, 2023 · Tokenization is the process of replacing sensitive data with unique identification symbols that retain all the essential information about the data without compromising its security.

  8. Tokenization Explained: What Is Tokenization & Why Use It? - Okta

    Sep 2, 2024 · Tokenization involves protecting sensitive, private information with something scrambled, which users call a token. Tokens can't be unscrambled and returned to their …

  9. What Is Tokenization? - Akamai

    In the world of data security and payment processing, tokenization is the practice of protecting sensitive data by replacing it with a token — a unique and nonsensitive string of symbols …

  10. What is Data Tokenization? [Examples, Benefits & Real-Time …

    Jul 9, 2025 · Data tokenization is a method of protecting sensitive information by replacing it with a non-sensitive equivalent — called a token — that has no exploitable meaning or value …