Tokenization

Share
on Aug5

Short description of Tokenization

Where data security is concerned, tokenization is the process by which a sensitive piece of information is replaced or substituted with a non-sensitive element called a token that has no exploitable or extrinsic meaning or value outside the intended complex, making it essentially not worth stealing.

Share

Tags



Previous postSubmission Next postTransaction Currency



Copyright© 2022, United Thinkers LLC