Where data security is concerned, tokenization is the process by which a sensitive piece of information is replaced or substituted with a non-sensitive element called a token that has no exploitable or extrinsic meaning or value outside the intended complex, making it essentially not worth stealing.