Understanding Tokenization for Data Security with Leon Bian
Techstrong TV Interviews
•
01-Jan-1900
Leon Bian, vice president and head of product for data security at Capital One Software, explains why tokenization has become a critical tool for securing data in a way that still makes it easily available.
Up Next in Techstrong TV Interviews
-
Navigating Data Security in the Age o...
Gagan Gulati, SVP and GM for Data Services at NetApp, shares his journey in security and compliance, highlighting NetApp’s evolution into an intelligent data infrastructure company. He discusses the future impact of quantum computing, the urgency of post-quantum cryptography, and how NetApp is pr...
-
Preparing for the Quantum Computing T...
Quantum computing poses a significant threat to current encryption methods, with organizations varying in their awareness and readiness. Regulated industries are taking proactive steps towards post-quantum readiness, but migrating encryption schemes is complex and requires prioritization. Concern...
-
Zero Trust for AI Demands a New Data ...
Druva CTO Stephen Manley explains why it will be more important than ever to apply zero-trust cybersecurity principles to artificial intelligence (AI) agents