March 30, 2026, 18:02 GMT | Insight
More capable and personalized AI assistants are pushing companies to be "innovative" with privacy frameworks and adapt to consumers’ expectations for seamless experiences,
Google legal chief Kent Walker said Monday, adding that market forces will drive competition on privacy safeguards as much as product quality.
AI assistants are becoming more capable and personalized, and companies will need to adapt their privacy frameworks to keep pace with consumers’ expectations for a frictionless AI experience, Google legal chief Kent Walker said Monday in the opening keynote at the world’s largest gathering of privacy professionals in the US.*
Walker kicked off his address with a promotional video for Gemini Personal Intelligence, a product he touted as offering a more “tailored search experience” that integrates into people’s daily lives.
Soon these sorts of AI powered assistants will become ubiquitous, he said, but “the trust part is always key, because above all, people tell us that they want to feel as if they're in the driver's seat.”
Users expect a seamless experience, Walker said. They want privacy protections, but they don’t want the friction that often accompanies notices, consents and dashboards.
“That means providing a level of protection that's tailored to what they want to do when they want to do it, to ensure that information isn't used in ways they don’t. We have to carefully assess when friction makes sense and when it doesn’t. We need to get creative with ways of delivering transparency and control without overwhelming people,” Walker said.
He said companies need to understand users’ expectations around privacy and consider how to embed protections in ways that support, rather than disrupt, the AI experience.
That shift may require rethinking users’ “reasonable expectations” of privacy, which Walker said have always been varied and evolving. Market pressure, not just the threat of regulation, will drive companies to compete on demonstrable privacy safeguards as much as they do on quality.
Walker’s call to anchor privacy in consumer expectations could be at odds with the EU AI Act, which imposes risk-based obligations on AI systems, rather than the fluid, market-driven model Walker seemed to be outlining.
Privacy-enhancing technology software tools, which allow organizations to use data while minimizing exposure of personal information, may help bridge that gap. The EU’s General Data Protection Regulation and other EU regulations already encourage the use of techniques like pseudonymization, encryption and data minimization to protect user data.
Walker said regulators can help make it worthwhile for businesses to adopt privacy enhancing technologies by offering regulatory incentives “so that they feel it's worth the investment,” Walker said.
“We need to work together to build a future where data protection frameworks adapt to the new uses that we're seeing," he said. "Where our privacy rules can translate into this new generation of expectations, the new generation of tools, the new generation of capabilities.”
*IAPP Global Summit 2026: Privacy-AI Governance, Washington, DC; March 30-April 2, 2026.
Please email editors@mlex.com to contact the editorial staff regarding this story, or to submit the names of lawyers and advisers.
Tags
Sections:
Data Privacy & Security, Artificial Intelligence
Industries:
Computing & Information Technology, Computer Networks, Social Media, Digital Economy
Geographies:
Latin America, North America, Asia, Europe, Africa, Oceania