Empowering the Edge: Synaptics' Vikram Gupta on AI Innovation and Strategic Google Collaboration

In an interview from CES 2025, Vikram Gupta, Chief Product Officer at Synaptics, discusses the company's strategic collaboration with Google to enhance Edge AI processors, emphasizing the importance of context-aware computing and AI-native platforms for improving user experience and efficiency.

author avatar

16 Jan, 2025. 2 min read

In an insightful conversation at CES, Vikram Gupta, Chief Product Officer and SVP GM of IoT Processors at Synaptics, sheds light on the company's groundbreaking collaboration with Google. This partnership aims to revolutionize edge AI technology by integrating Google's open ML core into Synaptics' Astra processor line. Gupta elaborates on the benefits of AI at the edge, such as enhanced security, reduced latency, and lower energy costs, and discusses how context-aware computing is set to transform user interactions with technology across various applications.

Video transcript:

Wevolver: Good to meet you again, Vikram. Could you introduce yourself and tell us about the recent collaboration with Google?

Vikram Gupta: Certainly. I'm Vikram Gupta, Chief Product Officer and SVP GM of IoT Processors at Synaptics. Last week, we announced a collaboration with Google, where we are integrating an open ML core from Google into our Edge AI processors in the Astra line. This integration aims to provide a tailored AI solution within our processors, focusing on multimodal, power-efficient applications.

Wevolver:  Why should AI be implemented at the edge?

Vikram Gupta: AI at the edge offers several benefits. It improves response times by processing data locally rather than sending it back to the cloud. This not only enhances user experience but also increases security and privacy. Additionally, it reduces energy costs associated with cloud computing.

Wevolver: Can you explain what context-aware computing means for end-users and which devices might see the most enhancements?

Vikram Gupta: Context-aware computing enables devices to understand their environment and react accordingly, much like humans do. This capability allows devices to process information locally and respond intelligently, which is crucial for applications like security systems where immediate action is required. It can also help conserve energy by reducing unnecessary operations.

Wevolver:  What makes the Synaptics Astra AI native platform compelling for edge AI?

Vikram Gupta: Launched in April at Nuremberg, the Astra platform is AI-native, designed to fully leverage AI capabilities from the ground up. This approach differentiates our processors by emphasizing AI integration in both hardware and software, enhancing our customers' experience with AI features.

Wevolver:  What other technologies are you showcasing at CES?

Vikram Gupta: Besides our processors, we're highlighting our advancements in wireless technologies under the Veros brand, along with our traditional touch and display technologies used in mobiles, PCs, and automotive products. Our wireless solutions are designed for reliable connectivity and interoperability across devices.

Wevolver:  What are the next big steps for Synaptics in edge AI and context-aware computing?

Vikram Gupta: We're advancing our collaboration with Google, optimizing their ML core for our SoCs. This year, we'll introduce new devices that leverage this technology, catering to a broad spectrum of applications, from wearables to high-power electronics.

Wevolver:  Thank you for your time, Vikram. Best of luck here at CES.

Vikram Gupta: Thank you. It's been great.

Learn more about Synaptics here.