Gemini Live now lets you talk to Google’s AI using your camera on Pixel 9, Galaxy S25
Gemini Live now lets you talk to Google’s AI using your camera on Pixel 9, Galaxy S25

Google is stepping up its AI game. The company has rolled out a powerful update to Gemini Live, letting users have real-time, two-way conversations with its AI using their phone’s camera or screen. Think of it like FaceTiming with AI—only it’s not just watching, it’s understanding what it sees.
Initially launched in March for Gemini Advanced subscribers, this feature is now being opened up to more users, starting with Pixel 9 and Samsung Galaxy S25 owners.
The upgrade is powered by Project Astra, Google DeepMind’s futuristic AI agent that responds to video, audio, and text—all in the moment. First shown at Google I/O 2024, Astra wowed audiences by answering questions in real time based on what the camera captured.
In the demo, it recognized places like London’s King’s Cross just from a window view, fixed code, improved circuit diagrams, made clever rhymes about crayons, and even found a lost pair of glasses—all through live video input.
Gemini Live supports natural conversations in over 45 languages, making it feel more like talking to a friend than typing into a search bar. Want to ask about something on your screen or get help with what’s in front of you? Just point your phone and talk.
This time around, Google was careful to avoid past criticism for staging AI demos. The company clarified that the Astra demo was recorded in just two continuous takes—with no editing tricks involved.
With Gemini Live, AI is getting a lot more personal—and a whole lot more powerful.