Google is using a clever bit of ultrasound tech to get you into meetings faster, and with fewer chances of embarrassing audio feedback. The company is rolling out automatic conference room detection to Google Meet on Android and iOS.
The feature already existed on laptops, but bringing it to phones makes it far more practical for real-world meetings. The idea is simple. When you enter a meeting room and open Meet through the app or via Gmail, your phone can sense that it is physically inside a supported conference room and prompt you to join the ongoing call correctly.
Here is how the ultrasound tech actually works
Behind the scenes, Google Meet relies on silent ultrasonic signals emitted by compatible meeting room hardware. These sounds are inaudible to humans but can be detected by your phone’s microphone. Once your phone picks up the signal, Meet recognises that you are in that specific room and surfaces a Companion mode option automatically.
Companion mode is the key here. It lets you join the meeting from your phone without turning on your microphone or speaker, which helps avoid echo, feedback, or talking over the room system.
This feature requires microphone access on your phone, since that is how the ultrasonic detection works. For organisations, Google Workspace admins remain in control. They can enable or disable the feature at the room level, and the system still allows manual check-in if automatic detection does not work for any reason.

Google has been steadily improving Meet with user-friendly updates, from quicker virtual touch-ups during calls to tighter controls that block unwanted lurkers and give hosts more say over who can enter meetings.
For anyone stuck in back-to-back hybrid meetings, the latest update is less about flashy innovation and more about smoothing out everyday hiccups. It removes a few small but annoying steps and makes joining meetings feel a little more convenient.













































