Chrome can now instantly subtitle audio and video on the web

Google is expanding its real-time captioning feature, Live Captions, from Pixel smartphones to anyone using a Chrome browser, as noted for the first time by XDA Developers. Live subtitles use machine learning to spontaneously create subtitles for videos or audio where they did not exist before, making the web much more accessible to anyone who is deaf or hard of hearing.

When enabled, closed captions automatically appear in a small mobile box at the bottom of your browser when you are watching or listening to content where people are talking. The words appear after a short delay and, for a quick or stuttering speech, you can detect errors. But in general, the feature is just as impressive as it was when it first appeared on Pixel phones in 2019. Subtitles will appear even with muted audio or at low volume, making it a way to “read” videos or podcasts without disturbing others around you.

Live subtitles with audio subtitles from a podcast player

Chrome live subtitles worked on YouTube videos, Twitch streams, podcast players and even music streaming services like SoundCloud in the first tests performed by some of us here at The Verge. However, it appears that active subtitles in Chrome only work in English, which is also the case on mobile devices.

Active subtitles can be enabled in the latest version of Chrome in Settings, then in the “Advanced” section and in “Accessibility”. (If you are not seeing the feature, try updating manually and restarting your browser.) When you enable them, Chrome will quickly download some speech recognition files and the captions should appear the next time your browser plays audio wherever people are talking .

Live subtitles were first introduced in the beta version of Android Q, but until today they were exclusive to some Pixel and Samsung phones. Now that they’re in Chrome, Live Captions will be available to a much wider audience.

Source