UPDATED: April 29, 2021 17:03 IST
Google has announced updates for Google Assistant that will help in accurately recognising and pronouncing people’s names as often as possible, especially those that are less common. This will enhance the efficiency in commands Google gives its users whether it’s during sending a text or making a call. Google further noted that Google Assistant will respond nearly 100 per cent accurately to alarms and timer tasks with new machine learning technologies in place.
Over the next few days, Google will allow users to teach Google Assistant to enunciate and recognise the names of their contacts the way they pronounce them. “Assistant will listen to your pronunciation and remember it, without keeping a recording of your voice. This means Assistant will be able to better understand you when you say those names, and also be able to pronounce them correctly,” Google noted in a blog post. Google noted that the feature will be available first in English followed by more languages soon.
Further, Google Assistant timers are getting the capabilities to respond to more than one timer and alarm at the same time. Google noted that its Assistant’s NLU models will help in responding accurately when a command is not accurate or when there is complexity with a conversation.
“We fully rebuilt Assistant’s NLU models so it can now more accurately understand context while also improving its “reference resolution” — meaning it knows exactly what you’re trying to do with a command. This upgrade uses machine learning technology powered by state-of-the-art BERT, a technology we invented in 2018 and first brought to Search that makes it possible to process words in relation to all the other words in a sentence, rather than one-by-one in order,” Google explained. As of now, these updates are available on alarms and timers on Google smart speakers in English in the US.
Google noted that it is extending BERT technology to users’ conversations. Google noted that it uses previous interactions to understand what is being displayed on users’ smartphones or smart displays to respond to any follow-up questions, letting them have a more natural, back-and-forth conversation. Google Assistant will further understand questions referring to what users are looking at their smartphones or tablet screens, even if queries are incomplete.