News hardware Google Assistant’s new capabilities leave us speechless
While Siri remains a little below in the race for voice assistants, Google intends to outrun Alexa with this new update which makes Google Assistant much more practical, avoiding saying “Ok Google” every time, but not only.
One look and Google Assistant responds
Google’s goal is to be able to communicate with Google Assistant in the same way as we do with another person. In any case, they are getting closer and closer to it by adding a few neurons to it.
During Google I/O, the brand’s annual developer conference, almost all Google services received improvements, starting with the voice assistant. It receives two major innovations, the first is inspired by human “eye contact”.
This function is called “Look and Talk”. As the name suggests, it allows the user to activate the wizard just by looking at it. Google presents the option as intended for the Nest Hub Max, the only one that there would support for now.
During the presentation, the person shows up in front of the device and pauses for 2 short seconds (time for the camera to recognize it). She issues her request, and after a few more seconds, Google Assistant responds.
Google places a strong emphasis on privacy during its presentation, stating that the video used to recognize the person is processed directly on the device. No one has access to it, but this is probably not the case for audio queries, which are often analyzed by Google to improve their AI and offer targeted advertising.
Key words to avoid saying “Ok Google”
Of course, the ideal is to have no particular action to do and simply ask Google what comes to mind.. This already exists on Pixel 6 to answer a call, and on Google Nest to create a timer.
The functionality now extends to the Nest Hub Max, with which it will soon be possible to express new requests, simply by asking:
- Create a timer or alarm (and cancel them)
- Ask the time
- Turn on/off/change the intensity of the lights
- Ask what the weather is like
This remains quite basic for the moment, and is limited to a sedentary device, because it is difficult to see a smartphone constantly activating given that we pronounce these sentences quite often in another context.
Google Assistant understands when you hesitate
When talking to a voice assistant, you better speak clearly and fluently, in which case the computer’s response revolves around: “Sorry. I did not understand.”.
Google shows us that it is above the others in mastering AI with a smarter version of the Assistant that will arrive in a later version of the Google Tensor chip, the first edition of the processor made in Google present in Pixels 6, 6 Pro, and 6a.
On the Google I/O stage, the presenter manages to ask Google Assistant to play a song, pausing for a long time in the middle of the sentence, and giving only an incomplete version of the artist’s name.
The little voice responds anyway by playing the corresponding music, the latter having correctly interpreted the hesitation in the sentence and found the author of the song.
It’s quite disconcerting, and we hope not to have to wait too long before we can enjoy it.
Voice is still the most natural way to perform an action, but the technology is still far from perfect. Interpreting the complexities of human language isn’t easy, and there’s still a long way to go before you can ask your voice assistant anything, regardless of how.
By RommBJournalist jeuxvideo.com