Google has done an interesting test to introduce the simultaneous translation capability of Android XR. In this experiment, four people in Indian and Persian began to speak to the person who had the XR Android glasses. At the same time, at the angle of the person wearing the glasses, the English translation was displayed.
The capabilities of Android XR artificial intelligence
Regarding the capabilities of these XR glasses, Google has emphasized that they will be a suitable platform for the use of Jina’s artificial intelligence assistant. The prototypes of Google glasses have a camera, microphone and speaker so that this smart assistant can help the user interpret the environment. Among the display features of these glasses include photography, instant routing and live language translation. These features are in line with the things that Google has gradually exhibited in the Android XR test versions in recent months.
Google Following Meta’s success
Google seems to be directly following the successful meta strategy in the field of smart glasses. This is very important because Meta has achieved significant success with Ray-Ban Stories smart glasses. The company announced in February that it has sold more than 5 million of these glasses so far and introduces them as an ideal hardware for interacting with artificial intelligence assistants.
RCO NEWS




