Apple has come up with creative ideas for how the user interacts with the Apple Vision Pro. From common inputs such as: keyboard, trackpad and controller to even the user’s voice, they are all things that can be useful in the user experience and how a person uses the device.
In the meantime, Apple showed great interest in controlling the device through hands and eyes to strengthen the user’s bond with Apple Vision Pro and improve the personal experience. In this context, many questions have arisen in our minds that remain unanswered
To answer these questions, we have decided to work with Andrew Hart, who has gained a lot of experience in the field of augmented reality, to get more information about the motion gestures of this device.
Hart shared his vision for Apple Vision Pro in the form of several tweets that were referenced during the developer session and addressed how users interact with content in Apple Vision Pro. Apple has used a new form of interaction in its new product, which distinguishes it from other headsets, but it does not mean that the way of working with the headset is reduced to such a movement gesture and the multi-year habits of users in using smartphones and tablets. They will be useless in using this headset.
– Apple Vision Pro video preview; The end of the iPhone?
In order to create a sense of comfort with Apple Vision Pro, Apple will use these motion gestures based on a greater user experience and away from new and complex interaction methods. Gestures such as tapping or zooming make the user spend less time learning movement modes and enjoy interacting with the headset more.
At the developer meeting, Apple mentioned how to zoom in on photos just by looking at the content and placing two fingers on the photo and swiping left and right at the same time. This mode of interaction is institutionalized in many applications and smartphones and there is no need to learn.
Apple also mentioned the placement of software pages and different items in the center of the screen, which prevents strain on the neck while finding them. In addition, the transfer of writing tools from one part to another is done quickly through a movement gesture; A type of interaction that we use on a daily basis in computers and smartphones.
But as far as we have seen from the people who used the motion capabilities of Apple Vision Pro, Apple has succeeded in applying the ideas in its new product and these motion gestures are not present in any of the headsets of other companies.
Ben Sin, a member of the Xda_developers website who was present at the Apple event, shared his experience while using the headset and explained that the user interface of the operating system of the headset is easy and enjoyable, and one of the distinguishing points of this headset compared to others. Apple’s products are new interaction patterns that, although familiar to users, bring a new style of interaction with software content.
– Review of developers’ first experience with the Apple Vision Pro headset
Apple is aware that any hand interaction will cause fatigue, but suggests that using these gestures can help create a new user experience. While the use of motion gestures are important and new, Apple has also included eye control access options that will be used in times of need.
Perhaps one of the most important points about the Apple Vision Pro headset is the lack of tactile feedback, which, unlike other virtual reality headsets, does not support physical controllers. This is important when we know that when we interact with objects in the real world, we see and touch it.
Probably, the lack of tactile feedback is one of the most difficult obstacles we face when building things in the operating system space, and Apple mentioned in the developer meeting that it intends to solve this problem through sensory data. One of the great examples for this section is Apple’s keyboard in Vision OS, where Apple provides light for each key of the keyboard to create a more natural feeling, and the light of the keyboard increases and decreases as the finger approaches or moves away.
In addition, when the Enter button is pressed, a visual display will be created as if the Send button has been pressed, and a sound will be played while pressing it, indicating the confirmation of pressing the Press button and sending.
– Comparison of Apple Vision Pro and Meta Quest Pro; Which one should we consider?
So far, these details are only a small part of the capabilities of the Apple Vision Pro, and various information will be available to us until the release. Apple has gathered its exclusive hardware, software and tools in a collection that is the result of several years and now it’s time for the developers to shine by producing new programs compatible with this headset to attract people’s opinion towards buying it. .
Apple Vision Pro is an expensive product compared to other models that few people are optimistic about its overnight success. In the past years, important companies such as: Sony, HTC and Meta showed interest in producing virtual reality headsets and each implemented their innovations in their new product, but after a few years, Apple entered this field and the success of this technology giant In the field of hardware and software, it can help in selling your new product.
RCO NEWS