A closer look at Apple’s Vision Pro keyboard and other controls

An Apple developer session has offered an in-depth look at the many ways users will (eventually) control its new Vision Pro headset, including a virtual keyboard that you’ll be able to type on in mid-air. It comes to us thanks to the “Design for spatial input” session, in which two members of Apple’s design team walk prospective developers through best practices for designing apps for the new platform.

Apple seems keen for users to mainly interact with the headset by simply looking at UI elements and making small hand gestures with their arms relaxed on their lap. But in its developer session, Apple designer Israel Pastrana Vicente admits that “some tasks are better suited to interact directly,” which can involve reaching out and touching UI elements (a feature Apple refers to as “direct touch”). There’s also support for using physical keyboards and trackpads or game controllers.

“Some tasks are better suited to interact directly”

So let’s talk about the Vision Pro’s virtual keyboard. Apple designer Eugene Krivoruchko explains that it’s important to offer plenty of visual and audio feedback while using it, to make up for the “missing tactile information” involved with touching a read peripheral. “While the finger is above the keyboard, buttons display a hover state and a highlight that gets brighter as you approach the button surface,” Krivoruchko explains. “It provides a proximity cue and helps guide the finger to target. At the moment of contact, the state change is quick and responsive, and is accompanied by matching spatial sound effect.”

There’s also support for voice input, with the same developer session noting that focusing your eyes on the microphone icon in the search field will trigger a “Speak to Search” feature. That’ll likely draw audio data from the six microphones built into the Vision Pro.

Direct touch can also be used to interact with other system elements. There’s the ability to tap and scroll as though you’re interacting with a touchscreen, and one Apple demo shows the wearer making a pen motion in midair to write a word and draw a heart shape in Markup. Although the primary interaction is via the user’s hand, Krivoruchko explains how it’s also using eye-tracking to augment the gestures. “You control the brush cursor with your hand, similar to a mouse pointer, but then if you look to the other side of the canvas and tap, the cursor jumps there landing right where you’re looking. This creates a sense of accuracy and helps to cover the large canvas quickly,” the designer says.

We still have plenty of questions about how Apple’s expensive new Vision Pro headset is going to work in practice (in particular the potential to pair it with motion controllers from other manufacturers), but between our hands-on experience and developer sessions like these, the experience is starting to come into focus.

Credit: Source link

Comments are closed.