Wednesday, December 11, 2024
Apple Vision ProPatents

Future Apple Vision Pro users may be able to share augmented reality environments

FIG. 5C illustrates an example user interfaces for sharing augmented reality environments between electronic devices.

Apple has filed for a patent (number US 20240312161 A1) for “methods for presenting and sharing content in augmented reality environments.” It shows that Apple wants multiple users of the Vision Pro and any follow-up devices to be able to share AR environments. 

About the patent filing

In the patent filing, Apple notes that the development of computer systems for augmented reality (AR) has increased significantly in recent years. Example augmented reality environments include at least some virtual elements that replace or augment the physical world. Input devices, such as cameras, controllers, joysticks, touch-sensitive surfaces, and touch-screen displays for computer systems and other electronic computing devices are used to interact with virtual/augmented reality environments. 

Example virtual elements include virtual objects include digital images, video, text, icons, and control elements such as buttons and other graphics. However, Apple says that methods and interfaces for interacting with environments that include at least some virtual elements (e.g., applications, AR environments, mixed reality environments, and virtual reality environments) are “cumbersome, inefficient, and limited.” The tech giant says, for example, systems that provide insufficient feedback for performing actions associated with virtual objects, systems that require a series of inputs to achieve a desired outcome in an AR environment, and systems in which manipulation of virtual objects are “complex, tedious and error-prone, create a significant cognitive burden on a user, and detract from the experience with the virtual/augmented reality environment.”

Apple says there’s a need for computer systems with improved methods and interfaces for providing computer generated experiences to users that make interaction with the computer systems more efficient and intuitive for a user. Such methods and interfaces optionally complement or replace conventional methods for providing extended reality experiences to users. One of those is “gaze detection” on Macs and iPads. In other words, you could be able to control some control some user interface elements with your eyes. And in a shared 3D and/or AR environment, multiple users could control objects via gaze detection.

Another input method would be motion and gesture detection. A Mac, iPad, or Apple Glasses could be equipped with hand-tracking components.

Summary of the patent filing

Here’s Apple’s abstract of the patent filing: “A first electronic device with one or more processors, memory, one or more cameras, and a display generation component captures, with the one or more cameras, an image of a second electronic device that includes position information displayed via a display generation component of the second electronic device. 

“The position information indicates a location of the second electronic device within an augmented reality environment that includes a physical environment in which the first electronic device and the second electronic device are located. The first electronic device, after capturing the image of the second electronic device that includes the position information, displays, via the display generation component of the first electronic device, one or more virtual objects within the augmented reality (AR) environment using the position information captured from the second electronic device.”

Dennis Sellers
the authorDennis Sellers
Dennis Sellers is the editor/publisher of Apple World Today. He’s been an “Apple journalist” since 1995 (starting with the first big Apple news site, MacCentral). He loves to read, run, play sports, and watch movies.