NewsOpinionsPatents

Apple patent involves viewing ‘extended reality’ on Macs, iPhones, iPad, Apple Glasses, Apple Cars, more

FIG. 1 illustrates an example system architecture including various electronic devices that may implement the subject system.

Apple has been granted a patent (number 11,380,097) for “visualization of non-visible phenomena.” It involves viewing “extended reality” content on Macs, iPhones, iPads, the rumored “Apple Glasses,”
 and even an Apple Car.

About the patent

Extended reality (XR) technology aims to bridge a gap between virtual environments and a physical environment by providing an enhanced physical environment that’s augmented with computer-generated content that is not part of the physical environment. As a result, the computer-generated content that is not part of the physical environment appears to be part of the physical environment as perceived by a user. 

In the patent, Apple notes that there are many different types of electronic systems that enable a person to sense and/or interact with various XR environments. Examples include head mountable systems, projection-based systems, heads-up displays (HUDs), vehicle windshields having integrated display capability, windows having integrated display capability, displays formed as lenses designed to be placed on a person’s eyes (e.g., similar to contact lenses), headphones/earphones, speaker arrays, input systems (e.g., wearable or handheld controllers with or without haptic feedback), smartphones, tablets, and desktop/laptop computers. 

Apple’s idea is to provide an XR system that displays a virtual representation of non-visible features of a physical environment, such that a user of the XR system perceives the non-visible features at the location of the non-visible features in the physical environment. 

For example, responsive to a detection of non-visible feature of a physical environment, the device may display a visualization of the non-visible feature overlaid on the view of the physical environment at a location that that corresponds to the detected non-visible features. The non-visible features may correspond to, for example, electromagnetic signals such as Wi-Fi signals, airflow from an HVAC system, temperatures of physical objects, fluids or gasses, an audible fence created for a pet (e.g., using ultrasonic pitches), sounds generated by a musical instrument, and/or hidden physical objects such as objects with known locations that are obscured from view by other physical objects (as examples). 

Summary of the patent

Here’s Apple’s abstract of the patent: “Implementations of the subject technology provide visualizations of non-visible features of a physical environment, at the location of the non-visible features in the physical environment. The non-visible features may include wireless communications signals, sounds, airflow, gases, subsonic and/or ultrasonic waves, hidden objects, or the like. 

“A device may store visual contexts for visualizations of particular non-visible features. The device may obtain a depth map that allows the device to determine the location of the non-visible feature in the physical environment and to overlay the visualization on a user’s view of that location. In this way, the non-visible feature can be visualized its correct location, orientation, direction and/or strength in the physical environment.”

Dennis Sellers
the authorDennis Sellers
Dennis Sellers is the editor/publisher of Apple World Today. He’s been an “Apple journalist” since 1995 (starting with the first big Apple news site, MacCentral). He loves to read, run, play sports, and watch movies.