Monday, November 4, 2024
Apple GlassesOpinionsPatents

Apple’s rumored extended reality headset may allow users to view invisible things

FIG. 1 illustrates an example system architecture including various electronic devices that may implement visualization of non-visible phenomena.

Apple has filed for a patent application (number 20220292821) for “visualization of non-visible phenomena.” It involves allowing the rumored “Apple Glasses,” an extended reality headset to see things invisible to the naked eye. 

About the patent filing

Such a feature would include certain safety features. So what is extended reality? Extended reality technology aims to bridge a gap between virtual environments and a physical environment by providing an enhanced physical environment that is augmented with computer-generated content that is not part of the physical environment. As a result, the computer-generated content that is not part of the physical environment appears to be part of the physical environment as perceived by a user. 

According to the patent filing, the non-visible features may correspond to, for example, electromagnetic signals such as Wi-Fi signals, airflow from an HVAC system, temperatures of physical objects, fluids or gasses, an audible fence created for a pet (e.g., using ultrasonic pitches), sounds generated by a musical instrument, and/or hidden physical objects such as objects with known locations that are obscured from view by other physical objects (as examples).

If the HMD, smartglasses or other eyewear “detects” a non-visible phenomenon (via specialized sensors), the device will then allow the user to actually see the range of ultrasonic pitch or see where a dangerous gas actually is within their view.

Summary of the patent filing

Here’s Apple’s (technical) abstract of the patent filing: “Implementations of the subject technology provide visualizations of non-visible features of a physical environment, at the location of the non-visible features in the physical environment. The non-visible features may include wireless communications signals, sounds, airflow, gases, subsonic and/or ultrasonic waves, hidden objects, or the like. 

“A device may store visual contexts for visualizations of particular non-visible features. The device may obtain a depth map that allows the device to determine the location of the non-visible feature in the physical environment and to overlay the visualization on a user’s view of that location. In this way, the non-visible feature can be visualized its correct location, orientation, direction and/or strength in the physical environment.”

About Apple Glasses

When it comes to Apple Glasses, the rumors are abundant. Such a device will arrive in mid-to-late 2023. Or maybe 2024. It will be a head-mounted display. Or may have a design like “normal” glasses. Or it may be eventually be available in both. The Apple Glasses may or may not have to be tethered to an iPhone to work. Other rumors say that Apple Glasses could have a custom-build Apple chip and a dedicated operating system dubbed “rOS” for “reality operating system.”

Dennis Sellers
the authorDennis Sellers
Dennis Sellers is the editor/publisher of Apple World Today. He’s been an “Apple journalist” since 1995 (starting with the first big Apple news site, MacCentral). He loves to read, run, play sports, and watch movies.