The upcoming Vision Pro may be able to see the invisible — and ‘smell’ objects

Forget virtual reality, extended reality, and mixed reality. Apple has been granted a patent (US 11715301 B2) for “Visualization of Non-Visible Phenomena” that show the company wants the upcoming Apple Vision Pro could “see” the invisible — and have the ability to smell.

The invisible in question being electrical currents, radios signals from Wi-Fi and other “non-visible phenomena.” Even more interesting is Apple’s idea that the Vision Pro could be equipped with features allowing it to detect scents. 

The Vision Pro is Apple’s US$3,499 (and up) “spatial computer.” It’s due in early 2024, though it will apparently only be available in limited quantities at first.

About the patent

In the patent Apple notes that extended reality technology aims to bridge a gap between virtual environments and a physical environment by providing an enhanced physical environment that is augmented with computer-generated content that is not part of the physical environment. As a result, the computer-generated content that is not part of the physical environment appears to be part of the physical environment as perceived by a user.

Implementations of the subject technology described in Apple’s patent provide an extended reality (XR) system that displays a virtual representation of non-visible features of a physical environment, such that a user of the XR system perceives the non-visible features at the location of the non-visible features in the physical environment. 

FIG. 6 illustrates a flow chart of an example process for providing computer-generated visualizations of non-visible phenomena.

For example, a device may detect, and/or receive information regarding, one or more non-visible features within a direct or pass-through field of view of a physical environment, and display a visualization of those detected non-visible features at the correct location of those features in the physical environment.

For example, responsive to a detection of non-visible feature of a physical environment, the device may display a visualization of the non-visible feature overlaid on the view of the physical environment at a location that that corresponds to the detected non-visible features. The non-visible features may correspond to, for example, electromagnetic signals such as Wi-Fi signals, airflow from an HVAC system, temperatures of physical objects, fluids or gasses, an audible fence created for a pet (e.g., using ultrasonic pitches), sounds generated by a musical instrument, and/or hidden physical objects such as objects with known locations that are obscured from view by other physical objects (as examples).

What’s really interesting (to me, anyway) is the patent’s statement that non-visible features can be added to a physical environment to provide scent experiences for Vision Pro users. For example, an electronic device such as electronic device may be provided with an artificial scent device (a device configured to release one or a combination of gases, vapors, or particulates that mimic one or more predefined scents). 

The non-visible features added to the physical environment to trigger a scent experience may be detected by the Vision Pro causing the artificial scent device to generate a corresponding scent. For example, a tea shop that links fruits and/or seasons to a scent may generate (e.g., using non-visible light and/or ultrasonic signals) non-visible depictions of fruit that can be visualized by a user of a Vision Pro that’ passing by or within the tea shop. 

The detection of the non-visible depictions of fruit can also trigger generation of a scent corresponding to the depicted fruit, by the artificial scent device, in one or more implementations. This begs the question of how many folks will wear a Vision Pro when they’re out and about in public. But that’s a topic for another article. 

Summary of the patent

Here’s Apple’s abstract of the patent: “Implementations of the subject technology provide visualizations of non-visible features of a physical environment, at the location of the non-visible features in the physical environment. The non-visible features may include wireless communications signals, sounds, airflow, gases, subsonic and/or ultrasonic waves, hidden objects, or the like. A device may store visual contexts for visualizations of particular non-visible features. 

“The device may obtain a depth map that allows the device to determine the location of the non-visible feature in the physical environment and to overlay the visualization on a user’s view of that location. In this way, the non-visible feature can be visualized its correct location, orientation, direction and/or strength in the physical environment.”

Dennis Sellers

Dennis Sellers is the editor/publisher of Apple World Today. He’s been an “Apple journalist” since 1995 (starting with the first big Apple news site, MacCentral). He loves to read, run, play sports, and watch movies.

Recent Posts

I’ve eliminated most of the ads at ‘Apple World Today’, so I hope you’ll support the site

As you’ve probably noticed (at least I hope you have, or else I’ve made a…

2 hours ago

Top Apple-related stories this week (May 13-17)

Here are the top Apple-related articles at Apple World Today for the week of May…

2 hours ago

Optimizing Time Management: How AI Email Writer Can Help

One of the biggest reasons people are starting to lean on AI for knowledge and…

2 hours ago

What Are The Best Free Online Games to Play Directly From Your Browser?

The internet offers infinite forms of entertainment, and the world of on-line gaming isn't any…

2 hours ago

Today’s deal: PDF Expert Premium Plan: Lifetime Subscription (Mac) for $79.99

PDF Expert is a uniquely fast, reliable, easy-to-use PDF editor that is built with the…

2 hours ago

Apple rumored to be developing a ‘significantly thinner version’ of the iPhone

Apple is developing a “significantly thinner version” of the iPhone that could be released next…

20 hours ago