Apple has applied for a patent (number 20210118238) for “registration between actual mobile device position and environmental model.” It involves the rumored “Apple Glasses,” a virtual reality/augmented reality/mixed reality head-mounted display (HMD).
This patent involves the use of an iPhone with such a device. In other words, the HMD would use the Apple smartphone as its engine.
The iPhone may be mounted on the side of the Apple Glasses that is opposite of the side on which the device’s display is placed. Enhanced reality applications would capture the contents currently within the view of the camera in real time and present those contents on the HMD display.
This means the iPhone user can hold up the smartphone in front of his field of vision in order to use the device’s graphical display as a sort of camera view finder. The device’s display shows the user everything that the camera can see. Of coursee, enhanced reality applications go further than simply presenting a view of the real world to the device’s user. Enhanced reality applications seek to register a projected model of the world with live video of the world.
Enhanced reality applications are frequently capable of overlaying real-world view with information about the people and objects contained within that view. For example, an enhanced reality application might overlay a particular building within a view with the name of the corporation whose office the building houses. For another example an enhanced reality application might overlay a particular street within a view with its name.
In order to overlay information over a particular object within the real world view in an accurate manner, the enhanced reality application must have some way of determining which of the many pixels within the view correspond to that particular object. The real world view might contain various pixels corresponding to many different objects in various different locations.
If the overlaid information is placed within the wrong spot within the view, then the user of the enhanced reality application may become confused or misinformed. Apple says that, for these reasons, enhanced reality applications should refer to a virtual model of the real world. Apparently, the idea is for apps on the iPhone to do the heavy lifting when it comes to calibrating the position of a three dimensional model with a real-world environment represented by that model.
Here’s the summary of the patent data: “A user interface enables a user to calibrate the position of a three dimensional model with a real-world environment represented by that model. Using a device’s sensor, the device’s location and orientation is determined. A video image of the device’s environment is displayed on the device’s display. The device overlays a representation of an object from a virtual reality model on the video image. The position of the overlaid representation is determined based on the device’s location and orientation. In response to user input, the device adjusts a position of the overlaid representation relative to the video image.”
When it comes to Apple Glasses, such a device will arrive this year or 2022, depending on which rumor you believe. The Sellers Research Group (that’s me) thinks Apple will at least preview it before the end of the year.
It will be a head-mounted display. Or may have a design like “normal” glasses. Or it may be eventually be available in both. The Apple Glasses may or may not have to be tethered to an iPhone to work. Other rumors say that Apple Glasses could have a custom-build Apple chip and a dedicated operating system dubbed “rOS” for “reality operating system.”