Apple wants future devices to be able to have a variety of input features that can work in tandem. The company has been granted a patent for “Multitouch Data Fusion.”
About the patent filing
The patent filing relates to systems utilizing multi-touch sensitive input devices and other input devices, and more particularly, to the combining of multi-touch input data with data from other input devices to gain an advantage thereby increasing the efficiency and performance of inputting operations.
In the patent filing, Apple notes that computing systems may have multiple input means. However, each input means is typically operated independently of each other in a non seamless way. There is no synergy between them. They don’t work together or cooperate for a common goal such as improving the input experience. Apple wants to change this.
The company’s idea is that, while the fingertip chording and movement data generated by multi-touch input devices can provide a strong set of user control means, additional information from other sensing modalities when combined or fused with the chording and movement data can “significantly enhance the interpretative abilities of the electronic device and/or significantly improve the ease of use as well as streamline input operations for the user.”
Apple says there are several independent sensing modalities that when fused with multi-touch and movement data that can provide enhanced performance and use of electronic devices. The sources of independent sensing data fall into several categories: (1) those that measure some aspect of the user’s body state, (2) those that measure data from the environment, which could include sensing data from other individuals, and (3) those that measure some aspect of the state of the electronic device.
For example, the results of voice recognition and speech understanding can be fused with multi-touch movement data in such a way as to significantly enhance electronic device performance. The contact size and contact separation of touch data along with finger identification data (such as from a camera) can allow the multi-touch system to make guesses concerning finger identification of the touch data.
Gaze vector data (the determination of a user’s gaze) can be fused with touch data and/or objects appearing on a display to perform various operations such as object movement or selection. The fusion of device dynamics data (e.g. movement data) with multi-touch movement data can result in a smoothing out (i.e., improved filtering) of unintended finger motion due to the means of traveling (e.g., vibrations and jolts).
Apple notes that biometric inputs include, but aren’t limited to, hand size, fingerprint input, body temperature, heart rate, skin impedance, and pupil size. Typical applications that might benefit from the fusion of biometric data with multi-touch movement data would include games, security, and fitness related activities. Apple adds that facial expressions conveying emotional state can also be fused advantageously with multi-touch movement data during creative activities such as music composition.
Summary of the patent filing
Here’s Apple’s abstract of the patent filing: “A method for performing multi-touch (MT) data fusion is disclosed in which multiple touch inputs occurring at about the same time are received to generating first touch data. Secondary sense data can then be combined with the first touch data to perform operations on an electronic device.
“The first touch data and the secondary sense data can be time-aligned and interpreted in a time-coherent manner. The first touch data can be refined in accordance with the secondary sense data, or alternatively, the secondary sense data can be interpreted in accordance with the first touch data. Additionally, the first touch data and the secondary sense data can be combined to create a new command.”
I hope you’ll help support Apple World Today by becoming a patron. Patreon pricing ranges from $2 to $10 a month. Thanks in advance for your support.