Nokia touch patent
Abstract
A method includes executing a gesture with a user-manipulated physical object in the vicinity of a device; generating data that is descriptive of the presence of the user-manipulated object when executing the gesture; and interpreting the data as pertaining to at least one object, such as an object displayed by the device.
Description
--------------------------------------------------------------------------------
TECHNICAL FIELD
[0001]The teachings in accordance with the exemplary embodiments of this invention relate generally to user interfaces to electronic devices and, more specifically, relate to manually activated user input devices, methods, systems and computer program products.
BACKGROUND
[0002]Input devices employed in the converging multimedia electronics industry are becoming increasingly important. The human-computing terminal interface has long challenged systems designers, yet has not significantly evolved since the advent of the mouse several decades ago. This is a particularly challenging problem in the area of mobile and wireless devices, where the objectives of device miniaturization and usability directly conflict with one another. A natural and intelligent interaction between humans and computing terminals (CT) can be achieved if the simplest modalities, such as finger movement and/or user gestures, are used to provide basic input information to the CT (non-limiting examples of which can include multimedia terminals, communication terminals, display dominated systems (DDS) and devices, gaming devices and laptop computers).
[0003]Technology related to input devices has conventionally relied on a set of electro-mechanical switches (such as the classic keyboard). Such an approach requires a relatively large area for a set of switches (keyboard keys), which are usually dedicated to only one operation. A more advanced solution is offered by touch screen displays where touch sensitive switches are embedded into the display itself, such as in Active Matrix LCD with Integrated Optical Touch Screen (AMLCD) technology. In this approach the "single button" trend is evolving towards that of a "distributed sensor system" that maybe embedded into the device and/or even directly into the display itself (AMLCD). The physical operation of such a sensor-based input device can be based on mechanical movement of different materials, change of electrical conductivity/capacity, influences by electrostatic field or optical properties (made by finger shadow/reflection from the surface). Reference with regard to AMLCD technology maybe made to documents: 56.3, W. den Boer et al., "Active Matrix LCD with Integrated Optical Touch Screen", SID 03 Digest (Baltimore, 2003) pgs. 1494-1497, and to 59.3, A. Abileah et al., "Integrated Optical Touch Panel in a 14.1'' AMLCD", SID 04 Digest, v. 35, Issue 1, pgs. 1544-1547, and incorporated by reference herein in their entireties.
[0004]Reference may also be made to U.S. Pat. No. 7,009,663 B2 (Mar. 7, 2006), entitled "Integrated Optical Light Sensitive Active Matrix Liquid Crystal display", A. Abileah et al., and U.S. Pat. No. 7,053,967 B2 (May 30, 2006), entitled "Light Sensitive Display", A. Abileah et al. (both assigned to Planar Systems, Inc.), which are incorporated by reference herein in their entireties.
[0005]The current trend in the development of multimedia device equipment involves hardware miniaturization together with a demand to provide a large input capacity. If the input device can be miniaturized then more space can be allocated for the visualization component(s), particularly in display dominated concept (DDC) devices. The situation in gaming devices is even more challenging, since improvements in the input devices may provide new design freedom and additional game-related functionalities.
[0006]Examples of current user input devices include those based on touch-motion, as in certain music storage and playback devices, and certain personal digital assistant (PDA) and similar devices that are capable of recognizing handwritten letters and commands.
[0007]Also of interest may be certain structured light based systems, such as those described in U.S. Pat. No. 6,690,354 B2 (Feb. 10, 2004), entitled "Method for Enhancing Performance in a System Utilizing an Array of Sensors that Sense at Least Two Dimensions", Sze; U.S. Pat. No. 6,710,770 (Mar. 23, 2004), entitled "Quasi-Three-Dimensional Method and Apparatus to Detect and Localize Interaction of User-Object and Virtual Transfer Device", Tomasi et al.; and U.S. Pat. No. 7,050,177 B2 (May 23, 2006), entitled "Method and Apparatus for Approximating Depth of an Object's Placement Onto a Monitored Region with Applications to Virtual Interface Devices", Tomasi et al. (all assigned to Canesta, Inc.), which are incorporated by reference herein in their entireties.
SUMMARY OF THE EXEMPLARY EMBODIMENTS
[0008]The foregoing and other problems are overcome, and other advantages are realized, in accordance with the non-limiting and exemplary embodiments of this invention.
[0009]In accordance with one aspect thereof the exemplary embodiments of this invention provide a method that includes executing a gesture with a user-manipulated physical object in the vicinity of a device; generating data that is descriptive of the presence of the user-manipulated object when executing the gesture; and interpreting the data as pertaining to at least one object displayed by the device.
[0010]In accordance with another aspect thereof the exemplary embodiments of this invention provide computer program product embodied in a computer readable medium, execution of the computer program product by at least one data processor resulting in operations that comprise, in response to a user executing a gesture with a user-manipulated physical object in the vicinity of a device, generating data that is descriptive of the presence of the user-manipulated object when executing the gesture; and interpreting the data as pertaining to information displayed to the user.
[0011]In accordance with a further aspect thereof the exemplary embodiments of this invention provide a device that comprises a unit to display information; an imaging system to generate data that is descriptive of the presence of a user-manipulated object when executing a gesture; and a data processor to interpret the data as pertaining to displayed information.
[0012]In accordance with a further aspect thereof the exemplary embodiments of this invention provide a method that includes, in response to a user employing at least one finger to form a gesture in the vicinity of a device, generating data that is descriptive of a presence of the at least one finger in forming the gesture; and interpreting the data as pertaining to at least one object that appears on a display screen.
[0013]In accordance with a still further aspect thereof the exemplary embodiments of this invention provide an apparatus that includes a display to visualize information; a sensor arrangement that is responsive to the user executing a gesture with a user-manipulated physical object in the vicinity of a surface of the apparatus, the sensor arrangement having an output to provide data descriptive of the presence of the user-manipulated object when executing the gesture; and a unit having an input coupled to the output of the sensor arrangement and operating to interpret the data to identify the executed gesture, and to interpret the identified gesture as pertaining in some manner to visualized information.
BRIEF DESCRIPTION OF THE DRAWINGS
[0014]The foregoing and other aspects of the teachings of this invention are made more evident in the following Detailed Description, when read in conjunction with the attached Drawing Figures, wherein:
[0015]FIG. 1A shows a device that incorporates a plurality of ultrasonic transducers (USTs) as user input devices;
[0016]FIG. 1B is a simplified block diagram of the device of FIG. 1A;
[0017]FIG. 2A shows a further exemplary embodiment of this invention where the USTs are incorporated into a device that embodies a mini-projector;
[0018]FIG. 2B is a simplified block diagram of the mini-projector device of FIG. 2A;
[0019]FIGS. 3A, 3B, collectively referred to as FIG. 3, FIGS. 4A-4D, collectively referred to as FIG. 4, FIGS. 5A, 5B, collectively referred to as FIG. 5, and FIG. 6 depict exemplary finger-based gestures that may be used to select various commands for execution in accordance with exemplary embodiments of this invention;
[0020]FIG. 7 shows the principles of the ultrasonic observation of finger distance;
[0021]FIGS. 8A-8D, collectively referred to as FIG. 8, show exemplary finger-based gestures that may be used to select various commands for execution in accordance with further exemplary embodiments of this invention;
[0022]FIG. 9 is a logic flow diagram depicting an exemplary finger detection process executed by the device shown in FIG. 10B, and that is suitable for capturing the finger-based gestures shown in FIG. 8 and 10A;
[0023]FIG. 10A shows an example of the sensing of multiple points of simultaneous touch detected by device of FIG. 10B;
[0024]FIG. 10B is a simplified block diagram of a device having a display capable of generating an image of one or more fingertips; and
[0025]FIG. 11 is a logic flow diagram that depicts a method in accordance with the exemplary embodiments of this invention.
No comments: