WO2005018129A2 - Improved gesture recognition for pointing devices - Google Patents

Improved gesture recognition for pointing devices Download PDF

Info

Publication number
WO2005018129A2
WO2005018129A2 PCT/IB2004/051418 IB2004051418W WO2005018129A2 WO 2005018129 A2 WO2005018129 A2 WO 2005018129A2 IB 2004051418 W IB2004051418 W IB 2004051418W WO 2005018129 A2 WO2005018129 A2 WO 2005018129A2
Authority
WO
WIPO (PCT)
Prior art keywords
gesture
pointing device
data
historical record
force
Prior art date
Application number
PCT/IB2004/051418
Other languages
French (fr)
Other versions
WO2005018129A3 (en
Inventor
Victor Marten
Aakar Patel
Original Assignee
Semtech Corporation
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Semtech Corporation filed Critical Semtech Corporation
Publication of WO2005018129A2 publication Critical patent/WO2005018129A2/en
Publication of WO2005018129A3 publication Critical patent/WO2005018129A3/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text

Definitions

  • buttons or switches are usually associated with a pointing device, controlling the required actions once the cursor or actuator is positioned in an appropriate location.
  • buttons or switches it is highly advantageous not to employ such buttons or switches, and affect an appropriate action utilizing the pointing device itself. This is desirable since the controlling digit or limb is already acting on the pointing device, and does not need to be moved to the button or switch (so that the operation can be conducted without the user actually looking at the pointing device), or the second digit or appendage does not need to come into play.
  • the objective of the current invention is to provide a robust method and an algorithm for determination of various gestures carried out on the pointing device, simultaneously applicable to a multiplicity of the pointing device types.
  • gesture determination on a touch pad requires timely and reliable information that the touch event is taking place and the duration of such touch event.
  • the desired resistance to noise sources e.g. probability of "false positive” activations and probability of "false negative” behavior
  • the desired resistance to noise sources can be readily adjusted and can be made variable in time, if different degrees of noise resistance are required at different moments in time.
  • the determination of a gesture is made using past history of the data, with the amount of data (length of observation) suitable to the desired resistance to noise.
  • Variable threshold levels are utilized when the higher/lower type of comparisons are made.
  • the values of the thresholds are determined and constantly adjusted in time based on the amount of noise (electrical, mechanical or human behavior), expected in the future and observed in the past (for an appropriate period of time and including the present conditions).
  • the algorithm can be further characterized, using the language of mathematics, as utilizing the cross-correlations of the parts of the data itself, and correlations of the data to variable templates based on the previous history of the data, human motor abilities, and specific mechanical configuration of the particular pointing device.
  • the invention provides for means to prevent erroneous motion of the cursor or actuator while the determination of the gesture is taking place.
  • Figure 1 shows a representation of information flow in the pointing device.
  • FIG. 1 is a depiction of Probability Density Function ("PDF” or "Bell Curve”) and Determination of Threshold Value.
  • PDF Probability Density Function
  • Bell Curve Probability Density Function
  • Figure 3 illustrates Typical Gesture on Touch Pad Hold.
  • Figure 4 illustrates Typical Gesture on Touch Pad Click.
  • Figure 5 illustrates Typical Gesture on Touch Pad Double Click.
  • Figure 6 illustrates Typical Gesture on Touch Pad Click and Hold. Detailed Description
  • Any variable that requires a threshold to detect activation or de-activation (such as Z-force or Z-proximity for detection of touch, for example) is processed and a histogram (a representation of Probability Density Function or PDF) is accumulated.
  • the data in the histogram includes past history and the current conditions. As the time progresses, some old data is removed from the histogram, and new data is added.
  • the total area under the PDF curve 20 includes all possible values of the variable and thus represents a probability of 1 (100%).
  • a portion of the area 24 represents the probability of erroneous activation for a given threshold 23 (V ) for an arbitrarily selected case when the activation event requires the variable to exceed the threshold.
  • V threshold 23
  • the same considerations apply when the desired activation will result from the variable being smaller than the threshold, except the threshold value will probably be located to the left of the median value V o .
  • a value for threshold 23 can be calculated or selected from a table of values such that the probability of error is always smaller than or equal to the error represented by area 24.
  • the determination of the PDF may also be done during a training session when the user sets various parameters according to personal abilities, and the system learns the user's behavior.
  • a typical gesture on a touch pad may require setting of value thresholds for parameters Z-force and Z-proximity (which are used for creation of logical variable "Processed Touch Event") and time thresholds for time intervals designated 40, 41, 50, 51, 52, 60 and 61.
  • the time thresholds may at first be set to default values, then they may be further refined during a training session, and then they may be even further refined during the actual operations of a pointing device.
  • the algorithm for refinement is the same as described above (building of a histogram, checking the shape of the distribution, and calculating a threshold according to the required probability of error).

Abstract

A method and algorithm for determination of gestures on pointing devices is disclosed. The invention provides for means to make the probability of erroneous operations arbitrarily small and variable, suitable to the present task at hand. The invention is applicable to a variety of different types of pointing devices. Historical data are maintained regarding the performance of gestures by a human operator, and are analyzed to develop heuristic thresholds distinguishing whether or not the gesture occurred.

Description

Description IMPROVED GESTURE RECOGNITION FOR POINTING DEVICES
[1] This application claims priority from US appl. no. 60/481,240, filed August 15, 2003, which application is incorporated herein by reference for all purposes. Background of Invention
[2] Modern electronic devices often make use of a cursor seen on the display, or control some parameter or position of a mechanical actuator, by means of a pointing device. One or more buttons or switches are usually associated with a pointing device, controlling the required actions once the cursor or actuator is positioned in an appropriate location. However, for the users, it is highly advantageous not to employ such buttons or switches, and affect an appropriate action utilizing the pointing device itself. This is desirable since the controlling digit or limb is already acting on the pointing device, and does not need to be moved to the button or switch (so that the operation can be conducted without the user actually looking at the pointing device), or the second digit or appendage does not need to come into play.
[3] Numerous previous art apparatus and patents describe means for achieving such functionality utilizing variety of pointing devices. However, each method is typically applicable only to a specific pointing device type, and is usually sensitive to a multiplicity of "noise" sources, be it internal electrical noise, or aberrations in the functioning of the human digits or limbs. Summary of Invention
[4] The objective of the current invention is to provide a robust method and an algorithm for determination of various gestures carried out on the pointing device, simultaneously applicable to a multiplicity of the pointing device types.
[5] For example, gesture determination on a touch pad requires timely and reliable information that the touch event is taking place and the duration of such touch event.
[6] According to the current invention, the desired resistance to noise sources (e.g. probability of "false positive" activations and probability of "false negative" behavior) can be readily adjusted and can be made variable in time, if different degrees of noise resistance are required at different moments in time.
[7] The determination of a gesture is made using past history of the data, with the amount of data (length of observation) suitable to the desired resistance to noise. Variable threshold levels are utilized when the higher/lower type of comparisons are made. The values of the thresholds are determined and constantly adjusted in time based on the amount of noise (electrical, mechanical or human behavior), expected in the future and observed in the past (for an appropriate period of time and including the present conditions).
[8] The algorithm can be further characterized, using the language of mathematics, as utilizing the cross-correlations of the parts of the data itself, and correlations of the data to variable templates based on the previous history of the data, human motor abilities, and specific mechanical configuration of the particular pointing device.
[9] Data from all mutually independent sources (for example X-position, Y-position, X-force, Y-force, Z-force, duration of Z-force event, Z-proximity, actions on other controls and/or keys/keyboard of the host device) are accumulated and factor into the final decision in order to reduce the possibility of erroneous and/or undesirable operation.
[10] The invention provides for means to prevent erroneous motion of the cursor or actuator while the determination of the gesture is taking place.
[11] The burden of logical and arithmetical calculations can be arbitrarily divided between the local controller of the pointing device and the driver processing the data from the pointing device, which is running on the host apparatus (if such exists).
[12] An arbitrarily small subset of the method and algorithm of the invention can be utilized, suitable to the operating conditions, specific mechanical configuration and desired functionality of a particular pointing device. In some of such cases, operations may be equivalent to the previously described in prior art and/or patents.
[13] Brief Description of Drawings
[14] Figure 1 shows a representation of information flow in the pointing device.
[15] Figure 2 is a depiction of Probability Density Function ("PDF" or "Bell Curve") and Determination of Threshold Value.
[16] Figure 3 illustrates Typical Gesture on Touch Pad Hold.
[17] Figure 4 illustrates Typical Gesture on Touch Pad Click.
[18] Figure 5 illustrates Typical Gesture on Touch Pad Double Click.
[ 19] Figure 6 illustrates Typical Gesture on Touch Pad Click and Hold. Detailed Description
[20] Any variable that requires a threshold to detect activation or de-activation (such as Z-force or Z-proximity for detection of touch, for example) is processed and a histogram (a representation of Probability Density Function or PDF) is accumulated. The data in the histogram includes past history and the current conditions. As the time progresses, some old data is removed from the histogram, and new data is added. These operations are carried out using an optimized algorithm that does not require recalculation of the complete array of data.
[21] Looking at Figure 2, once the median value 21 (V ) is determined, several values 22 around V are used to determine the shape of the Probability Distribution (with the Normal or Gaussian Distribution being the worst case). A threshold value 23 (V ) is then determined based on the acceptable error probability as described below.
[22] The total area under the PDF curve 20 includes all possible values of the variable and thus represents a probability of 1 (100%). A portion of the area 24 represents the probability of erroneous activation for a given threshold 23 (V ) for an arbitrarily selected case when the activation event requires the variable to exceed the threshold. The same considerations apply when the desired activation will result from the variable being smaller than the threshold, except the threshold value will probably be located to the left of the median value V o .
[23] Using Gaussian Distribution as the worst possible case, a value for threshold 23 can be calculated or selected from a table of values such that the probability of error is always smaller than or equal to the error represented by area 24.
[24] When appropriate, the determination of the PDF may also be done during a training session when the user sets various parameters according to personal abilities, and the system learns the user's behavior.
[25] For example, looking at Figure 3 through Figure 6, a typical gesture on a touch pad may require setting of value thresholds for parameters Z-force and Z-proximity (which are used for creation of logical variable "Processed Touch Event") and time thresholds for time intervals designated 40, 41, 50, 51, 52, 60 and 61. The time thresholds may at first be set to default values, then they may be further refined during a training session, and then they may be even further refined during the actual operations of a pointing device. The algorithm for refinement is the same as described above (building of a histogram, checking the shape of the distribution, and calculating a threshold according to the required probability of error).
[26] It will be appreciated that the benefits of the invention offer themselves with any of a multitude of types of pointing devices, among them force sticks, movable joy-sticks, FSR (force sensor resistance), and touch pads. Touch pad technologies with which the invention can offer its benefits include capacitive, resistive, infrared, or any other technology.
[27] Those skilled in the art will have no difficulty whatsoever devising myriad obvious variants and improvements of the invention, all of which are intended to be encompassed within the claims which follow.

Claims

Claims
[I] A method for use in recognizing a gesture made by a human at a human-input pointing device, the method comprising the steps of: performing the gesture a plurality of times; for each performance of the gesture, collecting items of data from the pointing device regarding the gesture; maintaining a historical record of the items of data; selecting one or more thresholds relating to the items of data from the historical record; and detecting the gesture by employing the one or more thresholds with respect to values detected at the pointing device.
[2] The method of claim 1 further comprising the step of deleting old data from the historical record.
[3] The method of claim 1 wherein the historical record comprises accumulated data from mutually independent sources selected from the set consisting of X- position, Y-position, X-force, Y-force, Z-force, duration of Z-force-event, and Z- proximity.
[4] The method of claim 1 wherein the historical record is a histogram.
[5] The method of claim 4 wherein the histogram is a probability distribution function.
[6] The method of claim 1 comprising extracting from the historical record a median value, analyzing several values around the median value to determine the shape of a probability distribution function, and selecting the threshold value based upon a predetermined acceptable error probability.
[7] The method of claim 1 wherein the gesture is "hold".
[8] The method of claim 1 wherein the gesture is "click".
[9] The method of claim 1 wherein the gesture is "double click".
[10] The method of claim 1 wherein the gesture is "click and hold".
[I I] The method of claim 1 wherein the performance of the gesture a plurality of times comprises a training session in which the user's behavior is learned.
[12] The method of claim 1 wherein the performance of the gesture a plurality of times comprises performing the gesture during the actual operations of a pointing device, and wherein the one or more thresholds are refined thereby.
[13] The method of claim 1 wherein the human-input pointing device is a force stick, a movable joy-stick, an FSR (force sensor resistance) device, or a touch pad.
[14] The method of claim 13 wherein the human-input pointing device is a touch pad and wherein the touch pad is capacitive, resistive, or infrared.
[15] Apparatus for use in recognizing a gesture made by a human at a human-input pointing device, the apparatus comprising: means responsive to performing the gesture a plurality of times for collecting items of data from the pointing device regarding the gesture; means responsive to the collecting items of data for maintaining a historical record of the items of data; means selecting one or more thresholds relating to the items of data from the historical record; and means detecting the gesture by employing the one or more thresholds with respect to values detected at the pointing device.
[16] The apparatus of claim 15 wherein the means responsive to the collecting items of data for maintaining a historical record of the items of data deletes old data from the historical record.
[17] The apparatus of claim 15 wherein the historical record comprises accumulated data from mutually independent sources selected from the set consisting of X- position, Y-position, X-force, Y-force, Z-force, duration of Z-force-event, and Z- proximity.
[ 18] The apparatus of claim 15 wherein the means selecting one or more thresholds relating to the items of data from the historical record extracts from the historical record a median value, analyzes several values around the median value to determine the shape of a probability distribution function, and selects the threshold value based upon a predetermined acceptable error probability.
[19] The apparatus of claim 15 wherein the gesture is "hold".
[20] The apparatus of claim 15 wherein the gesture is "click".
[21] The apparatus of claim 15 wherein the gesture is "double click".
[22] The apparatus of claim 15 wherein the gesture is "click and hold".
[23] The apparatus of claim 15 wherein the first, second, third, and fourth means comprise a local controller of the pointing device or a driver processing data from the pointing device, or a combination thereof.
[24] The apparatus of claim 15 wherein the first, second, third, and fourth means comprise a local controller of the pointing device.
[25] The apparatus of claim 15 wherein the pointing device is a touch pad.
[26] The apparatus of claim 25 wherein the touch pad is capacitive, resistive, or infrared. [27] The apparatus of claim 15 wherein the pointing device is a force stick, a movable joy-stick, an FSR (force sensor resistance) device, or a touch pad.
PCT/IB2004/051418 2003-08-15 2004-08-06 Improved gesture recognition for pointing devices WO2005018129A2 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US48124003P 2003-08-15 2003-08-15
US60/481,240 2003-08-15

Publications (2)

Publication Number Publication Date
WO2005018129A2 true WO2005018129A2 (en) 2005-02-24
WO2005018129A3 WO2005018129A3 (en) 2006-03-02

Family

ID=34193038

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/IB2004/051418 WO2005018129A2 (en) 2003-08-15 2004-08-06 Improved gesture recognition for pointing devices

Country Status (1)

Country Link
WO (1) WO2005018129A2 (en)

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2008030976A2 (en) * 2006-09-06 2008-03-13 Apple Inc. Touch screen device, method, and graphical user interface for determining commands by applying heuristics
JP2013069350A (en) * 2005-09-15 2013-04-18 Apple Inc System and method for processing raw data of track pad device
US8949735B2 (en) 2012-11-02 2015-02-03 Google Inc. Determining scroll direction intent
US9372620B2 (en) 2007-01-07 2016-06-21 Apple Inc. Portable multifunction device, method, and graphical user interface for interacting with user input elements in displayed content
EP2570901A4 (en) * 2010-05-31 2016-07-20 Zte Corp Method and mobile terminal for automatically recognizing gesture
US11169690B2 (en) 2006-09-06 2021-11-09 Apple Inc. Portable electronic device for instant messaging
US11194467B2 (en) 2019-06-01 2021-12-07 Apple Inc. Keyboard management user interfaces
US11467722B2 (en) 2007-01-07 2022-10-11 Apple Inc. Portable electronic device, method, and graphical user interface for displaying electronic documents and lists

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6363160B1 (en) * 1999-01-22 2002-03-26 Intel Corporation Interface using pattern recognition and tracking
US6791536B2 (en) * 2000-11-10 2004-09-14 Microsoft Corporation Simulating gestures of a pointing device using a stylus and providing feedback thereto

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6363160B1 (en) * 1999-01-22 2002-03-26 Intel Corporation Interface using pattern recognition and tracking
US6791536B2 (en) * 2000-11-10 2004-09-14 Microsoft Corporation Simulating gestures of a pointing device using a stylus and providing feedback thereto

Cited By (19)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2013069350A (en) * 2005-09-15 2013-04-18 Apple Inc System and method for processing raw data of track pad device
US11029838B2 (en) 2006-09-06 2021-06-08 Apple Inc. Touch screen device, method, and graphical user interface for customizing display of content category icons
US8564544B2 (en) 2006-09-06 2013-10-22 Apple Inc. Touch screen device, method, and graphical user interface for customizing display of content category icons
US9952759B2 (en) 2006-09-06 2018-04-24 Apple Inc. Touch screen device, method, and graphical user interface for customizing display of content category icons
US11762547B2 (en) 2006-09-06 2023-09-19 Apple Inc. Portable electronic device for instant messaging
US11169690B2 (en) 2006-09-06 2021-11-09 Apple Inc. Portable electronic device for instant messaging
US9335924B2 (en) 2006-09-06 2016-05-10 Apple Inc. Touch screen device, method, and graphical user interface for customizing display of content category icons
WO2008030976A2 (en) * 2006-09-06 2008-03-13 Apple Inc. Touch screen device, method, and graphical user interface for determining commands by applying heuristics
KR101462363B1 (en) * 2006-09-06 2014-11-17 애플 인크. Touch screen device, method, and graphical user interface for determining commands by applying heuristics
WO2008030976A3 (en) * 2006-09-06 2009-11-26 Apple Inc. Touch screen device, method, and graphical user interface for determining commands by applying heuristics
US9372620B2 (en) 2007-01-07 2016-06-21 Apple Inc. Portable multifunction device, method, and graphical user interface for interacting with user input elements in displayed content
US11467722B2 (en) 2007-01-07 2022-10-11 Apple Inc. Portable electronic device, method, and graphical user interface for displaying electronic documents and lists
US10409461B2 (en) 2007-01-07 2019-09-10 Apple Inc. Portable multifunction device, method, and graphical user interface for interacting with user input elements in displayed content
US10228824B2 (en) 2007-01-07 2019-03-12 Apple Inc. Portable multifunction device, method, and graphical user interface for interacting with user input elements in displayed content
EP2570901A4 (en) * 2010-05-31 2016-07-20 Zte Corp Method and mobile terminal for automatically recognizing gesture
US8949735B2 (en) 2012-11-02 2015-02-03 Google Inc. Determining scroll direction intent
US11842044B2 (en) 2019-06-01 2023-12-12 Apple Inc. Keyboard management user interfaces
US11194467B2 (en) 2019-06-01 2021-12-07 Apple Inc. Keyboard management user interfaces
US11620046B2 (en) 2019-06-01 2023-04-04 Apple Inc. Keyboard management user interfaces

Also Published As

Publication number Publication date
WO2005018129A3 (en) 2006-03-02

Similar Documents

Publication Publication Date Title
CA2318815C (en) Method and apparatus for integrating manual input
JP4628785B2 (en) Personal data entry device with limited hand functions
US9489086B1 (en) Finger hover detection for improved typing
WO2005018129A2 (en) Improved gesture recognition for pointing devices
CN105607784B (en) A kind of method of adjusting sensitivity of touch screen, regulating device and terminal
JP2004013736A (en) Operation display device
Arai et al. Camera as mouse and keyboard for handicap person with troubleshooting ability, recovery, and complete mouse events
CN106951151A (en) Key assignments generation method, device and terminal
KR102454604B1 (en) Method and apparatus for gesture commends of touch-sensitive braille display device
KR20140102486A (en) Keyboard input system and the method using eye tracking
CN111367459B (en) Text input method using pressure touch pad and intelligent electronic device
CN107390998A (en) The method to set up and system of button in a kind of dummy keyboard
CN115237271A (en) Touch sensor, touch pad, method for identifying unexpected touch and computer
CN109634417A (en) A kind of processing method and electronic equipment
CN111831111A (en) Electroencephalograph with artificial intelligence as control device
US20220138356A1 (en) Access regulation of peripheral devices
Lam et al. More Errors vs. Longer Commands: The Effects of Repetition and Reduced Expressiveness on Input Interpretation Error, Learning, and Effort
CN107122044A (en) Input equipment and input method

Legal Events

Date Code Title Description
AK Designated states

Kind code of ref document: A2

Designated state(s): AE AG AL AM AT AU AZ BA BB BG BR BW BY BZ CA CH CN CO CR CU CZ DE DK DM DZ EC EE EG ES FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KP KR KZ LC LK LR LS LT LU LV MA MD MG MK MN MW MX MZ NA NI NO NZ OM PG PH PL PT RO RU SC SD SE SG SK SL SY TJ TM TN TR TT TZ UA UG US UZ VC VN YU ZA ZM ZW

AL Designated countries for regional patents

Kind code of ref document: A2

Designated state(s): BW GH GM KE LS MW MZ NA SD SL SZ TZ UG ZM ZW AM AZ BY KG KZ MD RU TJ TM AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HU IE IT LU MC NL PL PT RO SE SI SK TR BF BJ CF CG CI CM GA GN GQ GW ML MR NE SN TD TG

121 Ep: the epo has been informed by wipo that ep was designated in this application
122 Ep: pct application non-entry in european phase