US20160062543A1 - Touch control device and method - Google Patents

Touch control device and method Download PDF

Info

Publication number
US20160062543A1
US20160062543A1 US14/936,376 US201514936376A US2016062543A1 US 20160062543 A1 US20160062543 A1 US 20160062543A1 US 201514936376 A US201514936376 A US 201514936376A US 2016062543 A1 US2016062543 A1 US 2016062543A1
Authority
US
United States
Prior art keywords
touchpad
area
operating object
coordinates
touch
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US14/936,376
Inventor
Wei-Kuo Mai
Shih Peng Huang
Chung-Jung Liou
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Elan Microelectronics Corp
Original Assignee
Elan Microelectronics Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from TW101102526A external-priority patent/TWI451309B/en
Application filed by Elan Microelectronics Corp filed Critical Elan Microelectronics Corp
Priority to US14/936,376 priority Critical patent/US20160062543A1/en
Assigned to ELAN MICROELECTRONICS CORPORATION reassignment ELAN MICROELECTRONICS CORPORATION ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: HUANG, SHIH PENG, LIOU, CHUNG-JUNG, MAI, WEI-KUO
Publication of US20160062543A1 publication Critical patent/US20160062543A1/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/0416Control or interface arrangements specially adapted for digitisers
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04812Interaction techniques based on cursor appearance or behaviour, e.g. being affected by the presence of displayed objects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04842Selection of displayed objects or displayed text elements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/048Indexing scheme relating to G06F3/048
    • G06F2203/04808Several contacts: gestures triggering a specific function, e.g. scrolling, zooming, right-click, when the user establishes several contacts with the surface simultaneously; e.g. using several fingers or a combination of fingers and pen

Definitions

  • the present invention is related generally to a touch control device and, more particularly, to a touch control device and a control method thereof so that, when a touchpad is touched by an operating object, the corresponding coordinates of the operating object on a screen can be obtained by calculating with different sets of ratios depending on the touched position on the touchpad.
  • touch control operation is applicable not only to the small touch screens of the conventional touch-screen mobile devices such as mobile phones and satellite-based navigation devices, but also to operating systems that provide multi-touch functions, such as Microsoft's Windows 7 and Windows 8 and Apple Inc.'s iPhone OS.
  • the touch control operation environment has extended from portable devices to desktop devices, allowing users to perform various operations directly on large touch screens.
  • touch control devices other than touch screens (e.g., touchpads) have been devised for touch control operation.
  • touch control devices are typically designed only for controlling the cursor on a screen and are intended mainly as a substitute for the existing cursor controllers such as external mice or trackballs.
  • touch screens which can be used to give actuation instructions directly by a finger touch on the screens
  • the aforesaid touch control devices provide no such a function when touched by a user's finger.
  • a touch control device capable of simulating the effect of a finger touch on a touch screen is desirable.
  • Still another object of the present invention is to provide a touch control device and a control method thereof so that different instructions can be executed according to the touch control action of an operating object on a touchpad.
  • the present invention provides a touch control device including an input element and a computation unit.
  • the input element has a touchpad and a control unit.
  • the touchpad includes a first area and a second area.
  • the control unit is connected to the touchpad and is configured for detecting the coordinates of an operating object on the touchpad.
  • the computation unit is connected to the control unit. If the touch by the operating object starts in the first area, the computation unit calculates the corresponding coordinates of the operating object on a screen according to the coordinates of the operating object on the touchpad and a first set of ratios. If the touch by the operating object starts in the second area, the computation unit calculates the corresponding coordinates of the operating object on the screen according to the coordinates of the operating object on the touchpad and a second set of ratios.
  • the present invention also provides a method for controlling a touch control device, and the method is carried out as follows. To begin with, a first area and a second area are defined on a touchpad. Then, it is determined whether a touch by an operating object starts in the first area or the second area. If the touch by the operating object starts in the first area, the corresponding coordinates of the operating object on a screen are calculated according to the coordinates of the operating object on the touchpad and a first set of ratios. If the touch by the operating object starts in the second area, the corresponding coordinates of the operating object on the screen are calculated according to the coordinates of the operating object on the touchpad and a second set of ratios.
  • the present invention also provides a touch control device including an input element and a computation unit.
  • the input element has a touchpad and a control unit.
  • the touchpad includes a first area and a second area, wherein the first area is a peripheral area of the touchpad.
  • the control unit is connected to the touchpad and is configured for detecting the movement of an operating object on the touchpad.
  • the computation unit is connected to the control unit. If the computation unit determines that the operating object has moved on the touchpad from the first area toward the second area, an instruction window is opened on a screen according to the movement of the operating object.
  • the present invention also provides a method for controlling a touch control device, and the method is carried out as follows. To begin with, a first area and a second area are defined on a touchpad, wherein the first area is a peripheral area of the touchpad. Then, the movement of an operating object on the touchpad is detected, and it is determined whether the operating object has moved on the touchpad from the first area toward the second area. If the operating object has moved on the touchpad from the first area toward the second area, an instruction window is opened on a screen according to the movement of the operating object.
  • the present invention also provides a method for controlling a touch control device, and the method includes determining the number of operating objects on a touchpad and determining the number of instruction items of an application program that is currently running. If the number of operating objects on the touchpad is greater than one and if the number of instruction items of the application program is greater than one, a virtual frame is defined on a screen. The corresponding coordinates of the operating objects in the virtual frame are calculated according to the coordinates of the operating objects on the touchpad and a set of ratios.
  • the present invention also provides a method for controlling a touch control device, and the method includes determining the number of operating objects on a touchpad. If there is only one operating object, it is then determined whether the operating object has double-clicked the touchpad. If the operating object has double-clicked the touchpad, a virtual touch control element is generated on a screen, and it is detected whether the operating object has displaced on the touchpad. If the operating object has displaced on the touchpad, a window page switching instruction is executed.
  • the present invention also provides a method for controlling a touch control device, and the method includes determining the number of operating objects on a touchpad. If there is only one operating object, it is then determined whether the operating object has performed a touch control action on the touchpad. If the operating object has performed a touch control action on the touchpad, a control window frame is defined on a screen, and the corresponding coordinates of the operating object in the control window frame are calculated according to the coordinates of the operating object on the touchpad and a set of ratios.
  • FIG. 1 is a system structure diagram of the touch control device in the first embodiment of the present invention
  • FIG. 2 is another system structure diagram of the touch control device in the first embodiment of the present invention.
  • FIG. 3 schematically shows how in the first embodiment of the present invention the corresponding coordinate position of an operating object on a screen is calculated according to a width ratio and a height ratio between the screen and a touchpad;
  • FIG. 4 schematically shows how in the first embodiment of the present invention the corresponding coordinate position of an operating object on a screen is calculated according to a width ratio and a height ratio between a control window frame and a touchpad;
  • FIG. 5 schematically shows how in the first embodiment of the present invention a touchpad is proportionally mapped onto a screen according to the condition of multiple operating objects on the touchpad;
  • FIG. 6 schematically shows how in the first embodiment of the present invention a virtual frame corresponding to a touchpad is mapped onto a screen according to the condition of multiple operating objects on the touchpad;
  • FIG. 7 is a flowchart of the control method in the first embodiment of the present invention.
  • FIG. 8 is a system structure diagram of the touch control device in the second embodiment of the present invention.
  • FIG. 9 schematically shows how in the second embodiment of the present invention an instruction window is opened on a screen by moving an operating object
  • FIG. 10 schematically shows how in the second embodiment of the present invention an operating object operates an instruction window through a virtual instruction area
  • FIG. 11 is a flowchart of the control method in the second embodiment of the present invention.
  • FIG. 12 schematically shows how in the third embodiment of the present invention the corresponding coordinates of an operating object on a screen are calculated according to a width ratio and a height ratio between a control window frame and a touchpad;
  • FIG. 13 schematically shows how in the third embodiment of the present invention a graphical item is dragged on a screen by moving an operating object
  • FIG. 14 is a flowchart of the control method in the third embodiment of the present invention.
  • the present invention mainly provides a touch control device for use with an operating system that supports touch control operation, such as Windows 7, Windows 8, and iPhone OS, so as to enable intuitive operation similar to what is achievable by touching a touch screen with a finger.
  • the touch control device of the present invention can be a built-in or external touch control device.
  • the built-in touch control device is applicable to the touchpad of a laptop computer or a transformable tablet computer (e.g., the Transformer-series tablet computers of ASUSTeK Computer Inc.); in the latter case, the external touch control device can be designed as one connectable to a computer device via a wired or wireless transmission interface (e.g., USB, PS2, infrared, or Bluetooth), such as an external touchpad, a mouse with a touchpad, a controller with a touchpad, a keyboard with a touchpad, or a touch keyboard with a touchpad.
  • a wired or wireless transmission interface e.g., USB, PS2, infrared, or Bluetooth
  • FIG. 1 shows the touch control device in the first embodiment, wherein the touch control device includes an input element 1 and a computation unit 2 .
  • the input element 1 has a touchpad 11 and a control unit 12 .
  • the touchpad 11 is provided with a plurality of sensing elements (not shown) for detecting whether the touchpad 11 is in contact with an operating object 9 .
  • the sensing elements generate a detection signal T 1 according to the position of the operating object 9 .
  • the touchpad 11 serves as a dynamic information input end of the input element 1 so that a user can give instructions and control a cursor by moving the operating object 9 on the touchpad 11 .
  • the touchpad 11 at least has one first area 111 and one second area 112 defined thereon.
  • the first area 111 and the second area 112 can be defined anywhere on the touchpad 11 as needed.
  • the first area 111 is defined in a peripheral area of the touchpad 11
  • the second area 112 is defined in a central area of the touchpad 11 and surrounded by the first area 111 .
  • the control unit 12 is electrically connected to the sensing elements and the computation unit 2 and is configured for receiving the detection signal T 1 and converting the detection signal T 1 into information T 2 related to the coordinates of the operating object 9 on the touchpad 11 (hereinafter referred to as the coordinate information T 2 ).
  • the computation unit 2 is installed under the operating system 3 of a computer device and is configured for converting the coordinate information T 2 into information T 3 related to the corresponding coordinates of the operating object 9 on a screen 4 (hereinafter referred to as the coordinate information T 3 ) and delivering the coordinate information T 3 to the operating system 3 .
  • FIG. 2 schematically shows the process flow of the aforesaid touch control device during use.
  • the control unit 12 and the computation unit 2 are included in a firmware process 7 and are in charge of coordinate conversion, mode determination, and control.
  • the touch control device Upon completing the firmware process, the touch control device sends a message to the operating system 3 so that output information is transmitted to the screen 4 through a driver algorithm 8 in the operating system 3 .
  • FIG. 3 schematically shows how the corresponding coordinates of the operating object 9 on the screen 4 are calculated according to a width ratio and a height ratio between the screen 4 and the touchpad 11 .
  • the computation unit 2 calculates the corresponding coordinates (Xc1, Yc1) of the operating object 9 on the screen 4 and generates a virtual touch control element 9 ′ on the screen 4 accordingly, wherein the virtual touch control element 9 ′ changes its coordinate position on the screen 4 in response to the movement of the operating object 9 on the touchpad 11 .
  • the user can exercise control or execute instructions via the position and displacement of the operating object 9 on the touchpad 11 .
  • FIG. 4 schematically shows how the corresponding coordinates of the operating object 9 on the screen 4 are calculated according to a width ratio and a height ratio between a control window frame 5 and the touchpad 11 .
  • the computation unit 2 uses the last position of a cursor 9 ′′ on the screen 4 as the reference coordinate position and defines the control window frame 5 on the screen 4 according to the reference coordinate position. Consequently, the coordinates (Xc2, Yc2) in the control window frame 5 correspond to the reference coordinate position, and the cursor 9 ′′ is displayed at the coordinates (Xc2, Yc2) in the control window frame 5 .
  • the user can change the coordinate position of the cursor 9 ′′ on the screen 4 and thus control the cursor 9 ′′.
  • the computation unit 2 in FIG. 1 further includes an application software detection tool 21 for determining the number of instruction items of an application software that is currently running.
  • FIG. 5 schematically shows how the touchpad 11 is proportionally mapped onto the screen 4 according to the condition of multiple operating objects on the touchpad 11 .
  • the application software detection tool 21 of the computation unit 2 determines whether the number of instruction items 23 of the currently running application software is greater than one, wherein the instruction items 23 can be graphical items, folders, and so on.
  • the computation unit 2 calculates the corresponding coordinates (Xc3, Yc3) and (Xc3′, Yc3′) of the operating objects 9 a and 9 b on the screen 4 according to the coordinates (Xf3, Yf3) and (Xf3′, Yf3′) of the operating objects 9 a and 9 b on the touchpad 11 and the first set of ratios and generates virtual touch control elements 9 a ′ and 9 b ′ on the screen 4 accordingly.
  • the computation unit 2 defines a virtual frame 6 on the screen 4 , wherein the virtual frame 6 corresponds in position to one of the instruction items 23 .
  • the computation unit 2 calculates the corresponding coordinates (Xc4, Yc4) and (Xc4′, Yc4′) of the operating objects 9 a and 9 b in the virtual frame 6 and generates the virtual touch control elements 9 a ′ and 9 b ′ in the virtual frame 6 accordingly.
  • the user can operate the instruction item 23 by finger actions on the touchpad 11 .
  • FIG. 7 is a flowchart of the control method in the embodiment shown in FIGS. 3 to 6 .
  • the first step S 12 it is detected whether the touchpad 11 is touched by an operating object 9 . If yes, the sensing elements generate the analog detection signal T 1 according to the position of the operating object 9 on the touchpad 11 .
  • the control unit 12 converts the detection signal T 1 into the coordinate information T 2 , which is related to the coordinates of the operating object 9 on the touchpad 11 and is sent by the control unit 12 to the computation unit 2 .
  • the computation unit 2 determines according to the coordinate information T 2 whether the number of the operating object(s) 9 is greater than one.
  • step S 16 the computation unit 2 determines according to the coordinate information T 2 whether the touch on the touchpad 11 by the operating object 9 starts in the first area 111 or the second area 112 . If the touch by the operating object 9 starts in the first area 111 , step S 18 is executed. When necessary, the determination process in step S 16 may be carried out by the control unit 12 instead, before step S 18 is executed.
  • step S 18 involves computation by the computation unit 2 according to the coordinates (Xf1, Yf1) of the operating object 9 on the touchpad 11 and the first set of ratios, wherein the first set of ratios are the width ratio
  • the corresponding coordinates (Xc1, Yc1) of the operating object 9 on the screen 4 are calculated from the coordinates (Xf1, Yf1) of the operating object 9 on the touchpad 11 by the computation unit 2 as
  • the computation unit 2 sends the coordinate information T 3 to the operating system 3 and generates the virtual touch control element 9 ′ on the screen 4 accordingly.
  • the virtual touch control element 9 ′ can be displayed on the screen 4 or hidden from view as needed.
  • the virtual touch control element 9 ′ changes its coordinate position on the screen 4 in response to the movement of the operating object 9 (see FIG. 3 ). This allows the user to control instructions by touching the touchpad 11 with the operating object 9 and by moving the operating object 9 on the touchpad 11 .
  • step S 20 is executed if it is determined in step S 16 that the touch by the operating object 9 starts in the second area 112 .
  • the control window frame 5 is defined on the screen 4 .
  • the shape and area of the control window frame 5 can be the same as or be scaled up or down from those of the touchpad 11 respectively.
  • the computation unit 2 performs computation based on the coordinates (Xf2, Yf2) of the operating object 9 on the touchpad 11 and the second set of ratios.
  • the width-height ratio of the control window frame 5 is defined in advance, and the second set of ratios are the width ratio
  • the corresponding coordinates (Xc2, Yc2) of the operating object 9 in the control window frame 5 are calculated from the coordinates (Xf2, Yf2) of the operating object 9 on the touchpad 11 by the computation unit 2 as
  • the computation unit 2 Upon completing the calculation of the foregoing coordinate information T 3 , the computation unit 2 transmits the coordinate information T 3 to the operating system 3 and, using the last position of the virtual touch control element 9 ′ or the cursor 9 ′′ on the screen 4 as the reference coordinate position, maps the coordinates (Xc2, Yc2) in the control window frame 5 to the reference coordinate position, thereby defining the position of the control window frame 5 on the screen 4 . As a result, the cursor 9 ′′ is displayed at the coordinates (Xc2, Yc2) in the control window frame 5 .
  • the user can change the coordinate position of the cursor 9 ′′ by moving the operating object 9 .
  • the control window frame 5 can be displayed on the screen 4 or hidden from view as desired. Displaying the control window frame 5 on the screen 4 allows the user to know the current position of the control window frame 5 ; however, as the user need not know the position of the control window frame 5 during operation, the user may choose to hide the control window frame 5 or display it in a flashing manner.
  • step S 22 is carried out if it is determined in step S 14 that there are multiple operating objects 9 a and 9 b on the touchpad 11 .
  • the application software detection tool 21 determines the number of instruction items 23 of the currently running application software.
  • the instruction items 23 in this embodiment are the graphical items shown in FIGS. 5 and 6 . If there is only one instruction item 23 , the process moves on to step S 24 .
  • step S 24 the computation unit 2 calculates the corresponding coordinates (Xc3, Yc3) and (Xc3′, Yc3′) of the operating objects 9 a and 9 b on the screen 4 according to the coordinates (Xf3, Yf3) and (Xf3′, Yf3′) of the operating objects 9 a and 9 b on the touchpad 11 and the first set of ratios (see Eq-1).
  • the virtual touch control elements 9 a ′ and 9 b ′ are generated on the screen 4 , allowing the user to operate the single instruction item 23 of the application software intuitively via the multiple operating objects on the touchpad 11 .
  • step S 26 is executed if it is determined in step S 22 that there are multiple instruction items 23 .
  • the virtual frame 6 is defined on the screen 4 , using the positions of the virtual touch control elements 9 a ′ and 9 b ′ on the screen 4 as the reference points. More particularly, the center point or one of the end points of the virtual frame 6 is mapped to the reference point such that the virtual frame 6 corresponds in position to one of the instruction items 23 of the application program.
  • the shape and area of the virtual frame 6 are the same as those of the touchpad 11 respectively, or the area of the virtual frame 6 is scaled up or down from the area of the touchpad 11 according to the area of the instruction item 23 .
  • the computation unit 2 performs computation based on the coordinates (Xf4, Yf4) and (Xf4′, Yf4′) of the operating objects 9 a and 9 b on the touchpad 11 and the third set of ratios, wherein the third set of ratios are the width ratio
  • the corresponding coordinates (Xc4, Yc4) of the operating object 9 a in the virtual frame 6 are calculated from the coordinates (Xf4, Yf4) of the operating object 9 a on the touchpad 11 by the computation unit 2 as
  • the computation unit 2 sends the coordinate information T 3 to the operating system 3 .
  • the virtual touch control elements 9 a ′ and 9 b ′ are generated in the virtual frame 6 , allowing the user to operate the one instruction item 23 of the application program intuitively by means of the multiple operating objects on the touchpad 11 .
  • the virtual frame 6 can be displayed on the screen 4 or hidden from view as needed. Displaying the virtual frame 6 on the screen 4 allows the user to know the current position of the virtual frame 6 , and yet it is not necessary for the user to know such information during operation. Hence, the user may choose to hide the virtual frame 6 or display it in a flashing manner.
  • step S 18 if the operating object 9 leaves the touchpad 11 upon completion of step S 18 , S 20 , S 24 , or S 26 , the process returns to step S 12 to detect whether the touchpad 11 is touched by an operating object 9 .
  • the touch control device in the second embodiment of the present invention has substantially the same construction as its counterpart in the first embodiment except that the first area 111 of the touchpad 11 corresponds to an instruction window W (e.g., a toolbar in the Windows system) at the right edge of the screen 4 and is defined at the right edge of the touchpad 11 .
  • the instruction window W can disappear into the right edge of the screen 4 .
  • the touchpad 11 defines a virtual instruction area 113 .
  • the virtual instruction area 113 is defined in an area of the touchpad 11 that corresponds in position to the instruction window W, and at least two borders of the virtual instruction area 113 coincide with borders of the touchpad 11 respectively.
  • the first area 111 coincides with the virtual instruction area 113 in range.
  • the computation unit 2 includes an event receiver 22 for receiving an event signal T 4 from the operating system 3 .
  • FIG. 9 schematically shows how the instruction window W is opened by moving an operating object.
  • the touchpad 11 is touched by the user's finger (hereinafter referred to as the operating object 9 ) in such a way that the touch starts in the first area 111 of the touchpad 11 and the operating object 9 moves from the first area 111 toward the second area 112 , the virtual touch control element 9 ′ drags the instruction window W, which has disappeared into the right edge of the screen 4 , toward the center of the screen 4 in response to the movement of the operating object 9 . Consequently, the instruction window W is opened and displayed on the screen 4 .
  • the touchpad 11 at this moment defines the virtual instruction area 113 (see FIG. 10 ) according to the position of the instruction window W, so as for the user to operate the instruction window W intuitively through the virtual instruction area 113 .
  • FIG. 11 is a flowchart of the control method in the embodiment shown in FIGS. 9 and 10 .
  • Steps S 12 , S 14 , S 16 , and S 18 in FIG. 11 are the same as those in the previous embodiment and involve determining whether the touch by the operating object 9 starts in the first area 111 and generating the virtual touch control element 9 ′ on the screen 4 .
  • the computation unit 2 determines whether the operating object 9 has moved from the first area 111 toward the second area 112 ; if yes, step S 30 is performed.
  • the first area 111 and the second area 112 defined on the touchpad 11 are temporarily canceled to facilitate the execution of subsequent steps.
  • step S 30 the instruction window W is opened according to the movement of the operating object 9 .
  • the operating system 3 generates the event signal T 4 according to the event instruction being executed and sends the event signal T 4 to the event receiver 22 .
  • the computation unit 2 defines the virtual instruction area 113 on the touchpad 11 according to the event signal T 4 and performs computation based on the coordinates (Xf5, Yf5) of the operating object 9 on the touchpad 11 and a fourth set of ratios, wherein the fourth set of ratios are the width ratio
  • the corresponding coordinates (Xc5, Yc5) of the operating object 9 in instruction window W are calculated from the coordinates (Xf5, Yf5) of the operating object 9 in the virtual instruction area 113 by the computation unit 2 as
  • the computation unit 2 After the calculation of the foregoing coordinate information T 3 is completed, the computation unit 2 sends the coordinate information T 3 to the operating system 3 . Hence, by operating the operating object 9 in the virtual instruction area 113 , the user can execute instructions in the instruction window W or perform other actions.
  • the user's finger may leave the touchpad 11 and then touch the second area 112 of the touchpad 11 .
  • the computation unit 2 will cancel the virtual instruction area 113 on the touchpad 11 and generates the cursor 9 ′′ on the screen 4 through the foregoing steps S 12 , S 14 , S 16 , and S 20 .
  • the user may move the cursor 9 ′′ to the instruction window W and click any instruction key in the instruction window W to execute the desired instruction or application program.
  • FIGS. 12 and 13 illustrate the control method in the third embodiment of the present invention.
  • the coordinate information T 2 generated by the control unit 12 contains the position and time information of each touched point on the touchpad 11 that is touched by the operating object 9
  • the coordinate information T 2 can be used to determine the various actions of the operating object 9 on the touchpad 11 .
  • the touch control actions include double-clicking the touchpad 11 .
  • double-clicking it is meant that the operating object 9 touches the touchpad 11 , leaves the touchpad 11 within a first predetermined time Ta, and touches the touchpad 11 again within a second predetermined time Tb, wherein the distance ⁇ d between the two touched points is less than a threshold value D.
  • the computation unit 2 calculates the corresponding coordinates (Xc5, Yc5) of the operating object 9 in the control window frame 5 according to the coordinates (Xf5, Yf5) of the operating object 9 on the touchpad 11 and the second set of ratios. Also, using the last position of the previous cursor 9 ′′ on the screen 4 as the reference coordinate position, the computation unit 2 defines the control window frame 5 on the screen 4 so as to generate the virtual touch control element 9 ′ in the control window frame 5 accordingly. After double-clicking, the operating object 9 can be moved on the touchpad 11 so that the virtual touch control element 9 ′ generated by double-clicking drags an instruction item 23 on the screen 4 (see FIG. 13 ), switches window pages, or executes other instructions.
  • FIG. 14 is a flowchart of the control method in the embodiment shown in FIGS. 12 and 13 .
  • step S 12 it is detected in step S 12 whether the touchpad 11 is touched by an operating object 9 . If yes, it is determined in step S 14 whether the number of the operating object 9 is greater than one. If there is only one operating object 9 , step S 151 is performed in which the time ⁇ t 1 for which the operating object 9 touches the touchpad 11 is counted. It is also determined whether the operating object 9 has left the touchpad 11 . In step S 152 , it is determined by comparison whether ⁇ t 1 is less than the first predetermined time Ta.
  • step S 153 the time ⁇ t 2 for which the operating object 9 has left the touchpad 11 is counted in step S 153 . It is also determined whether the operating object 9 has touched the touchpad 11 again. If the operating object 9 has touched the touchpad 11 again, it is determined in step S 154 by comparison whether ⁇ t 2 is less than the second predetermined time Tb. If ⁇ t 2 is less than the second predetermined time Tb, it is determined in step S 155 whether the distance ⁇ d between the aforesaid two touched points is less than the threshold value D. If ⁇ d ⁇ D, it is determined that the touch control action performed by the operating object 9 is double-clicking, and the process goes on to step S 156 .
  • step S 156 the computation unit 2 calculates the corresponding coordinates (Xc5, Yc5) of the operating object 9 in the control window frame 5 according to the second set of ratios as well as the coordinates (Xf5, Yf5) of the operating object 9 that correspond to the operating object 9 's second touch on the touchpad 11 .
  • the second set of ratios have been described in the previous embodiments and therefore are not repeated here.
  • the computation unit 2 Upon completing the calculation of the foregoing coordinate information T 3 , the computation unit 2 sends the coordinate information T 3 to the operating system 3 and, using as the reference coordinate position the position of the virtual touch control element 9 ′ or cursor 9 ′′ last appearing on the screen 4 , maps the coordinates (Xc5, Yc5) in the control window frame 5 onto the reference coordinate position, thereby defining the position of the control window frame 5 on the screen 4 . Then, the virtual touch control element 9 ′ is generated at the coordinates (Xc5, Yc5) in the control window frame 5 .
  • step S 158 it is detected in step S 158 whether the operating object 9 has displaced on the touchpad 11 . If the operating object 9 has displaced on the touchpad 11 , it is also determined whether the displacement ⁇ m of the operating object 9 is greater than a preset value M. If ⁇ m>M, the virtual touch control element 9 ′ executes the instruction of window page switching according to the displacement direction of the operating object 9 .

Abstract

A method for controlling a touch control device includes defining a first area and a second area on a touchpad, detecting whether a touched position on the touchpad that is touched by an operating object falls in the first area or the second area, and calculating the corresponding on-screen coordinates of the operating object with different sets of ratios depending on the touched position.

Description

    CROSS-REFERENCE TO RELATED APPLICATIONS
  • This application claims the benefit of U.S. provisional application filed on Nov. 11, 2011 and having application Ser. No. 61/558,457, the entire contents of which are hereby incorporated herein by reference
  • This application is based upon and claims priority under 35 U.S.C. 119 from Taiwan Patent Application No. 101102526 filed Jan. 20, 2012, which is hereby specifically incorporated herein by this reference thereto.
  • This application is a divisional application of U.S. patent application filed on Jul. 5, 2012 and having application Ser. No. 13/542,592, the entire contents of which are hereby incorporated herein by reference
  • BACKGROUND OF THE INVENTION
  • 1. Field of the Invention
  • The present invention is related generally to a touch control device and, more particularly, to a touch control device and a control method thereof so that, when a touchpad is touched by an operating object, the corresponding coordinates of the operating object on a screen can be obtained by calculating with different sets of ratios depending on the touched position on the touchpad.
  • 2. Description of the Prior Arts
  • With the continuous improvement of touch control technology, touch control operation is applicable not only to the small touch screens of the conventional touch-screen mobile devices such as mobile phones and satellite-based navigation devices, but also to operating systems that provide multi-touch functions, such as Microsoft's Windows 7 and Windows 8 and Apple Inc.'s iPhone OS. Thus, the touch control operation environment has extended from portable devices to desktop devices, allowing users to perform various operations directly on large touch screens.
  • Nowadays, the development of operating systems supporting touch control operation has gradually matured, and yet large touch screens are disadvantaged by high costs and by the limitation that users must be within a very short distance from the screens in order to exercise touch control. Therefore, touch control devices other than touch screens (e.g., touchpads) have been devised for touch control operation. These touch control devices, however, are typically designed only for controlling the cursor on a screen and are intended mainly as a substitute for the existing cursor controllers such as external mice or trackballs. In contrast to touch screens, which can be used to give actuation instructions directly by a finger touch on the screens, the aforesaid touch control devices provide no such a function when touched by a user's finger. Hence, a touch control device capable of simulating the effect of a finger touch on a touch screen is desirable.
  • SUMMARY OF THE INVENTION
  • It is an object of the present invention to provide a touch control device and a control method thereof so that, when a touchpad is touched by an operating object, the corresponding coordinates of the operating object on a screen can be obtained by calculating with different sets of ratios depending on the touched position on the touchpad.
  • It is another object of the present invention to provide a touch control device and a control method thereof so that instruction windows can be opened according to the movement of an operating object.
  • Still another object of the present invention is to provide a touch control device and a control method thereof so that different instructions can be executed according to the touch control action of an operating object on a touchpad.
  • To achieve the above and other objects, the present invention provides a touch control device including an input element and a computation unit. The input element has a touchpad and a control unit. The touchpad includes a first area and a second area. The control unit is connected to the touchpad and is configured for detecting the coordinates of an operating object on the touchpad. The computation unit is connected to the control unit. If the touch by the operating object starts in the first area, the computation unit calculates the corresponding coordinates of the operating object on a screen according to the coordinates of the operating object on the touchpad and a first set of ratios. If the touch by the operating object starts in the second area, the computation unit calculates the corresponding coordinates of the operating object on the screen according to the coordinates of the operating object on the touchpad and a second set of ratios.
  • The present invention also provides a method for controlling a touch control device, and the method is carried out as follows. To begin with, a first area and a second area are defined on a touchpad. Then, it is determined whether a touch by an operating object starts in the first area or the second area. If the touch by the operating object starts in the first area, the corresponding coordinates of the operating object on a screen are calculated according to the coordinates of the operating object on the touchpad and a first set of ratios. If the touch by the operating object starts in the second area, the corresponding coordinates of the operating object on the screen are calculated according to the coordinates of the operating object on the touchpad and a second set of ratios.
  • The present invention also provides a touch control device including an input element and a computation unit. The input element has a touchpad and a control unit. The touchpad includes a first area and a second area, wherein the first area is a peripheral area of the touchpad. The control unit is connected to the touchpad and is configured for detecting the movement of an operating object on the touchpad. The computation unit is connected to the control unit. If the computation unit determines that the operating object has moved on the touchpad from the first area toward the second area, an instruction window is opened on a screen according to the movement of the operating object.
  • The present invention also provides a method for controlling a touch control device, and the method is carried out as follows. To begin with, a first area and a second area are defined on a touchpad, wherein the first area is a peripheral area of the touchpad. Then, the movement of an operating object on the touchpad is detected, and it is determined whether the operating object has moved on the touchpad from the first area toward the second area. If the operating object has moved on the touchpad from the first area toward the second area, an instruction window is opened on a screen according to the movement of the operating object.
  • The present invention also provides a method for controlling a touch control device, and the method includes determining the number of operating objects on a touchpad and determining the number of instruction items of an application program that is currently running. If the number of operating objects on the touchpad is greater than one and if the number of instruction items of the application program is greater than one, a virtual frame is defined on a screen. The corresponding coordinates of the operating objects in the virtual frame are calculated according to the coordinates of the operating objects on the touchpad and a set of ratios.
  • The present invention also provides a method for controlling a touch control device, and the method includes determining the number of operating objects on a touchpad. If there is only one operating object, it is then determined whether the operating object has double-clicked the touchpad. If the operating object has double-clicked the touchpad, a virtual touch control element is generated on a screen, and it is detected whether the operating object has displaced on the touchpad. If the operating object has displaced on the touchpad, a window page switching instruction is executed.
  • The present invention also provides a method for controlling a touch control device, and the method includes determining the number of operating objects on a touchpad. If there is only one operating object, it is then determined whether the operating object has performed a touch control action on the touchpad. If the operating object has performed a touch control action on the touchpad, a control window frame is defined on a screen, and the corresponding coordinates of the operating object in the control window frame are calculated according to the coordinates of the operating object on the touchpad and a set of ratios.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • These and other objects, features and advantages of the present invention will become apparent to those skilled in the art upon consideration of the following description of the preferred embodiments of the present invention taken in conjunction with the accompanying drawings, in which:
  • FIG. 1 is a system structure diagram of the touch control device in the first embodiment of the present invention;
  • FIG. 2 is another system structure diagram of the touch control device in the first embodiment of the present invention;
  • FIG. 3 schematically shows how in the first embodiment of the present invention the corresponding coordinate position of an operating object on a screen is calculated according to a width ratio and a height ratio between the screen and a touchpad;
  • FIG. 4 schematically shows how in the first embodiment of the present invention the corresponding coordinate position of an operating object on a screen is calculated according to a width ratio and a height ratio between a control window frame and a touchpad;
  • FIG. 5 schematically shows how in the first embodiment of the present invention a touchpad is proportionally mapped onto a screen according to the condition of multiple operating objects on the touchpad;
  • FIG. 6 schematically shows how in the first embodiment of the present invention a virtual frame corresponding to a touchpad is mapped onto a screen according to the condition of multiple operating objects on the touchpad;
  • FIG. 7 is a flowchart of the control method in the first embodiment of the present invention;
  • FIG. 8 is a system structure diagram of the touch control device in the second embodiment of the present invention;
  • FIG. 9 schematically shows how in the second embodiment of the present invention an instruction window is opened on a screen by moving an operating object;
  • FIG. 10 schematically shows how in the second embodiment of the present invention an operating object operates an instruction window through a virtual instruction area;
  • FIG. 11 is a flowchart of the control method in the second embodiment of the present invention;
  • FIG. 12 schematically shows how in the third embodiment of the present invention the corresponding coordinates of an operating object on a screen are calculated according to a width ratio and a height ratio between a control window frame and a touchpad;
  • FIG. 13 schematically shows how in the third embodiment of the present invention a graphical item is dragged on a screen by moving an operating object;
  • FIG. 14 is a flowchart of the control method in the third embodiment of the present invention.
  • DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENTS
  • The present invention mainly provides a touch control device for use with an operating system that supports touch control operation, such as Windows 7, Windows 8, and iPhone OS, so as to enable intuitive operation similar to what is achievable by touching a touch screen with a finger. The touch control device of the present invention can be a built-in or external touch control device. In the former case, the built-in touch control device is applicable to the touchpad of a laptop computer or a transformable tablet computer (e.g., the Transformer-series tablet computers of ASUSTeK Computer Inc.); in the latter case, the external touch control device can be designed as one connectable to a computer device via a wired or wireless transmission interface (e.g., USB, PS2, infrared, or Bluetooth), such as an external touchpad, a mouse with a touchpad, a controller with a touchpad, a keyboard with a touchpad, or a touch keyboard with a touchpad.
  • FIG. 1 shows the touch control device in the first embodiment, wherein the touch control device includes an input element 1 and a computation unit 2. The input element 1 has a touchpad 11 and a control unit 12. The touchpad 11 is provided with a plurality of sensing elements (not shown) for detecting whether the touchpad 11 is in contact with an operating object 9. The sensing elements generate a detection signal T1 according to the position of the operating object 9. The touchpad 11 serves as a dynamic information input end of the input element 1 so that a user can give instructions and control a cursor by moving the operating object 9 on the touchpad 11. The touchpad 11 at least has one first area 111 and one second area 112 defined thereon. The first area 111 and the second area 112 can be defined anywhere on the touchpad 11 as needed. In FIG. 1, the first area 111 is defined in a peripheral area of the touchpad 11, and the second area 112 is defined in a central area of the touchpad 11 and surrounded by the first area 111. The control unit 12 is electrically connected to the sensing elements and the computation unit 2 and is configured for receiving the detection signal T1 and converting the detection signal T1 into information T2 related to the coordinates of the operating object 9 on the touchpad 11 (hereinafter referred to as the coordinate information T2). The computation unit 2 is installed under the operating system 3 of a computer device and is configured for converting the coordinate information T2 into information T3 related to the corresponding coordinates of the operating object 9 on a screen 4 (hereinafter referred to as the coordinate information T3) and delivering the coordinate information T3 to the operating system 3. FIG. 2 schematically shows the process flow of the aforesaid touch control device during use. The control unit 12 and the computation unit 2 are included in a firmware process 7 and are in charge of coordinate conversion, mode determination, and control. Upon completing the firmware process, the touch control device sends a message to the operating system 3 so that output information is transmitted to the screen 4 through a driver algorithm 8 in the operating system 3.
  • FIG. 3 schematically shows how the corresponding coordinates of the operating object 9 on the screen 4 are calculated according to a width ratio and a height ratio between the screen 4 and the touchpad 11. If a user's finger (hereinafter referred to as the operating object 9) touches the touchpad 11 and the touch starts in the first area 111 of the touch pad 11, the control unit 12 sends the coordinate information T2 of the operating object 9 to the computation unit 2. Based on the coordinates (Xf1, Yf1) of the operating object 9 on the touchpad 11 and a first set of ratios, the computation unit 2 calculates the corresponding coordinates (Xc1, Yc1) of the operating object 9 on the screen 4 and generates a virtual touch control element 9′ on the screen 4 accordingly, wherein the virtual touch control element 9′ changes its coordinate position on the screen 4 in response to the movement of the operating object 9 on the touchpad 11. The user can exercise control or execute instructions via the position and displacement of the operating object 9 on the touchpad 11.
  • FIG. 4 schematically shows how the corresponding coordinates of the operating object 9 on the screen 4 are calculated according to a width ratio and a height ratio between a control window frame 5 and the touchpad 11. If the touch on the touchpad 11 by the operating object 9 starts in the second area 112, the control unit 12 sends the coordinate information T2 of the operating object 9 to the computation unit 2. Based on the coordinates (Xf2, Yf2) of the operating object 9 on the touchpad 11 and a second set of ratios, the computation unit 2 calculates the corresponding coordinates (Xc2, Yc2) of the operating object 9 in the control window frame 5. In addition, the computation unit 2 uses the last position of a cursor 9″ on the screen 4 as the reference coordinate position and defines the control window frame 5 on the screen 4 according to the reference coordinate position. Consequently, the coordinates (Xc2, Yc2) in the control window frame 5 correspond to the reference coordinate position, and the cursor 9″ is displayed at the coordinates (Xc2, Yc2) in the control window frame 5. By moving the operating object 9 on the touchpad 11, the user can change the coordinate position of the cursor 9″ on the screen 4 and thus control the cursor 9″.
  • The computation unit 2 in FIG. 1 further includes an application software detection tool 21 for determining the number of instruction items of an application software that is currently running. FIG. 5 schematically shows how the touchpad 11 is proportionally mapped onto the screen 4 according to the condition of multiple operating objects on the touchpad 11. When the touchpad 11 is touched by multiple operating objects 9 a and 9 b, the application software detection tool 21 of the computation unit 2 determines whether the number of instruction items 23 of the currently running application software is greater than one, wherein the instruction items 23 can be graphical items, folders, and so on. If there is only one instruction item 23, the computation unit 2 calculates the corresponding coordinates (Xc3, Yc3) and (Xc3′, Yc3′) of the operating objects 9 a and 9 b on the screen 4 according to the coordinates (Xf3, Yf3) and (Xf3′, Yf3′) of the operating objects 9 a and 9 b on the touchpad 11 and the first set of ratios and generates virtual touch control elements 9 a′ and 9 b′ on the screen 4 accordingly. Referring to FIG. 6, if there are plural instruction items 23, the computation unit 2 defines a virtual frame 6 on the screen 4, wherein the virtual frame 6 corresponds in position to one of the instruction items 23. Then, based on the coordinates (Xf4, Yf4) and (Xf4′, Yf4′) of the operating objects 9 a and 9 b on the touchpad 11 and a third set of ratios, the computation unit 2 calculates the corresponding coordinates (Xc4, Yc4) and (Xc4′, Yc4′) of the operating objects 9 a and 9 b in the virtual frame 6 and generates the virtual touch control elements 9 a′ and 9 b′ in the virtual frame 6 accordingly. Thus, the user can operate the instruction item 23 by finger actions on the touchpad 11.
  • FIG. 7 is a flowchart of the control method in the embodiment shown in FIGS. 3 to 6. In the first step S12, it is detected whether the touchpad 11 is touched by an operating object 9. If yes, the sensing elements generate the analog detection signal T1 according to the position of the operating object 9 on the touchpad 11. The control unit 12 converts the detection signal T1 into the coordinate information T2, which is related to the coordinates of the operating object 9 on the touchpad 11 and is sent by the control unit 12 to the computation unit 2. In step S14, the computation unit 2 determines according to the coordinate information T2 whether the number of the operating object(s) 9 is greater than one. If no, the computation unit 2 determines that there is only one operating object 9 on the touchpad 11, and the process goes on to step S16. In step S16, the computation unit 2 determines according to the coordinate information T2 whether the touch on the touchpad 11 by the operating object 9 starts in the first area 111 or the second area 112. If the touch by the operating object 9 starts in the first area 111, step S18 is executed. When necessary, the determination process in step S16 may be carried out by the control unit 12 instead, before step S18 is executed.
  • Referring to FIG. 7 in conjunction with FIG. 3, step S18 involves computation by the computation unit 2 according to the coordinates (Xf1, Yf1) of the operating object 9 on the touchpad 11 and the first set of ratios, wherein the first set of ratios are the width ratio
  • H screen H device
  • and the height ratio
  • V screen V device
  • between the screen 4 and the touchpad 11. The corresponding coordinates (Xc1, Yc1) of the operating object 9 on the screen 4 are calculated from the coordinates (Xf1, Yf1) of the operating object 9 on the touchpad 11 by the computation unit 2 as
  • H c 1 = H screen H device × X f 1 , Y c 1 = V screen V device × Y f 1 . [ Eq - 1 ]
  • Once the calculation of the above coordinate information T3 is completed, the computation unit 2 sends the coordinate information T3 to the operating system 3 and generates the virtual touch control element 9′ on the screen 4 accordingly. The virtual touch control element 9′ can be displayed on the screen 4 or hidden from view as needed. When the operating object 9 slides on the touchpad 11, the virtual touch control element 9′ changes its coordinate position on the screen 4 in response to the movement of the operating object 9 (see FIG. 3). This allows the user to control instructions by touching the touchpad 11 with the operating object 9 and by moving the operating object 9 on the touchpad 11.
  • Referring to FIG. 7 in conjunction with FIG. 4, step S20 is executed if it is determined in step S16 that the touch by the operating object 9 starts in the second area 112. In step S20, the control window frame 5 is defined on the screen 4. The shape and area of the control window frame 5 can be the same as or be scaled up or down from those of the touchpad 11 respectively. More specifically, the computation unit 2 performs computation based on the coordinates (Xf2, Yf2) of the operating object 9 on the touchpad 11 and the second set of ratios. The width-height ratio of the control window frame 5 is defined in advance, and the second set of ratios are the width ratio
  • H frame 1 H device
  • and the height ratio
  • V frame 1 V device
  • between the control window frame 5 and the touchpad 11. The corresponding coordinates (Xc2, Yc2) of the operating object 9 in the control window frame 5 are calculated from the coordinates (Xf2, Yf2) of the operating object 9 on the touchpad 11 by the computation unit 2 as
  • X c 2 = H frame 1 H device × X f 2 , Y c 2 = V frame 1 V device × Y f 2 . [ Eq - 2 ]
  • Upon completing the calculation of the foregoing coordinate information T3, the computation unit 2 transmits the coordinate information T3 to the operating system 3 and, using the last position of the virtual touch control element 9′ or the cursor 9″ on the screen 4 as the reference coordinate position, maps the coordinates (Xc2, Yc2) in the control window frame 5 to the reference coordinate position, thereby defining the position of the control window frame 5 on the screen 4. As a result, the cursor 9″ is displayed at the coordinates (Xc2, Yc2) in the control window frame 5.
  • The user can change the coordinate position of the cursor 9″ by moving the operating object 9. In addition, the control window frame 5 can be displayed on the screen 4 or hidden from view as desired. Displaying the control window frame 5 on the screen 4 allows the user to know the current position of the control window frame 5; however, as the user need not know the position of the control window frame 5 during operation, the user may choose to hide the control window frame 5 or display it in a flashing manner.
  • Referring to FIG. 7 in conjunction with FIG. 5, step S22 is carried out if it is determined in step S14 that there are multiple operating objects 9 a and 9 b on the touchpad 11. In step S22, the application software detection tool 21 determines the number of instruction items 23 of the currently running application software. The instruction items 23 in this embodiment are the graphical items shown in FIGS. 5 and 6. If there is only one instruction item 23, the process moves on to step S24. In step S24, the computation unit 2 calculates the corresponding coordinates (Xc3, Yc3) and (Xc3′, Yc3′) of the operating objects 9 a and 9 b on the screen 4 according to the coordinates (Xf3, Yf3) and (Xf3′, Yf3′) of the operating objects 9 a and 9 b on the touchpad 11 and the first set of ratios (see Eq-1). Thus, the virtual touch control elements 9 a′ and 9 b′ are generated on the screen 4, allowing the user to operate the single instruction item 23 of the application software intuitively via the multiple operating objects on the touchpad 11.
  • Referring to FIG. 7 in conjunction with FIG. 6, step S26 is executed if it is determined in step S22 that there are multiple instruction items 23. In step S26, the virtual frame 6 is defined on the screen 4, using the positions of the virtual touch control elements 9 a′ and 9 b′ on the screen 4 as the reference points. More particularly, the center point or one of the end points of the virtual frame 6 is mapped to the reference point such that the virtual frame 6 corresponds in position to one of the instruction items 23 of the application program. Preferably, the shape and area of the virtual frame 6 are the same as those of the touchpad 11 respectively, or the area of the virtual frame 6 is scaled up or down from the area of the touchpad 11 according to the area of the instruction item 23. Following that, the computation unit 2 performs computation based on the coordinates (Xf4, Yf4) and (Xf4′, Yf4′) of the operating objects 9 a and 9 b on the touchpad 11 and the third set of ratios, wherein the third set of ratios are the width ratio
  • H frame 2 H device
  • and the height ratio
  • V frame 2 V device
  • between the virtual frame 6 and the touchpad 11. The corresponding coordinates (Xc4, Yc4) of the operating object 9 a in the virtual frame 6 are calculated from the coordinates (Xf4, Yf4) of the operating object 9 a on the touchpad 11 by the computation unit 2 as
  • X c 4 = H frame 2 H device × X f 4 , Y c 4 = V frame 2 V device × Y f 4 . [ Eq - 3 ]
  • A similar calculation is performed for the operating object 9 b. Upon completing the calculation of the above coordinate information T3, the computation unit 2 sends the coordinate information T3 to the operating system 3. Accordingly, the virtual touch control elements 9 a′ and 9 b′ are generated in the virtual frame 6, allowing the user to operate the one instruction item 23 of the application program intuitively by means of the multiple operating objects on the touchpad 11.
  • Moreover, the virtual frame 6 can be displayed on the screen 4 or hidden from view as needed. Displaying the virtual frame 6 on the screen 4 allows the user to know the current position of the virtual frame 6, and yet it is not necessary for the user to know such information during operation. Hence, the user may choose to hide the virtual frame 6 or display it in a flashing manner.
  • In addition, if the operating object 9 leaves the touchpad 11 upon completion of step S18, S20, S24, or S26, the process returns to step S12 to detect whether the touchpad 11 is touched by an operating object 9.
  • Referring to FIGS. 8 to 10, the touch control device in the second embodiment of the present invention has substantially the same construction as its counterpart in the first embodiment except that the first area 111 of the touchpad 11 corresponds to an instruction window W (e.g., a toolbar in the Windows system) at the right edge of the screen 4 and is defined at the right edge of the touchpad 11. The instruction window W can disappear into the right edge of the screen 4. Besides, the touchpad 11 defines a virtual instruction area 113. Preferably, the virtual instruction area 113 is defined in an area of the touchpad 11 that corresponds in position to the instruction window W, and at least two borders of the virtual instruction area 113 coincide with borders of the touchpad 11 respectively. In this embodiment, the first area 111 coincides with the virtual instruction area 113 in range. Furthermore, the computation unit 2 includes an event receiver 22 for receiving an event signal T4 from the operating system 3.
  • FIG. 9 schematically shows how the instruction window W is opened by moving an operating object. When the touchpad 11 is touched by the user's finger (hereinafter referred to as the operating object 9) in such a way that the touch starts in the first area 111 of the touchpad 11 and the operating object 9 moves from the first area 111 toward the second area 112, the virtual touch control element 9′ drags the instruction window W, which has disappeared into the right edge of the screen 4, toward the center of the screen 4 in response to the movement of the operating object 9. Consequently, the instruction window W is opened and displayed on the screen 4. The touchpad 11 at this moment defines the virtual instruction area 113 (see FIG. 10) according to the position of the instruction window W, so as for the user to operate the instruction window W intuitively through the virtual instruction area 113.
  • FIG. 11 is a flowchart of the control method in the embodiment shown in FIGS. 9 and 10. Steps S12, S14, S16, and S18 in FIG. 11 are the same as those in the previous embodiment and involve determining whether the touch by the operating object 9 starts in the first area 111 and generating the virtual touch control element 9′ on the screen 4. Then, in step S28, the computation unit 2 determines whether the operating object 9 has moved from the first area 111 toward the second area 112; if yes, step S30 is performed. In addition, upon completing the determination process of step S28, the first area 111 and the second area 112 defined on the touchpad 11 are temporarily canceled to facilitate the execution of subsequent steps. In step S30, the instruction window W is opened according to the movement of the operating object 9. Meanwhile, the operating system 3 generates the event signal T4 according to the event instruction being executed and sends the event signal T4 to the event receiver 22. In the following step S32, the computation unit 2 defines the virtual instruction area 113 on the touchpad 11 according to the event signal T4 and performs computation based on the coordinates (Xf5, Yf5) of the operating object 9 on the touchpad 11 and a fourth set of ratios, wherein the fourth set of ratios are the width ratio
  • H window H area
  • and the height ratio
  • V window V area
  • between the instruction window W and the virtual instruction area 113. The corresponding coordinates (Xc5, Yc5) of the operating object 9 in instruction window W are calculated from the coordinates (Xf5, Yf5) of the operating object 9 in the virtual instruction area 113 by the computation unit 2 as
  • X c 5 = H window H area × X f 5 , Y c 5 = V window V area × Y f 5 . [ Eq - 4 ]
  • After the calculation of the foregoing coordinate information T3 is completed, the computation unit 2 sends the coordinate information T3 to the operating system 3. Hence, by operating the operating object 9 in the virtual instruction area 113, the user can execute instructions in the instruction window W or perform other actions.
  • In a different aspect, once the instruction window W is opened in step S30, the user's finger may leave the touchpad 11 and then touch the second area 112 of the touchpad 11. In that case, the computation unit 2 will cancel the virtual instruction area 113 on the touchpad 11 and generates the cursor 9″ on the screen 4 through the foregoing steps S12, S14, S16, and S20. The user may move the cursor 9″ to the instruction window W and click any instruction key in the instruction window W to execute the desired instruction or application program.
  • FIGS. 12 and 13 illustrate the control method in the third embodiment of the present invention. As the coordinate information T2 generated by the control unit 12 contains the position and time information of each touched point on the touchpad 11 that is touched by the operating object 9, the coordinate information T2 can be used to determine the various actions of the operating object 9 on the touchpad 11. In this embodiment, the touch control actions include double-clicking the touchpad 11. By “double-clicking”, it is meant that the operating object 9 touches the touchpad 11, leaves the touchpad 11 within a first predetermined time Ta, and touches the touchpad 11 again within a second predetermined time Tb, wherein the distance Δd between the two touched points is less than a threshold value D. When it is determined that the touch control action performed by the operating object 9 is double-clicking the touchpad 11, the computation unit 2 calculates the corresponding coordinates (Xc5, Yc5) of the operating object 9 in the control window frame 5 according to the coordinates (Xf5, Yf5) of the operating object 9 on the touchpad 11 and the second set of ratios. Also, using the last position of the previous cursor 9″ on the screen 4 as the reference coordinate position, the computation unit 2 defines the control window frame 5 on the screen 4 so as to generate the virtual touch control element 9′ in the control window frame 5 accordingly. After double-clicking, the operating object 9 can be moved on the touchpad 11 so that the virtual touch control element 9′ generated by double-clicking drags an instruction item 23 on the screen 4 (see FIG. 13), switches window pages, or executes other instructions.
  • FIG. 14 is a flowchart of the control method in the embodiment shown in FIGS. 12 and 13. To begin with, it is detected in step S12 whether the touchpad 11 is touched by an operating object 9. If yes, it is determined in step S14 whether the number of the operating object 9 is greater than one. If there is only one operating object 9, step S151 is performed in which the time Δt1 for which the operating object 9 touches the touchpad 11 is counted. It is also determined whether the operating object 9 has left the touchpad 11. In step S152, it is determined by comparison whether Δt1 is less than the first predetermined time Ta. If Δt1 is less than the first predetermined time Ta, the time Δt2 for which the operating object 9 has left the touchpad 11 is counted in step S153. It is also determined whether the operating object 9 has touched the touchpad 11 again. If the operating object 9 has touched the touchpad 11 again, it is determined in step S154 by comparison whether Δt2 is less than the second predetermined time Tb. If Δt2 is less than the second predetermined time Tb, it is determined in step S155 whether the distance Δd between the aforesaid two touched points is less than the threshold value D. If Δd<D, it is determined that the touch control action performed by the operating object 9 is double-clicking, and the process goes on to step S156.
  • In step S156, the computation unit 2 calculates the corresponding coordinates (Xc5, Yc5) of the operating object 9 in the control window frame 5 according to the second set of ratios as well as the coordinates (Xf5, Yf5) of the operating object 9 that correspond to the operating object 9's second touch on the touchpad 11. The second set of ratios have been described in the previous embodiments and therefore are not repeated here. Upon completing the calculation of the foregoing coordinate information T3, the computation unit 2 sends the coordinate information T3 to the operating system 3 and, using as the reference coordinate position the position of the virtual touch control element 9′ or cursor 9″ last appearing on the screen 4, maps the coordinates (Xc5, Yc5) in the control window frame 5 onto the reference coordinate position, thereby defining the position of the control window frame 5 on the screen 4. Then, the virtual touch control element 9′ is generated at the coordinates (Xc5, Yc5) in the control window frame 5.
  • After the virtual touch control element 9′ is generated, it is detected in step S158 whether the operating object 9 has displaced on the touchpad 11. If the operating object 9 has displaced on the touchpad 11, it is also determined whether the displacement Δm of the operating object 9 is greater than a preset value M. If Δm>M, the virtual touch control element 9′ executes the instruction of window page switching according to the displacement direction of the operating object 9.

Claims (28)

What is claimed is:
1. A touch control device, comprising:
an input element having a touchpad and a control unit, the touchpad comprising a first area and a second area, the first area being a peripheral area of the touchpad, the control unit being connected to the touchpad and configured for detecting a touch on the touchpad by an operating object and a movement of the operating object on the touchpad; and
a computation unit connected to the control unit, wherein if the computation unit determines that the operating object has moved from the first area toward the second area, an instruction window is opened on a screen.
2. The touch control device of claim 1, wherein the touchpad generates a detection signal in response to the touch by the operating object and sends the detection signal to the control unit, and the control unit generates coordinates of the operating object on the touchpad according to the detection signal.
3. The touch control device of claim 2, wherein the control unit determines according to the detection signal whether the touch on the touchpad by the operating object starts in the first area or the second area.
4. The touch control device of claim 2, wherein the computation unit determines according to coordinates output from the control unit whether the touch on the touchpad by the operating object starts in the first area or the second area.
5. The touch control device of claim 1, wherein the second area is defined in a central area of the touchpad and surrounded by the first area.
6. The touch control device of claim 1, wherein the computation unit is connected to an operating system and is provided with an event receiver for receiving an event signal from the operating system.
7. A method for controlling a touch control device including a touchpad having a first area and a second area, the first area being a peripheral area of the touchpad, the method comprising steps of:
A.) detecting a touch on the touchpad by an operating object and a movement of the operating object on the touchpad;
B.) determining whether the operating object has moved from the first area toward the second area; and
C.) opening an instruction window on a screen according to the movement of the operating object, if the operating object has moved from the first area toward the second area.
8. The method of claim 7, wherein the step A comprises steps of:
generating a detection signal in response to the touch on the touchpad by the operating object; and
generating first coordinates according to the detection signal.
9. The method of claim 8, wherein the step A comprises a step of determining according to the detection signal whether the touch on the touchpad by the operating object starts in the first area or the second area.
10. The method of claim 8, wherein the step A comprises a step of determining whether the touch by the operating object starts in the first area or the second area according to coordinates provided by a control unit.
11. The method of claim 7, further comprising a step of calculating corresponding coordinates of the operating object on the screen according to coordinates of the operating object on the touchpad and a width ratio and a height ratio between the screen and the touchpad if the touch by the operating object starts in the first area.
12. The method of claim 7, wherein the second area is defined in a central area of the touchpad and surrounded by the first area.
13. The method of claim 7, further comprising a step of defining a virtual instruction area on the touchpad according to a position of the instruction window on the screen after the instruction window is opened.
14. The method of claim 13, wherein the virtual instruction area is defined in the peripheral area of the touchpad and has at least two borders coinciding respectively with borders of the touchpad.
15. The method of claim 13, further comprising a step of calculating corresponding coordinates of the operating object in the instruction window according to coordinates of the operating object in the virtual instruction area and a width ratio and a height ratio between the instruction window and the virtual instruction area.
16. The method of claim 7, further comprising a step of defining a third area corresponding to the instruction window on the touchpad after the instruction window is opened.
17. The method of claim 16, wherein the third area is defined in a peripheral area of the touchpad and has at least two borders coinciding respectively with borders of the touchpad.
18. The method of claim 16, further comprising a step of calculating corresponding coordinates of the operating object in the instruction window according to coordinates of the operating object in the third area and a width ratio and a height ratio between the instruction window and the third area.
19. A method for controlling a touch control device, comprising steps of:
A) determining the number of operating objects on a touchpad;
B) determining, if there is one and only one said operating object, whether the operating object has double-clicked the touchpad;
C) detecting, if the operating object has double-clicked the touchpad, whether the operating object has displaced on the touchpad; and
D) executing a window page switching instruction if the operating object has displaced on the touchpad.
20. The method of claim 19, further comprising a step of defining a control window frame on a screen and calculating corresponding first coordinates of the operating object in the control window frame according to coordinates of the operating object on the touchpad and a second set of ratios if the operating object has double-clicked the touchpad.
21. The method of claim 20, further comprising a step of generating a virtual touch control element at the first coordinates if the operating object has double-clicked the touchpad.
22. The method of claim 20, wherein the second set of ratios are a width ratio and a height ratio between the control window frame and the touchpad.
23. The method of claim 20, further comprising a step of using a position of a virtual touch control element or cursor last appearing on the screen as a reference coordinate position and mapping the first coordinates in the control window frame to the reference coordinate position so as to define a position of the control window frame on the screen.
24. A method for controlling a touch control device, comprising steps of:
determining the number of operating objects on a touchpad;
determining, if there is one and only one said operating object, whether the operating object has performed a touch control action on the touchpad; and
defining a control window frame on a screen and calculating, according to coordinates of the operating object on the touchpad and a second set of ratios, corresponding first coordinates of the operating object in the control window frame, if the operating object has performed the touch control action on the touchpad.
25. The method of claim 24, further comprising a step of generating a virtual touch control element at the first coordinates if the touch control action is double clicking.
26. The method of claim 24, wherein the second set of ratios are a width ratio and a height ratio between the control window frame and the touchpad.
27. The method of claim 25, further comprising a step of using a position of a said virtual touch control element or cursor last appearing on the screen as a reference coordinate position and mapping the first coordinates in the control window frame to the reference coordinate position so as to define a position of the control window frame on the screen.
28. The Method of claim 25, further comprising a step of detecting whether the operating object has displaced on the touchpad after the virtual touch control element is generated, and executing a window page switching instruction according to the displacement by the virtual touch control element if a displacement of the operating object on the touchpad is detected.
US14/936,376 2011-11-11 2015-11-09 Touch control device and method Abandoned US20160062543A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US14/936,376 US20160062543A1 (en) 2011-11-11 2015-11-09 Touch control device and method

Applications Claiming Priority (5)

Application Number Priority Date Filing Date Title
US201161558457P 2011-11-11 2011-11-11
TW101102526 2012-01-20
TW101102526A TWI451309B (en) 2011-11-11 2012-01-20 Touch device and its control method
US13/542,592 US9213482B2 (en) 2011-11-11 2012-07-05 Touch control device and method
US14/936,376 US20160062543A1 (en) 2011-11-11 2015-11-09 Touch control device and method

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
US13/542,592 Division US9213482B2 (en) 2011-11-11 2012-07-05 Touch control device and method

Publications (1)

Publication Number Publication Date
US20160062543A1 true US20160062543A1 (en) 2016-03-03

Family

ID=48280112

Family Applications (2)

Application Number Title Priority Date Filing Date
US13/542,592 Expired - Fee Related US9213482B2 (en) 2011-11-11 2012-07-05 Touch control device and method
US14/936,376 Abandoned US20160062543A1 (en) 2011-11-11 2015-11-09 Touch control device and method

Family Applications Before (1)

Application Number Title Priority Date Filing Date
US13/542,592 Expired - Fee Related US9213482B2 (en) 2011-11-11 2012-07-05 Touch control device and method

Country Status (1)

Country Link
US (2) US9213482B2 (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108062199A (en) * 2017-12-15 2018-05-22 广东欧珀移动通信有限公司 Touch processing method, device, storage medium and the terminal of information
US10037091B2 (en) 2014-11-19 2018-07-31 Honda Motor Co., Ltd. System and method for providing absolute coordinate and zone mapping between a touchpad and a display screen
US11307756B2 (en) 2014-11-19 2022-04-19 Honda Motor Co., Ltd. System and method for presenting moving graphic animations in inactive and active states

Families Citing this family (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TWI467467B (en) * 2012-10-29 2015-01-01 Pixart Imaging Inc Method and apparatus for controlling object movement on screen
KR102052960B1 (en) * 2012-11-23 2019-12-06 삼성전자주식회사 Input apparatus, display apparatus and control method thereof
EP2752758A3 (en) * 2013-01-07 2016-10-26 LG Electronics Inc. Image display device and controlling method thereof
US10002589B2 (en) * 2015-03-04 2018-06-19 Qualcomm Incorporated Retaining user selected screen area on user equipment
TWI588734B (en) 2015-05-26 2017-06-21 仁寶電腦工業股份有限公司 Electronic apparatus and method for operating electronic apparatus
US9781468B2 (en) 2015-08-25 2017-10-03 Echostar Technologies L.L.C. Dynamic scaling of touchpad/UI grid size relationship within a user interface
US9826187B2 (en) * 2015-08-25 2017-11-21 Echostar Technologies L.L.C. Combined absolute/relative touchpad navigation
FR3072803B1 (en) * 2017-10-19 2021-05-07 Immersion SYSTEM AND METHOD FOR THE SIMULTANEOUS MANAGEMENT OF A PLURALITY OF DESIGNATION DEVICES
US10969899B2 (en) 2019-07-19 2021-04-06 Samsung Electronics Co., Ltd. Dynamically adaptive sensing for remote hover touch

Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5644737A (en) * 1995-06-06 1997-07-01 Microsoft Corporation Method and system for stacking toolbars in a computer display
US20060026535A1 (en) * 2004-07-30 2006-02-02 Apple Computer Inc. Mode-based graphical user interfaces for touch sensitive input devices
US20080108456A1 (en) * 2006-11-02 2008-05-08 Bonito Anthony P Golf scoring, marketing and reporting system and method of operation
US20080301046A1 (en) * 2007-08-10 2008-12-04 Christian John Martinez Methods and systems for making a payment and/or a donation via a network, such as the Internet, using a drag and drop user interface
US20100223564A1 (en) * 2005-01-13 2010-09-02 Ray Hsu Automatically Merging Graphical Programs
US20100295789A1 (en) * 2009-05-19 2010-11-25 Samsung Electronics Co., Ltd. Mobile device and method for editing pages used for a home screen
US20120169623A1 (en) * 2011-01-05 2012-07-05 Tovi Grossman Multi-Touch Integrated Desktop Environment
US20120206471A1 (en) * 2011-02-11 2012-08-16 Apple Inc. Systems, methods, and computer-readable media for managing layers of graphical object data
US20120233690A1 (en) * 2007-05-11 2012-09-13 Rpo Pty Limited User-Defined Enablement Protocol
US20130328775A1 (en) * 2008-10-24 2013-12-12 Microsoft Corporation User Interface Elements Positioned for Display
US20140129990A1 (en) * 2010-10-01 2014-05-08 Smart Technologies Ulc Interactive input system having a 3d input space

Family Cites Families (37)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5880411A (en) * 1992-06-08 1999-03-09 Synaptics, Incorporated Object position detector with edge motion feature and gesture recognition
US6898315B2 (en) * 1998-03-23 2005-05-24 Microsoft Corporation Feature extraction for real-time pattern recognition using single curve per pattern analysis
US6429846B2 (en) 1998-06-23 2002-08-06 Immersion Corporation Haptic feedback for touchpads and other touch controls
US6333753B1 (en) * 1998-09-14 2001-12-25 Microsoft Corporation Technique for implementing an on-demand display widget through controlled fading initiated by user contact with a touch sensitive input device
ATE443946T1 (en) * 1999-05-27 2009-10-15 Tegic Communications Inc KEYBOARD SYSTEM WITH AUTOMATIC CORRECTION
US6421067B1 (en) * 2000-01-16 2002-07-16 Isurftv Electronic programming guide
US7873972B2 (en) * 2001-06-01 2011-01-18 Jlb Ventures Llc Method and apparatus for generating a mosaic style electronic program guide
JP2003344086A (en) * 2002-05-28 2003-12-03 Pioneer Electronic Corp Touch panel device and display input device for car
US7358963B2 (en) * 2002-09-09 2008-04-15 Apple Inc. Mouse having an optically-based scrolling feature
JP2005049994A (en) * 2003-07-30 2005-02-24 Canon Inc Method for controlling cursor
US8381135B2 (en) * 2004-07-30 2013-02-19 Apple Inc. Proximity detector in handheld device
US8407347B2 (en) * 2004-11-19 2013-03-26 Xiao Qian Zhang Method of operating multiple input and output devices through a single computer
US9007299B2 (en) * 2006-07-14 2015-04-14 Ailive Inc. Motion control used as controlling device
US8169404B1 (en) * 2006-08-15 2012-05-01 Navisense Method and device for planary sensory detection
US20080106523A1 (en) * 2006-11-07 2008-05-08 Conrad Richard H Ergonomic lift-clicking method and apparatus for actuating home switches on computer input devices
US9858712B2 (en) * 2007-04-09 2018-01-02 Sam Stathis System and method capable of navigating and/or mapping any multi-dimensional space
US8031175B2 (en) * 2008-04-21 2011-10-04 Panasonic Corporation Touch sensitive remote control system that detects hand size characteristics of user and adapts mapping to screen display
US8386918B2 (en) * 2007-12-06 2013-02-26 International Business Machines Corporation Rendering of real world objects and interactions into a virtual universe
TW200928905A (en) * 2007-12-26 2009-07-01 E Lead Electronic Co Ltd A method for controlling touch pad cursor
US20100039404A1 (en) * 2008-08-18 2010-02-18 Sentelic Corporation Integrated input system
US20100094496A1 (en) * 2008-09-19 2010-04-15 Barak Hershkovitz System and Method for Operating an Electric Vehicle
US8286095B2 (en) * 2009-01-15 2012-10-09 Research In Motion Limited Multidimensional volume and vibration controls for a handheld electronic device
EP2228711A3 (en) * 2009-03-12 2014-06-04 Lg Electronics Inc. Mobile terminal and method for providing user interface thereof
EP2237140B1 (en) * 2009-03-31 2018-12-26 Lg Electronics Inc. Mobile terminal and controlling method thereof
US20110009813A1 (en) * 2009-07-09 2011-01-13 Medtronic Minimed, Inc. Panning a display of a portable medical device
TW201104529A (en) * 2009-07-22 2011-02-01 Elan Microelectronics Corp Touch device, control method and control unit for multi-touch environment
US9232167B2 (en) 2009-08-04 2016-01-05 Echostar Technologies L.L.C. Video system and remote control with touch interface for supplemental content display
US8819591B2 (en) * 2009-10-30 2014-08-26 Accuray Incorporated Treatment planning in a virtual environment
ES2370067B1 (en) * 2009-12-01 2012-10-30 Linguaversal, S.L SYSTEM TO CONTROL DISTANCE COMPUTERIZED SYSTEMS
WO2011096166A1 (en) * 2010-02-03 2011-08-11 パナソニック株式会社 Display control device, display control method, and touchpad input system
US9922622B2 (en) * 2010-02-26 2018-03-20 Synaptics Incorporated Shifting carrier frequency to avoid interference
US9262073B2 (en) * 2010-05-20 2016-02-16 John W. Howard Touch screen with virtual joystick and methods for use therewith
US9001208B2 (en) * 2011-06-17 2015-04-07 Primax Electronics Ltd. Imaging sensor based multi-dimensional remote controller with multiple input mode
US9594504B2 (en) * 2011-11-08 2017-03-14 Microsoft Technology Licensing, Llc User interface indirect interaction
US20130127738A1 (en) * 2011-11-23 2013-05-23 Microsoft Corporation Dynamic scaling of touch sensor
JP5413448B2 (en) * 2011-12-23 2014-02-12 株式会社デンソー Display system, display device, and operation device
US9423895B2 (en) * 2012-05-31 2016-08-23 Intel Corporation Dual touch surface multiple function input device

Patent Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5644737A (en) * 1995-06-06 1997-07-01 Microsoft Corporation Method and system for stacking toolbars in a computer display
US20060026535A1 (en) * 2004-07-30 2006-02-02 Apple Computer Inc. Mode-based graphical user interfaces for touch sensitive input devices
US20100223564A1 (en) * 2005-01-13 2010-09-02 Ray Hsu Automatically Merging Graphical Programs
US20080108456A1 (en) * 2006-11-02 2008-05-08 Bonito Anthony P Golf scoring, marketing and reporting system and method of operation
US20120233690A1 (en) * 2007-05-11 2012-09-13 Rpo Pty Limited User-Defined Enablement Protocol
US20080301046A1 (en) * 2007-08-10 2008-12-04 Christian John Martinez Methods and systems for making a payment and/or a donation via a network, such as the Internet, using a drag and drop user interface
US20130328775A1 (en) * 2008-10-24 2013-12-12 Microsoft Corporation User Interface Elements Positioned for Display
US20100295789A1 (en) * 2009-05-19 2010-11-25 Samsung Electronics Co., Ltd. Mobile device and method for editing pages used for a home screen
US20140129990A1 (en) * 2010-10-01 2014-05-08 Smart Technologies Ulc Interactive input system having a 3d input space
US20120169623A1 (en) * 2011-01-05 2012-07-05 Tovi Grossman Multi-Touch Integrated Desktop Environment
US20120206471A1 (en) * 2011-02-11 2012-08-16 Apple Inc. Systems, methods, and computer-readable media for managing layers of graphical object data

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10037091B2 (en) 2014-11-19 2018-07-31 Honda Motor Co., Ltd. System and method for providing absolute coordinate and zone mapping between a touchpad and a display screen
US10496194B2 (en) 2014-11-19 2019-12-03 Honda Motor Co., Ltd. System and method for providing absolute coordinate and zone mapping between a touchpad and a display screen
US11307756B2 (en) 2014-11-19 2022-04-19 Honda Motor Co., Ltd. System and method for presenting moving graphic animations in inactive and active states
CN108062199A (en) * 2017-12-15 2018-05-22 广东欧珀移动通信有限公司 Touch processing method, device, storage medium and the terminal of information

Also Published As

Publication number Publication date
US20130120286A1 (en) 2013-05-16
US9213482B2 (en) 2015-12-15

Similar Documents

Publication Publication Date Title
US20160062543A1 (en) Touch control device and method
TWI451309B (en) Touch device and its control method
US10402042B2 (en) Force vector cursor control
TWI413922B (en) Control method for touchpad and touch device using the same
US9104308B2 (en) Multi-touch finger registration and its applications
US20110018828A1 (en) Touch device, control method and control unit for multi-touch environment
US7924271B2 (en) Detecting gestures on multi-event sensitive devices
US8614664B2 (en) Multi-touch multi-dimensional mouse
US20110060986A1 (en) Method for Controlling the Display of a Touch Screen, User Interface of the Touch Screen, and an Electronic Device using The Same
US20140002398A1 (en) Controlling a cursor on a touch screen
JP2011028524A (en) Information processing apparatus, program and pointing method
WO2010144726A1 (en) User interface methods providing continuous zoom functionality
JP7233109B2 (en) Touch-sensitive surface-display input method, electronic device, input control method and system with tactile-visual technology
TW201512940A (en) Multi-region touchpad
US20120297336A1 (en) Computer system with touch screen and associated window resizing method
JP3850570B2 (en) Touchpad and scroll control method using touchpad
JP5388246B1 (en) INPUT DISPLAY CONTROL DEVICE, THIN CLIENT SYSTEM, INPUT DISPLAY CONTROL METHOD, AND PROGRAM
JP5275429B2 (en) Information processing apparatus, program, and pointing method
TWI497357B (en) Multi-touch pad control method
JP5882973B2 (en) Information processing apparatus, method, and program
KR20140067861A (en) Method and apparatus for sliding objects across touch-screen display
TWI439922B (en) Handheld electronic apparatus and control method thereof
KR20150111651A (en) Control method of favorites mode and device including touch screen performing the same
CN104484117B (en) Man-machine interaction method and device
US20140085197A1 (en) Control and visualization for multi touch connected devices

Legal Events

Date Code Title Description
AS Assignment

Owner name: ELAN MICROELECTRONICS CORPORATION, TAIWAN

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:MAI, WEI-KUO;HUANG, SHIH PENG;LIOU, CHUNG-JUNG;REEL/FRAME:036996/0201

Effective date: 20151107

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION