US20040036680A1 - User-interface features for computers with contact-sensitive displays - Google Patents

User-interface features for computers with contact-sensitive displays Download PDF

Info

Publication number
US20040036680A1
US20040036680A1 US10/452,233 US45223303A US2004036680A1 US 20040036680 A1 US20040036680 A1 US 20040036680A1 US 45223303 A US45223303 A US 45223303A US 2004036680 A1 US2004036680 A1 US 2004036680A1
Authority
US
United States
Prior art keywords
display
input area
active input
user
portable computer
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US10/452,233
Inventor
Mark Davis
Carlo Bernoulli
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Qualcomm Inc
Original Assignee
Palm Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Palm Inc filed Critical Palm Inc
Priority to US10/452,233 priority Critical patent/US20040036680A1/en
Assigned to PALM, INC. reassignment PALM, INC. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: BERNOULLI, CARLO, DAVIS, MARK
Priority to CA002496774A priority patent/CA2496774A1/en
Priority to AU2003262921A priority patent/AU2003262921A1/en
Priority to EP03793432A priority patent/EP1558985A2/en
Priority to PCT/US2003/026869 priority patent/WO2004019200A2/en
Publication of US20040036680A1 publication Critical patent/US20040036680A1/en
Assigned to JP MORGAN CHASE BANK, N.A., AS ADMINISTRATIVE AGENT reassignment JP MORGAN CHASE BANK, N.A., AS ADMINISTRATIVE AGENT SECURITY AGREEMENT Assignors: PALM, INC.
Assigned to PALM, INC. reassignment PALM, INC. RELEASE BY SECURED PARTY (SEE DOCUMENT FOR DETAILS). Assignors: JPMORGAN CHASE BANK, N.A., AS ADMINISTRATIVE AGENT
Assigned to HEWLETT-PACKARD DEVELOPMENT COMPANY, L.P. reassignment HEWLETT-PACKARD DEVELOPMENT COMPANY, L.P. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: PALM, INC.
Assigned to PALM, INC. reassignment PALM, INC. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: HEWLETT-PACKARD DEVELOPMENT COMPANY, L.P.
Assigned to HEWLETT-PACKARD DEVELOPMENT COMPANY, L.P. reassignment HEWLETT-PACKARD DEVELOPMENT COMPANY, L.P. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: PALM, INC.
Assigned to PALM, INC. reassignment PALM, INC. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: HEWLETT-PACKARD DEVELOPMENT COMPANY, L.P.
Assigned to HEWLETT-PACKARD DEVELOPMENT COMPANY, L.P. reassignment HEWLETT-PACKARD DEVELOPMENT COMPANY, L.P. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: PALM, INC.
Assigned to QUALCOMM INCORPORATED reassignment QUALCOMM INCORPORATED ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: HEWLETT-PACKARD COMPANY, HEWLETT-PACKARD DEVELOPMENT COMPANY, L.P., PALM, INC.
Abandoned legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/1626Constructional details or arrangements for portable computers with a single-body enclosure integrating a flat display, e.g. Personal Digital Assistants [PDAs]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04886Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures by partitioning the display area of the touch-screen or the surface of the digitising tablet into independently controllable areas, e.g. virtual keyboards or menus
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2200/00Indexing scheme relating to G06F1/04 - G06F1/32
    • G06F2200/16Indexing scheme relating to G06F1/16 - G06F1/18
    • G06F2200/161Indexing scheme relating to constructional details of the monitor
    • G06F2200/1614Image rotation following screen orientation, e.g. switching from landscape to portrait mode

Definitions

  • the present invention relates to user-interfaces for computers.
  • the present invention relates to user-interface features for computers with contact-sensitive displays.
  • PDAs Personal digital assistants
  • a PDA is small in size, usually suited to be held by a user on one hand and operated by another hand.
  • the display of the PDA is used to provide additional input functionality in lieu of a large keyboard, a mouse or other input mechanism that is incompatible with the size and portability of the PDA.
  • PDAs often provide an active input area on the display, which is a designated region on the display where most of the user-contact and input is entered.
  • One type of active input area used in PALM OS and POCKET PC devices provides for a handwriting recognition area to appear on the display. The user can form strokes on the region of the display where the handwriting recognition area is provided, and technology such as provided by GRAFFITI or JOT, is used recognize the strokes as characters.
  • handwriting recognition area is often a frequent location of the user's attention
  • other input functionality is usually provided in conjunction with or next to the handwriting recognition area.
  • This other input functionality is often in the form of icons and task bars that can be selected in order to cause the PDA to perform some function.
  • electronic keyboards can be substituted on the display in place of the handwriting recognition area.
  • TABLET PCs have become popular. Such devices also utilize an immediate handwriting recognition square for recognizing contact strokes provided on a display as characters.
  • Embodiments of the invention provide for a configurable user-interface for a computer.
  • Embodiments of the invention may apply to a handheld computer, such as a PDA, having an active input area, where handwriting recognition or digital keyboards may be displayed.
  • input features such as icons provided with the active input area may be substituted in exchange for other input features.
  • a display of the handheld computer may be provided in a portrait mode, with a left or right handed orientation.
  • the placement and orientation of the active input area in relation to other portions of the display is considered in order to facilitate users who are either left or right handed.
  • Other embodiments provide a feedback feature that echoes back to the user a particular character that was just entered through a handwriting recognition scheme.
  • the particular character that is echoed back may be a glyph (e.g. a character before it is displayed as an alphabet or Roman numeral character) that the handheld computer determines match to a handwriting stroke of the user.
  • another embodiment provides for a configurable handwriting recognition area for an active input area.
  • the handwriting recognition area portion of the active input area may be configurable in terms of the number of cells provided, the shape of each cell, the functionality provided by each cell (e.g. what kind of characters are to be recognized in a particular cell) and the dimensions of each cell in both the lengthwise and widthwise directions.
  • FIG. 1 is a simplified frontal view of a handheld computer with a configurable active input area, under an embodiment of the invention.
  • FIGS. 2 A- 2 D illustrate screen shots of a configurable active input area, under one or more embodiments of the invention.
  • FIG. 3 describes a method for replacing elements of an active input area with other elements.
  • FIGS. 4 A- 4 C illustrate screen shots of an icon in an active input area being replaced by another icon.
  • FIGS. 5 A- 5 B illustrate screen shots of a handwriting recognition aid, under an embodiment of the invention.
  • FIGS. 6 A- 6 C are simplified frontal views of a handheld computer that has user-interface features which can be positioned to facilitate landscape modes with handedness orientation.
  • FIGS. 7 A- 7 D illustrate screen shots of a display of a handheld computer where different active input areas are displayed in left and right handedness orientations.
  • FIG. 8 is a block diagram that illustrates a portable computer upon which an embodiment of the invention may be implemented.
  • Embodiments of the invention provide a set of configurable user-interface features for computers that have contact-sensitive displays.
  • numerous specific details are set forth in order to provide a thorough understanding of the present invention. It will be apparent, however, that the present invention may be practiced without these specific details. In other instances, well-known structures and devices are shown in block diagram form in order to avoid unnecessarily obscuring the present invention.
  • Embodiments described herein provide for a portable computer with a contact-sensitive display having a user-interface that is configurable through user-contact with a display surface.
  • an active input area is provided that is configurable in appearance and functionality.
  • the configurable nature of the active input area allows for a flexible user-interface that can accommodate, amongst other considerations, left and right handedness, special business applications, and user-preferences.
  • embodiments of the invention are described in the context of handheld computers, such as PDAs and smart cell phones, which use contact-sensitive displays.
  • Handheld computers in particular, illustrate the problem of maximizing user-interface functionality and preferences on a device with a relatively small profile.
  • Embodiments of the invention may also be employed with other types of computers that have contact-sensitive displays, such as on tablet computers, laptops and other portable computers.
  • a user-interface can be configured on a computer with a contact-sensitive display.
  • a set of features that are selectable through contact with the display of the computer may be provided on a designated region of the computer's display. When selected, the features cause the computer to perform some function associated with that feature.
  • a tap event corresponding to an object making a specific form of contact with the display, may be entered by the user to initiate a substitution of one feature for another feature in the designated region.
  • a list of alternative features is provided to the user. A selection of one of the alternative features is detected by a user once-again making contact with the display. Then the selected alternative feature is provided on the display instead of the feature that was associated with the tap event.
  • a portable computer includes a housing, a contact-sensitive display and a processor.
  • the processor is configured to provide an active input area on the display.
  • the active input area includes functionality where the processor recognizes strokes entered on the display as characters.
  • the portable computer may be oriented in a portrait mode, where the active input area extends primarily in a left-right direction from a perspective of a user that operates the portable computer.
  • the portable computer may also be oriented in a landscape mode, where the active input area extends primarily in a top-bottom direction from the perspective of the user.
  • the processor is configured to provide a handedness orientation for the active input area with respect to the display and other features of the handheld computer 100 .
  • an active input area refers to a graphic, contact-sensitive input mechanism provided on a display surface of a computer.
  • the active input area provides functionality that is oriented for making the active input area the primary focus of the user when the user is interacting with the computer.
  • the active input area may provide a handwriting recognition area, keypad, and/or a keyboard that enables a large number of possible user contacts to be entered and uniquely interpreted from one designated region of the display.
  • an active input area may include a display region designated for recognizing certain user-contacts as character input, including alphabet and numeric characters.
  • the active input area may also be used to receive commands from the user for performing functions such as launching applications.
  • an active input area may differ from other user-interfaces of a computer (such as mechanical features like keyboard and buttons) in that it is provided on a contact-sensitive display, and it can be used to receive a large number of unique user-inputs that can subsequently be interpreted.
  • FIG. 1 illustrates a handheld computer 100 with a configurable active input area 110 , under an embodiment of the invention.
  • handheld computer 100 includes a housing 120 having a front panel 122 .
  • a display surface 124 is exposed on the front panel 122 .
  • the display surface 124 may be part of a display assembly having a digitizer or other construction in which contact between an object and the display surface is detected and recorded.
  • the housing 120 may also provide a plurality of buttons 130 , or other actuatable mechanisms. The buttons 130 can be individually actuated to cause handheld computer 100 to perform some function such as launch a program.
  • An active input area 110 is provided on display surface 124 .
  • active input area 110 is purely digital, and can be selected to appear on display surface 124 , rather than be a permanent aspect of the display surface 124 .
  • the active input area 110 includes a handwriting recognition area 112 .
  • a user may initiate contact with an object in the form of a gesture or stroke on handwriting recognition area 112 , and the processing resources of handheld computer 100 interpret that stroke as a character or function.
  • the handwriting recognition area 112 may be immediate in that a single stroke may be recognized as a character after that stroke is completed.
  • a recognized character of an immediately recognized stroke may be outputted on display surface 124 prior to another stroke being entered.
  • the handwriting recognition area 112 itself may be separated into two or more cells.
  • a first cell 112 A recognizes strokes as alphabetical characters
  • a second cell 112 B recognizes strokes as numbers. Additional cells may be provided as needed.
  • embodiments described below provide for a “triple-cell” configuration, where one cell of handwriting recognition area 112 is for recognizing strokes as capital letters.
  • a third or additional cell may be for recognizing strokes as functions.
  • the active input area 110 also includes a plurality of active icons 115 , which are placed adjacent to the handwriting recognition area 112 .
  • active icon means an icon that has some functionality associated with it. An active icon can be selected to perform its associated function. Accordingly, active icons 115 are each individually selectable to cause the handheld computer 100 to perform a function that corresponds to that icon. Unless stated otherwise, reference to icons in this application is intended to mean “active icons”. In one embodiment, a set of four icons 115 is provided around handwriting recognition area 112 , although more or fewer icons may be provided as part of active input area 110 as needed or desired.
  • one characteristic of the active input area is that it contains multiple user-interface features of different types.
  • Another characteristic of an active input area is that even though it is formed from multiple elements with different functionality, the active input area appears as a unit.
  • active input area 110 is selected to appear, all of the elements designated to be part of the active input area at that particular moment appear with it. With respect to FIG. 1, this would mean that all of the active icons 115 and the handwriting recognition area 112 appear as the components of the active input area 110 .
  • these elements appear in the same configuration each time the active input area 110 is displayed. For example, each active icon 115 may occupy the same position relative to handwriting recognition area 112 each time active input area 110 is called on the display surface 124 .
  • active input area 110 may be minimized into a task bar or other graphic feature that appears on the display.
  • the active input area 110 may be made to appear on display surface 124 at any time through one or more taps with the display surface 124 .
  • an area of display surface 124 can be maximized for providing content by minimizing active input area 110 , thus facilitating use of handheld computer 100 as, for example, an electronic book reader.
  • FIGS. 2 A- 2 D provide screen shots of display surface 124 to illustrate where the appearance of active input area 110 may be altered or otherwise changed as needed or selected by a user of handheld computer 100 .
  • FIG. 2A illustrates an embodiment where a user selects to provide active input area 110 with a triple-cell configuration.
  • the active input area 110 may also include active icons 115 .
  • a first cell 112 A (usually on the far left) interprets gestures made on that part of the display surface 124 as small cap characters.
  • a second cell 112 B (usually on the far right) interprets gestures made on that part of the display surface 124 as numbers.
  • a third cell 112 C which may appear in the middle, interprets gestures made on that part of the display surface 124 as capitalized letters.
  • Such a configuration may be set as a preference of the user.
  • FIG. 2B illustrates another screen shot of how active input area 110 can be made to appear on the display surface 124 .
  • active icons 115 are removed from active input area 110 . Rather, all of active input area 110 is made into handwriting recognition area 112 .
  • cells 112 A (for interpreting strokes as characters) and 112 B (for interpreting strokes as numbers) are re-sized to be larger widthwise (along the axis X) than the configuration illustrated in FIG. 2A.
  • the dimensions of the two cells 112 A, 112 B are non-symmetrical, in that cell 112 A for characters is larger than cell 112 B for numbers.
  • a configuration such as shown in FIG. 2B may be designated as a user-preference because the user is more likely to use the character entry cell than the numeric entry cell.
  • FIG. 2C illustrates a configuration where active input area 110 is formed entirely of handwriting recognition area 112 , and further that handwriting recognition area 112 has an enlarged height (along the axis Y).
  • a triple cell configuration is also shown, in that a third cell 112 C is also provided for recognizing capital letters.
  • FIG. 2D illustrates a reverse configuration for active input area 110 , where handwriting recognition area 112 is made smaller in height (along axis Y), but not minimized.
  • Such an embodiment provides more room on display surface 124 for providing content, while providing some space for a user to enter strokes onto handwriting recognition area 112 .
  • active input area 110 is adjustable between various configurations, including configurations shown by FIGS. 2 A- 2 D, through user-input with the display surface 124 .
  • boundary lines 212 and 214 may be provided to delineate the active input area 110 from the remaining portion of the display surface 124 .
  • the boundary line 212 may correspond to a height of the active input area 110 from an edge 222 of the display surface.
  • the boundary line 214 may correspond to a marker delineating the cells 112 A, 112 B of the handwriting recognition area 112 .
  • one embodiment enables the user to select boundary line 212 to move it either upward or downward relative to bottom edge 222 , to yield configurations shown by FIGS.
  • the boundary 214 may be selected and moved to the left or right, such as shown by FIG. 2B.
  • the selection of boundary lines 212 , 214 may be done through contact with the display surface 124 , or through some other means such as menu selection.
  • FIGS. 2 A- 2 D illustrate preferences that may be selected by the user.
  • the user's selection may be based on factors such as whether display surface 124 is to be used primarily for displaying content, or whether character recognition is to be enhanced.
  • Embodiments of the invention provide for elements of active input area 110 to be selected and replaced by other elements as the need arises. As described by FIGS. 3 and 4A- 4 C, the selection and replacement of elements of active input area 110 may be done at the user level.
  • a manufacturer may provide the handheld computer 100 with a particular (or default) configuration for active input area 110 . Subsequently, vendors or original equipment manufacturers may alter the configuration of the handheld computer 100 from its original manufacturing in order to suit a particular need.
  • active input area 110 may be configured to include elements (such as icons) for a particular internal business application of a company. In one use, an entity such as the company may alter the configurations of the active input area 110 one time, and disable the ability of the end user to subsequently reconfigure the active input area.
  • a more general application for an embodiment of the invention is to enable the end user to configure and reconfigure active input area 110 as the user desires.
  • the active icons 115 that form part of active input area 110 can be selected and configured by a user of handheld computer 100 .
  • the user may, for example, switch the icons that appear in the active input area 110 , alter the relative positions of such icons, and/or reduce, eliminate or increase the number of icons that appear as part of active input area 110 .
  • an embodiment provides that the active input area 110 appears only with the designated selection of icons, at least until that selection is altered or replaced once again.
  • FIG. 3 illustrates a method for substituting out one of the active icons 115 that appear in active input area 110 for another icon that is selected by the user.
  • Step 310 provides that the active input area is displayed with a designated set of active icons 115 .
  • the active icons 115 of active input area 110 may be displayed with a specific orientation, position, appearance and functionality.
  • a tap event is detected that is associated with one of the icons that appears in the active input area 110 .
  • the location of where the tap event occurs is what associates the tap event with a particular icon of active input area 110 .
  • step 325 a determination is made as to whether the detected tap event qualifies as a tap event for substituting out one of the active icons 115 (or some other feature of active input area 110 ) for an alternative icon.
  • the determination may be based on whether the tap event qualifies based on some pre-determined criteria. This determination may distinguish such a tap event from other taps and tap events which are not for substituting out icons from active input area 110 .
  • the tap event is a “tap and hold” where an object such as a stylus is tapped to the display surface 124 and held in position for a designated duration.
  • the duration in which the object making contact with the display is continuously held in contact with the display may form the criteria as to whether the tap event qualifies.
  • the position where the tap and hold occurs may also be part of the criteria for qualifying the tap event. For example, in order to select a particular icon for replacement, the tap event may be required to occur over a particular active icon 115 , and last a designated duration so that it is identified as a tap event to substitute out the particular icon. Should the tap occur elsewhere, or not for the designated duration, then the tap event would not be recognized as a tap event to substitute out that particular icon.
  • tap events include a “tap and drag” event, where the object is tapped to one place on display surface 124 , then dragged continuously to another place on the display surface.
  • the criteria for qualifying the tap event may be that the first icon is tapped, then the object is continuously dragged across the display to another designated location.
  • a tap event is a double-tap or even a triple-tap.
  • a series of three taps within a relatively small duration of time that occurs over one of the icons 115 may be designated to qualify as a request to substitute out the selected icon.
  • Other examples and scenarios are possible.
  • step 325 determines whether the tap event was to not a request to reconfigure the selection of any of the icons 115 in the active input area 110 . If the determination in step 325 is that the tap event was to not a request to reconfigure the selection of any of the icons 115 in the active input area 110 , then step 330 provides that the tap event is ignored.
  • Step 340 provides that a list of the alternative icons is displayed in response to a determination that the tap event was to substitute out one of the active icons.
  • the alternative icons may correspond to icons that are not presented in the active input area 110 , but that are available in that they are each associated with a distinct functionality by the handheld computer 100 . Thus, the selection of any icon provided in the displayed list would cause handheld computer 100 to perform some function associated with that icon.
  • the list may display representations of the available alternative active icons. These representations may correspond to iconic expressions, such as insignias, trademarks, and other graphic associations to the underlying application or functionality.
  • step 345 a determination is made as to whether the user made another selection for another icon to replace the first icon. In one embodiment, this selection may be made by the user tapping a representation of the second icon from the list provided in step 340 . If the determination is that no selection was made from the list, then step 350 provides that the list is displayed until the user taps somewhere else on the display surface 124 , or somehow initiates or causes some action to indicate that the list should be closed. For example, the user may launch another application with one of the buttons 130 , or shut handheld computer 100 off.
  • step 360 provides that the icon selected for substitution is replaced with the icon selected from the list. Until further alterations, this new icon will appear as part of the active input area 110 each time the active input area is selected to appear. In addition, the next time the list is displayed, a representation of the icon that was substituted out may be provided in the list, so that this icon may be re-selected at a later time as one of the elements of the active input area 110 .
  • FIGS. 4 A- 4 C provide screen shots to illustrate a method such as described in FIG. 3.
  • FIG. 4A shows active input area 110 provided over a task bar 426 .
  • the active input area 110 can be minimized or substituted out of the display.
  • An icon or other item representing the active input area 110 may be provided on the task bar 426 . This icon can be selected by a user through contact with the display, or other means, to cause the active input area to re-appear on the display surface 124 .
  • the task bar 426 may be persistent, in that it is either always present, or present automatically depending on certain applications or functions performed by the handheld computer 100 .
  • FIG. 4A shows active input area 110 with four active icons 115 when in a displayed state.
  • Each of the active icons 115 is assigned a particular function.
  • the function associated with that icon is performed. Examples of functions that can be assigned to active icons 115 include launching a particular application, performing a utility function (such as displaying a search tool or adjusting the contrast of the computer), or opening a particular record.
  • launching a particular application performing a utility function (such as displaying a search tool or adjusting the contrast of the computer), or opening a particular record.
  • a utility function such as displaying a search tool or adjusting the contrast of the computer
  • opening a particular record Rather than change the function associated with a particular icon, embodiments of the invention permit the particular icon displayed in the active input area 110 to be replaced by a new icon. With the changing of a particular icon, the functionality offered by that icon is changed in place of the functionality provided by the new replacement icon.
  • the association between an icon in the active input area 110 and a function or application may be
  • FIGS. 4 A- 4 C illustrate how a first active icon 115 A associated with a “display menu” function can be replaced by a second active icon 115 B associate with a communication port application (referred to as “dialer”).
  • the first active icon 115 A is assumed to be selected for exchange for another icon by a tap event.
  • the tap event that selects the first active icon 115 A for exchange is different than a tap (or other tap event) that would select that and cause the handheld computer 100 to perform the function of the display menu icon.
  • the act of selecting the first active icon 115 A in order to cause the handheld computer 100 to perform the function associated with that icon may be performed simply by tapping the icon one time.
  • the tap event that selects the first active icon 115 A for exchange with another icon may correspond to a stylus tapping on display surface 124 where first active icon 115 A is provided, and holding the tap for a designated duration.
  • the tap event for exchanging the first active icon 115 A may correspond to the stylus dragging in contact with the display from a location where the first icon 115 A is provided to some other location.
  • the tap event for selecting the first active icon 115 A for exchange may correspond to a double-tap or triple-tap on the location of the display surface where the first active icon 115 A is provided. In either case, the tap event for selecting the icon for exchange with another icon is differentiable from the tap or tap event for performing the function of that icon, but the particular act required for the tap event may be one of design choice.
  • FIG. 4B illustrates a list 410 that is opened in response to first active icon 115 A being selected for exchange with another icon.
  • the list 410 includes a plurality of representations 412 .
  • Each representation 412 corresponds to an alternative active icon that is available to be displayed as part of active input area 110 .
  • an icon of that representation would be generated to replace the first active icon 115 A. In one embodiment, this would mean that the replacement icon would appear instead of the first active icon 115 A, in first active icon's position within the active input area 110 .
  • the selection of one of the representations 412 in list 410 may be accomplished by a stylus making contact with a point on display surface 124 where that representation is displayed.
  • Embodiments of the invention provide a feedback function where the selected representation 412 is indicated to the user to afford the user an opportunity to change the selection before the selection is made final.
  • the selection of one of the representations (the one corresponding to “dialer”) is also visually indicated with some feedback.
  • the feedback may correspond to highlighting the selected representation when it is selected from the list.
  • the feedback may correspond to changing the appearance of the selected representation, such as changing its color, size, or shading.
  • a distinctive audible may be provided to distinguish which representation 412 from the list 410 was selected from the user.
  • the list 410 may visually indicate information about the alternative icons, or about the functionality associated with those alternative icons. Fir example, the list 410 may indicate if certain applications are not available by graying out representations 412 that correspond to those applications.
  • FIG. 4B illustrates a second active icon 115 B.
  • FIG. 4C illustrates when second active icon 115 B is displayed in active input area 110 in place of first active icon 115 A.
  • the second active icon 115 B takes the place of first active icon 115 A in active input area 110 .
  • second active icon 115 B occupies the relative position previously occupied by the first active icon 115 A in active input area 110 .
  • the first active icon 115 A is no longer present in active input area 110 , but it is available for reselection and exchange with any other icon that is part of the active input area 110 .
  • active input area 110 is subsequently called or used, active input area appears with second icon 115 B, at least until the active input area is re-configured.
  • FIGS. 4 A- 4 C enable the user to create static associations between icons that can appear in the active input area 110 and their respective functionalities. If the user wants a new functionality to be provided by an icon in the active input area 110 , the user selects a new icon for the active input area which already has that functionality assigned to it. The user does not need to select a new function for an icon that cannot be substituted out of the active input area 110 .
  • FIGS. 4 A- 4 C enables active input area 110 to carry icons created by third-party developers for particular applications.
  • Application developers often create the icons that are associated with their programs. The icons are provided in order to let the user launch an application by selecting the icon associated with that icon.
  • the icons designed by the developers include graphics such as insignias and trademarks, which uniquely identify their application to the user. These icons are often listed in the menu of the handheld computer 100 . With conventional handheld computers, the icon corresponding to the menu function is usually presented in the active input area 110 , but the various icons that represent different applications, including third-party developer applications, are not part of the active input area.
  • some conventional computers require the user to select a new function for a wildcard icon that always appears on the display, or switch the functionality of one icon (such as the menu icon) in order to assign that icon a new functionality.
  • the icons designed and provided by the developers can be imported by the user (or a vendor) into the active input area 110 .
  • the handheld computer 100 is configured to display the icons that form the active input area 110 using monochromatic display resources. All of the active input area 110 , including handwriting recognition area 112 , may be provided using monochromatic resources, even if handheld computer 100 has color display resources. Monochromatic resources offer the advantage of being able to display content designed for both color and monochrome. There are many applications which are designed for monochrome environments. By providing for the handheld computer 100 to display the icons of active input area 110 in monochrome, no special consideration needs to be made to distinguish icons made for color from icons made for monochrome, as both types of icons would be displayed in the active input area 110 in monochrome.
  • handwriting recognition area 112 may be switched out of the active input area 110 in the same manner as the active icons.
  • the handwriting recognition area 112 may be switched out in place of a digital keyboard, or a set of icons.
  • the specific type of handwriting recognition area 112 that forms part of the active input area 110 may be selected in a manner such as described with FIGS. 4 A- 4 C.
  • a two-cell version of handwriting recognition area 112 may be substituted for a triple-cell version (see FIG. 2B) in a manner described above.
  • handheld computer 100 it is possible for handheld computer 100 , or other computer with a contact-sensitive display, to accept character entry on any location of display surface 124 .
  • the acceptance of the character entry may be through display contact mechanisms, such as electronic keyboards and handwriting recognition area.
  • the handheld computer 100 is configured to recognize strokes entered anywhere on display surface 124 , where each stroke is immediately recognized as a corresponding character.
  • handheld computer 100 may be configured to recognize certain strokes, such as provided in GRAFFITI and JOT, as characters or commands when those strokes are entered on locations of display surface 124 other than active input area 110 .
  • the electronic keyboard itself may be provided anywhere on the display surface 124 . Any taps entered on regions corresponding to keys of the electronic keyboard are recognized as corresponding characters.
  • active input area 110 has functionality other than that of receiving input.
  • active input area 110 can be used as a visual guide for assisting the user to enter correctly shaped strokes on a remaining portion of display surface 124 .
  • a glyph is a recognized form of a stroke
  • a stroke is what is traced by a user employing an object to make continuous contact (e.g. between a pen-up and a pen-down) with the display surface 124 .
  • immediate handwriting recognition can be performed by matching a stroke to a glyph, and then displaying a character associated with the glyph.
  • U.S. Pat. No. 6,493,464 (hereby incorporated for all purposes in its entirety by this application) describes an immediate handwriting recognition technique using strokes and glyphs.
  • active input area 110 displays a set of glyphs 552 .
  • the region 526 of display surface 124 which excludes active input area 110 , is shown as displaying a stroke 554 recently formed by the user.
  • the stroke 554 may have been formed by, for example, the user tracing a shape on the region 526 . Since the stroke 524 needs to match a shape of a desired glyph in the set of glyphs 552 in order to be properly recognized, displaying the set of glyphs in the active input area 110 is useful for providing a visual cue for the user. Such an embodiment may be particularly useful in the case where the user is unfamiliar with the particular stroke recognition technique used by the handheld computer 100 (such as GRAFFITI or JOT). Thus, active input area 110 may also serve as a feedback mechanism for providing visual feedback of a user's input operations.
  • active input area 110 provides a visual feedback as to the character that was identified from the stroke 554 that the user entered on the region 526 .
  • active input area 110 may display or somehow indicate simultaneously which character was recognized from that stroke.
  • FIG. 5B an indication is shown as to which glyph in the set of glyphs 552 corresponded to the stroke that the user entered. The indication may be in the form of highlighting or shading one glyph that the handheld computer 100 determines to have matched the stroke 554 entered by the user onto the region 526 .
  • the manner in which active input area 110 and other user-interface features are provided on handheld computer 100 may be accommodating for landscape modes, with particular handedness configurations.
  • the active input area 110 and other input features can be provided on display surface 124 in a landscape mode, with a particular left handed or right handed orientation.
  • active input area 110 enables flexibility as to how it can be shaped and positioned. Specifically, when active input area 110 is electronically generated, the particular portion of display surface 124 upon which the active input area is displayed can be selected. Simultaneously, resources for detecting contact to display surface 124 may be oriented to recognize the particular forms of contact that correspond to the numerous entries that can be made through the active input area 110 . Thus, active input area 110 can be created and recreated with physical characteristics that suit a particular configuration, such as a handedness orientation. In particular, the position, dimension, shape, orientation and even components of active input area 110 are selectable based on orienting all of the features according to a particular handedness.
  • FIGS. 6 A- 6 C shows how the flexibility in the manner active input area 110 is provided can be used to accommodate various preferences of the user, including left or right handedness of the user in the landscape mode.
  • the handheld computer 100 is shown in a portrait mode, which may be the default configuration of the handheld computer.
  • the display surface 124 is assumed to be rectangular in shape, and the portrait mode corresponds to when the length of the display surface extends in an up-down direction from the perspective of the user.
  • the perspective of the user is shown by the axes X and Y, with the X axis corresponding to what the user views as being the up and down direction.
  • the perspective offered with the axes X and Y is that of the user staring into the paper.
  • active input area 110 extends a height from a bottom surface 612 of display surface 124 .
  • the buttons 130 are provided between the bottom surface 612 of display surface 124 and a bottom edge 616 of the housing 120 . Based on convention, active input area 110 may be provided at the bottom portion of display surface 124 .
  • the active input area 110 may include active icons 115 .
  • FIG. 6B illustrates handheld computer 100 positioned in a landscape mode, with a left handed orientation.
  • the left handed orientation means that most, if not all, of the user-interface features that require the user to make manual contact with handheld computer 100 are provided on the left-hand side of the handheld computer.
  • the active input area 110 is positioned so that when used by a left-handed person, the person's hand will not block the user's view of the display surface 124 .
  • the left-hand orientation may be created by rotating display surface 124 clockwise 90 degrees in the direction of A.
  • housing 120 provides the buttons in the top-down configuration, to the left of display surface 124 .
  • the active input area 110 may be re-generated to extend the same manner as in the portrait mode.
  • active input area 110 extends in a top-bottom direction, as defined by axis X, but adjacent to a left boundary 621 (when viewed in the configuration of FIG. 6B) of the display surface 124 .
  • FIG. 6C illustrates handheld computer 100 positioned in a landscape mode, with a right handed orientation.
  • the active input area 110 is positioned so that when used by a right-handed person, the person's hand will not block the user's view of the display surface 124 .
  • the right-hand orientation may be created by rotating display surface 124 counter-clockwise 90 degrees in the direction of B.
  • housing 120 provides the buttons in the top-down configuration, to the right of display surface 124 .
  • the active input area 110 may be re-generated to extend the same manner as in the portrait mode.
  • active input area 110 extends in a top-bottom direction, as defined by axis X, but adjacent to a right boundary 623 (when viewed in the configuration of FIG. 6C) of the display surface 124 .
  • handheld computer 100 can be configured to enable its contact-sensitive display to be viewed and used in a landscape mode with particular attention to the handedness of the user.
  • FIGS. 7 A- 7 D show some specific examples of display surface 124 accommodating different modes and handedness.
  • FIG. 7A illustrates the portrait mode for display surface 124 , with the length of the display surface 124 extending in the top-bottom direction, along the axis Y.
  • active input area 110 is displaying a set of keys corresponding to special character and number keys.
  • FIG. 7B the active input area 110 is rotated into the right-handed landscape orientation.
  • the same set of keys provided in the active input area 110 with FIG. 7A now are stacked vertically, so that the length of the active input area 110 extends in the direction of the axis Y.
  • FIGS. 7C and 7D illustrate the active input area 110 with cells that comprise the handwriting recognition area 112 .
  • an embodiment provides that the left cell 112 A, the right cell 112 B and the center cell 112 C of the handwriting recognition area 112 are provided to receive strokes as input.
  • FIG. 7C the left-handed landscape orientation is shown, with the cell 112 A being in the top position within active input area 110 , and the cell 112 C being in the bottom most position.
  • the active input area 110 appears to the left of the display surface 124 .
  • FIG. 7D the right-handed landscape orientation is shown.
  • the right-handed orientation of FIG. 7D mirrors the orientation of active input area 110 in FIG. 7C, except that the active input area appears to the right of the display surface 124 .
  • FIG. 8 illustrates the components of a portable computer 800 , under an embodiment of the invention.
  • the portable computer 800 may, for example, correspond to handheld computer 100 .
  • portable computer 800 includes a processor 810 , an analog-digital (A/D) converter 820 , a set of mechanical buttons 830 , a volatile memory 840 , a non-volatile memory 845 and a contact-sensitive display assembly 850 .
  • a power source 825 may be used to power the various components of the portable computer 800 .
  • One typical component of the portable computer 800 is an expansion port 842 . Typically, multiple such expansion ports are provided on such portable computers.
  • the contact sensitive display assembly 850 may include a display 852 and a digitizer 854 .
  • a display driver 856 may also form part of the display assembly 850 .
  • the digitizer 854 may be connected to the A/D converter 820 .
  • the digitizer 854 uses analog signals to detect contact with the display 852 , and to track the object making the contact as it moves over the display.
  • the A/D converter converts the signals into a digital form for processor 810 , which interprets what input in entered by the contact with the display 852 .
  • the driver 856 may be coupled to the processor 810 in order to receive signals that are translated into output on the display 852 .
  • the output may correspond to content that appears on the display surface 124 in previous embodiments, as well as to the digitally-created active input area 110 .
  • the display driver 856 may provide some or all of the monochromatic resources that are used to display icons, representations of the icons, and/or the active input area 110 .
  • the monochromatic resources enable the developer to make just one set of icons that work for all applications and all devices, since all such applications and devices can use monochrome, but not all such devices use color.
  • FIG. 8 While an embodiment such as described with FIG. 8 provides for a display assembly that is integrated and formed as part of the housing of the portable computer 800 , other embodiments may provide for a portable computer where the contact-sensitive display is remote to the housing of the portable computer, or at least to the housing where the processor 810 is provided. Such an embodiment may provide, for example, a projector that displays the content being provided by the processor 810 onto a surface such as a table. The portable computer 100 may sense the user's interaction with the surface where the projection is provided. Thus, the display surface may be external to the portable computer or its primary housing.

Abstract

Embodiments described herein provide for a portable computer with a contact-sensitive display having a user-interface that is configurable through user-contact with the display. An active input area may be provided that is configurable in appearance and functionality. The contents of the active input area, its functionality, and the manner in which it is oriented, particularly with respect to a left or right handedness, are described herein.

Description

    RELATED APPLICATION
  • The application claims benefit of priority to U.S. Provisional Application No. 60/406,264, filed Aug. 26, 2002, entitled “User interface features for a handheld computer,” and naming Mark Davis and Carlo Bernoulli as inventors, the aforementioned priority application being hereby incorporated by reference for all purposes in its entirety.[0001]
  • FIELD OF THE INVENTION
  • The present invention relates to user-interfaces for computers. In particular, the present invention relates to user-interface features for computers with contact-sensitive displays. [0002]
  • BACKGROUND OF THE INVENTION
  • Personal digital assistants (PDAs) are typical of computers that utilize contact-sensitive displays. A PDA is small in size, usually suited to be held by a user on one hand and operated by another hand. The display of the PDA is used to provide additional input functionality in lieu of a large keyboard, a mouse or other input mechanism that is incompatible with the size and portability of the PDA. [0003]
  • PDAs often provide an active input area on the display, which is a designated region on the display where most of the user-contact and input is entered. One type of active input area used in PALM OS and POCKET PC devices provides for a handwriting recognition area to appear on the display. The user can form strokes on the region of the display where the handwriting recognition area is provided, and technology such as provided by GRAFFITI or JOT, is used recognize the strokes as characters. [0004]
  • Because the handwriting recognition area is often a frequent location of the user's attention, other input functionality is usually provided in conjunction with or next to the handwriting recognition area. This other input functionality is often in the form of icons and task bars that can be selected in order to cause the PDA to perform some function. In addition, electronic keyboards can be substituted on the display in place of the handwriting recognition area. [0005]
  • Recently, devices such as TABLET PCs have become popular. Such devices also utilize an immediate handwriting recognition square for recognizing contact strokes provided on a display as characters. [0006]
  • SUMMARY OF THE INVENTION
  • Embodiments of the invention provide for a configurable user-interface for a computer. Embodiments of the invention may apply to a handheld computer, such as a PDA, having an active input area, where handwriting recognition or digital keyboards may be displayed. [0007]
  • According to one embodiment, input features such as icons provided with the active input area may be substituted in exchange for other input features. [0008]
  • According to another embodiment, a display of the handheld computer may be provided in a portrait mode, with a left or right handed orientation. In providing the handedness orientation, the placement and orientation of the active input area in relation to other portions of the display is considered in order to facilitate users who are either left or right handed. [0009]
  • Other embodiments provide a feedback feature that echoes back to the user a particular character that was just entered through a handwriting recognition scheme. The particular character that is echoed back may be a glyph (e.g. a character before it is displayed as an alphabet or Roman numeral character) that the handheld computer determines match to a handwriting stroke of the user. [0010]
  • Still further, another embodiment provides for a configurable handwriting recognition area for an active input area. In particular, the handwriting recognition area portion of the active input area may be configurable in terms of the number of cells provided, the shape of each cell, the functionality provided by each cell (e.g. what kind of characters are to be recognized in a particular cell) and the dimensions of each cell in both the lengthwise and widthwise directions. [0011]
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • Embodiments of the invention are illustrated by way of example, and not by way of limitation, in the figures of the accompanying drawings. Like reference numerals are intended to refer to similar elements among different figures. [0012]
  • FIG. 1 is a simplified frontal view of a handheld computer with a configurable active input area, under an embodiment of the invention. [0013]
  • FIGS. [0014] 2A-2D illustrate screen shots of a configurable active input area, under one or more embodiments of the invention.
  • FIG. 3 describes a method for replacing elements of an active input area with other elements. [0015]
  • FIGS. [0016] 4A-4C illustrate screen shots of an icon in an active input area being replaced by another icon.
  • FIGS. [0017] 5A-5B illustrate screen shots of a handwriting recognition aid, under an embodiment of the invention.
  • FIGS. [0018] 6A-6C are simplified frontal views of a handheld computer that has user-interface features which can be positioned to facilitate landscape modes with handedness orientation.
  • FIGS. [0019] 7A-7D illustrate screen shots of a display of a handheld computer where different active input areas are displayed in left and right handedness orientations.
  • FIG. 8 is a block diagram that illustrates a portable computer upon which an embodiment of the invention may be implemented. [0020]
  • DETAILED DESCRIPTION OF THE INVENTION
  • Embodiments of the invention provide a set of configurable user-interface features for computers that have contact-sensitive displays. In the following description, for the purposes of explanation, numerous specific details are set forth in order to provide a thorough understanding of the present invention. It will be apparent, however, that the present invention may be practiced without these specific details. In other instances, well-known structures and devices are shown in block diagram form in order to avoid unnecessarily obscuring the present invention. [0021]
  • A. OVERVIEW
  • Embodiments described herein provide for a portable computer with a contact-sensitive display having a user-interface that is configurable through user-contact with a display surface. In some embodiments, an active input area is provided that is configurable in appearance and functionality. As well be described, the configurable nature of the active input area allows for a flexible user-interface that can accommodate, amongst other considerations, left and right handedness, special business applications, and user-preferences. [0022]
  • For purpose of description, embodiments of the invention are described in the context of handheld computers, such as PDAs and smart cell phones, which use contact-sensitive displays. Handheld computers, in particular, illustrate the problem of maximizing user-interface functionality and preferences on a device with a relatively small profile. Embodiments of the invention may also be employed with other types of computers that have contact-sensitive displays, such as on tablet computers, laptops and other portable computers. [0023]
  • In one embodiment, a user-interface can be configured on a computer with a contact-sensitive display. A set of features that are selectable through contact with the display of the computer may be provided on a designated region of the computer's display. When selected, the features cause the computer to perform some function associated with that feature. A tap event, corresponding to an object making a specific form of contact with the display, may be entered by the user to initiate a substitution of one feature for another feature in the designated region. In response to the tap event, a list of alternative features is provided to the user. A selection of one of the alternative features is detected by a user once-again making contact with the display. Then the selected alternative feature is provided on the display instead of the feature that was associated with the tap event. [0024]
  • According to another embodiment, a portable computer is provided that includes a housing, a contact-sensitive display and a processor. The processor is configured to provide an active input area on the display. The active input area includes functionality where the processor recognizes strokes entered on the display as characters. The portable computer may be oriented in a portrait mode, where the active input area extends primarily in a left-right direction from a perspective of a user that operates the portable computer. The portable computer may also be oriented in a landscape mode, where the active input area extends primarily in a top-bottom direction from the perspective of the user. When in the landscape mode, the processor is configured to provide a handedness orientation for the active input area with respect to the display and other features of the [0025] handheld computer 100.
  • B. ACTIVE INPUT AREA
  • With respect to embodiments such as described below, an active input area refers to a graphic, contact-sensitive input mechanism provided on a display surface of a computer. The active input area provides functionality that is oriented for making the active input area the primary focus of the user when the user is interacting with the computer. Accordingly, the active input area may provide a handwriting recognition area, keypad, and/or a keyboard that enables a large number of possible user contacts to be entered and uniquely interpreted from one designated region of the display. To provide an example, in one embodiment, an active input area may include a display region designated for recognizing certain user-contacts as character input, including alphabet and numeric characters. The active input area may also be used to receive commands from the user for performing functions such as launching applications. In this way, an active input area may differ from other user-interfaces of a computer (such as mechanical features like keyboard and buttons) in that it is provided on a contact-sensitive display, and it can be used to receive a large number of unique user-inputs that can subsequently be interpreted. [0026]
  • FIG. 1 illustrates a [0027] handheld computer 100 with a configurable active input area 110, under an embodiment of the invention. In FIG. 1, handheld computer 100 includes a housing 120 having a front panel 122. A display surface 124 is exposed on the front panel 122. The display surface 124 may be part of a display assembly having a digitizer or other construction in which contact between an object and the display surface is detected and recorded. The housing 120 may also provide a plurality of buttons 130, or other actuatable mechanisms. The buttons 130 can be individually actuated to cause handheld computer 100 to perform some function such as launch a program.
  • An [0028] active input area 110 is provided on display surface 124. In an embodiment, active input area 110 is purely digital, and can be selected to appear on display surface 124, rather than be a permanent aspect of the display surface 124. The active input area 110 includes a handwriting recognition area 112. A user may initiate contact with an object in the form of a gesture or stroke on handwriting recognition area 112, and the processing resources of handheld computer 100 interpret that stroke as a character or function. The handwriting recognition area 112 may be immediate in that a single stroke may be recognized as a character after that stroke is completed. A recognized character of an immediately recognized stroke may be outputted on display surface 124 prior to another stroke being entered.
  • The [0029] handwriting recognition area 112 itself may be separated into two or more cells. In one embodiment, a first cell 112A recognizes strokes as alphabetical characters, and a second cell 112B recognizes strokes as numbers. Additional cells may be provided as needed. For example, embodiments described below provide for a “triple-cell” configuration, where one cell of handwriting recognition area 112 is for recognizing strokes as capital letters. Alternatively, a third or additional cell may be for recognizing strokes as functions.
  • The [0030] active input area 110 also includes a plurality of active icons 115, which are placed adjacent to the handwriting recognition area 112. As used herein, the term “active icon” means an icon that has some functionality associated with it. An active icon can be selected to perform its associated function. Accordingly, active icons 115 are each individually selectable to cause the handheld computer 100 to perform a function that corresponds to that icon. Unless stated otherwise, reference to icons in this application is intended to mean “active icons”. In one embodiment, a set of four icons 115 is provided around handwriting recognition area 112, although more or fewer icons may be provided as part of active input area 110 as needed or desired.
  • In one embodiment, one characteristic of the active input area is that it contains multiple user-interface features of different types. Another characteristic of an active input area is that even though it is formed from multiple elements with different functionality, the active input area appears as a unit. Thus, when [0031] active input area 110 is selected to appear, all of the elements designated to be part of the active input area at that particular moment appear with it. With respect to FIG. 1, this would mean that all of the active icons 115 and the handwriting recognition area 112 appear as the components of the active input area 110. Furthermore, these elements appear in the same configuration each time the active input area 110 is displayed. For example, each active icon 115 may occupy the same position relative to handwriting recognition area 112 each time active input area 110 is called on the display surface 124.
  • When not in use, an embodiment provides that [0032] active input area 110 may be minimized into a task bar or other graphic feature that appears on the display. One embodiment provides that the active input area 110 may be made to appear on display surface 124 at any time through one or more taps with the display surface 124. Thus, an area of display surface 124 can be maximized for providing content by minimizing active input area 110, thus facilitating use of handheld computer 100 as, for example, an electronic book reader.
  • FIGS. [0033] 2A-2D provide screen shots of display surface 124 to illustrate where the appearance of active input area 110 may be altered or otherwise changed as needed or selected by a user of handheld computer 100. FIG. 2A illustrates an embodiment where a user selects to provide active input area 110 with a triple-cell configuration. The active input area 110 may also include active icons 115. In the triple-cell configuration, a first cell 112A (usually on the far left) interprets gestures made on that part of the display surface 124 as small cap characters. A second cell 112B (usually on the far right) interprets gestures made on that part of the display surface 124 as numbers. A third cell 112C, which may appear in the middle, interprets gestures made on that part of the display surface 124 as capitalized letters. Such a configuration may be set as a preference of the user.
  • FIG. 2B illustrates another screen shot of how [0034] active input area 110 can be made to appear on the display surface 124. In an embodiment such as shown, active icons 115 are removed from active input area 110. Rather, all of active input area 110 is made into handwriting recognition area 112. Furthermore, cells 112A (for interpreting strokes as characters) and 112B (for interpreting strokes as numbers) are re-sized to be larger widthwise (along the axis X) than the configuration illustrated in FIG. 2A. Furthermore, the dimensions of the two cells 112A, 112B are non-symmetrical, in that cell 112A for characters is larger than cell 112B for numbers. As an example, a configuration such as shown in FIG. 2B may be designated as a user-preference because the user is more likely to use the character entry cell than the numeric entry cell.
  • FIG. 2C illustrates a configuration where [0035] active input area 110 is formed entirely of handwriting recognition area 112, and further that handwriting recognition area 112 has an enlarged height (along the axis Y). For purpose of illustrating variation, a triple cell configuration is also shown, in that a third cell 112C is also provided for recognizing capital letters.
  • FIG. 2D illustrates a reverse configuration for [0036] active input area 110, where handwriting recognition area 112 is made smaller in height (along axis Y), but not minimized. Such an embodiment provides more room on display surface 124 for providing content, while providing some space for a user to enter strokes onto handwriting recognition area 112.
  • In an embodiment, [0037] active input area 110 is adjustable between various configurations, including configurations shown by FIGS. 2A-2D, through user-input with the display surface 124. In one embodiment, boundary lines 212 and 214 may be provided to delineate the active input area 110 from the remaining portion of the display surface 124. The boundary line 212 may correspond to a height of the active input area 110 from an edge 222 of the display surface. The boundary line 214 may correspond to a marker delineating the cells 112A, 112B of the handwriting recognition area 112. In order to adjust the position height of the active input area 110, one embodiment enables the user to select boundary line 212 to move it either upward or downward relative to bottom edge 222, to yield configurations shown by FIGS. 2A and 2D respectively. In order to adjust the dimensions of the cells 112A, 112B, the boundary 214 may be selected and moved to the left or right, such as shown by FIG. 2B. The selection of boundary lines 212, 214 may be done through contact with the display surface 124, or through some other means such as menu selection.
  • According to one embodiment, specific screen shots shown in FIGS. [0038] 2A-2D illustrate preferences that may be selected by the user. The user's selection may be based on factors such as whether display surface 124 is to be used primarily for displaying content, or whether character recognition is to be enhanced.
  • C. RECONFIGURING THE ACTIVE INPUT AREA
  • Embodiments of the invention provide for elements of [0039] active input area 110 to be selected and replaced by other elements as the need arises. As described by FIGS. 3 and 4A-4C, the selection and replacement of elements of active input area 110 may be done at the user level.
  • Alternatively, a manufacturer may provide the [0040] handheld computer 100 with a particular (or default) configuration for active input area 110. Subsequently, vendors or original equipment manufacturers may alter the configuration of the handheld computer 100 from its original manufacturing in order to suit a particular need. For example, active input area 110 may be configured to include elements (such as icons) for a particular internal business application of a company. In one use, an entity such as the company may alter the configurations of the active input area 110 one time, and disable the ability of the end user to subsequently reconfigure the active input area.
  • A more general application for an embodiment of the invention is to enable the end user to configure and reconfigure [0041] active input area 110 as the user desires. According to one embodiment, the active icons 115 that form part of active input area 110 can be selected and configured by a user of handheld computer 100. The user may, for example, switch the icons that appear in the active input area 110, alter the relative positions of such icons, and/or reduce, eliminate or increase the number of icons that appear as part of active input area 110. Once the selection of icons for the active input area 110 is designated by user-input or other means, an embodiment provides that the active input area 110 appears only with the designated selection of icons, at least until that selection is altered or replaced once again.
  • FIG. 3 illustrates a method for substituting out one of the [0042] active icons 115 that appear in active input area 110 for another icon that is selected by the user. Step 310 provides that the active input area is displayed with a designated set of active icons 115. Thus, the active icons 115 of active input area 110 may be displayed with a specific orientation, position, appearance and functionality.
  • In [0043] step 320, a tap event is detected that is associated with one of the icons that appears in the active input area 110. In one embodiment, the location of where the tap event occurs is what associates the tap event with a particular icon of active input area 110.
  • In [0044] step 325, a determination is made as to whether the detected tap event qualifies as a tap event for substituting out one of the active icons 115 (or some other feature of active input area 110) for an alternative icon. The determination may be based on whether the tap event qualifies based on some pre-determined criteria. This determination may distinguish such a tap event from other taps and tap events which are not for substituting out icons from active input area 110.
  • In one embodiment, the tap event is a “tap and hold” where an object such as a stylus is tapped to the [0045] display surface 124 and held in position for a designated duration. In such an embodiment, the duration in which the object making contact with the display is continuously held in contact with the display may form the criteria as to whether the tap event qualifies. The position where the tap and hold occurs may also be part of the criteria for qualifying the tap event. For example, in order to select a particular icon for replacement, the tap event may be required to occur over a particular active icon 115, and last a designated duration so that it is identified as a tap event to substitute out the particular icon. Should the tap occur elsewhere, or not for the designated duration, then the tap event would not be recognized as a tap event to substitute out that particular icon.
  • Rather than a tap and hold event, other embodiments may provide for other types of tap events. Examples of other such tap events include a “tap and drag” event, where the object is tapped to one place on [0046] display surface 124, then dragged continuously to another place on the display surface. For an embodiment where the tap event is a tap and drag, the criteria for qualifying the tap event may be that the first icon is tapped, then the object is continuously dragged across the display to another designated location.
  • Still further, another alternative form for a tap event is a double-tap or even a triple-tap. For example, a series of three taps within a relatively small duration of time that occurs over one of the [0047] icons 115 may be designated to qualify as a request to substitute out the selected icon. Other examples and scenarios are possible.
  • If the determination in [0048] step 325 is that the tap event was to not a request to reconfigure the selection of any of the icons 115 in the active input area 110, then step 330 provides that the tap event is ignored.
  • [0049] Step 340 provides that a list of the alternative icons is displayed in response to a determination that the tap event was to substitute out one of the active icons. The alternative icons may correspond to icons that are not presented in the active input area 110, but that are available in that they are each associated with a distinct functionality by the handheld computer 100. Thus, the selection of any icon provided in the displayed list would cause handheld computer 100 to perform some function associated with that icon. The list may display representations of the available alternative active icons. These representations may correspond to iconic expressions, such as insignias, trademarks, and other graphic associations to the underlying application or functionality.
  • Once the list is displayed, the user is given an opportunity to select a new icon to replace the icon that has been selected for substitution. In [0050] step 345, a determination is made as to whether the user made another selection for another icon to replace the first icon. In one embodiment, this selection may be made by the user tapping a representation of the second icon from the list provided in step 340. If the determination is that no selection was made from the list, then step 350 provides that the list is displayed until the user taps somewhere else on the display surface 124, or somehow initiates or causes some action to indicate that the list should be closed. For example, the user may launch another application with one of the buttons 130, or shut handheld computer 100 off.
  • If the determination is that a selection of the second icon is made from the list, then step [0051] 360 provides that the icon selected for substitution is replaced with the icon selected from the list. Until further alterations, this new icon will appear as part of the active input area 110 each time the active input area is selected to appear. In addition, the next time the list is displayed, a representation of the icon that was substituted out may be provided in the list, so that this icon may be re-selected at a later time as one of the elements of the active input area 110.
  • FIGS. [0052] 4A-4C provide screen shots to illustrate a method such as described in FIG. 3. FIG. 4A shows active input area 110 provided over a task bar 426. In one embodiment, the active input area 110 can be minimized or substituted out of the display. An icon or other item representing the active input area 110 may be provided on the task bar 426. This icon can be selected by a user through contact with the display, or other means, to cause the active input area to re-appear on the display surface 124. The task bar 426 may be persistent, in that it is either always present, or present automatically depending on certain applications or functions performed by the handheld computer 100.
  • FIG. 4A shows [0053] active input area 110 with four active icons 115 when in a displayed state. Each of the active icons 115 is assigned a particular function. When the user taps one of the active icons 115, the function associated with that icon is performed. Examples of functions that can be assigned to active icons 115 include launching a particular application, performing a utility function (such as displaying a search tool or adjusting the contrast of the computer), or opening a particular record. Rather than change the function associated with a particular icon, embodiments of the invention permit the particular icon displayed in the active input area 110 to be replaced by a new icon. With the changing of a particular icon, the functionality offered by that icon is changed in place of the functionality provided by the new replacement icon. Thus, the association between an icon in the active input area 110 and a function or application may be static. This allows the user to have the same visual association between a particular icon and the function associated with that icon.
  • FIGS. [0054] 4A-4C illustrate how a first active icon 115A associated with a “display menu” function can be replaced by a second active icon 115B associate with a communication port application (referred to as “dialer”). The first active icon 115A is assumed to be selected for exchange for another icon by a tap event. The tap event that selects the first active icon 115A for exchange is different than a tap (or other tap event) that would select that and cause the handheld computer 100 to perform the function of the display menu icon. The act of selecting the first active icon 115A in order to cause the handheld computer 100 to perform the function associated with that icon may be performed simply by tapping the icon one time. In contrast, the tap event that selects the first active icon 115A for exchange with another icon may correspond to a stylus tapping on display surface 124 where first active icon 115A is provided, and holding the tap for a designated duration. Alternatively, the tap event for exchanging the first active icon 115A may correspond to the stylus dragging in contact with the display from a location where the first icon 115A is provided to some other location. Still further, the tap event for selecting the first active icon 115A for exchange may correspond to a double-tap or triple-tap on the location of the display surface where the first active icon 115A is provided. In either case, the tap event for selecting the icon for exchange with another icon is differentiable from the tap or tap event for performing the function of that icon, but the particular act required for the tap event may be one of design choice.
  • FIG. 4B illustrates a [0055] list 410 that is opened in response to first active icon 115A being selected for exchange with another icon. The list 410 includes a plurality of representations 412. Each representation 412 corresponds to an alternative active icon that is available to be displayed as part of active input area 110. Once the list 410 is opened, if one of the representations 412 is selected, an icon of that representation would be generated to replace the first active icon 115A. In one embodiment, this would mean that the replacement icon would appear instead of the first active icon 115A, in first active icon's position within the active input area 110. The selection of one of the representations 412 in list 410 may be accomplished by a stylus making contact with a point on display surface 124 where that representation is displayed.
  • Since the [0056] representations 412 are fairly small, there is the possibility that what the user wishes to select and what the user actually selects is not the same thing. For example, the user may miss the desired representation when tapping the display surface 124. Embodiments of the invention provide a feedback function where the selected representation 412 is indicated to the user to afford the user an opportunity to change the selection before the selection is made final. In an FIG. 4B, the selection of one of the representations (the one corresponding to “dialer”) is also visually indicated with some feedback. The feedback may correspond to highlighting the selected representation when it is selected from the list. Alternatively, the feedback may correspond to changing the appearance of the selected representation, such as changing its color, size, or shading. As another example, a distinctive audible may be provided to distinguish which representation 412 from the list 410 was selected from the user.
  • In addition to providing feedback, the [0057] list 410 may visually indicate information about the alternative icons, or about the functionality associated with those alternative icons. Fir example, the list 410 may indicate if certain applications are not available by graying out representations 412 that correspond to those applications.
  • For purpose of explanation, the particular representation selected in FIG. 4B is assumed to correspond to a second [0058] active icon 115B. FIG. 4C illustrates when second active icon 115B is displayed in active input area 110 in place of first active icon 115A. The second active icon 115B takes the place of first active icon 115A in active input area 110. Thus, second active icon 115B occupies the relative position previously occupied by the first active icon 115A in active input area 110. The first active icon 115A is no longer present in active input area 110, but it is available for reselection and exchange with any other icon that is part of the active input area 110. When active input area 110 is subsequently called or used, active input area appears with second icon 115B, at least until the active input area is re-configured.
  • In the past, when the user of the [0059] handheld computer 100 wished to associate new iconic functionality within active input area 110, the user had to associate that new functionality with an icon that always appeared within the active input area. This required the user to learn a new visual association between that icon of the active input area 110 and the newly selected functionality that was to be provided with the active input area. In contrast, embodiments such as described with FIGS. 4A-4C enable the user to create static associations between icons that can appear in the active input area 110 and their respective functionalities. If the user wants a new functionality to be provided by an icon in the active input area 110, the user selects a new icon for the active input area which already has that functionality assigned to it. The user does not need to select a new function for an icon that cannot be substituted out of the active input area 110.
  • Furthermore, embodiments such as described in FIGS. [0060] 4A-4C enables active input area 110 to carry icons created by third-party developers for particular applications. Application developers often create the icons that are associated with their programs. The icons are provided in order to let the user launch an application by selecting the icon associated with that icon. Typically, the icons designed by the developers include graphics such as insignias and trademarks, which uniquely identify their application to the user. These icons are often listed in the menu of the handheld computer 100. With conventional handheld computers, the icon corresponding to the menu function is usually presented in the active input area 110, but the various icons that represent different applications, including third-party developer applications, are not part of the active input area. In contrast, some conventional computers require the user to select a new function for a wildcard icon that always appears on the display, or switch the functionality of one icon (such as the menu icon) in order to assign that icon a new functionality. With embodiments such as described, however, the icons designed and provided by the developers can be imported by the user (or a vendor) into the active input area 110.
  • In an embodiment, the [0061] handheld computer 100 is configured to display the icons that form the active input area 110 using monochromatic display resources. All of the active input area 110, including handwriting recognition area 112, may be provided using monochromatic resources, even if handheld computer 100 has color display resources. Monochromatic resources offer the advantage of being able to display content designed for both color and monochrome. There are many applications which are designed for monochrome environments. By providing for the handheld computer 100 to display the icons of active input area 110 in monochrome, no special consideration needs to be made to distinguish icons made for color from icons made for monochrome, as both types of icons would be displayed in the active input area 110 in monochrome.
  • While embodiments described with FIGS. [0062] 4A-4C contemplate the use of icons as a type of feature that can be switched from and into the active input area 110, embodiments of the invention may apply to other types of features. For example, handwriting recognition area 112 may be switched out of the active input area 110 in the same manner as the active icons. The handwriting recognition area 112 may be switched out in place of a digital keyboard, or a set of icons. Alternatively, the specific type of handwriting recognition area 112 that forms part of the active input area 110 may be selected in a manner such as described with FIGS. 4A-4C. For example, a two-cell version of handwriting recognition area 112 (see FIG. 2B) may be substituted for a triple-cell version (see FIG. 2B) in a manner described above.
  • D. STROKE RECOGNITION ASSISTANCE
  • It is possible for [0063] handheld computer 100, or other computer with a contact-sensitive display, to accept character entry on any location of display surface 124. The acceptance of the character entry may be through display contact mechanisms, such as electronic keyboards and handwriting recognition area. In the case where handwriting recognition is employed, the handheld computer 100 is configured to recognize strokes entered anywhere on display surface 124, where each stroke is immediately recognized as a corresponding character. For example, handheld computer 100 may be configured to recognize certain strokes, such as provided in GRAFFITI and JOT, as characters or commands when those strokes are entered on locations of display surface 124 other than active input area 110. In the case where an electronic keyboard is provided, the electronic keyboard itself may be provided anywhere on the display surface 124. Any taps entered on regions corresponding to keys of the electronic keyboard are recognized as corresponding characters.
  • With either stroke recognition or electronic keyboard entry, some degree of error exists in what is entered by the user and what is interpreted by the [0064] handheld computer 100. The display surfaces 124 are often small, causing the user to miss a key, or not enter a stroke correctly. In the case of handwriting recognition, a user is required to draw the stroke to match one of a set of known strokes. If the user's stroke is off, the handheld computer 100 may recognize the wrong character or command.
  • In an embodiment, [0065] active input area 110 has functionality other than that of receiving input. One embodiment provides that active input area 110 can be used as a visual guide for assisting the user to enter correctly shaped strokes on a remaining portion of display surface 124. For purpose of explanation, the following terminology is used in this application: a glyph is a recognized form of a stroke; and a stroke is what is traced by a user employing an object to make continuous contact (e.g. between a pen-up and a pen-down) with the display surface 124. In one embodiment, immediate handwriting recognition can be performed by matching a stroke to a glyph, and then displaying a character associated with the glyph. U.S. Pat. No. 6,493,464 (hereby incorporated for all purposes in its entirety by this application) describes an immediate handwriting recognition technique using strokes and glyphs.
  • With reference to FIG. 5A, [0066] active input area 110 displays a set of glyphs 552. The region 526 of display surface 124, which excludes active input area 110, is shown as displaying a stroke 554 recently formed by the user. The stroke 554 may have been formed by, for example, the user tracing a shape on the region 526. Since the stroke 524 needs to match a shape of a desired glyph in the set of glyphs 552 in order to be properly recognized, displaying the set of glyphs in the active input area 110 is useful for providing a visual cue for the user. Such an embodiment may be particularly useful in the case where the user is unfamiliar with the particular stroke recognition technique used by the handheld computer 100 (such as GRAFFITI or JOT). Thus, active input area 110 may also serve as a feedback mechanism for providing visual feedback of a user's input operations.
  • According to another embodiment, [0067] active input area 110 provides a visual feedback as to the character that was identified from the stroke 554 that the user entered on the region 526. For example, for stroke 554, active input area 110 may display or somehow indicate simultaneously which character was recognized from that stroke. In FIG. 5B, an indication is shown as to which glyph in the set of glyphs 552 corresponded to the stroke that the user entered. The indication may be in the form of highlighting or shading one glyph that the handheld computer 100 determines to have matched the stroke 554 entered by the user onto the region 526.
  • E. HANDEDNESS ORIENTATION
  • The manner in which [0068] active input area 110 and other user-interface features are provided on handheld computer 100 may be accommodating for landscape modes, with particular handedness configurations. Specifically, the active input area 110 and other input features can be provided on display surface 124 in a landscape mode, with a particular left handed or right handed orientation.
  • Different handedness configurations can be provided because the construction of [0069] active input area 110 enables flexibility as to how it can be shaped and positioned. Specifically, when active input area 110 is electronically generated, the particular portion of display surface 124 upon which the active input area is displayed can be selected. Simultaneously, resources for detecting contact to display surface 124 may be oriented to recognize the particular forms of contact that correspond to the numerous entries that can be made through the active input area 110. Thus, active input area 110 can be created and recreated with physical characteristics that suit a particular configuration, such as a handedness orientation. In particular, the position, dimension, shape, orientation and even components of active input area 110 are selectable based on orienting all of the features according to a particular handedness.
  • FIGS. [0070] 6A-6C shows how the flexibility in the manner active input area 110 is provided can be used to accommodate various preferences of the user, including left or right handedness of the user in the landscape mode. In FIG. 6A, the handheld computer 100 is shown in a portrait mode, which may be the default configuration of the handheld computer. The display surface 124 is assumed to be rectangular in shape, and the portrait mode corresponds to when the length of the display surface extends in an up-down direction from the perspective of the user. The perspective of the user is shown by the axes X and Y, with the X axis corresponding to what the user views as being the up and down direction. The perspective offered with the axes X and Y is that of the user staring into the paper.
  • With reference to FIG. 6A, [0071] active input area 110 extends a height from a bottom surface 612 of display surface 124. The buttons 130 are provided between the bottom surface 612 of display surface 124 and a bottom edge 616 of the housing 120. Based on convention, active input area 110 may be provided at the bottom portion of display surface 124. The active input area 110 may include active icons 115.
  • FIG. 6B illustrates [0072] handheld computer 100 positioned in a landscape mode, with a left handed orientation. The left handed orientation means that most, if not all, of the user-interface features that require the user to make manual contact with handheld computer 100 are provided on the left-hand side of the handheld computer. The active input area 110 is positioned so that when used by a left-handed person, the person's hand will not block the user's view of the display surface 124. The left-hand orientation may be created by rotating display surface 124 clockwise 90 degrees in the direction of A. When rotated, housing 120 provides the buttons in the top-down configuration, to the left of display surface 124. The active input area 110 may be re-generated to extend the same manner as in the portrait mode. Thus, active input area 110 extends in a top-bottom direction, as defined by axis X, but adjacent to a left boundary 621 (when viewed in the configuration of FIG. 6B) of the display surface 124.
  • FIG. 6C illustrates [0073] handheld computer 100 positioned in a landscape mode, with a right handed orientation. As with the left handed orientation, most or all of the user-interface features that require the user to make manual contact with handheld computer 100 are provided on the right-hand side of the handheld computer. The active input area 110 is positioned so that when used by a right-handed person, the person's hand will not block the user's view of the display surface 124. The right-hand orientation may be created by rotating display surface 124 counter-clockwise 90 degrees in the direction of B. When rotated, housing 120 provides the buttons in the top-down configuration, to the right of display surface 124. The active input area 110 may be re-generated to extend the same manner as in the portrait mode. Thus, active input area 110 extends in a top-bottom direction, as defined by axis X, but adjacent to a right boundary 623 (when viewed in the configuration of FIG. 6C) of the display surface 124.
  • Among other advantages, [0074] handheld computer 100 can be configured to enable its contact-sensitive display to be viewed and used in a landscape mode with particular attention to the handedness of the user.
  • FIGS. [0075] 7A-7D show some specific examples of display surface 124 accommodating different modes and handedness. FIG. 7A illustrates the portrait mode for display surface 124, with the length of the display surface 124 extending in the top-bottom direction, along the axis Y. In the example provided, active input area 110 is displaying a set of keys corresponding to special character and number keys. In FIG. 7B, the active input area 110 is rotated into the right-handed landscape orientation. The same set of keys provided in the active input area 110 with FIG. 7A now are stacked vertically, so that the length of the active input area 110 extends in the direction of the axis Y.
  • FIGS. 7C and 7D illustrate the [0076] active input area 110 with cells that comprise the handwriting recognition area 112. When in the portrait mode, an embodiment provides that the left cell 112A, the right cell 112B and the center cell 112C of the handwriting recognition area 112 are provided to receive strokes as input. In FIG. 7C, the left-handed landscape orientation is shown, with the cell 112A being in the top position within active input area 110, and the cell 112C being in the bottom most position. In the left-handed orientation, the active input area 110 appears to the left of the display surface 124. In FIG. 7D, the right-handed landscape orientation is shown. The right-handed orientation of FIG. 7D mirrors the orientation of active input area 110 in FIG. 7C, except that the active input area appears to the right of the display surface 124.
  • F. COMPONENTS OF A PORTABLE COMPUTER
  • FIG. 8 illustrates the components of a [0077] portable computer 800, under an embodiment of the invention. The portable computer 800 may, for example, correspond to handheld computer 100. In an embodiment, portable computer 800 includes a processor 810, an analog-digital (A/D) converter 820, a set of mechanical buttons 830, a volatile memory 840, a non-volatile memory 845 and a contact-sensitive display assembly 850. A power source 825 may be used to power the various components of the portable computer 800. One typical component of the portable computer 800 is an expansion port 842. Typically, multiple such expansion ports are provided on such portable computers.
  • The contact sensitive display assembly [0078] 850 may include a display 852 and a digitizer 854. A display driver 856 may also form part of the display assembly 850. The digitizer 854 may be connected to the A/D converter 820. The digitizer 854 uses analog signals to detect contact with the display 852, and to track the object making the contact as it moves over the display. The A/D converter converts the signals into a digital form for processor 810, which interprets what input in entered by the contact with the display 852. The driver 856 may be coupled to the processor 810 in order to receive signals that are translated into output on the display 852. The output may correspond to content that appears on the display surface 124 in previous embodiments, as well as to the digitally-created active input area 110.
  • The [0079] display driver 856 may provide some or all of the monochromatic resources that are used to display icons, representations of the icons, and/or the active input area 110. As mentioned, the monochromatic resources enable the developer to make just one set of icons that work for all applications and all devices, since all such applications and devices can use monochrome, but not all such devices use color.
  • While an embodiment such as described with FIG. 8 provides for a display assembly that is integrated and formed as part of the housing of the [0080] portable computer 800, other embodiments may provide for a portable computer where the contact-sensitive display is remote to the housing of the portable computer, or at least to the housing where the processor 810 is provided. Such an embodiment may provide, for example, a projector that displays the content being provided by the processor 810 onto a surface such as a table. The portable computer 100 may sense the user's interaction with the surface where the projection is provided. Thus, the display surface may be external to the portable computer or its primary housing.
  • G. CONCLUSION
  • In the foregoing specification, the invention has been described with reference to specific embodiments thereof. It will, however, be evident that various modifications and changes may be made thereto without departing from the broader spirit and scope of the invention. The specification and drawings are, accordingly, to be regarded in an illustrative rather than a restrictive sense. [0081]

Claims (25)

What is claimed is:
1. A portable computer comprising:
a housing;
a contact-sensitive display supported by the housing, wherein a surface of the display is exposed and viewable on a front of the housing;
a processor coupled to the display and configured to:
provide an active input area on the display comprising a handwriting recognition area, wherein the processor recognizes strokes entered on the handwriting recognition area as characters;
wherein in a portrait orientation, the processor is configured to display the active input area to extend primarily in a left-right direction from a perspective of a user operating the portable computer;
wherein in a landscape orientation, the processor is configured to display the active input area to extend primarily in a top-bottom direction from the perspective of the user;
wherein the processor is configured to detect a user-input to select a handedness for the landscape orientation, and in response to the selection corresponding to a right-handedness orientation, providing the active input area adjacent to a right boundary of the surface from the perspective of the user, and in response to the selection corresponding to a left-handedness orientation, providing the active input area adjacent to a left boundary of the surface from the perspective of the user.
2. The portable computer of claim 1, further comprising a set of one or more buttons, wherein with respect to a portrait orientation, the buttons are provided between a bottom surface of the surface and a bottom edge of the housing.
3. The portable computer of claim 2, wherein the contact-sensitive display includes a digitizer that is coupled to the processor to detect contact with the display.
4. The portable computer of claim 1, wherein the processor is configured to display one or more icons within the active input area, wherein each of the one or more icons is selectable to cause the processor to perform a function associated with that icon.
5. The portable computer of claim 1, wherein the processor is configured to display the active input area in response to a user-selection to make the active input area appear.
6. The portable computer of claim 1, wherein the processor is configured to display the active input area to cover only a minority portion of the surface.
7. The portable computer of claim 5, wherein the processor is configured to display a persistent task bar on the display, and to provide an item on the task bar that can be selected through contact with the display in order to cause the processor to display the active input area.
8. The portable computer of claim 1, wherein the handwriting recognition area of the active input area is, in the portrait orientation, segmented into at least a left cell and a right cell from the perspective of the user, and wherein in a landscape orientation, the left cell is provided as a top cell, and the right cell is provided as a bottom cell.
9. The portable computer of claim 8, wherein the left cell and the top cell is for recognizing a first set of characters, and the right and the bottom cell is for recognizing a second set of characters.
10. The portable computer of claim 1, wherein the left cell and top cell are for recognizing one of alphabet characters or numeric characters, and the right cell and the bottom cell are for recognizing the other of the alphabet characters or numeric characters.
11. The portable computer of claim 1, wherein the active input area is provided adjacent to an application portion of the surface where content from the portable computer executing an application is displayed, so that in the right-handedness orientation, the application portion of the surface appears to the left of the active input area, and so that in the left-handedness orientation, the application portion of the surface appears to the right of the active input area.
12. The portable computer of claim 1, wherein the processor recognizes strokes entered on a first portion of the active input area as alphabetical characters, and the processor recognizes strokes entered on a second portion of the active input area as numeric characters.
13. The portable computer of claim 1, wherein the user-input for selecting the handedness of the portable computer corresponds to a user making contact with the surface.
14. The portable computer of claim 9, wherein the processor immediately recognizes strokes entered on at least a portion of the active input area as characters, and for each stroke, immediately displays a character recognized from the stroke on the application portion of the surface.
15. The portable computer of claim 1, wherein the processor is configured to adjust one or more dimensions of at least a portion of the active input area in response to user-input.
16. The portable computer of claim 8, wherein the handwriting recognition area may be segmented into three or more cells, wherein each cell is designated for receiving strokes from a corresponding set of strokes that is assigned to that particular cell.
17. A method for configuring a portable computer, wherein the portable computer includes a housing, and a contact-sensitive display supported by the housing, wherein a surface of the display is viewable from the housing, wherein the method comprises:
providing an active input area on the surface of the display that includes at least a handwriting recognition area upon which strokes may be entered and recognized as characters;
providing a portrait orientation for the display in which the active input area is displayed to extend primarily widthwise but not lengthwise with respect to a perspective of the user viewing the display;
providing a landscape orientation for the display in which the active input area extends primarily lengthwise but not widthwise with respect to a perspective of the user viewing the display;
detecting a selection of a handedness for the landscape orientation;
in response to the selection corresponding to a right-handedness orientation, providing the active input area adjacent to a right boundary of the surface from a perspective of a user looking at the display; and
in response to the selection corresponding to a left-handedness orientation, providing the active input area adjacent to a left boundary of the surface from a perspective of the user looking at the display.
18. The method of claim 17, further comprising providing one or more active icons as part of the active input area.
19. A portable computer comprising:
a contact-sensitive display including a display surface;
a processor coupled to the display and configured to:
immediately recognize strokes entered through contact onto a handwriting recognition area of the display surface as characters; and
receive a user-input to alter one or more dimensions of the handwriting recognition area.
20. The portable computer of claim 19, wherein the processor is configured to display the handwriting recognition area to the user so that the handwriting recognition area is a delineated portion of what appears on the display surface.
21. The portable computer of claim 20, wherein the user-input includes the user directing an object to make contact with a location of the display surface approximately where a boundary of the handwriting recognition area is provided.
22. The portable computer of claim 21, wherein the user-input further includes the user dragging the object on the display surface a certain distance to indicate a new value for the dimension of the handwriting recognition area being altered.
23. The portable computer of claim 19, wherein the process is configured to segment the handwriting recognition area into three or more cells, wherein the first cell is designated for recognizing strokes entered onto that cell as a character from a first set of characters, wherein the second cell is designated for recognizing strokes entered onto that cell as a character from a second set of characters, and wherein the third cell is designated for recognizing strokes entered onto that cell as a character from a third set of characters.
24. A portable computer comprising:
a contact-sensitive display including a display surface;
a processor coupled to the display and configured to:
recognize a stroke entered through contact onto any portion of the display surface as a character from a set of characters; and
simultaneously with recognizing a stroke entered through contact, display a glyph that matched the stroke.
25. The portable computer of claim 24, further comprising immediately displaying the recognized character along the with glyph that matched the recognized stroke.
US10/452,233 2002-08-26 2003-05-30 User-interface features for computers with contact-sensitive displays Abandoned US20040036680A1 (en)

Priority Applications (5)

Application Number Priority Date Filing Date Title
US10/452,233 US20040036680A1 (en) 2002-08-26 2003-05-30 User-interface features for computers with contact-sensitive displays
CA002496774A CA2496774A1 (en) 2002-08-26 2003-08-26 User-interface features for computers with contact sensitive displays
AU2003262921A AU2003262921A1 (en) 2002-08-26 2003-08-26 User-interface features for computers with contact sensitive displays
EP03793432A EP1558985A2 (en) 2002-08-26 2003-08-26 User-interface features for computers with contact sensitive displays
PCT/US2003/026869 WO2004019200A2 (en) 2002-08-26 2003-08-26 User-interface features for computers with contact sensitive displays

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US40626402P 2002-08-26 2002-08-26
US10/452,233 US20040036680A1 (en) 2002-08-26 2003-05-30 User-interface features for computers with contact-sensitive displays

Publications (1)

Publication Number Publication Date
US20040036680A1 true US20040036680A1 (en) 2004-02-26

Family

ID=31997669

Family Applications (3)

Application Number Title Priority Date Filing Date
US10/452,233 Abandoned US20040036680A1 (en) 2002-08-26 2003-05-30 User-interface features for computers with contact-sensitive displays
US10/452,232 Active 2025-04-07 US7406666B2 (en) 2002-08-26 2003-05-30 User-interface features for computers with contact-sensitive displays
US12/144,545 Expired - Lifetime US7831934B2 (en) 2002-08-26 2008-06-23 User-interface features for computers with contact-sensitive displays

Family Applications After (2)

Application Number Title Priority Date Filing Date
US10/452,232 Active 2025-04-07 US7406666B2 (en) 2002-08-26 2003-05-30 User-interface features for computers with contact-sensitive displays
US12/144,545 Expired - Lifetime US7831934B2 (en) 2002-08-26 2008-06-23 User-interface features for computers with contact-sensitive displays

Country Status (5)

Country Link
US (3) US20040036680A1 (en)
EP (1) EP1558985A2 (en)
AU (1) AU2003262921A1 (en)
CA (1) CA2496774A1 (en)
WO (1) WO2004019200A2 (en)

Cited By (47)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040046791A1 (en) * 2002-08-26 2004-03-11 Mark Davis User-interface features for computers with contact-sensitive displays
US20050219226A1 (en) * 2004-04-02 2005-10-06 Ying Liu Apparatus and method for handwriting recognition
US20070094417A1 (en) * 2005-05-16 2007-04-26 Hur Yong S Mobile terminal having scrolling device and method implementing functions using the same
US20070216643A1 (en) * 2004-06-16 2007-09-20 Morris Robert P Multipurpose Navigation Keys For An Electronic Device
WO2007118019A2 (en) * 2006-04-06 2007-10-18 Motorola, Inc. Method and apparatus for user interface adaptation
US20080001932A1 (en) * 2006-06-30 2008-01-03 Inventec Corporation Mobile communication device
EP1923778A2 (en) * 2006-11-16 2008-05-21 LG Electronics, Inc. Mobile terminal and screen display method thereof
US20080166049A1 (en) * 2004-04-02 2008-07-10 Nokia Corporation Apparatus and Method for Handwriting Recognition
US20080266244A1 (en) * 2007-04-30 2008-10-30 Xiaoping Bai Dual Sided Electrophoretic Display
US20080305837A1 (en) * 2007-06-08 2008-12-11 Inventec Corporation Mobile communication apparatus
US20080316397A1 (en) * 2007-06-22 2008-12-25 Polak Robert D Colored Morphing Apparatus for an Electronic Device
US20090015597A1 (en) * 2000-05-18 2009-01-15 Palm, Inc. Reorienting display on portable computing device
US20090046072A1 (en) * 2007-08-13 2009-02-19 Emig David M Electrically Non-interfering Printing for Electronic Devices Having Capacitive Touch Sensors
US20090161059A1 (en) * 2007-12-19 2009-06-25 Emig David M Field Effect Mode Electro-Optical Device Having a Quasi-Random Photospacer Arrangement
US20090198132A1 (en) * 2007-08-10 2009-08-06 Laurent Pelissier Hand-held ultrasound imaging device having reconfigurable user interface
EP2115555A1 (en) * 2007-02-27 2009-11-11 Motorola, Inc. Adaptable user interface and mechanism for a portable electronic device
US20090300537A1 (en) * 2008-05-27 2009-12-03 Park Kenneth J Method and system for changing format for displaying information on handheld device
US20100110020A1 (en) * 2008-10-31 2010-05-06 Sprint Communications Company L.P. Virtual press number pad
US20100171693A1 (en) * 2009-01-06 2010-07-08 Kenichi Tamura Display control device, display control method, and program
US7859518B1 (en) 2001-06-04 2010-12-28 Palm, Inc. Interface for interaction with display visible from both sides
US20110012926A1 (en) * 2009-07-17 2011-01-20 Apple Inc. Selective rotation of a user interface
US20110029904A1 (en) * 2009-07-30 2011-02-03 Adam Miles Smith Behavior and Appearance of Touch-Optimized User Interface Elements for Controlling Computer Function
US20110105186A1 (en) * 2009-10-29 2011-05-05 Research In Motion Limited Systems and methods for providing direct and indirect navigation modes for touchscreen devices
US8059232B2 (en) 2008-02-08 2011-11-15 Motorola Mobility, Inc. Electronic device and LC shutter for polarization-sensitive switching between transparent and diffusive states
US20130215062A1 (en) * 2008-01-29 2013-08-22 Kyocera Corporation Terminal device with display function
US20130301272A1 (en) * 2012-05-09 2013-11-14 Chan Hee Wang Display device and method for fabricating the same
US20150253891A1 (en) * 2008-01-04 2015-09-10 Apple Inc. Selective rejection of touch contacts in an edge region of a touch surface
US20150309712A1 (en) * 2010-11-26 2015-10-29 Hologic, Inc. User interface for medical image review workstation
US9632608B2 (en) 2008-12-08 2017-04-25 Apple Inc. Selective input signal rejection and modification
US20180121071A1 (en) * 2016-11-03 2018-05-03 Ford Global Technologies, Llc Vehicle display based on vehicle speed
US9971496B2 (en) 2014-08-04 2018-05-15 Google Technology Holdings LLC Method and apparatus for adjusting a graphical user interface on an electronic device
US10282155B2 (en) 2012-01-26 2019-05-07 Google Technology Holdings LLC Portable electronic device and method for controlling operation thereof taking into account which limb possesses the electronic device
US10459608B2 (en) * 2014-12-01 2019-10-29 Ebay Inc. Mobile optimized shopping comparison
US11307743B2 (en) * 2017-11-07 2022-04-19 Samsung Electronics Co., Ltd. Method, electronic device and storage medium for providing mode switching
US11314391B2 (en) * 2017-09-08 2022-04-26 Guangdong Oppo Mobile Telecommunications Corp., Ltd. Navigation bar controlling method and terminal
US11379060B2 (en) 2004-08-25 2022-07-05 Apple Inc. Wide touchpad on a portable computer
US11403483B2 (en) 2017-06-20 2022-08-02 Hologic, Inc. Dynamic self-learning medical image method and system
US11406332B2 (en) 2011-03-08 2022-08-09 Hologic, Inc. System and method for dual energy and/or contrast enhanced breast imaging for screening, diagnosis and biopsy
US11419565B2 (en) 2014-02-28 2022-08-23 IIologic, Inc. System and method for generating and displaying tomosynthesis image slabs
US11445993B2 (en) 2017-03-30 2022-09-20 Hologic, Inc. System and method for targeted object enhancement to generate synthetic breast tissue images
US11455754B2 (en) 2017-03-30 2022-09-27 Hologic, Inc. System and method for synthesizing low-dimensional image data from high-dimensional image data using an object grid enhancement
US11452486B2 (en) 2006-02-15 2022-09-27 Hologic, Inc. Breast biopsy and needle localization using tomosynthesis systems
US11508340B2 (en) 2011-11-27 2022-11-22 Hologic, Inc. System and method for generating a 2D image using mammography and/or tomosynthesis image data
US11550466B2 (en) 2012-08-27 2023-01-10 Samsung Electronics Co., Ltd. Method of controlling a list scroll bar and an electronic device using the same
US11589944B2 (en) 2013-03-15 2023-02-28 Hologic, Inc. Tomosynthesis-guided biopsy apparatus and method
US11663780B2 (en) 2012-02-13 2023-05-30 Hologic Inc. System and method for navigating a tomosynthesis stack using synthesized image data
US11701199B2 (en) 2009-10-08 2023-07-18 Hologic, Inc. Needle breast biopsy system and method of use

Families Citing this family (56)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8095879B2 (en) * 2002-12-10 2012-01-10 Neonode Inc. User interface for mobile handheld computer unit
US7886236B2 (en) * 2003-03-28 2011-02-08 Microsoft Corporation Dynamic feedback for gestures
US7496385B2 (en) * 2003-12-29 2009-02-24 International Business Machines Corporation Method for viewing information underlying lists and other contexts
US7895537B2 (en) * 2003-12-29 2011-02-22 International Business Machines Corporation Method and apparatus for setting attributes and initiating actions through gestures
DE102004013415B4 (en) * 2004-03-18 2011-12-08 Disetronic Licensing Ag Rotatable display of a medical, pharmaceutical or cosmetic device
US7454174B2 (en) * 2004-08-03 2008-11-18 Qualcomm, Incorporated Estimation of received signal strength
US20060227100A1 (en) * 2005-03-30 2006-10-12 Yu Kun Mobile communication terminal and method
US7509588B2 (en) 2005-12-30 2009-03-24 Apple Inc. Portable electronic device with interface reconfiguration mode
KR20070113022A (en) * 2006-05-24 2007-11-28 엘지전자 주식회사 Apparatus and operating method of touch screen responds to user input
KR101327581B1 (en) * 2006-05-24 2013-11-12 엘지전자 주식회사 Apparatus and Operating method of touch screen
TWI328185B (en) * 2006-04-19 2010-08-01 Lg Electronics Inc Touch screen device for potable terminal and method of displaying and selecting menus thereon
KR20070113025A (en) * 2006-05-24 2007-11-28 엘지전자 주식회사 Apparatus and operating method of touch screen
KR101269375B1 (en) * 2006-05-24 2013-05-29 엘지전자 주식회사 Touch screen apparatus and Imige displaying method of touch screen
KR20070113018A (en) * 2006-05-24 2007-11-28 엘지전자 주식회사 Apparatus and operating method of touch screen
US20090213086A1 (en) * 2006-04-19 2009-08-27 Ji Suk Chae Touch screen device and operating method thereof
TW200805131A (en) * 2006-05-24 2008-01-16 Lg Electronics Inc Touch screen device and method of selecting files thereon
TW200744352A (en) * 2006-05-26 2007-12-01 Benq Corp Mobile communication devices and methods for displaying menu thereof
US20070295540A1 (en) * 2006-06-23 2007-12-27 Nurmi Mikko A Device feature activation
US10313505B2 (en) 2006-09-06 2019-06-04 Apple Inc. Portable multifunction device, method, and graphical user interface for configuring and displaying widgets
US8161395B2 (en) * 2006-11-13 2012-04-17 Cisco Technology, Inc. Method for secure data entry in an application
US8120584B2 (en) * 2006-12-21 2012-02-21 Cypress Semiconductor Corporation Feedback mechanism for user detection of reference location on a sensing device
US8519964B2 (en) 2007-01-07 2013-08-27 Apple Inc. Portable multifunction device, method, and graphical user interface supporting user navigations of graphical objects on a touch screen display
US7999789B2 (en) * 2007-03-14 2011-08-16 Computime, Ltd. Electrical device with a selected orientation for operation
US20090019188A1 (en) * 2007-07-11 2009-01-15 Igt Processing input for computing systems based on the state of execution
KR101365595B1 (en) * 2007-08-16 2014-02-21 삼성전자주식회사 Method for inputting of device containing display unit based on GUI and apparatus thereof
US8619038B2 (en) 2007-09-04 2013-12-31 Apple Inc. Editing interface
KR20090029138A (en) * 2007-09-17 2009-03-20 삼성전자주식회사 The method of inputting user command by gesture and the multimedia apparatus thereof
KR101499546B1 (en) * 2008-01-17 2015-03-09 삼성전자주식회사 Method and apparatus for controlling display area in touch screen device, and computer readable medium thereof
US8769427B2 (en) 2008-09-19 2014-07-01 Google Inc. Quick gesture input
US8508475B2 (en) * 2008-10-24 2013-08-13 Microsoft Corporation User interface elements positioned for display
US8423916B2 (en) * 2008-11-20 2013-04-16 Canon Kabushiki Kaisha Information processing apparatus, processing method thereof, and computer-readable storage medium
US20100138781A1 (en) * 2008-11-30 2010-06-03 Nokia Corporation Phonebook arrangement
KR20100067381A (en) * 2008-12-11 2010-06-21 삼성전자주식회사 Method for providing a physical user interface and a portable terminal therefor
JP5353345B2 (en) * 2009-03-18 2013-11-27 株式会社リコー Information processing apparatus, display processing method, and program
US8587532B2 (en) 2009-12-18 2013-11-19 Intel Corporation Multi-feature interactive touch user interface
US8423911B2 (en) 2010-04-07 2013-04-16 Apple Inc. Device, method, and graphical user interface for managing folders
US10788976B2 (en) 2010-04-07 2020-09-29 Apple Inc. Device, method, and graphical user interface for managing folders with multiple pages
US8531417B2 (en) 2010-09-02 2013-09-10 Blackberry Limited Location of a touch-sensitive control method and apparatus
JP5075975B2 (en) * 2010-12-27 2012-11-21 株式会社東芝 Information processing apparatus, information processing method, and program
US8610682B1 (en) * 2011-02-17 2013-12-17 Google Inc. Restricted carousel with built-in gesture customization
KR101898202B1 (en) * 2012-02-09 2018-09-12 삼성전자주식회사 Apparatus and method for guiding writing input for recognation of writing
EP2631738B1 (en) * 2012-02-24 2016-04-13 BlackBerry Limited Method and apparatus for adjusting a user interface to reduce obscuration
EP2631747B1 (en) 2012-02-24 2016-03-30 BlackBerry Limited Method and apparatus for providing a user interface on a device that indicates content operators
EP2631760A1 (en) 2012-02-24 2013-08-28 Research In Motion Limited Method and apparatus for providing a user interface on a device enabling selection of operations to be performed in relation to content
CN103376972A (en) * 2012-04-12 2013-10-30 环达电脑(上海)有限公司 Electronic device and control method of touch control screen of electronic device
US20140184519A1 (en) * 2012-12-28 2014-07-03 Hayat Benchenaa Adapting user interface based on handedness of use of mobile computing device
US20140331154A1 (en) * 2013-05-05 2014-11-06 Carrier Corporation User defined interface system and a method for using the same
WO2014192125A1 (en) * 2013-05-30 2014-12-04 株式会社 東芝 Electronic device and processing method
EP3063608B1 (en) 2013-10-30 2020-02-12 Apple Inc. Displaying relevant user interface objects
US10066959B2 (en) 2014-09-02 2018-09-04 Apple Inc. User interactions for a mapping application
EP3254452B1 (en) 2015-02-02 2018-12-26 Apple Inc. Device, method, and graphical user interface for establishing a relationship and connection between two devices
DK201770423A1 (en) 2016-06-11 2018-01-15 Apple Inc Activity and workout updates
DK201670595A1 (en) 2016-06-11 2018-01-22 Apple Inc Configuring context-specific user interfaces
US11816325B2 (en) * 2016-06-12 2023-11-14 Apple Inc. Application shortcuts for carplay
US11675476B2 (en) 2019-05-05 2023-06-13 Apple Inc. User interfaces for widgets
US11863700B2 (en) * 2019-05-06 2024-01-02 Apple Inc. Providing user interfaces based on use contexts and managing playback of media

Citations (72)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5252951A (en) * 1989-04-28 1993-10-12 International Business Machines Corporation Graphical user interface with gesture recognition in a multiapplication environment
US5276794A (en) * 1990-09-25 1994-01-04 Grid Systems Corporation Pop-up keyboard system for entering handwritten data into computer generated forms
US5347295A (en) * 1990-10-31 1994-09-13 Go Corporation Control of a computer through a position-sensed stylus
US5432720A (en) * 1992-11-13 1995-07-11 International Business Machines Corporation Rotatable pen-based computer
US5483261A (en) * 1992-02-14 1996-01-09 Itu Research, Inc. Graphical input controller and method with rear screen image detection
US5502461A (en) * 1993-05-11 1996-03-26 Sanyo Electric Co., Ltd. Hand written character input system/allowing change of size of character writing frames
US5596697A (en) * 1993-09-30 1997-01-21 Apple Computer, Inc. Method for routing items within a computer system
US5621438A (en) * 1992-10-12 1997-04-15 Hitachi, Ltd. Pointing information processing apparatus with pointing function
US5644737A (en) * 1995-06-06 1997-07-01 Microsoft Corporation Method and system for stacking toolbars in a computer display
US5731801A (en) * 1994-03-31 1998-03-24 Wacom Co., Ltd. Two-handed method of displaying information on a computer display
US5736974A (en) * 1995-02-17 1998-04-07 International Business Machines Corporation Method and apparatus for improving visibility and selectability of icons
US5745718A (en) * 1995-07-31 1998-04-28 International Business Machines Corporation Folder bar widget
US5757371A (en) * 1994-12-13 1998-05-26 Microsoft Corporation Taskbar with start menu
US5760773A (en) * 1995-01-06 1998-06-02 Microsoft Corporation Methods and apparatus for interacting with data objects using action handles
US5801699A (en) * 1996-01-26 1998-09-01 International Business Machines Corporation Icon aggregation on a graphical user interface
US5805167A (en) * 1994-09-22 1998-09-08 Van Cruyningen; Izak Popup menus with directional gestures
US5828376A (en) * 1996-09-23 1998-10-27 J. D. Edwards World Source Company Menu control in a graphical user interface
US5859623A (en) * 1996-05-14 1999-01-12 Proxima Corporation Intelligent display system presentation projection arrangement and method of using same
US5936619A (en) * 1992-09-11 1999-08-10 Canon Kabushiki Kaisha Information processor
US5940488A (en) * 1996-11-15 1999-08-17 Active Voice Corporation Telecommunication management system and user interface
US5973664A (en) * 1998-03-19 1999-10-26 Portrait Displays, Inc. Parameterized image orientation for computer displays
US6018346A (en) * 1998-01-12 2000-01-25 Xerox Corporation Freeform graphics system having meeting objects for supporting meeting objectives
US6037937A (en) * 1997-12-04 2000-03-14 Nortel Networks Corporation Navigation tool for graphical user interface
US6057836A (en) * 1997-04-01 2000-05-02 Microsoft Corporation System and method for resizing and rearranging a composite toolbar by direct manipulation
US6067584A (en) * 1996-09-09 2000-05-23 National Instruments Corporation Attribute-based system and method for configuring and controlling a data acquisition task
US6069623A (en) * 1997-09-19 2000-05-30 International Business Machines Corporation Method and system for the dynamic customization of graphical user interface elements
US6096094A (en) * 1997-10-03 2000-08-01 National Instruments Corporation Configuration manager for configuring a data acquisition system
US6097392A (en) * 1992-09-10 2000-08-01 Microsoft Corporation Method and system of altering an attribute of a graphic object in a pen environment
US6133915A (en) * 1998-06-17 2000-10-17 Microsoft Corporation System and method for customizing controls on a toolbar
US6181344B1 (en) * 1998-03-20 2001-01-30 Nuvomedia, Inc. Drag-and-release method for configuring user-definable function key of hand-held computing device
US20010002126A1 (en) * 1995-12-01 2001-05-31 Immersion Corporation Providing force feedback to a user of an interface device based on interactions of a user-controlled cursor in a graphical user interface
US6300946B1 (en) * 1997-01-29 2001-10-09 Palm, Inc. Method and apparatus for interacting with a portable computer
US6310610B1 (en) * 1997-12-04 2001-10-30 Nortel Networks Limited Intelligent touch display
US20010038394A1 (en) * 2000-05-08 2001-11-08 Tadao Tsuchimura Information display system having graphical user interface, and medium
US20020015064A1 (en) * 2000-08-07 2002-02-07 Robotham John S. Gesture-based user interface to multi-level and multi-modal sets of bit-maps
US6346972B1 (en) * 1999-05-26 2002-02-12 Samsung Electronics Co., Ltd. Video display apparatus with on-screen display pivoting function
US20020021278A1 (en) * 2000-07-17 2002-02-21 Hinckley Kenneth P. Method and apparatus using multiple sensors in a device with a display
US20020033836A1 (en) * 2000-06-06 2002-03-21 Smith Scott R. Device and method for changing the orientation and configuration of a display of an electronic device
US20020059350A1 (en) * 2000-11-10 2002-05-16 Marieke Iwema Insertion point bungee space tool
US20020078037A1 (en) * 2000-10-12 2002-06-20 Mitsuyuki Hatanaka Information processing apparatus and method, and program storing medium
US20020091700A1 (en) * 2000-01-21 2002-07-11 Steele Robert A. Unique architecture for handheld computers
US20020113784A1 (en) * 2000-12-29 2002-08-22 Feilmeier Michael Leon Portable computer aided design apparatus and method
US20020163544A1 (en) * 2001-03-02 2002-11-07 Baker Bruce R. Computer device, method and article of manufacture for utilizing sequenced symbols to enable programmed application and commands
US20020173721A1 (en) * 1999-08-20 2002-11-21 Novasonics, Inc. User interface for handheld imaging devices
US6493464B1 (en) * 1994-07-01 2002-12-10 Palm, Inc. Multiple pen stroke character set and handwriting recognition system with immediate response
US20020188636A1 (en) * 2001-05-02 2002-12-12 Peck David K. System and method for in-line editing of web-based documents
US20030013959A1 (en) * 1999-08-20 2003-01-16 Sorin Grunwald User interface for handheld imaging devices
US20030018245A1 (en) * 2001-07-17 2003-01-23 Accuimage Diagnostics Corp. Methods for generating a lung report
US20030016850A1 (en) * 2001-07-17 2003-01-23 Leon Kaufman Systems and graphical user interface for analyzing body images
US6525749B1 (en) * 1993-12-30 2003-02-25 Xerox Corporation Apparatus and method for supporting the implicit structure of freeform lists, outlines, text, tables and diagrams in a gesture-based input system and editing system
US20030050906A1 (en) * 1998-08-26 2003-03-13 Gervase Clifton-Bligh Methods and devices for mapping data files
US6545669B1 (en) * 1999-03-26 2003-04-08 Husam Kinawi Object-drag continuity between discontinuous touch-screens
US20030073430A1 (en) * 2001-10-17 2003-04-17 Palm, Inc. User interface-technique for managing an active call
US6597347B1 (en) * 1991-11-26 2003-07-22 Itu Research Inc. Methods and apparatus for providing touch-sensitive input in multiple degrees of freedom
US6624831B1 (en) * 2000-10-17 2003-09-23 Microsoft Corporation System and process for generating a dynamically adjustable toolbar
US20030221167A1 (en) * 2001-04-25 2003-11-27 Eric Goldstein System, method and apparatus for selecting, displaying, managing, tracking and transferring access to content of web pages and other sources
US6683600B1 (en) * 2000-04-19 2004-01-27 Microsoft Corporation Adaptive input pen mode selection
US6683623B1 (en) * 2000-08-30 2004-01-27 New Forum Publishers System and method for providing and accessing educational information over a computer network
US20040032413A1 (en) * 2002-08-13 2004-02-19 Fuller David W. Multiple views for a measurement system diagram
US20040113935A1 (en) * 2001-05-25 2004-06-17 O'neal David System and method for electronic presentations
US20040174398A1 (en) * 2003-03-04 2004-09-09 Microsoft Corporation System and method for navigating a graphical user interface on a smaller display
US20050024341A1 (en) * 2001-05-16 2005-02-03 Synaptics, Inc. Touch screen with user interface enhancement
US6952203B2 (en) * 2002-01-08 2005-10-04 International Business Machines Corporation Touchscreen user interface: Bluetooth™ stylus for performing right mouse clicks
US20060048058A1 (en) * 2001-05-25 2006-03-02 Learning Tree International System and method for electronic presentations
US7015894B2 (en) * 2001-09-28 2006-03-21 Ricoh Company, Ltd. Information input and output system, method, storage medium, and carrier wave
US7030888B1 (en) * 1999-03-01 2006-04-18 Eastman Kodak Company Color processing
US7185274B1 (en) * 1999-12-07 2007-02-27 Microsoft Corporation Computer user interface architecture wherein users interact with both content and user interface by activating links
US7190351B1 (en) * 2002-05-10 2007-03-13 Michael Goren System and method for data input
US20070128899A1 (en) * 2003-01-12 2007-06-07 Yaron Mayer System and method for improving the efficiency, comfort, and/or reliability in Operating Systems, such as for example Windows
US20070203906A1 (en) * 2003-09-22 2007-08-30 Cone Julian M Enhanced Search Engine
US20080177994A1 (en) * 2003-01-12 2008-07-24 Yaron Mayer System and method for improving the efficiency, comfort, and/or reliability in Operating Systems, such as for example Windows
US7831934B2 (en) * 2002-08-26 2010-11-09 Palm, Inc. User-interface features for computers with contact-sensitive displays

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5889888A (en) * 1996-12-05 1999-03-30 3Com Corporation Method and apparatus for immediate response handwriting recognition system that handles multiple character sets
CN1217255C (en) 1999-12-28 2005-08-31 索尼株式会社 Electronic device with dispaly function

Patent Citations (75)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5252951A (en) * 1989-04-28 1993-10-12 International Business Machines Corporation Graphical user interface with gesture recognition in a multiapplication environment
US5276794A (en) * 1990-09-25 1994-01-04 Grid Systems Corporation Pop-up keyboard system for entering handwritten data into computer generated forms
US5347295A (en) * 1990-10-31 1994-09-13 Go Corporation Control of a computer through a position-sensed stylus
US6597347B1 (en) * 1991-11-26 2003-07-22 Itu Research Inc. Methods and apparatus for providing touch-sensitive input in multiple degrees of freedom
US5483261A (en) * 1992-02-14 1996-01-09 Itu Research, Inc. Graphical input controller and method with rear screen image detection
US6097392A (en) * 1992-09-10 2000-08-01 Microsoft Corporation Method and system of altering an attribute of a graphic object in a pen environment
US5936619A (en) * 1992-09-11 1999-08-10 Canon Kabushiki Kaisha Information processor
US5621438A (en) * 1992-10-12 1997-04-15 Hitachi, Ltd. Pointing information processing apparatus with pointing function
US5566098A (en) * 1992-11-13 1996-10-15 International Business Machines Corporation Rotatable pen-based computer with automatically reorienting display
US5432720A (en) * 1992-11-13 1995-07-11 International Business Machines Corporation Rotatable pen-based computer
US5502461A (en) * 1993-05-11 1996-03-26 Sanyo Electric Co., Ltd. Hand written character input system/allowing change of size of character writing frames
US5596697A (en) * 1993-09-30 1997-01-21 Apple Computer, Inc. Method for routing items within a computer system
US6525749B1 (en) * 1993-12-30 2003-02-25 Xerox Corporation Apparatus and method for supporting the implicit structure of freeform lists, outlines, text, tables and diagrams in a gesture-based input system and editing system
US5731801A (en) * 1994-03-31 1998-03-24 Wacom Co., Ltd. Two-handed method of displaying information on a computer display
US6493464B1 (en) * 1994-07-01 2002-12-10 Palm, Inc. Multiple pen stroke character set and handwriting recognition system with immediate response
US5805167A (en) * 1994-09-22 1998-09-08 Van Cruyningen; Izak Popup menus with directional gestures
US5757371A (en) * 1994-12-13 1998-05-26 Microsoft Corporation Taskbar with start menu
US5760773A (en) * 1995-01-06 1998-06-02 Microsoft Corporation Methods and apparatus for interacting with data objects using action handles
US5736974A (en) * 1995-02-17 1998-04-07 International Business Machines Corporation Method and apparatus for improving visibility and selectability of icons
US5644737A (en) * 1995-06-06 1997-07-01 Microsoft Corporation Method and system for stacking toolbars in a computer display
US5745718A (en) * 1995-07-31 1998-04-28 International Business Machines Corporation Folder bar widget
US20010002126A1 (en) * 1995-12-01 2001-05-31 Immersion Corporation Providing force feedback to a user of an interface device based on interactions of a user-controlled cursor in a graphical user interface
US5801699A (en) * 1996-01-26 1998-09-01 International Business Machines Corporation Icon aggregation on a graphical user interface
US5859623A (en) * 1996-05-14 1999-01-12 Proxima Corporation Intelligent display system presentation projection arrangement and method of using same
US6067584A (en) * 1996-09-09 2000-05-23 National Instruments Corporation Attribute-based system and method for configuring and controlling a data acquisition task
US5828376A (en) * 1996-09-23 1998-10-27 J. D. Edwards World Source Company Menu control in a graphical user interface
US5940488A (en) * 1996-11-15 1999-08-17 Active Voice Corporation Telecommunication management system and user interface
US6300946B1 (en) * 1997-01-29 2001-10-09 Palm, Inc. Method and apparatus for interacting with a portable computer
US6057836A (en) * 1997-04-01 2000-05-02 Microsoft Corporation System and method for resizing and rearranging a composite toolbar by direct manipulation
US6069623A (en) * 1997-09-19 2000-05-30 International Business Machines Corporation Method and system for the dynamic customization of graphical user interface elements
US6096094A (en) * 1997-10-03 2000-08-01 National Instruments Corporation Configuration manager for configuring a data acquisition system
US6037937A (en) * 1997-12-04 2000-03-14 Nortel Networks Corporation Navigation tool for graphical user interface
US6310610B1 (en) * 1997-12-04 2001-10-30 Nortel Networks Limited Intelligent touch display
US6018346A (en) * 1998-01-12 2000-01-25 Xerox Corporation Freeform graphics system having meeting objects for supporting meeting objectives
US5973664A (en) * 1998-03-19 1999-10-26 Portrait Displays, Inc. Parameterized image orientation for computer displays
US6181344B1 (en) * 1998-03-20 2001-01-30 Nuvomedia, Inc. Drag-and-release method for configuring user-definable function key of hand-held computing device
US6133915A (en) * 1998-06-17 2000-10-17 Microsoft Corporation System and method for customizing controls on a toolbar
US20030050906A1 (en) * 1998-08-26 2003-03-13 Gervase Clifton-Bligh Methods and devices for mapping data files
US7030888B1 (en) * 1999-03-01 2006-04-18 Eastman Kodak Company Color processing
US6545669B1 (en) * 1999-03-26 2003-04-08 Husam Kinawi Object-drag continuity between discontinuous touch-screens
US6346972B1 (en) * 1999-05-26 2002-02-12 Samsung Electronics Co., Ltd. Video display apparatus with on-screen display pivoting function
US20020173721A1 (en) * 1999-08-20 2002-11-21 Novasonics, Inc. User interface for handheld imaging devices
US20030013959A1 (en) * 1999-08-20 2003-01-16 Sorin Grunwald User interface for handheld imaging devices
US7185274B1 (en) * 1999-12-07 2007-02-27 Microsoft Corporation Computer user interface architecture wherein users interact with both content and user interface by activating links
US20020091700A1 (en) * 2000-01-21 2002-07-11 Steele Robert A. Unique architecture for handheld computers
US6683600B1 (en) * 2000-04-19 2004-01-27 Microsoft Corporation Adaptive input pen mode selection
US20010038394A1 (en) * 2000-05-08 2001-11-08 Tadao Tsuchimura Information display system having graphical user interface, and medium
US20020033836A1 (en) * 2000-06-06 2002-03-21 Smith Scott R. Device and method for changing the orientation and configuration of a display of an electronic device
US20020021278A1 (en) * 2000-07-17 2002-02-21 Hinckley Kenneth P. Method and apparatus using multiple sensors in a device with a display
US20020015064A1 (en) * 2000-08-07 2002-02-07 Robotham John S. Gesture-based user interface to multi-level and multi-modal sets of bit-maps
US6683623B1 (en) * 2000-08-30 2004-01-27 New Forum Publishers System and method for providing and accessing educational information over a computer network
US20020078037A1 (en) * 2000-10-12 2002-06-20 Mitsuyuki Hatanaka Information processing apparatus and method, and program storing medium
US6624831B1 (en) * 2000-10-17 2003-09-23 Microsoft Corporation System and process for generating a dynamically adjustable toolbar
US20020059350A1 (en) * 2000-11-10 2002-05-16 Marieke Iwema Insertion point bungee space tool
US20020113784A1 (en) * 2000-12-29 2002-08-22 Feilmeier Michael Leon Portable computer aided design apparatus and method
US7076738B2 (en) * 2001-03-02 2006-07-11 Semantic Compaction Systems Computer device, method and article of manufacture for utilizing sequenced symbols to enable programmed application and commands
US20020163544A1 (en) * 2001-03-02 2002-11-07 Baker Bruce R. Computer device, method and article of manufacture for utilizing sequenced symbols to enable programmed application and commands
US20030221167A1 (en) * 2001-04-25 2003-11-27 Eric Goldstein System, method and apparatus for selecting, displaying, managing, tracking and transferring access to content of web pages and other sources
US20020188636A1 (en) * 2001-05-02 2002-12-12 Peck David K. System and method for in-line editing of web-based documents
US20050024341A1 (en) * 2001-05-16 2005-02-03 Synaptics, Inc. Touch screen with user interface enhancement
US20060048058A1 (en) * 2001-05-25 2006-03-02 Learning Tree International System and method for electronic presentations
US20040113935A1 (en) * 2001-05-25 2004-06-17 O'neal David System and method for electronic presentations
US20030016850A1 (en) * 2001-07-17 2003-01-23 Leon Kaufman Systems and graphical user interface for analyzing body images
US20030018245A1 (en) * 2001-07-17 2003-01-23 Accuimage Diagnostics Corp. Methods for generating a lung report
US7015894B2 (en) * 2001-09-28 2006-03-21 Ricoh Company, Ltd. Information input and output system, method, storage medium, and carrier wave
US7231208B2 (en) * 2001-10-17 2007-06-12 Palm, Inc. User interface-technique for managing an active call
US20030073430A1 (en) * 2001-10-17 2003-04-17 Palm, Inc. User interface-technique for managing an active call
US6952203B2 (en) * 2002-01-08 2005-10-04 International Business Machines Corporation Touchscreen user interface: Bluetooth™ stylus for performing right mouse clicks
US7190351B1 (en) * 2002-05-10 2007-03-13 Michael Goren System and method for data input
US20040032413A1 (en) * 2002-08-13 2004-02-19 Fuller David W. Multiple views for a measurement system diagram
US7831934B2 (en) * 2002-08-26 2010-11-09 Palm, Inc. User-interface features for computers with contact-sensitive displays
US20070128899A1 (en) * 2003-01-12 2007-06-07 Yaron Mayer System and method for improving the efficiency, comfort, and/or reliability in Operating Systems, such as for example Windows
US20080177994A1 (en) * 2003-01-12 2008-07-24 Yaron Mayer System and method for improving the efficiency, comfort, and/or reliability in Operating Systems, such as for example Windows
US20040174398A1 (en) * 2003-03-04 2004-09-09 Microsoft Corporation System and method for navigating a graphical user interface on a smaller display
US20070203906A1 (en) * 2003-09-22 2007-08-30 Cone Julian M Enhanced Search Engine

Cited By (88)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090015597A1 (en) * 2000-05-18 2009-01-15 Palm, Inc. Reorienting display on portable computing device
US8031212B2 (en) 2000-05-18 2011-10-04 Hewlett-Packard Development Company, L.P. Reorienting display on portable computing device
US7859518B1 (en) 2001-06-04 2010-12-28 Palm, Inc. Interface for interaction with display visible from both sides
US7831934B2 (en) * 2002-08-26 2010-11-09 Palm, Inc. User-interface features for computers with contact-sensitive displays
US7406666B2 (en) 2002-08-26 2008-07-29 Palm, Inc. User-interface features for computers with contact-sensitive displays
US20040046791A1 (en) * 2002-08-26 2004-03-11 Mark Davis User-interface features for computers with contact-sensitive displays
US20090007025A1 (en) * 2002-08-26 2009-01-01 Mark Davis User-interface features for computers with contact-sensitive displays
US20080166049A1 (en) * 2004-04-02 2008-07-10 Nokia Corporation Apparatus and Method for Handwriting Recognition
US8094938B2 (en) * 2004-04-02 2012-01-10 Nokia Corporation Apparatus and method for handwriting recognition
US7580029B2 (en) * 2004-04-02 2009-08-25 Nokia Corporation Apparatus and method for handwriting recognition
US20050219226A1 (en) * 2004-04-02 2005-10-06 Ying Liu Apparatus and method for handwriting recognition
US20070216643A1 (en) * 2004-06-16 2007-09-20 Morris Robert P Multipurpose Navigation Keys For An Electronic Device
US11379060B2 (en) 2004-08-25 2022-07-05 Apple Inc. Wide touchpad on a portable computer
US20070094417A1 (en) * 2005-05-16 2007-04-26 Hur Yong S Mobile terminal having scrolling device and method implementing functions using the same
US11918389B2 (en) 2006-02-15 2024-03-05 Hologic, Inc. Breast biopsy and needle localization using tomosynthesis systems
US11452486B2 (en) 2006-02-15 2022-09-27 Hologic, Inc. Breast biopsy and needle localization using tomosynthesis systems
WO2007118019A2 (en) * 2006-04-06 2007-10-18 Motorola, Inc. Method and apparatus for user interface adaptation
US10048860B2 (en) 2006-04-06 2018-08-14 Google Technology Holdings LLC Method and apparatus for user interface adaptation
WO2007118019A3 (en) * 2006-04-06 2008-04-24 Motorola Inc Method and apparatus for user interface adaptation
US20080001932A1 (en) * 2006-06-30 2008-01-03 Inventec Corporation Mobile communication device
EP1923778A3 (en) * 2006-11-16 2010-04-07 LG Electronics, Inc. Mobile terminal and screen display method thereof
US8217904B2 (en) 2006-11-16 2012-07-10 Lg Electronics Inc. Mobile terminal and screen display method thereof
EP1923778A2 (en) * 2006-11-16 2008-05-21 LG Electronics, Inc. Mobile terminal and screen display method thereof
US20080119237A1 (en) * 2006-11-16 2008-05-22 Lg Electronics Inc. Mobile terminal and screen display method thereof
EP2163970A2 (en) * 2007-02-27 2010-03-17 Motorola, Inc. Adaptable user interface and mechanism for a portable electronic device
EP2115555A1 (en) * 2007-02-27 2009-11-11 Motorola, Inc. Adaptable user interface and mechanism for a portable electronic device
EP2163970A3 (en) * 2007-02-27 2010-06-09 Motorola, Inc. Adaptable user interface and mechanism for a portable electronic device
EP2115555A4 (en) * 2007-02-27 2010-06-09 Motorola Inc Adaptable user interface and mechanism for a portable electronic device
US8902152B2 (en) 2007-04-30 2014-12-02 Motorola Mobility Llc Dual sided electrophoretic display
US20080266244A1 (en) * 2007-04-30 2008-10-30 Xiaoping Bai Dual Sided Electrophoretic Display
US20080305837A1 (en) * 2007-06-08 2008-12-11 Inventec Corporation Mobile communication apparatus
US7969414B2 (en) * 2007-06-08 2011-06-28 Inventec Corporation Mobile communication apparatus
US20090231283A1 (en) * 2007-06-22 2009-09-17 Polak Robert D Colored Morphing Apparatus for an Electronic Device
US9122092B2 (en) 2007-06-22 2015-09-01 Google Technology Holdings LLC Colored morphing apparatus for an electronic device
US8957863B2 (en) 2007-06-22 2015-02-17 Google Technology Holdings LLC Colored morphing apparatus for an electronic device
US20080316397A1 (en) * 2007-06-22 2008-12-25 Polak Robert D Colored Morphing Apparatus for an Electronic Device
US20090198132A1 (en) * 2007-08-10 2009-08-06 Laurent Pelissier Hand-held ultrasound imaging device having reconfigurable user interface
US20090046072A1 (en) * 2007-08-13 2009-02-19 Emig David M Electrically Non-interfering Printing for Electronic Devices Having Capacitive Touch Sensors
US8077154B2 (en) 2007-08-13 2011-12-13 Motorola Mobility, Inc. Electrically non-interfering printing for electronic devices having capacitive touch sensors
US20090161059A1 (en) * 2007-12-19 2009-06-25 Emig David M Field Effect Mode Electro-Optical Device Having a Quasi-Random Photospacer Arrangement
US8139195B2 (en) 2007-12-19 2012-03-20 Motorola Mobility, Inc. Field effect mode electro-optical device having a quasi-random photospacer arrangement
US10747428B2 (en) * 2008-01-04 2020-08-18 Apple Inc. Selective rejection of touch contacts in an edge region of a touch surface
US9891732B2 (en) * 2008-01-04 2018-02-13 Apple Inc. Selective rejection of touch contacts in an edge region of a touch surface
US11449224B2 (en) 2008-01-04 2022-09-20 Apple Inc. Selective rejection of touch contacts in an edge region of a touch surface
US20150253891A1 (en) * 2008-01-04 2015-09-10 Apple Inc. Selective rejection of touch contacts in an edge region of a touch surface
US20220391086A1 (en) * 2008-01-04 2022-12-08 Apple Inc. Selective rejection of touch contacts in an edge region of a touch surface
US11886699B2 (en) * 2008-01-04 2024-01-30 Apple Inc. Selective rejection of touch contacts in an edge region of a touch surface
US8797270B2 (en) * 2008-01-29 2014-08-05 Kyocera Corporation Terminal device with display function
US9013439B2 (en) 2008-01-29 2015-04-21 Kyocera Corporation Terminal device with display function
US20130215062A1 (en) * 2008-01-29 2013-08-22 Kyocera Corporation Terminal device with display function
US9477338B2 (en) 2008-01-29 2016-10-25 Kyocera Corporation Terminal device with display function
US8059232B2 (en) 2008-02-08 2011-11-15 Motorola Mobility, Inc. Electronic device and LC shutter for polarization-sensitive switching between transparent and diffusive states
US20090300537A1 (en) * 2008-05-27 2009-12-03 Park Kenneth J Method and system for changing format for displaying information on handheld device
US20100110020A1 (en) * 2008-10-31 2010-05-06 Sprint Communications Company L.P. Virtual press number pad
US9632608B2 (en) 2008-12-08 2017-04-25 Apple Inc. Selective input signal rejection and modification
US10452174B2 (en) 2008-12-08 2019-10-22 Apple Inc. Selective input signal rejection and modification
US20100171693A1 (en) * 2009-01-06 2010-07-08 Kenichi Tamura Display control device, display control method, and program
US9053652B2 (en) * 2009-01-06 2015-06-09 Sony Corporation Display control device, display control method, and program
US9766788B2 (en) 2009-07-17 2017-09-19 Apple Inc. Selective rotation of a user interface
US8817048B2 (en) * 2009-07-17 2014-08-26 Apple Inc. Selective rotation of a user interface
US20110012926A1 (en) * 2009-07-17 2011-01-20 Apple Inc. Selective rotation of a user interface
US20110029904A1 (en) * 2009-07-30 2011-02-03 Adam Miles Smith Behavior and Appearance of Touch-Optimized User Interface Elements for Controlling Computer Function
US11701199B2 (en) 2009-10-08 2023-07-18 Hologic, Inc. Needle breast biopsy system and method of use
US20110105186A1 (en) * 2009-10-29 2011-05-05 Research In Motion Limited Systems and methods for providing direct and indirect navigation modes for touchscreen devices
US10444960B2 (en) * 2010-11-26 2019-10-15 Hologic, Inc. User interface for medical image review workstation
US11775156B2 (en) 2010-11-26 2023-10-03 Hologic, Inc. User interface for medical image review workstation
US20150309712A1 (en) * 2010-11-26 2015-10-29 Hologic, Inc. User interface for medical image review workstation
US11406332B2 (en) 2011-03-08 2022-08-09 Hologic, Inc. System and method for dual energy and/or contrast enhanced breast imaging for screening, diagnosis and biopsy
US11837197B2 (en) 2011-11-27 2023-12-05 Hologic, Inc. System and method for generating a 2D image using mammography and/or tomosynthesis image data
US11508340B2 (en) 2011-11-27 2022-11-22 Hologic, Inc. System and method for generating a 2D image using mammography and/or tomosynthesis image data
US10282155B2 (en) 2012-01-26 2019-05-07 Google Technology Holdings LLC Portable electronic device and method for controlling operation thereof taking into account which limb possesses the electronic device
US11663780B2 (en) 2012-02-13 2023-05-30 Hologic Inc. System and method for navigating a tomosynthesis stack using synthesized image data
US9022611B2 (en) * 2012-05-09 2015-05-05 Samsung Display Co., Ltd. Display device and method for fabricating the same
US20130301272A1 (en) * 2012-05-09 2013-11-14 Chan Hee Wang Display device and method for fabricating the same
US11550466B2 (en) 2012-08-27 2023-01-10 Samsung Electronics Co., Ltd. Method of controlling a list scroll bar and an electronic device using the same
US11589944B2 (en) 2013-03-15 2023-02-28 Hologic, Inc. Tomosynthesis-guided biopsy apparatus and method
US11419565B2 (en) 2014-02-28 2022-08-23 IIologic, Inc. System and method for generating and displaying tomosynthesis image slabs
US11801025B2 (en) 2014-02-28 2023-10-31 Hologic, Inc. System and method for generating and displaying tomosynthesis image slabs
US9971496B2 (en) 2014-08-04 2018-05-15 Google Technology Holdings LLC Method and apparatus for adjusting a graphical user interface on an electronic device
US10459608B2 (en) * 2014-12-01 2019-10-29 Ebay Inc. Mobile optimized shopping comparison
US11366572B2 (en) 2014-12-01 2022-06-21 Ebay Inc. Mobile optimized shopping comparison
US20180121071A1 (en) * 2016-11-03 2018-05-03 Ford Global Technologies, Llc Vehicle display based on vehicle speed
US11445993B2 (en) 2017-03-30 2022-09-20 Hologic, Inc. System and method for targeted object enhancement to generate synthetic breast tissue images
US11455754B2 (en) 2017-03-30 2022-09-27 Hologic, Inc. System and method for synthesizing low-dimensional image data from high-dimensional image data using an object grid enhancement
US11403483B2 (en) 2017-06-20 2022-08-02 Hologic, Inc. Dynamic self-learning medical image method and system
US11850021B2 (en) 2017-06-20 2023-12-26 Hologic, Inc. Dynamic self-learning medical image method and system
US11314391B2 (en) * 2017-09-08 2022-04-26 Guangdong Oppo Mobile Telecommunications Corp., Ltd. Navigation bar controlling method and terminal
US11307743B2 (en) * 2017-11-07 2022-04-19 Samsung Electronics Co., Ltd. Method, electronic device and storage medium for providing mode switching

Also Published As

Publication number Publication date
US7406666B2 (en) 2008-07-29
US20040046791A1 (en) 2004-03-11
AU2003262921A8 (en) 2004-03-11
US7831934B2 (en) 2010-11-09
CA2496774A1 (en) 2004-03-04
WO2004019200A2 (en) 2004-03-04
WO2004019200A3 (en) 2005-02-17
AU2003262921A1 (en) 2004-03-11
EP1558985A2 (en) 2005-08-03
US20090007025A1 (en) 2009-01-01

Similar Documents

Publication Publication Date Title
US7406666B2 (en) User-interface features for computers with contact-sensitive displays
US7966573B2 (en) Method and system for improving interaction with a user interface
US9239673B2 (en) Gesturing with a multipoint sensing device
EP1979804B1 (en) Gesturing with a multipoint sensing device
US9292111B2 (en) Gesturing with a multipoint sensing device
US7898529B2 (en) User interface having a placement and layout suitable for pen-based computers
US7802202B2 (en) Computer interaction based upon a currently active input device
US7644372B2 (en) Area frequency radial menus
US20140123049A1 (en) Keyboard with gesture-redundant keys removed
EP2003538A1 (en) Method for operating user interface and recording medium for storing program applying the same
US20050162402A1 (en) Methods of interacting with a computer using a finger(s) touch sensing input device with visual feedback
JP2010521022A (en) Virtual keyboard input system using a pointing device used in digital equipment
US7479943B1 (en) Variable template input area for a data input device of a handheld electronic system
US7571384B1 (en) Method and system for handwriting recognition with scrolling input history and in-place editing
WO2014043275A1 (en) Gesturing with a multipoint sensing device
AU2016238971B2 (en) Gesturing with a multipoint sensing device
AU2014201419B2 (en) Gesturing with a multipoint sensing device

Legal Events

Date Code Title Description
AS Assignment

Owner name: PALM, INC., CALIFORNIA

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:DAVIS, MARK;BERNOULLI, CARLO;REEL/FRAME:014143/0898

Effective date: 20030530

AS Assignment

Owner name: JP MORGAN CHASE BANK, N.A., AS ADMINISTRATIVE AGEN

Free format text: SECURITY AGREEMENT;ASSIGNOR:PALM, INC.;REEL/FRAME:020113/0788

Effective date: 20071024

AS Assignment

Owner name: PALM, INC., CALIFORNIA

Free format text: RELEASE BY SECURED PARTY;ASSIGNOR:JPMORGAN CHASE BANK, N.A., AS ADMINISTRATIVE AGENT;REEL/FRAME:024630/0474

Effective date: 20100701

AS Assignment

Owner name: HEWLETT-PACKARD DEVELOPMENT COMPANY, L.P., TEXAS

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:PALM, INC.;REEL/FRAME:025204/0809

Effective date: 20101027

AS Assignment

Owner name: PALM, INC., CALIFORNIA

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:HEWLETT-PACKARD DEVELOPMENT COMPANY, L.P.;REEL/FRAME:030341/0459

Effective date: 20130430

AS Assignment

Owner name: HEWLETT-PACKARD DEVELOPMENT COMPANY, L.P., TEXAS

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:PALM, INC.;REEL/FRAME:031837/0659

Effective date: 20131218

Owner name: HEWLETT-PACKARD DEVELOPMENT COMPANY, L.P., TEXAS

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:PALM, INC.;REEL/FRAME:031837/0239

Effective date: 20131218

Owner name: PALM, INC., CALIFORNIA

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:HEWLETT-PACKARD DEVELOPMENT COMPANY, L.P.;REEL/FRAME:031837/0544

Effective date: 20131218

AS Assignment

Owner name: QUALCOMM INCORPORATED, CALIFORNIA

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:HEWLETT-PACKARD COMPANY;HEWLETT-PACKARD DEVELOPMENT COMPANY, L.P.;PALM, INC.;REEL/FRAME:032177/0210

Effective date: 20140123

STCV Information on status: appeal procedure

Free format text: BOARD OF APPEALS DECISION RENDERED

STCB Information on status: application discontinuation

Free format text: ABANDONED -- AFTER EXAMINER'S ANSWER OR BOARD OF APPEALS DECISION