WO2008086305A2 - Swapping user- interface objects by drag-and-drop finger gestures on a touch screen display - Google Patents

Swapping user- interface objects by drag-and-drop finger gestures on a touch screen display Download PDF

Info

Publication number
WO2008086305A2
WO2008086305A2 PCT/US2008/050430 US2008050430W WO2008086305A2 WO 2008086305 A2 WO2008086305 A2 WO 2008086305A2 US 2008050430 W US2008050430 W US 2008050430W WO 2008086305 A2 WO2008086305 A2 WO 2008086305A2
Authority
WO
WIPO (PCT)
Prior art keywords
user interface
interface object
touch screen
finger
screen display
Prior art date
Application number
PCT/US2008/050430
Other languages
French (fr)
Other versions
WO2008086305A3 (en
Inventor
Andrew Emilio Platzer
Charles J. Pisula
Imran Chaudhri
Steven P. Jobs
Greg Christie
Scott Forstall
Stephen O. Lemay
Michael Matas
Gregory Novick
Marcel Van Os
Original Assignee
Apple Inc.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Apple Inc. filed Critical Apple Inc.
Publication of WO2008086305A2 publication Critical patent/WO2008086305A2/en
Publication of WO2008086305A3 publication Critical patent/WO2008086305A3/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/1626Constructional details or arrangements for portable computers with a single-body enclosure integrating a flat display, e.g. Personal Digital Assistants [PDAs]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/1633Constructional details or arrangements of portable computers not specific to the type of enclosures covered by groups G06F1/1615 - G06F1/1626
    • G06F1/1656Details related to functional adaptations of the enclosure, e.g. to provide protection against EMI, shock, water, or to host detachable peripherals like a mouse or removable expansions units like PCMCIA cards, or to provide access to internal components for maintenance or to removable storage supports like CDs or DVDs, or to mechanically mount accessories
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/1633Constructional details or arrangements of portable computers not specific to the type of enclosures covered by groups G06F1/1615 - G06F1/1626
    • G06F1/1684Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/0412Digitisers structurally integrated in a display
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04817Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance using icons
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/0486Drag-and-drop
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04886Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures by partitioning the display area of the touch-screen or the surface of the digitising tablet into independently controllable areas, e.g. virtual keyboards or menus
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72403User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72403User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality
    • H04M1/7243User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality with interactive means for internal management of messages
    • H04M1/72436User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality with interactive means for internal management of messages for text messaging, e.g. SMS or e-mail
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72448User interfaces specially adapted for cordless or mobile telephones with means for adapting the functionality of the device according to specific conditions
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72466User interfaces specially adapted for cordless or mobile telephones with selection means, e.g. keys, having functions defined by the mode or the status of the device
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2200/00Indexing scheme relating to G06F1/04 - G06F1/32
    • G06F2200/16Indexing scheme relating to G06F1/16 - G06F1/18
    • G06F2200/161Indexing scheme relating to constructional details of the monitor
    • G06F2200/1614Image rotation following screen orientation, e.g. switching from landscape to portrait mode
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/66Substation equipment, e.g. for use by subscribers with means for preventing unauthorised or fraudulent calling
    • H04M1/667Preventing unauthorised calls from a telephone set
    • H04M1/67Preventing unauthorised calls from a telephone set by electronic means
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72403User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality
    • H04M1/72442User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality for playing music files
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72469User interfaces specially adapted for cordless or mobile telephones for operating the device by selecting functions from two or more displayed items, e.g. menus or icons
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M2250/00Details of telephonic subscriber devices
    • H04M2250/12Details of telephonic subscriber devices including a sensor for measuring a physical value, e.g. temperature or motion
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M2250/00Details of telephonic subscriber devices
    • H04M2250/22Details of telephonic subscriber devices including a touch pad, a touch sensor or a touch detector

Definitions

  • the disclosed embodiments relate generally to portable electronic devices, and more particularly, to portable devices that support user navigations of graphical objects on a touch screen display.
  • the device has a touch-sensitive display (also known as a "touch screen") with a graphical user interface (GUI), one or more processors, memory and one or more modules, programs or sets of instructions stored in the memory for performing multiple functions.
  • GUI graphical user interface
  • the user interacts with the GUI primarily through finger contacts and gestures on the touch-sensitive display.
  • the functions may include telephoning, video conferencing, e-mailing, instant messaging, blogging, digital photographing, digital videoing, web browsing, digital music playing, and/or digital video playing. Instructions for performing these functions may be included in a computer program product configured for execution by one or more processors.
  • One aspect of the invention involves a computer-implemented method performed by a portable multifunction device with a touch screen display.
  • the device In response to a finger-down event on the touch screen display, the device identifies a first user interface object at which the finger-down event occurs.
  • the device In response to one or more finger-dragging events on the touch screen display, the device then moves the first user interface object on the touch screen display in accordance with the finger-dragging events.
  • the device In response to a finger- up event on the touch screen display, the device identifies a second user interface object at which the finger-up event occurs and visually replaces the second user interface object with the first user interface object.
  • Another aspect of the invention involves a computer-implemented method performed by a portable multifunction device with a touch screen display.
  • the device displays a first user interface object and a second user interface object on the touch screen display.
  • the device moves the first user interface object on the touch screen display along a path determined by the finger-dragging events until the first user interface object at least in part overlaps the second user interface object.
  • the device visually replaces the second user interface object with the first user interface object.
  • Another aspect of the invention involves a portable electronic device with a touch screen display for displaying a plurality of user interface objects.
  • the device includes one or more processors, memory, and a program stored in the memory and configured to be executed by the one or more processors.
  • the program includes: instructions for displaying a first user interface object and a second user interface object on the touch screen display; instructions for detecting a finger-down event at the first user interface object; instructions for detecting one or more finger-dragging events on the touch screen display; instructions for moving the first user interface object on the touch screen display along a path determined by the finger-dragging events until the first user interface object at least in part overlaps the second user interface object; instructions for detecting a finger-up event at the second user interface object; and instructions for visually replacing the second user interface object with the first user interface object.
  • Another aspect of the invention involves a computer readable storage medium that stores one or more programs.
  • the one or more programs include instructions, which when executed by the device, cause the device to: display a first user interface object and a second user interface object on the touch screen display; detect a finger-down event at the first user interface object; detect one or more finger-dragging events on the touch screen display; move the first user interface object on the touch screen display along a path determined by the finger-dragging events until the first user interface object at least in part overlaps the second user interface object; detect a finger-up event at the second user interface object; and visually replace the second user interface object with the first user interface object.
  • Another aspect of the invention involves a portable electronic device with a touch screen display with a plurality of user interface objects.
  • the device includes: means for displaying a first user interface object and a second user interface object on the touch screen display; means for detecting a finger-down event at the first user interface object; means for detecting one or more finger-dragging events on the touch screen display; means for moving the first user interface object on the touch screen display along a path determined by the finger-dragging events until the first user interface object at least in part overlaps the second user interface object; means for detecting a finger-up event at the second user interface object; and means for visually replacing the second user interface object with the first user interface object.
  • Another aspect of the invention involves a computer-implemented method performed by a portable multifunction device with a touch screen display.
  • the device displays a series of ratings indicia on the touch screen display.
  • the series of ratings indicia include a lowest rating indicia and one or more progressively higher rating indicia.
  • the device determines a last rating indicia contacted by the finger gesture immediately prior to the finger breaking contact with the touch screen display. A rating corresponding to the last rating indicia contacted by the finger gesture is used as input to a function or application in the device.
  • Another aspect of the invention involves a graphical user interface on a portable electronic device with a touch screen display.
  • the graphical user interface includes a series of ratings indicia on the touch screen display.
  • the ratings indicia include a lowest rating indicia and one or more progressively higher rating indicia.
  • the graphical user interface displays on the touch screen a graphical object using as input the last rating indicia contacted by the finger gesture before the finger gesture breaks contact with the touch screen display.
  • the device includes one or more processors, memory, and one or more programs stored in the memory and configured to be executed by the processors.
  • the programs include: instructions for displaying a series of ratings indicia on the touch screen display, wherein the ratings indicia comprise a lowest rating indicia and one or more progressively higher rating indicia; instructions for detecting a finger gesture by a user on one or more of the ratings indicia, wherein the finger gesture contacts a last rating indicia immediately prior to breaking contact with the touch screen display; and instructions for using a rating corresponding to the last rating indicia contacted by the finger gesture as input to a function or application in the device.
  • Another aspect of the invention involves a computer readable storage medium that stores one or more programs.
  • the one or more programs include instructions, which when executed by a portable multifunction device with a touch screen display, cause the device to: display a series of ratings indicia on the touch screen display, wherein the ratings indicia comprise a lowest rating indicia and one or more progressively higher rating indicia; detect a finger gesture by a user on one or more of the ratings indicia, wherein the finger gesture contacts a last rating indicia immediately prior to breaking contact with the touch screen display; and use a rating corresponding to the last rating indicia contacted by the finger gesture as input to a function or application in the device.
  • Another aspect of the invention involves a portable electronic device with a touch screen display.
  • the device includes: means for displaying a series of ratings indicia on the touch screen display, wherein the ratings indicia comprise a lowest rating indicia and one or more progressively higher rating indicia; means for detecting a finger gesture by a user on one or more of the ratings indicia, wherein the finger gesture contacts a last rating indicia immediately prior to breaking contact with the touch screen display; and means for using a rating corresponding to the last rating indicia contacted by the finger gesture as input to a function or application in the device.
  • FIGS. 1A and IB are block diagrams illustrating portable multifunction devices with touch-sensitive displays in accordance with some embodiments.
  • Figure 2 illustrates a portable multifunction device having a touch screen in accordance with some embodiments.
  • Figure 3 illustrates an exemplary user interface for unlocking a portable electronic device in accordance with some embodiments.
  • Figures 4 A and 4B illustrate exemplary user interfaces for a menu of applications on a portable multifunction device in accordance with some embodiments.
  • Figure 5 is a flow diagram illustrating a first process for swapping first and second user interface objects in accordance with some embodiments.
  • Figure 6 is a flow diagram illustrating a second process for swapping first and second user interface objects in accordance with some embodiments.
  • Figure 7 is a flow diagram illustrating a third process for displaying a ratings icon using as input a finger swipe gesture on the touch screen display in accordance with some embodiments.
  • Figures 8 A through 81 illustrate exemplary user interfaces for a music and video player in accordance with some embodiments.
  • Figures 9A-9P illustrate exemplary user interfaces for an online video application for a portable multifunction device in accordance with some embodiments.
  • first, second, etc. may be used herein to describe various elements, these elements should not be limited by these terms. These terms are only used to distinguish one element from another. For example, a first gesture could be termed a second gesture, and, similarly, a second gesture could be termed a first gesture, without departing from the scope of the present invention.
  • the device is a portable communications device such as a mobile telephone that also contains other functions, such as PDA and/or music player functions.
  • the user interface may include a physical click wheel in addition to a touch screen or a virtual click wheel displayed on the touch screen.
  • a click wheel is a user- interface device that may provide navigation commands based on an angular displacement of the wheel or a point of contact with the wheel by a user of the device.
  • a click wheel may also be used to provide a user command corresponding to selection of one or more items, for example, when the user of the device presses down on at least a portion of the wheel or the center of the wheel.
  • breaking contact with a click wheel image on a touch screen surface may indicate a user command corresponding to selection.
  • a portable multifunction device that includes a touch screen is used as an exemplary embodiment.
  • the device supports a variety of applications, such as a telephone application, a video conferencing application, an e-mail application, an instant messaging application, a blogging application, a digital camera application, a digital video camera application, a web browsing application, a digital music player application, and/or a digital video player application.
  • applications such as a telephone application, a video conferencing application, an e-mail application, an instant messaging application, a blogging application, a digital camera application, a digital video camera application, a web browsing application, a digital music player application, and/or a digital video player application.
  • the various applications that may be executed on the device may use at least one common physical user-interface device, such as the touch screen.
  • One or more functions of the touch screen as well as corresponding information displayed on the device may be adjusted and/or varied from one application to the next and/or within a respective application.
  • a common physical architecture (such as the touch screen) of the device may support the variety of applications with user interfaces that are intuitive and transparent.
  • the user interfaces may include one or more soft keyboard embodiments.
  • the soft keyboard embodiments may include standard (QWERTY) and/or non-standard configurations of symbols on the displayed icons of the keyboard, such as those described in U.S. Patent Applications 11/459,606, "Keyboards For Portable Electronic Devices," filed July 24, 2006, and 11/459,615, “Touch Screen Keyboards For Portable Electronic Devices," filed July 24, 2006, the contents of which are hereby incorporated by reference in their entirety.
  • the keyboard embodiments may include a reduced number of icons (or soft keys) relative to the number of keys in existing physical keyboards, such as that for a typewriter. This may make it easier for users to select one or more icons in the keyboard, and thus, one or more corresponding symbols.
  • the keyboard embodiments may be adaptive. For example, displayed icons may be modified in accordance with user actions, such as selecting one or more icons and/or one or more corresponding symbols.
  • One or more applications on the portable device may utilize common and/or different keyboard embodiments. Thus, the keyboard embodiment used may be tailored to at least some of the applications. In some embodiments, one or more keyboard embodiments may be tailored to a respective user.
  • one or more keyboard embodiments may be tailored to a respective user based on a word usage history (lexicography, slang, individual usage) of the respective user. Some of the keyboard embodiments may be adjusted to reduce a probability of a user error when selecting one or more icons, and thus one or more symbols, when using the soft keyboard embodiments. [0035] Attention is now directed towards embodiments of the device. Figures IA and
  • IB are block diagrams illustrating portable multifunction devices 100 with touch-sensitive displays 112 in accordance with some embodiments.
  • the touch-sensitive display 112 is sometimes called a "touch screen" for convenience, and may also be known as or called a touch-sensitive display system.
  • the device 100 may include a memory 102 (which may include one or more computer readable storage mediums), a memory controller 122, one or more processing units (CPU's) 120, a peripherals interface 118, RF circuitry 108, audio circuitry 110, a speaker 111, a microphone 113, an input/output (I/O) subsystem 106, other input or control devices 116, and an external port 124.
  • the device 100 may include one or more optical sensors 164. These components may communicate over one or more communication buses or signal lines 103.
  • the device 100 is only one example of a portable multifunction device 100, and that the device 100 may have more or fewer components than shown, may combine two or more components, or a may have a different configuration or arrangement of the components.
  • the various components shown in Figures IA and IB may be implemented in hardware, software or a combination of both hardware and software, including one or more signal processing and/or application specific integrated circuits.
  • Memory 102 may include high-speed random access memory and may also include non- volatile memory, such as one or more magnetic disk storage devices, flash memory devices, or other non- volatile solid-state memory devices. Access to memory 102 by other components of the device 100, such as the CPU 120 and the peripherals interface 118, may be controlled by the memory controller 122.
  • the peripherals interface 118 couples the input and output peripherals of the device to the CPU 120 and memory 102.
  • the one or more processors 120 run or execute various software programs and/or sets of instructions stored in memory 102 to perform various functions for the device 100 and to process data.
  • the peripherals interface 118, the CPU 120, and the memory controller 122 may be implemented on a single chip, such as a chip 104. In some other embodiments, they may be implemented on separate chips.
  • the RF (radio frequency) circuitry 108 receives and sends RF signals, also called electromagnetic signals.
  • the RF circuitry 108 converts electrical signals to/from electromagnetic signals and communicates with communications networks and other communications devices via the electromagnetic signals.
  • the RF circuitry 108 may include well-known circuitry for performing these functions, including but not limited to an antenna system, an RF transceiver, one or more amplifiers, a tuner, one or more oscillators, a digital signal processor, a CODEC chipset, a subscriber identity module (SIM) card, memory, and so forth.
  • SIM subscriber identity module
  • the RF circuitry 108 may communicate with networks, such as the Internet, also referred to as the World Wide Web (WWW), an intranet and/or a wireless network, such as a cellular telephone network, a wireless local area network (LAN) and/or a metropolitan area network (MAN), and other devices by wireless communication.
  • networks such as the Internet, also referred to as the World Wide Web (WWW), an intranet and/or a wireless network, such as a cellular telephone network, a wireless local area network (LAN) and/or a metropolitan area network (MAN), and other devices by wireless communication.
  • networks such as the Internet, also referred to as the World Wide Web (WWW), an intranet and/or a wireless network, such as a cellular telephone network, a wireless local area network (LAN) and/or a metropolitan area network (MAN), and other devices by wireless communication.
  • WLAN wireless local area network
  • MAN metropolitan area network
  • the wireless communication may use any of a plurality of communications standards, protocols and technologies, including but not limited to Global System for Mobile Communications (GSM), Enhanced Data GSM Environment (EDGE), high-speed downlink packet access (HSDPA), wideband code division multiple access (W-CDMA), code division multiple access (CDMA), time division multiple access (TDMA), Bluetooth, Wireless Fidelity (Wi-Fi) (e.g., IEEE 802.1 Ia, IEEE 802.11b, IEEE 802. Hg and/or IEEE 802.1 In), voice over Internet Protocol (VoIP),
  • GSM Global System for Mobile Communications
  • EDGE Enhanced Data GSM Environment
  • HSDPA high-speed downlink packet access
  • W-CDMA wideband code division multiple access
  • CDMA code division multiple access
  • TDMA time division multiple access
  • Wi-Fi Wireless Fidelity
  • IEEE 802.1 Ia IEEE 802.11b
  • VoIP voice over Internet Protocol
  • Wi-MAX a protocol for email (e.g., Internet message access protocol (IMAP) and/or post office protocol (POP)), instant messaging (e.g., extensible messaging and presence protocol (XMPP), Session Initiation Protocol for Instant Messaging and Presence Leveraging Extensions (SIMPLE), and/or Instant Messaging and Presence Service (IMPS)), and/or Short Message Service (SMS)), or any other suitable communication protocol, including communication protocols not yet developed as of the filing date of this document.
  • IMAP Internet message access protocol
  • POP post office protocol
  • instant messaging e.g., extensible messaging and presence protocol (XMPP), Session Initiation Protocol for Instant Messaging and Presence Leveraging Extensions (SIMPLE), and/or Instant Messaging and Presence Service (IMPS)
  • SMS Short Message Service
  • the audio circuitry 110, the speaker 111, and the microphone 113 provide an audio interface between a user and the device 100.
  • the audio circuitry 110 receives audio data from the peripherals interface 118, converts the audio data to an electrical signal, and transmits the electrical signal to the speaker 111.
  • the speaker 111 converts the electrical signal to human-audible sound waves.
  • the audio circuitry 110 also receives electrical signals converted by the microphone 113 from sound waves.
  • the audio circuitry 110 converts the electrical signal to audio data and transmits the audio data to the peripherals interface 118 for processing. Audio data may be retrieved from and/or transmitted to memory 102 and/or the RF circuitry 108 by the peripherals interface 118.
  • the audio circuitry 110 also includes a headset jack (e.g. 212, Figure 2).
  • the headset jack provides an interface between the audio circuitry 110 and removable audio input/output peripherals, such as output-only headphones or a headset with both output (e.g., a headphone for one or both ears) and input (e.g., a microphone).
  • the I/O subsystem 106 couples input/output peripherals on the device 100, such as the touch screen 112 and other input/control devices 116, to the peripherals interface 118.
  • the I/O subsystem 106 may include a display controller 156 and one or more input controllers 160 for other input or control devices.
  • the one or more input controllers 160 receive/send electrical signals from/to other input or control devices 116.
  • the other input/control devices 116 may include physical buttons (e.g., push buttons, rocker buttons, etc.), dials, slider switches, joysticks, click wheels, and so forth.
  • input controller(s) 160 may be coupled to any (or none) of the following: a keyboard, infrared port, USB port, and a pointer device such as a mouse.
  • the one or more buttons (e.g., 208, Figure 2) may include an up/down button for volume control of the speaker 111 and/or the microphone 113.
  • the one or more buttons may include a push button (e.g., 206, Figure 2). A quick press of the push button may disengage a lock of the touch screen 112 or begin a process that uses gestures on the touch screen to unlock the device, as described in U.S. Patent Application 11/322,549, "Unlocking a Device by Performing
  • a longer press of the push button (e.g., 206) may turn power to the device 100 on or off.
  • the user may be able to customize a functionality of one or more of the buttons.
  • the touch screen 112 is used to implement virtual or soft buttons and one or more soft keyboards.
  • the touch-sensitive touch screen 112 provides an input interface and an output interface between the device and a user.
  • the display controller 156 receives and/or sends electrical signals from/to the touch screen 112.
  • the touch screen 112 displays visual output to the user.
  • the visual output may include graphics, text, icons, video, and any combination thereof (collectively termed "graphics"). In some embodiments, some or all of the visual output may correspond to user-interface objects, further details of which are described below.
  • a touch screen 112 has a touch-sensitive surface, sensor or set of sensors that accepts input from the user based on haptic and/or tactile contact.
  • the touch screen 112 and the display controller 156 (along with any associated modules and/or sets of instructions in memory 102) detect contact (and any movement or breaking of the contact) on the touch screen 112 and converts the detected contact into interaction with user-interface objects (e.g., one or more soft keys, icons, web pages or images) that are displayed on the touch screen.
  • user-interface objects e.g., one or more soft keys, icons, web pages or images
  • a point of contact between a touch screen 112 and the user corresponds to a finger of the user.
  • the touch screen 112 may use LCD (liquid crystal display) technology, or
  • the touch screen 112 and the display controller 156 may detect contact and any movement or breaking thereof using any of a plurality of touch sensing technologies now known or later developed, including but not limited to capacitive, resistive, infrared, and surface acoustic wave technologies, as well as other proximity sensor arrays or other elements for determining one or more points of contact with a touch screen 112.
  • a touch-sensitive display in some embodiments of the touch screen 112 may be analogous to the multi-touch sensitive tablets described in the following U.S.
  • Patents 6,323,846 (Westerman et al), 6,570,557 (Westerman et al), and/or 6,677,932 (Westerman), and/or U.S. Patent Publication 2002/0015024A1, each of which is hereby incorporated by reference in its entirety.
  • a touch screen 112 displays visual output from the portable device 100, whereas touch sensitive tablets do not provide visual output.
  • a touch-sensitive display in some embodiments of the touch screen 112 may be as described in the following applications: (1) U.S. Patent Application No. 11/381,313, “Multipoint Touch Surface Controller,” filed May 2, 2006; (2) U.S. Patent Application No. 10/840,862, “Multipoint Touchscreen,” filed May 6, 2004; (3) U.S. Patent Application No.
  • the touch screen 112 may have a resolution in excess of 100 dpi. In an exemplary embodiment, the touch screen has a resolution of approximately 160 dpi.
  • the user may make contact with the touch screen 112 using any suitable object or appendage, such as a stylus, a finger, and so forth.
  • the user interface is designed to work primarily with finger-based contacts and gestures, which are much less precise than stylus- based input due to the larger area of contact of a finger on the touch screen.
  • the device translates the rough finger-based input into a precise pointer/cursor position or command for performing the actions desired by the user.
  • the device 100 may include a touchpad (not shown) for activating or deactivating particular functions.
  • the touchpad is a touch-sensitive area of the device that, unlike the touch screen, does not display visual output.
  • the touchpad may be a touch-sensitive surface that is separate from the touch screen 112 or an extension of the touch-sensitive surface formed by the touch screen.
  • the device 100 may include a physical or virtual click wheel as an input control device 116.
  • a user may navigate among and interact with one or more graphical objects (henceforth referred to as icons) displayed in the touch screen 112 by rotating the click wheel or by moving a point of contact with the click wheel (e.g., where the amount of movement of the point of contact is measured by its angular displacement with respect to a center point of the click wheel).
  • the click wheel may also be used to select one or more of the displayed icons. For example, the user may press down on at least a portion of the click wheel or an associated button.
  • User commands and navigation commands provided by the user via the click wheel may be processed by an input controller 160 as well as one or more of the modules and/or sets of instructions in memory 102.
  • the click wheel and click wheel controller may be part of the touch screen 112 and the display controller 156, respectively.
  • the click wheel may be either an opaque or semitransparent object that appears and disappears on the touch screen display in response to user interaction with the device.
  • a virtual click wheel is displayed on the touch screen of a portable multifunction device and operated by user contact with the touch screen.
  • the device 100 also includes a power system 162 for powering the various components.
  • the power system 162 may include a power management system, one or more power sources (e.g., battery, alternating current (AC)), a recharging system, a power failure detection circuit, a power converter or inverter, a power status indicator (e.g., a light-emitting diode (LED)) and any other components associated with the generation, management and distribution of power in portable devices.
  • a power management system one or more power sources (e.g., battery, alternating current (AC)), a recharging system, a power failure detection circuit, a power converter or inverter, a power status indicator (e.g., a light-emitting diode (LED)) and any other components associated with the generation, management and distribution of power in portable devices.
  • power sources e.g., battery, alternating current (AC)
  • AC alternating current
  • a recharging system e.g., a recharging system
  • a power failure detection circuit e.g.,
  • the device 100 may also include one or more optical sensors 164.
  • Figures IA and IB show an optical sensor coupled to an optical sensor controller 158 in I/O subsystem 106.
  • the optical sensor 164 may include charge-coupled device (CCD) or complementary metal-oxide semiconductor (CMOS) phototransistors.
  • CCD charge-coupled device
  • CMOS complementary metal-oxide semiconductor
  • the optical sensor 164 receives light from the environment, projected through one or more lens, and converts the light to data representing an image.
  • an imaging module 143 also called a camera module
  • the optical sensor 164 may capture still images or video.
  • an optical sensor is located on the back of the device 100, opposite the touch screen display 112 on the front of the device, so that the touch screen display may be used as a viewfinder for either still and/or video image acquisition.
  • an optical sensor is located on the front of the device so that the user's image may be obtained for videoconferencing while the user views the other video conference participants on the touch screen display.
  • the position of the optical sensor 164 can be changed by the user (e.g., by rotating the lens and the sensor in the device housing) so that a single optical sensor 164 may be used along with the touch screen display for both video conferencing and still and/or video image acquisition.
  • the device 100 may also include one or more proximity sensors 166.
  • IA and IB show a proximity sensor 166 coupled to the peripherals interface 118.
  • the proximity sensor 166 may be coupled to an input controller 160 in the I/O subsystem 106.
  • the proximity sensor 166 may perform as described in U.S. Patent Application Nos.
  • the proximity sensor turns off and disables the touch screen 112 when the multifunction device is placed near the user's ear (e.g., when the user is making a phone call). In some embodiments, the proximity sensor keeps the screen off when the device is in the user's pocket, purse, or other dark area to prevent unnecessary battery drainage when the device is a locked state.
  • the device 100 may also include one or more accelerometers 168.
  • Figures IA and IB show an accelerometer 168 coupled to the peripherals interface 118.
  • the accelerometer 168 may be coupled to an input controller 160 in the I/O subsystem 106.
  • the accelerometer 168 may perform as described in U.S. Patent Publication No. 20050190059, "Acceleration-based Theft Detection System for Portable Electronic Devices," and U.S. Patent Publication No. 20060017692, "Methods And Apparatuses For Operating A Portable
  • information is displayed on the touch screen display in a portrait view or a landscape view based on an analysis of data received from the one or more accelerometers.
  • the software components stored in memory 102 may include an operating system 126, a communication module (or set of instructions) 128, a contact/motion module (or set of instructions) 130, a graphics module (or set of instructions) 132, a text input module (or set of instructions) 134, a Global Positioning System (GPS) module (or set of instructions) 135, and applications (or set of instructions) 136.
  • an operating system 126 a communication module (or set of instructions) 128, a contact/motion module (or set of instructions) 130, a graphics module (or set of instructions) 132, a text input module (or set of instructions) 134, a Global Positioning System (GPS) module (or set of instructions) 135, and applications (or set of instructions) 136.
  • a communication module or set of instructions 128, a contact/motion module (or set of instructions) 130, a graphics module (or set of instructions) 132, a text input module (or set of instructions) 134, a Global Positioning System (GPS) module (
  • the operating system 126 e.g., Darwin, RTXC, LINUX, UNIX, OS X, WINDOWS, or an embedded operating system such as Vx Works
  • the operating system 126 includes various software components and/or drivers for controlling and managing general system tasks (e.g., memory management, storage device control, power management, etc.) and facilitates communication between various hardware and software components.
  • the communication module 128 facilitates communication with other devices over one or more external ports 124 and also includes various software components for handling data received by the RF circuitry 108 and/or the external port 124.
  • the external port 124 e.g., Universal Serial Bus (USB), FIREWIRE, etc.
  • USB Universal Serial Bus
  • FIREWIRE FireWire
  • the external port is adapted for coupling directly to other devices or indirectly over a network (e.g., the Internet, wireless LAN, etc.).
  • the external port is a multi-pin (e.g., 30-pin) connector that is the same as, or similar to and/or compatible with the 30-pin connector used on iPod (trademark of Apple Computer, Inc.) devices.
  • the contact/motion module 130 may detect contact with the touch screen 112
  • the contact/motion module 130 includes various software components for performing various operations related to detection of contact, such as determining if contact has occurred, determining if there is movement of the contact and tracking the movement across the touch screen 112, and determining if the contact has been broken (i.e., if the contact has ceased). Determining movement of the point of contact may include determining speed (magnitude), velocity (magnitude and direction), and/or an acceleration (a change in magnitude and/or direction) of the point of contact. These operations may be applied to single contacts (e.g., one finger contacts) or to multiple simultaneous contacts (e.g., "multitouch'Vmultiple finger contacts).
  • the contact/motion module 130 and the display controller 156 also detects contact on a touchpad.
  • the contact/motion module 130 and the controller 160 detects contact on a click wheel.
  • the graphics module 132 includes various known software components for rendering and displaying graphics on the touch screen 112, including components for changing the intensity of graphics that are displayed.
  • graphics includes any object that can be displayed to a user, including without limitation text, web pages, icons (such as user-interface objects including soft keys), digital images, videos, animations and the like.
  • the text input module 134 which may be a component of graphics module
  • 132 provides soft keyboards for entering text in various applications (e.g., contacts 137, e- mail 140, IM 141, blogging 142, browser 147, and any other application that needs text input).
  • applications e.g., contacts 137, e- mail 140, IM 141, blogging 142, browser 147, and any other application that needs text input).
  • the GPS module 135 determines the location of the device and provides this information for use in various applications (e.g., to telephone 138 for use in location-based dialing, to camera 143 and/or blogger 142 as picture/video metadata, and to applications that provide location-based services such as weather widgets, local yellow page widgets, and map/navigation widgets).
  • applications e.g., to telephone 138 for use in location-based dialing, to camera 143 and/or blogger 142 as picture/video metadata, and to applications that provide location-based services such as weather widgets, local yellow page widgets, and map/navigation widgets).
  • the applications 136 may include the following modules (or sets of instructions), or a subset or superset thereof:
  • a contacts module 137 (sometimes called an address book or contact list);
  • a camera module 143 for still and/or video images
  • widget modules 149 which may include weather widget 149-1, stocks widget 149-2, calculator widget 149-3, alarm clock widget 149-4, dictionary widget 149-5, and other widgets obtained by the user, as well as user-created widgets 149-6;
  • widget creator module 150 for making user-created widgets 149-6;
  • search module 151 • search module 151;
  • map module 154 • map module 154 ;
  • Examples of other applications 136 that may be stored in memory 102 include other word processing applications, JAVA-enabled applications, encryption, digital rights management, voice recognition, and voice replication.
  • the contacts module 137 may be used to manage an address book or contact list, including: adding name(s) to the address book; deleting name(s) from the address book; associating telephone number(s), e-mail address(es), physical address(es) or other information with a name; associating an image with a name; categorizing and sorting names; providing telephone numbers or e-mail addresses to initiate and/or facilitate communications by telephone 138, video conference 139, e-mail 140, or IM 141; and so forth.
  • the telephone module 138 may be used to enter a sequence of characters corresponding to a telephone number, access one or more telephone numbers in the address book 137, modify a telephone number that has been entered, dial a respective telephone number, conduct a conversation and disconnect or hang up when the conversation is completed.
  • the wireless communication may use any of a plurality of communications standards, protocols and technologies.
  • the videoconferencing module 139 may be used to initiate, conduct, and terminate a video conference between a user and one or more other participants.
  • the e-mail client module 140 may be used to create, send, receive, and manage e-mail.
  • the e-mail module 140 makes it very easy to create and send e-mails with still or video images taken with camera module 143.
  • the instant messaging module 141 may be used to enter a sequence of characters corresponding to an instant message, to modify previously entered characters, to transmit a respective instant message (for example, using a Short Message Service (SMS) or Multimedia Message Service (MMS) protocol for telephony-based instant messages or using XMPP, SIMPLE, or IMPS for Internet-based instant messages), to receive instant messages and to view received instant messages.
  • SMS Short Message Service
  • MMS Multimedia Message Service
  • XMPP extensible Markup Language
  • SIMPLE Session Initiation Protocol
  • IMPS Internet Messaging Protocol
  • transmitted and/or received instant messages may include graphics, photos, audio files, video files and/or other attachments as are supported in a MMS and/or an Enhanced Messaging Service (EMS).
  • EMS Enhanced Messaging Service
  • instant messaging refers to both telephony-based messages (e.g., messages sent using SMS or MMS) and Internet-based messages (e.g., messages sent using XMPP, SIMPLE, or IMPS).
  • the blogging module 142 may be used to send text, still images, video, and/or other graphics to a blog (e.g., the user's blog).
  • the camera module 143 may be used to capture still images or video (including a video stream) and store them into memory 102, modify characteristics of a still image or video, or delete a still image or video from memory 102.
  • the image management module 144 may be used to arrange, modify or otherwise manipulate, label, delete, present (e.g., in a digital slide show or album), and store still and/or video images.
  • the video player module 145 may be used to display, present or otherwise play back videos (e.g., on the touch screen or on an external, connected display via external port 124).
  • the music player module 146 allows the user to download and play back recorded music and other sound files stored in one or more file formats, such as MP3 or AAC files.
  • the device 100 may include the functionality of an MP3 player, such as an iPod (trademark of Apple Computer, Inc.).
  • the browser module 147 may be used to browse the Internet, including searching, linking to, receiving, and displaying web pages or portions thereof, as well as attachments and other files linked to web pages.
  • the calendar module 148 may be used to create, display, modify, and store calendars and data associated with calendars (e.g., calendar entries, to do lists, etc.).
  • the widget modules 149 are mini-applications that may be downloaded and used by a user (e.g., weather widget 149-1, stocks widget 149-2, calculator widget 149-3, alarm clock widget 149-4, and dictionary widget 149-5) or created by the user (e.g., user- created widget 149-6).
  • a widget includes an HTML (Hypertext Markup Language) file, a CSS (Cascading Style Sheets) file, and a JavaScript file.
  • a widget includes an XML (Extensible Markup Language) file and a JavaScript file (e.g., Yahoo! Widgets).
  • XML Extensible Markup Language
  • JavaScript e.g., Yahoo! Widgets
  • the widget creator module 150 may be used by a user to create widgets (e.g., turning a user-specified portion of a web page into a widget).
  • search module 151 may be used to search for text, music, sound, image, video, and/or other files in memory 102 that match one or more search criteria (e.g., one or more user-specified search terms).
  • the notes module 153 may be used to create and manage notes, to do lists, and the like.
  • the map module 154 may be used to receive, display, modify, and store maps and data associated with maps (e.g., driving directions; data on stores and other points of interest at or near a particular location; and other location-based data).
  • maps e.g., driving directions; data on stores and other points of interest at or near a particular location; and other location-based data.
  • the online video module 155 allows the user to access, browse, receive (e.g., by streaming and/or download), play back (e.g., on the touch screen or on an external, connected display via external port 124), send an e-mail with a link to a particular online video, and otherwise manage online videos in one or more file formats, such as H.264.
  • instant messaging module 141 rather than e-mail client module 140, is used to send a link to a particular online video.
  • modules and applications correspond to a set of instructions for performing one or more functions described above.
  • These modules i.e., sets of instructions
  • video player module 145 may be combined with music player module 146 into a single module (e.g., video and music player module 152, Figure IB).
  • memory 102 may store a subset of the modules and data structures identified above. Furthermore, memory 102 may store additional modules and data structures not described above.
  • the device 100 is a device where operation of a predefined set of functions on the device is performed exclusively through a touch screen 112 and/or a touchpad.
  • a touch screen and/or a touchpad as the primary input/control device for operation of the device 100, the number of physical input/control devices (such as push buttons, dials, and the like) on the device 100 may be reduced.
  • the predefined set of functions that may be performed exclusively through a touch screen and/or a touchpad include navigation between user interfaces.
  • the touchpad when touched by the user, navigates the device 100 to a main, home, or root menu from any user interface that may be displayed on the device 100.
  • the touchpad may be referred to as a "menu button.”
  • the menu button may be a physical push button or other physical input/control device instead of a touchpad.
  • Figure 2 illustrates a portable multifunction device 100 having a touch screen
  • the touch screen may display one or more graphics within user interface (UI) 200.
  • UI user interface
  • a user may select one or more of the graphics by making contact or touching the graphics, for example, with one or more fingers 202 (not drawn to scale in the figure).
  • selection of one or more graphics occurs when the user breaks contact with the one or more graphics.
  • the contact may include a gesture, such as one or more taps, one or more swipes (from left to right, right to left, upward and/or downward) and/or a rolling of a finger (from right to left, left to right, upward and/or downward) that has made contact with the device 100.
  • inadvertent contact with a graphic may not select the graphic. For example, a swipe gesture that sweeps over an application icon may not select the corresponding application when the gesture corresponding to selection is a tap.
  • the device 100 may also include one or more physical buttons, such as
  • menu button 204 may be used to navigate to any application 136 in a set of applications that may be executed on the device 100.
  • the menu button is implemented as a soft key in a GUI in touch screen 112.
  • the device 100 includes a touch screen 112, a menu button 204, a push button 206 for powering the device on/off and locking the device, volume adjustment button(s) 208, a Subscriber Identity Module (SIM) card slot 210, a head set jack 212, and a docking/charging external port 124.
  • the push button 206 may be used to turn the power on/off on the device by depressing the button and holding the button in the depressed state for a predefined time interval; to lock the device by depressing the button and releasing the button before the predefined time interval has elapsed; and/or to unlock the device or initiate an unlock process.
  • the device 100 also may accept verbal input for activation or deactivation of some functions through the microphone 113.
  • Figure 3 illustrates an exemplary user interface for unlocking a portable electronic device in accordance with some embodiments.
  • user interface 300 includes the following elements, or a subset or superset thereof:
  • Unlock image 302 that is moved with a finger gesture to unlock the device
  • the device detects contact with the touch-sensitive display (e.g., a user's finger making contact on or near the unlock image 302) while the device is in a user-interface lock state.
  • the device moves the unlock image 302 in accordance with the contact.
  • the device transitions to a user-interface unlock state if the detected contact corresponds to a predefined gesture, such as moving the unlock image across channel 306.
  • a predefined gesture such as moving the unlock image across channel 306.
  • the device maintains the user-interface lock state if the detected contact does not correspond to the predefined gesture.
  • Figures 4 A and 4B illustrate exemplary user interfaces for a menu of applications on a portable multifunction device in accordance with some embodiments.
  • user interface 400A includes the following elements, or a subset or superset thereof: • Signal strength indicator(s) 402 for wireless communication(s), such as cellular and
  • Tray 408 with icons for frequently used applications such as one or more of the following: o Phone 138, which may include an indicator 414 of the number of missed calls or voicemail messages; o E-mail client 140, which may include an indicator 410 of the number of unread e-mails; o Browser 147; and o Music player 146; and
  • Icons for other applications such as one or more of the following: o IM 141; o Image management 144; o Camera 143; o Video player 145; o Weather 149-1; o Stocks 149-2; o Blog 142; o Calendar 148; o Calculator 149-3; o Alarm clock 149-4; o Dictionary 149-5; and o User-created widget 149-6.
  • user interface 400B includes the following elements, or a subset or superset thereof:
  • Settings 412 which provides access to settings for the device 100 and its various applications 136;
  • Video and music player module 152 also referred to as iPod (trademark of Apple Computer, Inc.) module 152; and
  • Online video module 155 also referred to as YouTube (trademark of Google, Inc.) module 155.
  • UI 400A or 400B displays all of the available applications 136 on one screen so that there is no need to scroll through a list of applications (e.g., via a scroll bar).
  • the icons corresponding to the applications may decrease in size so that all applications may be displayed on a single screen without scrolling.
  • having all applications on one screen and a menu button enables a user to access any desired application with at most two inputs, such as activating the menu button 204 and then activating the desired application (e.g., by a tap or other finger gesture on the icon corresponding to the application).
  • UI 400A or 400B provides integrated access to both widget-based applications and non-widget-based applications.
  • all of the widgets, whether user-created or not, are displayed in UI 400A or 400B.
  • activating the icon for user-created widget 149-6 may lead to another UI that contains the user-created widgets or icons corresponding to the user-created widgets.
  • a user may rearrange the icons in UI 400A or 400B, e.g., using processes described in U.S. Patent Application No. 11/459,602, "Portable Electronic Device With Interface Reconfiguration Mode," filed July 24, 2006, which is hereby incorporated by reference in its entirety.
  • a user may move application icons in and out of tray 408 using finger gestures.
  • UI 400A or 400B includes a gauge (not shown) that displays an updated account usage metric for an account associated with usage of the device (e.g., a cellular phone account), as described in U.S. Patent Application 11/322,552,
  • user interface object (which is interchangeable with “graphical object”) generally refers to a graphical icon on the touch screen display, which may be associated with an entertainment item, an application configuration option, an email message, a photo, a data file, or the like depending on the specific application that employs the schemes.
  • Figures 8 A through 81 illustrate exemplary user interfaces for a music and video player in accordance with some embodiments. Note that these user interfaces are only examples illustrating the processes mentioned above. One skilled in the art may apply these user navigation schemes to other applications that provide or require similar user experience.
  • icons for major content categories are displayed in a first area of the display (e.g., 4340, Figure 8A).
  • the first area also includes an icon
  • the major content categories that are displayed in the first area 4340 of the display can be rearranged by a user to correspond to the user's preferred (favorite) categories (e.g., as illustrated in Figures 8A-8G).
  • activation of add category icon 4344 e.g., by a finger tap on the icon
  • activation of edit icon 4342 in Figure 8A e.g., by a finger tap on the icon
  • moving affordance icons 4360 may be used as control icons that assist in rearranging categories or other UI objects.
  • FIG 5 is a flow diagram illustrating a first process for swapping first and second user interface objects in accordance with some embodiments.
  • a portable multifunction device with a touch screen display with a plurality of user interface objects displays a first user interface object (e.g., genres icon 4350, Figure 8B) and a second user interface object (e.g., artists icon 4310, Figure 8B) on the touch screen display (501).
  • the first user interface object is one of a group of candidate icons (e.g., icons in the more list 4362, Figure 8B, which are candidates for rearrangement) and the second user interface object is one of a group of user favorite icons (e.g., icons in area 4340).
  • a finger-down event is detected at the first user interface object (503) (e.g., contact 4346-1, Figure 8B).
  • the first user interface object includes a control icon (e.g., the horizontal bars comprising a moving affordance icon 4360 in genres icon 4350) and the finger-down event occurs at or near the control icon.
  • One or more finger-dragging events are detected on the touch screen display
  • the first user interface object is moved on the touch screen display along a path determined by the finger-dragging events until the first user interface object at least in part overlaps the second user interface object (507).
  • the first user interface object while moving the first user interface object on the touch screen display, the first user interface object is displayed in a manner visually distinguishable from other user interface objects on the touch screen display (e.g., the shading around genres icon 4350 in Figure 8C).
  • a finger-up event is detected at the second user interface object (509) (e.g., ending contact at 4346-3, Figure 8C).
  • the second user interface object e.g., artists icon 4310, Figure 8C
  • the first user interface object e.g., genres icon 4350, Figure 8D.
  • the first user interface object is displayed at a location formerly occupied by the second user interface object, and a movement of the second user interface object to a location formerly occupied by the first user interface object is animated (e.g., in Figure 8D, artists 4310 is now part of the list that used to include genres 4350).
  • the first user interface object is displayed in a first form before the finger-up event and in a second form after the finger-up event, and the second form is visually different from the first form.
  • the first form is a row including characters and at least one control icon (e.g., 4350, Figure 8B) and the second form is an image or other graphic (e.g., 4350, Figure 8D).
  • the second user interface object is displayed in a first form before the finger-up event and in a second form after the finger-up event, and the second form is visually different from the first form.
  • the first form is an image or other graphic (e.g., 4310, Figure 8B) and the second form is a row (e.g., 4310, Figure 8D) including characters associated with at least one control icon (e.g., 4360-2, Figure 8D).
  • the second form is a row including characters near, or within a predefined distance, corresponding to a hit region for the control icon.
  • the first user interface object is one of a group of candidate icons and the second user interface object is one of a group of user favorite icons.
  • the remaining group of candidate icons is rearranged after moving the first user interface object away from its original location.
  • the remaining group of candidate icons is the group of candidate icons excluding the first user interface object.
  • the first user interface object is displayed at a location formerly occupied by the second user interface object and a movement of the second user interface object to a location formerly occupied by one of the remaining group of candidate icons is animated.
  • Figure 6 is a flow diagram illustrating a second process for swapping first and second user interface objects in accordance with some embodiments.
  • Figures 8E through 8G illustrate another way the major content categories that are displayed in the first area 4340 of the display can be rearranged by a user to correspond to the user's preferred (favorite) categories.
  • the categories that are included in area 4340 may also be listed in a first list area 4364 in the more list 4362 (e.g., above separator 4352 in the more list 4362), with the candidate categories listed in a second list area 4366 in the more list 4362 (e.g., below separator 4352 in the more list 4362).
  • a finger down event (601) (e.g., 4346-5, Figure 8E)
  • a first user interface object is identified at which the finger-down event occurs (603) (e.g., genres icon 4350).
  • the first user interface object is moved on the touch screen display in accordance with the finger-dragging event
  • the portable device Upon detecting a finger up event (609) (e.g., at 4346-7), the portable device identifies a second user interface object at which the finger-up event occurs (611) and then visually replaces the second user interface object with the first user interface object (613) (e.g., artists icon 4310) in both the first list area 4364 and in area 4340 (e.g., 4350-1 and 4350-2, Figure 8G), with the second user interface object moving to the second list area 4366 (e.g., 4310, Figure 8G).
  • a finger up event 609
  • the portable device identifies a second user interface object at which the finger-up event occurs (611) and then visually replaces the second user interface object with the first user interface object (613) (e.g., artists icon 4310) in both the first list area 4364 and in area 4340 (e.g., 4350-1 and 4350-2, Figure 8G), with the second user interface object moving to the second list area 4366 (e.g., 4310, Figure
  • a portable multifunction device displays a first group of user interface objects on the touch screen display (e.g., icons in the more list 4362, Figure 8B, which are candidates for rearrangement).
  • a second group of user interface objects is displayed on the touch screen display (e.g., icons in area 4340).
  • a finger-down event is detected on the touch screen display (e.g., contact 4346-1, Figure 8B).
  • a first user interface object e.g., genres icon 4350, Figure 8B
  • a first group at which the finger-down event occurs is identified.
  • One or more finger-dragging events are detected on the touch screen display (e.g., the finger drag from 4346-1 (Figure 8B) to 4346-2 (Figure 8C) to 4346-3 via 4365 ( Figure 8C)).
  • the first user interface object on the touch screen display is moved in accordance with the finger-dragging events.
  • a finger-up event is detected on the touch screen display (e.g., ending contact at 4346-3, Figure 8C).
  • a second user interface object e.g., artists icon 4310, Figure 8B
  • the second user interface object is visually replaced with the first user interface object (e.g., artists icon 4310 in Figure 8C is visually replaced with genres icon 4350 in Figure 8D).
  • the set of finger movements described above can be employed to represent a user's feedback on information or services provided by the portable device.
  • Figure 7 is a flow diagram illustrating a third process for displaying a ratings icon using as input a finger swipe gesture on the touch screen display in accordance with some embodiments.
  • a user rating may be applied to an item of content with a finger gesture.
  • a portable multifunction device displays a series of ratings indicia (e.g., 4382, Figure 8H and 81) on a touch screen display (701).
  • the ratings indicia comprise a lowest rating indicia and one or more progressively higher rating indicia.
  • the ratings indicia comprise stars (e.g., 4382-2, Figure 81). In some embodiments, the series of ratings indicia consists of five stars.
  • a finger gesture e.g., 4384, Figure 81
  • the finger gesture contacts a last rating indicia immediately prior to breaking contact with the touch screen display (e.g., the third rating indicia in Figure 81). In some embodiments, the finger gesture contacts the lowest rating indicia prior to contacting one or more of the progressively higher rating indicia. In some embodiments, the finger gesture is a swipe gesture.
  • a rating corresponding to the last rating indicia contacted by the finger gesture is used as input to a function or application in the device (705).
  • the three-star rating for the song "Come Together" in Figure 81 may be used to sort this content versus other content in the device and/or to determine how often this content is heard when content is played in a random order.
  • the rating corresponding to the last rating indicia contacted by the finger gesture is used to give a rating for an item of content that is playable with a content player application on the device.
  • the item of content is an item of music and the content player application is a music player application.
  • the item of content is a video and the content player application is a video player application.
  • the rating corresponding to the last rating indicia contacted by the finger gesture is used to give a rating for content on a web page that is viewable with a browser application on the device.
  • a graphical user interface on a portable multifunction device with a touch screen display comprises a series of ratings indicia 4382 on the touch screen display.
  • the ratings indicia comprise a lowest rating indicia and one or more progressively higher rating indicia.
  • a rating corresponding to the last rating indicia contacted by the finger gesture is used as input to a function or an application in the device.
  • Figures 9A-9P illustrate exemplary user interfaces for an online video application for a portable multifunction device in accordance with some embodiments.
  • a computer-implemented method is performed at a portable electronic device (e.g., 100) with a touch screen display 112.
  • the device displays a first list 2330-1 ( Figure 9A) of information about online video items in a plurality of lists 2330 of information about online video items.
  • the plurality of lists of information about online video items include at least two of: a list of information about featured content items (e.g., videos featured by the online video website), a list of information about most recently added content items (e.g., videos most recently added to the online video website), a list of information about most viewed content items (e.g., videos most viewed by other users of the online video website, 2330-1,
  • Figure 9A a list of information about top rated content items (e.g., videos rated by other users of the online video website), a list of information about content items bookmarked by a user of the computing device (e.g., bookmark list 2330-2, Figure 90), and a list of information about content items viewed by a user of the computing device (e.g., a list with a history of the video played by the user).
  • a respective list 2330 of information about online video items is displayed in a portrait orientation of the touch screen display.
  • a respective list may be chosen to correspond to a specific time period.
  • the device displays a plurality of icons (e.g., 2332-1, 2332-2, and 2332-3,
  • Figure 9A corresponding to at least some of the plurality of lists of information about online video items.
  • the plurality of icons are displayed at the same time as a list of information about online video items (e.g., list 2330-1, Figure 9A).
  • the device displays a search icon 2334 that when activated initiates the display of a user interface 2300R ( Figure 9N) for searching for online video items.
  • the device In response to detecting a moving finger gesture 2336 on the first list of information about content items, the device scrolls the first list of information about content items.
  • the device In response to detecting a stationary finger contact on a first portion 2338 of a row 2340 in the first list of information about online video items, wherein the row contains information about a particular online video item, the device: initiates a request for the particular online video item 2342 from a remote computer (e.g., an online video server for a web site such as www.youtube.com), receives the particular online video item 2342, and plays the particular online video item 2342 ( Figure 9B).
  • a remote computer e.g., an online video server for a web site such as www.youtube.com
  • the first portion 2338 of a row includes anywhere in the row except a second portion of the row, such as additional information icon 2344.
  • the row 2340 has a width
  • the touch screen 112 has a width
  • the width of the row is substantially the same as the width of the touch screen display (e.g., at least 90% of the width of the touch screen display).
  • the touch screen display 112 has an area and the particular online video item 2342 uses substantially all (e.g., at least 90%) of the touch screen display area when the particular online video item is played.
  • the particular online video item 2342 is played in a landscape orientation of the touch screen display ( Figure 9B).
  • the device displays one or more playback controls.
  • the one or more playback controls comprise a play icon 2304, a pause icon (not shown, which may toggle with the play icon 2304), a sound volume icon 2324, and/or a playback progress bar icon 2310.
  • displaying one or more playback controls comprises displaying one or more playback controls on top of the particular online video item 2342 (e.g., a semi- transparent "heads-up display", as illustrated in Figure 9B).
  • the device ceases to display the one or more playback controls.
  • ceasing to display the one or more playback controls comprises fading out the one or more playback controls. In some embodiments, the display of the one or more playback controls is ceased after a predetermined time. In some embodiments, the display of the one or more playback controls is ceased after no contact is detected with the touch screen display for a predetermined time.
  • the device in response to detecting a finger contact 2346 on the touch screen display while the particular online video item is playing, the device displays a bookmark icon 2350 that, if activated by another finger contact 2348, bookmarks the particular online video item 2342 (or initiates a process for creating a bookmark for the item).
  • the device in response to detecting a finger contact 2346 on the touch screen display while the particular online video item is playing, the device displays a sharing icon 2352 that, if activated by another finger contact 2354, initiates creation of an electronic message to another user that includes a link to the particular online video item.
  • the electronic message is an email ( Figure 9E).
  • the electronic message is an instant message, such as an SMS message.
  • the device displays a corresponding list (e.g., 2330-1, Figure 9A or 2330-2, Figure 90, respectively ) of information about online video items.
  • a corresponding list e.g., 2330-1, Figure 9A or 2330-2, Figure 90, respectively
  • the device in response to detecting a finger contact on a second portion of the row in the first list of information about online video items (e.g., a contact on icon 2344-3, Figure 9A), displays additional information about the particular online video item (e.g., UI 2300G, Figure 9C).
  • the second portion (e.g., icon 2344) of the row is different from the first portion 2338 of the row (e.g., anywhere else in the row 2340 besides icon 2344).
  • the additional information about the particular online video item includes information about related online video items 2356.
  • the device in response to detecting a finger contact on the second portion of the row, displays a bookmark icon 2358 that, if activated by another finger contact 2360, bookmarks the particular online video item (or initiates a process for creating a bookmark).
  • the device in response to detecting a finger contact on the second portion of the row, displays a sharing icon 2362 that, if activated by another finger contact 2364, initiates creation of an electronic message to another user that includes a link to (or an online address for) the particular online video item ( Figure 9E).
  • the device in response to detecting a finger contact on the second portion of the row, the device: (a) displays a bookmark icon 2358 and a sharing icon 2362 if the particular online video item is not bookmarked (Figure 9C), and (b) displays an enlarged sharing icon 2366 (Figure 9D) without the bookmark icon if the particular online video item is already bookmarked ( Figure 9D).
  • the device displays an icon 2368 that when activated initiates the display of: (a) icons corresponding to at least some of the plurality of lists of information about online video items (e.g., 2332-4, 2332-5, 2332-6, Figure 9F) , and (b) a configuration icon (e.g., Edit icon 2370, Figure 9F) that when activated initiates the display of a user interface 2300K ( Figure 9G) for configuring which icons corresponding to at least some of the plurality of lists are displayed with the first list of information.
  • icons corresponding to at least some of the plurality of lists of information about online video items e.g., 2332-4, 2332-5, 2332-6, Figure 9F
  • a configuration icon e.g., Edit icon 2370, Figure 9F
  • the device after detecting a gesture on the configuration icon 2370, the device: detects a finger-down event 2372 at a first icon in a plurality of icons; detects one or more finger- dragging events 2374 on the touch screen display; moves the first icon on the touch screen display along a path determined by the finger-dragging events until the first icon at least in part overlaps a second icon in the plurality of icons (e.g., in Figure 91, "Most Recent” icon partially overlaps "Most Viewed” icon); detects a finger-up event 2376 at the second icon; and visually replaces the second icon with the first icon (e.g., in Figure 9J, "Most Recent” icon visually replaces the "Most Viewed” icon in Figure 91).
  • the device while moving the first icon on the touch screen display, displays the first icon in a manner visually distinguishable from other icons on the touch screen display (e.g., the "Most Recent” icon is enlarged in Figure 91).
  • a manner visually distinguishable from other icons on the touch screen display e.g., the "Most Recent” icon is enlarged in Figure 91.
  • an analogous finger down, finger drag, and finger up process may be used to rearrange the icons 2332 (and 2334) that are displayed with the first list of information (e.g., exchanging the positions of the "Most Recent" icon and the "Bookmarks" icon).
  • the device in response to detecting a finger contact on a playback completion icon 2314 ( Figure 9B), the device ceases to play the particular online video item
  • the finger contact detected on the playback completion icon comprises a tap gesture.
  • a graphical user interface 2300E on a portable electronic device 100 with a touch screen display 112 includes: a first list 2330-1 of information about online video items in a plurality of lists of information about online video items; and a plurality of icons 2332 corresponding to at least some of the plurality of lists of information about online video items.
  • a request is initiated for the particular online video item 2342 from a remote computer, the particular online video item 2342 is received, and the particular online video item 2342 is played.
  • additional information is displayed about the particular online video item (e.g., in UI 2300G, Figure 9C).
  • a corresponding list 2330 of information about online video items is displayed.

Abstract

A portable multifunction device (100) displays a first user interface object (4350) and a second user interface object (4310) on a touch screen display (112). Upon detecting a finger-down event (4346-2) at the first user interface object (4350) and one or more finger-dragging events (4365) on the touch screen display (112), the device (100) moves the first user interface object (4350) on the touch screen display (112) along a path determined by the finger-dragging events (4365) until the. first user interface object (4350) at least in part overlaps the second user interface object (4310). Upon detecting a finger-up event (4346-3) at the second user interface object (4310), the device (100) visually replaces the second user interface object (4310) with the first user interface object (4350).

Description

Portable Multifunction Device, Method, and Graphical User Interface Supporting User Navigations of Graphical Objects on a
Touch Screen Display
TECHNICAL FIELD [0001] The disclosed embodiments relate generally to portable electronic devices, and more particularly, to portable devices that support user navigations of graphical objects on a touch screen display.
BACKGROUND
[0002] As portable electronic devices become more compact, and the number of functions performed by a given device increase, it has become a significant challenge to design a user interface that allows users to easily interact with a multifunction device. This challenge is particular significant for handheld portable devices, which have much smaller screens than desktop or laptop computers. This situation is unfortunate because the user interface is the gateway through which users receive not only content but also responses to user actions or behaviors, including user attempts to access a device's features, tools, and functions. Some portable communication devices (e.g., mobile telephones, sometimes called mobile phones, cell phones, cellular telephones, and the like) have resorted to adding more pushbuttons, increasing the density of push buttons, overloading the functions of pushbuttons, or using complex menu systems to allow a user to access, store and manipulate data. These conventional user interfaces often result in complicated key sequences and menu hierarchies that must be memorized by the user.
[0003] Many conventional user interfaces, such as those that include physical pushbuttons, are also inflexible. This is unfortunate because it may prevent a user interface from being configured and/or adapted by either an application running on the portable device or by users. When coupled with the time consuming requirement to memorize multiple key sequences and menu hierarchies, and the difficulty in activating a desired pushbutton, such inflexibility is frustrating to most users. [0004] On the other hand, a touch-sensitive screen supports more user- friendly and intuitive means for a user to interact with graphical objects on the screen, such as dragging and dropping an object from one position to another using finger tip.
[0005] Accordingly, there is a need for portable multifunction devices with more transparent and intuitive user interfaces supporting user navigation of graphical objects on a touch screen display, e.g., swapping two objects, which are easy to use, configure, and/or adapt. Such interfaces increase the effectiveness, efficiency and user satisfaction with portable multifunction devices.
SUMMARY [0006] The above deficiencies and other problems associated with user interfaces for portable devices are reduced or eliminated by the disclosed portable multifunction device. In some embodiments, the device has a touch-sensitive display (also known as a "touch screen") with a graphical user interface (GUI), one or more processors, memory and one or more modules, programs or sets of instructions stored in the memory for performing multiple functions. In some embodiments, the user interacts with the GUI primarily through finger contacts and gestures on the touch-sensitive display. In some embodiments, the functions may include telephoning, video conferencing, e-mailing, instant messaging, blogging, digital photographing, digital videoing, web browsing, digital music playing, and/or digital video playing. Instructions for performing these functions may be included in a computer program product configured for execution by one or more processors.
[0007] One aspect of the invention involves a computer-implemented method performed by a portable multifunction device with a touch screen display. In response to a finger-down event on the touch screen display, the device identifies a first user interface object at which the finger-down event occurs. In response to one or more finger-dragging events on the touch screen display, the device then moves the first user interface object on the touch screen display in accordance with the finger-dragging events. In response to a finger- up event on the touch screen display, the device identifies a second user interface object at which the finger-up event occurs and visually replaces the second user interface object with the first user interface object. [0008] Another aspect of the invention involves a computer-implemented method performed by a portable multifunction device with a touch screen display. The device displays a first user interface object and a second user interface object on the touch screen display. In response to a finger-down event at the first user interface object and one or more finger-dragging events on the touch screen display, the device moves the first user interface object on the touch screen display along a path determined by the finger-dragging events until the first user interface object at least in part overlaps the second user interface object. In response to a finger-up event at the second user interface object, the device visually replaces the second user interface object with the first user interface object.
[0009] Another aspect of the invention involves a portable electronic device with a touch screen display for displaying a plurality of user interface objects. The device includes one or more processors, memory, and a program stored in the memory and configured to be executed by the one or more processors. The program includes: instructions for displaying a first user interface object and a second user interface object on the touch screen display; instructions for detecting a finger-down event at the first user interface object; instructions for detecting one or more finger-dragging events on the touch screen display; instructions for moving the first user interface object on the touch screen display along a path determined by the finger-dragging events until the first user interface object at least in part overlaps the second user interface object; instructions for detecting a finger-up event at the second user interface object; and instructions for visually replacing the second user interface object with the first user interface object. [0010] Another aspect of the invention involves a computer readable storage medium that stores one or more programs. The one or more programs include instructions, which when executed by the device, cause the device to: display a first user interface object and a second user interface object on the touch screen display; detect a finger-down event at the first user interface object; detect one or more finger-dragging events on the touch screen display; move the first user interface object on the touch screen display along a path determined by the finger-dragging events until the first user interface object at least in part overlaps the second user interface object; detect a finger-up event at the second user interface object; and visually replace the second user interface object with the first user interface object. [0011] Another aspect of the invention involves a portable electronic device with a touch screen display with a plurality of user interface objects. The device includes: means for displaying a first user interface object and a second user interface object on the touch screen display; means for detecting a finger-down event at the first user interface object; means for detecting one or more finger-dragging events on the touch screen display; means for moving the first user interface object on the touch screen display along a path determined by the finger-dragging events until the first user interface object at least in part overlaps the second user interface object; means for detecting a finger-up event at the second user interface object; and means for visually replacing the second user interface object with the first user interface object.
[0012] Another aspect of the invention involves a computer-implemented method performed by a portable multifunction device with a touch screen display. The device displays a series of ratings indicia on the touch screen display. The series of ratings indicia include a lowest rating indicia and one or more progressively higher rating indicia. In response to a finger gesture by a user on one or more of the ratings indicia, the device determines a last rating indicia contacted by the finger gesture immediately prior to the finger breaking contact with the touch screen display. A rating corresponding to the last rating indicia contacted by the finger gesture is used as input to a function or application in the device.
[0013] Another aspect of the invention involves a graphical user interface on a portable electronic device with a touch screen display. The graphical user interface includes a series of ratings indicia on the touch screen display. The ratings indicia include a lowest rating indicia and one or more progressively higher rating indicia. In response to a finger gesture on one or more of the ratings indicia, the graphical user interface displays on the touch screen a graphical object using as input the last rating indicia contacted by the finger gesture before the finger gesture breaks contact with the touch screen display.
[0014] Another aspect of the invention involves a portable electronic device with a touch screen display. The device includes one or more processors, memory, and one or more programs stored in the memory and configured to be executed by the processors. The programs include: instructions for displaying a series of ratings indicia on the touch screen display, wherein the ratings indicia comprise a lowest rating indicia and one or more progressively higher rating indicia; instructions for detecting a finger gesture by a user on one or more of the ratings indicia, wherein the finger gesture contacts a last rating indicia immediately prior to breaking contact with the touch screen display; and instructions for using a rating corresponding to the last rating indicia contacted by the finger gesture as input to a function or application in the device.
[0015] Another aspect of the invention involves a computer readable storage medium that stores one or more programs. The one or more programs include instructions, which when executed by a portable multifunction device with a touch screen display, cause the device to: display a series of ratings indicia on the touch screen display, wherein the ratings indicia comprise a lowest rating indicia and one or more progressively higher rating indicia; detect a finger gesture by a user on one or more of the ratings indicia, wherein the finger gesture contacts a last rating indicia immediately prior to breaking contact with the touch screen display; and use a rating corresponding to the last rating indicia contacted by the finger gesture as input to a function or application in the device.
[0016] Another aspect of the invention involves a portable electronic device with a touch screen display. The device includes: means for displaying a series of ratings indicia on the touch screen display, wherein the ratings indicia comprise a lowest rating indicia and one or more progressively higher rating indicia; means for detecting a finger gesture by a user on one or more of the ratings indicia, wherein the finger gesture contacts a last rating indicia immediately prior to breaking contact with the touch screen display; and means for using a rating corresponding to the last rating indicia contacted by the finger gesture as input to a function or application in the device.
BRIEF DESCRIPTION OF THE DRAWINGS
[0017] For a better understanding of the aforementioned embodiments of the invention as well as additional embodiments thereof, reference should be made to the Description of Embodiments below, in conjunction with the following drawings in which like reference numerals refer to corresponding parts throughout the figures. [0018] Figures IA and IB are block diagrams illustrating portable multifunction devices with touch-sensitive displays in accordance with some embodiments.
[0019] Figure 2 illustrates a portable multifunction device having a touch screen in accordance with some embodiments.
[0020] Figure 3 illustrates an exemplary user interface for unlocking a portable electronic device in accordance with some embodiments. [0021] Figures 4 A and 4B illustrate exemplary user interfaces for a menu of applications on a portable multifunction device in accordance with some embodiments.
[0022] Figure 5 is a flow diagram illustrating a first process for swapping first and second user interface objects in accordance with some embodiments. [0023] Figure 6 is a flow diagram illustrating a second process for swapping first and second user interface objects in accordance with some embodiments.
[0024] Figure 7 is a flow diagram illustrating a third process for displaying a ratings icon using as input a finger swipe gesture on the touch screen display in accordance with some embodiments. [0025] Figures 8 A through 81 illustrate exemplary user interfaces for a music and video player in accordance with some embodiments.
[0026] Figures 9A-9P illustrate exemplary user interfaces for an online video application for a portable multifunction device in accordance with some embodiments.
DESCRIPTION OF EMBODIMENTS [0027] Reference will now be made in detail to embodiments, examples of which are illustrated in the accompanying drawings. In the following detailed description, numerous specific details are set forth in order to provide a thorough understanding of the present invention. However, it will be apparent to one of ordinary skill in the art that the present invention may be practiced without these specific details. In other instances, well-known methods, procedures, components, circuits, and networks have not been described in detail so as not to unnecessarily obscure aspects of the embodiments.
[0028] It will also be understood that, although the terms first, second, etc. may be used herein to describe various elements, these elements should not be limited by these terms. These terms are only used to distinguish one element from another. For example, a first gesture could be termed a second gesture, and, similarly, a second gesture could be termed a first gesture, without departing from the scope of the present invention.
[0029] The terminology used in the description of the invention herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the invention. As used in the description of the invention and the appended claims, the singular forms "a", "an" and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise. It will also be understood that the term "and/or" as used herein refers to and encompasses any and all possible combinations of one or more of the associated listed items. It will be further understood that the terms "comprises" and/or "comprising," when used in this specification, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof.
[0030] Embodiments of a portable multifunction device, user interfaces for such devices, and associated processes for using such devices are described. In some embodiments, the device is a portable communications device such as a mobile telephone that also contains other functions, such as PDA and/or music player functions.
[0031] The user interface may include a physical click wheel in addition to a touch screen or a virtual click wheel displayed on the touch screen. A click wheel is a user- interface device that may provide navigation commands based on an angular displacement of the wheel or a point of contact with the wheel by a user of the device. A click wheel may also be used to provide a user command corresponding to selection of one or more items, for example, when the user of the device presses down on at least a portion of the wheel or the center of the wheel. Alternatively, breaking contact with a click wheel image on a touch screen surface may indicate a user command corresponding to selection. For simplicity, in the discussion that follows, a portable multifunction device that includes a touch screen is used as an exemplary embodiment. It should be understood, however, that some of the user interfaces and associated processes may be applied to other devices, such as personal computers and laptop computers, which may include one or more other physical user- interface devices, such as a physical click wheel, a physical keyboard, a mouse and/or a joystick.
[0032] The device supports a variety of applications, such as a telephone application, a video conferencing application, an e-mail application, an instant messaging application, a blogging application, a digital camera application, a digital video camera application, a web browsing application, a digital music player application, and/or a digital video player application.
[0033] The various applications that may be executed on the device may use at least one common physical user-interface device, such as the touch screen. One or more functions of the touch screen as well as corresponding information displayed on the device may be adjusted and/or varied from one application to the next and/or within a respective application. In this way, a common physical architecture (such as the touch screen) of the device may support the variety of applications with user interfaces that are intuitive and transparent. [0034] The user interfaces may include one or more soft keyboard embodiments. The soft keyboard embodiments may include standard (QWERTY) and/or non-standard configurations of symbols on the displayed icons of the keyboard, such as those described in U.S. Patent Applications 11/459,606, "Keyboards For Portable Electronic Devices," filed July 24, 2006, and 11/459,615, "Touch Screen Keyboards For Portable Electronic Devices," filed July 24, 2006, the contents of which are hereby incorporated by reference in their entirety.
The keyboard embodiments may include a reduced number of icons (or soft keys) relative to the number of keys in existing physical keyboards, such as that for a typewriter. This may make it easier for users to select one or more icons in the keyboard, and thus, one or more corresponding symbols. The keyboard embodiments may be adaptive. For example, displayed icons may be modified in accordance with user actions, such as selecting one or more icons and/or one or more corresponding symbols. One or more applications on the portable device may utilize common and/or different keyboard embodiments. Thus, the keyboard embodiment used may be tailored to at least some of the applications. In some embodiments, one or more keyboard embodiments may be tailored to a respective user. For example, one or more keyboard embodiments may be tailored to a respective user based on a word usage history (lexicography, slang, individual usage) of the respective user. Some of the keyboard embodiments may be adjusted to reduce a probability of a user error when selecting one or more icons, and thus one or more symbols, when using the soft keyboard embodiments. [0035] Attention is now directed towards embodiments of the device. Figures IA and
IB are block diagrams illustrating portable multifunction devices 100 with touch-sensitive displays 112 in accordance with some embodiments. The touch-sensitive display 112 is sometimes called a "touch screen" for convenience, and may also be known as or called a touch-sensitive display system. The device 100 may include a memory 102 (which may include one or more computer readable storage mediums), a memory controller 122, one or more processing units (CPU's) 120, a peripherals interface 118, RF circuitry 108, audio circuitry 110, a speaker 111, a microphone 113, an input/output (I/O) subsystem 106, other input or control devices 116, and an external port 124. The device 100 may include one or more optical sensors 164. These components may communicate over one or more communication buses or signal lines 103.
[0036] It should be appreciated that the device 100 is only one example of a portable multifunction device 100, and that the device 100 may have more or fewer components than shown, may combine two or more components, or a may have a different configuration or arrangement of the components. The various components shown in Figures IA and IB may be implemented in hardware, software or a combination of both hardware and software, including one or more signal processing and/or application specific integrated circuits.
[0037] Memory 102 may include high-speed random access memory and may also include non- volatile memory, such as one or more magnetic disk storage devices, flash memory devices, or other non- volatile solid-state memory devices. Access to memory 102 by other components of the device 100, such as the CPU 120 and the peripherals interface 118, may be controlled by the memory controller 122.
[0038] The peripherals interface 118 couples the input and output peripherals of the device to the CPU 120 and memory 102. The one or more processors 120 run or execute various software programs and/or sets of instructions stored in memory 102 to perform various functions for the device 100 and to process data.
[0039] In some embodiments, the peripherals interface 118, the CPU 120, and the memory controller 122 may be implemented on a single chip, such as a chip 104. In some other embodiments, they may be implemented on separate chips.
[0040] The RF (radio frequency) circuitry 108 receives and sends RF signals, also called electromagnetic signals. The RF circuitry 108 converts electrical signals to/from electromagnetic signals and communicates with communications networks and other communications devices via the electromagnetic signals. The RF circuitry 108 may include well-known circuitry for performing these functions, including but not limited to an antenna system, an RF transceiver, one or more amplifiers, a tuner, one or more oscillators, a digital signal processor, a CODEC chipset, a subscriber identity module (SIM) card, memory, and so forth. The RF circuitry 108 may communicate with networks, such as the Internet, also referred to as the World Wide Web (WWW), an intranet and/or a wireless network, such as a cellular telephone network, a wireless local area network (LAN) and/or a metropolitan area network (MAN), and other devices by wireless communication. The wireless communication may use any of a plurality of communications standards, protocols and technologies, including but not limited to Global System for Mobile Communications (GSM), Enhanced Data GSM Environment (EDGE), high-speed downlink packet access (HSDPA), wideband code division multiple access (W-CDMA), code division multiple access (CDMA), time division multiple access (TDMA), Bluetooth, Wireless Fidelity (Wi-Fi) (e.g., IEEE 802.1 Ia, IEEE 802.11b, IEEE 802. Hg and/or IEEE 802.1 In), voice over Internet Protocol (VoIP),
Wi-MAX, a protocol for email (e.g., Internet message access protocol (IMAP) and/or post office protocol (POP)), instant messaging (e.g., extensible messaging and presence protocol (XMPP), Session Initiation Protocol for Instant Messaging and Presence Leveraging Extensions (SIMPLE), and/or Instant Messaging and Presence Service (IMPS)), and/or Short Message Service (SMS)), or any other suitable communication protocol, including communication protocols not yet developed as of the filing date of this document.
[0041] The audio circuitry 110, the speaker 111, and the microphone 113 provide an audio interface between a user and the device 100. The audio circuitry 110 receives audio data from the peripherals interface 118, converts the audio data to an electrical signal, and transmits the electrical signal to the speaker 111. The speaker 111 converts the electrical signal to human-audible sound waves. The audio circuitry 110 also receives electrical signals converted by the microphone 113 from sound waves. The audio circuitry 110 converts the electrical signal to audio data and transmits the audio data to the peripherals interface 118 for processing. Audio data may be retrieved from and/or transmitted to memory 102 and/or the RF circuitry 108 by the peripherals interface 118. In some embodiments, the audio circuitry 110 also includes a headset jack (e.g. 212, Figure 2). The headset jack provides an interface between the audio circuitry 110 and removable audio input/output peripherals, such as output-only headphones or a headset with both output (e.g., a headphone for one or both ears) and input (e.g., a microphone). [0042] The I/O subsystem 106 couples input/output peripherals on the device 100, such as the touch screen 112 and other input/control devices 116, to the peripherals interface 118. The I/O subsystem 106 may include a display controller 156 and one or more input controllers 160 for other input or control devices. The one or more input controllers 160 receive/send electrical signals from/to other input or control devices 116. The other input/control devices 116 may include physical buttons (e.g., push buttons, rocker buttons, etc.), dials, slider switches, joysticks, click wheels, and so forth. In some alternate embodiments, input controller(s) 160 may be coupled to any (or none) of the following: a keyboard, infrared port, USB port, and a pointer device such as a mouse. The one or more buttons (e.g., 208, Figure 2) may include an up/down button for volume control of the speaker 111 and/or the microphone 113. The one or more buttons may include a push button (e.g., 206, Figure 2). A quick press of the push button may disengage a lock of the touch screen 112 or begin a process that uses gestures on the touch screen to unlock the device, as described in U.S. Patent Application 11/322,549, "Unlocking a Device by Performing
Gestures on an Unlock Image," filed December 23, 2005, which is hereby incorporated by reference in its entirety. A longer press of the push button (e.g., 206) may turn power to the device 100 on or off. The user may be able to customize a functionality of one or more of the buttons. The touch screen 112 is used to implement virtual or soft buttons and one or more soft keyboards.
[0043] The touch-sensitive touch screen 112 provides an input interface and an output interface between the device and a user. The display controller 156 receives and/or sends electrical signals from/to the touch screen 112. The touch screen 112 displays visual output to the user. The visual output may include graphics, text, icons, video, and any combination thereof (collectively termed "graphics"). In some embodiments, some or all of the visual output may correspond to user-interface objects, further details of which are described below.
[0044] A touch screen 112 has a touch-sensitive surface, sensor or set of sensors that accepts input from the user based on haptic and/or tactile contact. The touch screen 112 and the display controller 156 (along with any associated modules and/or sets of instructions in memory 102) detect contact (and any movement or breaking of the contact) on the touch screen 112 and converts the detected contact into interaction with user-interface objects (e.g., one or more soft keys, icons, web pages or images) that are displayed on the touch screen. In an exemplary embodiment, a point of contact between a touch screen 112 and the user corresponds to a finger of the user. [0045] The touch screen 112 may use LCD (liquid crystal display) technology, or
LPD (light emitting polymer display) technology, although other display technologies may be used in other embodiments. The touch screen 112 and the display controller 156 may detect contact and any movement or breaking thereof using any of a plurality of touch sensing technologies now known or later developed, including but not limited to capacitive, resistive, infrared, and surface acoustic wave technologies, as well as other proximity sensor arrays or other elements for determining one or more points of contact with a touch screen 112. [0046] A touch-sensitive display in some embodiments of the touch screen 112 may be analogous to the multi-touch sensitive tablets described in the following U.S. Patents: 6,323,846 (Westerman et al), 6,570,557 (Westerman et al), and/or 6,677,932 (Westerman), and/or U.S. Patent Publication 2002/0015024A1, each of which is hereby incorporated by reference in its entirety. However, a touch screen 112 displays visual output from the portable device 100, whereas touch sensitive tablets do not provide visual output.
[0047] A touch-sensitive display in some embodiments of the touch screen 112 may be as described in the following applications: (1) U.S. Patent Application No. 11/381,313, "Multipoint Touch Surface Controller," filed May 2, 2006; (2) U.S. Patent Application No. 10/840,862, "Multipoint Touchscreen," filed May 6, 2004; (3) U.S. Patent Application No.
10/903,964, "Gestures For Touch Sensitive Input Devices," filed July 30, 2004; (4) U.S. Patent Application No. 11/048,264, "Gestures For Touch Sensitive Input Devices," filed January 31, 2005; (5) U.S. Patent Application No. 11/038,590, "Mode-Based Graphical User Interfaces For Touch Sensitive Input Devices," filed January 18, 2005; (6) U.S. Patent Application No. 11/228,758, "Virtual Input Device Placement On A Touch Screen User
Interface," filed September 16, 2005; (7) U.S. Patent Application No. 11/228,700, "Operation OfA Computer With A Touch Screen Interface," filed September 16, 2005; (8) U.S. Patent Application No. 11/228,737, "Activating Virtual Keys OfA Touch-Screen Virtual Keyboard," filed September 16, 2005; and (9) U.S. Patent Application No. 11/367,749, "Multi-Functional Hand-Held Device," filed March 3, 2006. All of these applications are incorporated by reference herein in their entirety.
[0048] The touch screen 112 may have a resolution in excess of 100 dpi. In an exemplary embodiment, the touch screen has a resolution of approximately 160 dpi. The user may make contact with the touch screen 112 using any suitable object or appendage, such as a stylus, a finger, and so forth. In some embodiments, the user interface is designed to work primarily with finger-based contacts and gestures, which are much less precise than stylus- based input due to the larger area of contact of a finger on the touch screen. In some embodiments, the device translates the rough finger-based input into a precise pointer/cursor position or command for performing the actions desired by the user. [0049] In some embodiments, in addition to the touch screen, the device 100 may include a touchpad (not shown) for activating or deactivating particular functions. In some embodiments, the touchpad is a touch-sensitive area of the device that, unlike the touch screen, does not display visual output. The touchpad may be a touch-sensitive surface that is separate from the touch screen 112 or an extension of the touch-sensitive surface formed by the touch screen.
[0050] In some embodiments, the device 100 may include a physical or virtual click wheel as an input control device 116. A user may navigate among and interact with one or more graphical objects (henceforth referred to as icons) displayed in the touch screen 112 by rotating the click wheel or by moving a point of contact with the click wheel (e.g., where the amount of movement of the point of contact is measured by its angular displacement with respect to a center point of the click wheel). The click wheel may also be used to select one or more of the displayed icons. For example, the user may press down on at least a portion of the click wheel or an associated button. User commands and navigation commands provided by the user via the click wheel may be processed by an input controller 160 as well as one or more of the modules and/or sets of instructions in memory 102. For a virtual click wheel, the click wheel and click wheel controller may be part of the touch screen 112 and the display controller 156, respectively. For a virtual click wheel, the click wheel may be either an opaque or semitransparent object that appears and disappears on the touch screen display in response to user interaction with the device. In some embodiments, a virtual click wheel is displayed on the touch screen of a portable multifunction device and operated by user contact with the touch screen. [0051] The device 100 also includes a power system 162 for powering the various components. The power system 162 may include a power management system, one or more power sources (e.g., battery, alternating current (AC)), a recharging system, a power failure detection circuit, a power converter or inverter, a power status indicator (e.g., a light-emitting diode (LED)) and any other components associated with the generation, management and distribution of power in portable devices.
[0052] The device 100 may also include one or more optical sensors 164. Figures IA and IB show an optical sensor coupled to an optical sensor controller 158 in I/O subsystem 106. The optical sensor 164 may include charge-coupled device (CCD) or complementary metal-oxide semiconductor (CMOS) phototransistors. The optical sensor 164 receives light from the environment, projected through one or more lens, and converts the light to data representing an image. In conjunction with an imaging module 143 (also called a camera module), the optical sensor 164 may capture still images or video. In some embodiments, an optical sensor is located on the back of the device 100, opposite the touch screen display 112 on the front of the device, so that the touch screen display may be used as a viewfinder for either still and/or video image acquisition. In some embodiments, an optical sensor is located on the front of the device so that the user's image may be obtained for videoconferencing while the user views the other video conference participants on the touch screen display. In some embodiments, the position of the optical sensor 164 can be changed by the user (e.g., by rotating the lens and the sensor in the device housing) so that a single optical sensor 164 may be used along with the touch screen display for both video conferencing and still and/or video image acquisition. [0053] The device 100 may also include one or more proximity sensors 166. Figures
IA and IB show a proximity sensor 166 coupled to the peripherals interface 118. Alternately, the proximity sensor 166 may be coupled to an input controller 160 in the I/O subsystem 106. The proximity sensor 166 may perform as described in U.S. Patent Application Nos. 11/241,839, "Proximity Detector In Handheld Device," filed September 30, 2005; 11/240,788, "Proximity Detector In Handheld Device," filed September 30, 2005; number to be determined, "Using Ambient Light Sensor To Augment Proximity Sensor Output," filed January 7, 2007, attorney docket # 04860.P4851US1; number to be determined, "Automated Response To And Sensing Of User Activity In Portable Devices," filed October 24, 2006, attorney docket # 04860.P4293; and number to be determined, "Methods And Systems For Automatic Configuration Of Peripherals," filed December 12, 2006, attorney docket # 04860.P4634, which are hereby incorporated by reference in their entirety. In some embodiments, the proximity sensor turns off and disables the touch screen 112 when the multifunction device is placed near the user's ear (e.g., when the user is making a phone call). In some embodiments, the proximity sensor keeps the screen off when the device is in the user's pocket, purse, or other dark area to prevent unnecessary battery drainage when the device is a locked state.
[0054] The device 100 may also include one or more accelerometers 168. Figures IA and IB show an accelerometer 168 coupled to the peripherals interface 118. Alternately, the accelerometer 168 may be coupled to an input controller 160 in the I/O subsystem 106. The accelerometer 168 may perform as described in U.S. Patent Publication No. 20050190059, "Acceleration-based Theft Detection System for Portable Electronic Devices," and U.S. Patent Publication No. 20060017692, "Methods And Apparatuses For Operating A Portable
Device Based On An Accelerometer," both of which are which are incorporated herein by reference in their entirety. In some embodiments, information is displayed on the touch screen display in a portrait view or a landscape view based on an analysis of data received from the one or more accelerometers.
[0055] In some embodiments, the software components stored in memory 102 may include an operating system 126, a communication module (or set of instructions) 128, a contact/motion module (or set of instructions) 130, a graphics module (or set of instructions) 132, a text input module (or set of instructions) 134, a Global Positioning System (GPS) module (or set of instructions) 135, and applications (or set of instructions) 136.
[0056] The operating system 126 (e.g., Darwin, RTXC, LINUX, UNIX, OS X, WINDOWS, or an embedded operating system such as Vx Works) includes various software components and/or drivers for controlling and managing general system tasks (e.g., memory management, storage device control, power management, etc.) and facilitates communication between various hardware and software components.
[0057] The communication module 128 facilitates communication with other devices over one or more external ports 124 and also includes various software components for handling data received by the RF circuitry 108 and/or the external port 124. The external port 124 (e.g., Universal Serial Bus (USB), FIREWIRE, etc.) is adapted for coupling directly to other devices or indirectly over a network (e.g., the Internet, wireless LAN, etc.). In some embodiments, the external port is a multi-pin (e.g., 30-pin) connector that is the same as, or similar to and/or compatible with the 30-pin connector used on iPod (trademark of Apple Computer, Inc.) devices.
[0058] The contact/motion module 130 may detect contact with the touch screen 112
(in conjunction with the display controller 156) and other touch sensitive devices (e.g., a touchpad or physical click wheel). The contact/motion module 130 includes various software components for performing various operations related to detection of contact, such as determining if contact has occurred, determining if there is movement of the contact and tracking the movement across the touch screen 112, and determining if the contact has been broken (i.e., if the contact has ceased). Determining movement of the point of contact may include determining speed (magnitude), velocity (magnitude and direction), and/or an acceleration (a change in magnitude and/or direction) of the point of contact. These operations may be applied to single contacts (e.g., one finger contacts) or to multiple simultaneous contacts (e.g., "multitouch'Vmultiple finger contacts). In some embodiments, the contact/motion module 130 and the display controller 156 also detects contact on a touchpad. In some embodiments, the contact/motion module 130 and the controller 160 detects contact on a click wheel.
[0059] The graphics module 132 includes various known software components for rendering and displaying graphics on the touch screen 112, including components for changing the intensity of graphics that are displayed. As used herein, the term "graphics" includes any object that can be displayed to a user, including without limitation text, web pages, icons (such as user-interface objects including soft keys), digital images, videos, animations and the like. [0060] The text input module 134, which may be a component of graphics module
132, provides soft keyboards for entering text in various applications (e.g., contacts 137, e- mail 140, IM 141, blogging 142, browser 147, and any other application that needs text input).
[0061] The GPS module 135 determines the location of the device and provides this information for use in various applications (e.g., to telephone 138 for use in location-based dialing, to camera 143 and/or blogger 142 as picture/video metadata, and to applications that provide location-based services such as weather widgets, local yellow page widgets, and map/navigation widgets).
[0062] The applications 136 may include the following modules (or sets of instructions), or a subset or superset thereof:
• a contacts module 137 (sometimes called an address book or contact list);
• a telephone module 138;
• a video conferencing module 139;
• an e-mail client module 140; • an instant messaging (IM) module 141;
• a blogging module 142;
• a camera module 143 for still and/or video images;
• an image management module 144;
• a video player module 145; • a music player module 146;
• a browser module 147;
• a calendar module 148;
• widget modules 149, which may include weather widget 149-1, stocks widget 149-2, calculator widget 149-3, alarm clock widget 149-4, dictionary widget 149-5, and other widgets obtained by the user, as well as user-created widgets 149-6;
• widget creator module 150 for making user-created widgets 149-6;
• search module 151;
• video and music player module 152, which merges video player module 145 and music player module 146;
• notes module 153;
• map module 154 ; and/or
• online video module 155.
[0063] Examples of other applications 136 that may be stored in memory 102 include other word processing applications, JAVA-enabled applications, encryption, digital rights management, voice recognition, and voice replication.
[0064] In conjunction with touch screen 112, display controller 156, contact module
130, graphics module 132, and text input module 134, the contacts module 137 may be used to manage an address book or contact list, including: adding name(s) to the address book; deleting name(s) from the address book; associating telephone number(s), e-mail address(es), physical address(es) or other information with a name; associating an image with a name; categorizing and sorting names; providing telephone numbers or e-mail addresses to initiate and/or facilitate communications by telephone 138, video conference 139, e-mail 140, or IM 141; and so forth. [0065] In conjunction with RF circuitry 108, audio circuitry 110, speaker 111, microphone 113, touch screen 112, display controller 156, contact module 130, graphics module 132, and text input module 134, the telephone module 138 may be used to enter a sequence of characters corresponding to a telephone number, access one or more telephone numbers in the address book 137, modify a telephone number that has been entered, dial a respective telephone number, conduct a conversation and disconnect or hang up when the conversation is completed. As noted above, the wireless communication may use any of a plurality of communications standards, protocols and technologies.
[0066] In conjunction with RF circuitry 108, audio circuitry 110, speaker 111, microphone 113, touch screen 112, display controller 156, optical sensor 164, optical sensor controller 158, contact module 130, graphics module 132, text input module 134, contact list
137, and telephone module 138, the videoconferencing module 139 may be used to initiate, conduct, and terminate a video conference between a user and one or more other participants.
[0067] In conjunction with RF circuitry 108, touch screen 112, display controller 156, contact module 130, graphics module 132, and text input module 134, the e-mail client module 140 may be used to create, send, receive, and manage e-mail. In conjunction with image management module 144, the e-mail module 140 makes it very easy to create and send e-mails with still or video images taken with camera module 143.
[0068] In conjunction with RF circuitry 108, touch screen 112, display controller 156, contact module 130, graphics module 132, and text input module 134, the instant messaging module 141 may be used to enter a sequence of characters corresponding to an instant message, to modify previously entered characters, to transmit a respective instant message (for example, using a Short Message Service (SMS) or Multimedia Message Service (MMS) protocol for telephony-based instant messages or using XMPP, SIMPLE, or IMPS for Internet-based instant messages), to receive instant messages and to view received instant messages. In some embodiments, transmitted and/or received instant messages may include graphics, photos, audio files, video files and/or other attachments as are supported in a MMS and/or an Enhanced Messaging Service (EMS). As used herein, "instant messaging" refers to both telephony-based messages (e.g., messages sent using SMS or MMS) and Internet-based messages (e.g., messages sent using XMPP, SIMPLE, or IMPS). [0069] In conjunction with RF circuitry 108, touch screen 112, display controller 156, contact module 130, graphics module 132, text input module 134, image management module 144, and browsing module 147, the blogging module 142 may be used to send text, still images, video, and/or other graphics to a blog (e.g., the user's blog).
[0070] In conjunction with touch screen 112, display controller 156, optical sensor(s) 164, optical sensor controller 158, contact module 130, graphics module 132, and image management module 144, the camera module 143 may be used to capture still images or video (including a video stream) and store them into memory 102, modify characteristics of a still image or video, or delete a still image or video from memory 102.
[0071] In conjunction with touch screen 112, display controller 156, contact module
130, graphics module 132, text input module 134, and camera module 143, the image management module 144 may be used to arrange, modify or otherwise manipulate, label, delete, present (e.g., in a digital slide show or album), and store still and/or video images.
[0072] In conjunction with touch screen 112, display controller 156, contact module
130, graphics module 132, audio circuitry 110, and speaker 111, the video player module 145 may be used to display, present or otherwise play back videos (e.g., on the touch screen or on an external, connected display via external port 124).
[0073] In conjunction with touch screen 112, display system controller 156, contact module 130, graphics module 132, audio circuitry 110, speaker 111, RF circuitry 108, and browser module 147, the music player module 146 allows the user to download and play back recorded music and other sound files stored in one or more file formats, such as MP3 or AAC files. In some embodiments, the device 100 may include the functionality of an MP3 player, such as an iPod (trademark of Apple Computer, Inc.).
[0074] In conjunction with RF circuitry 108, touch screen 112, display system controller 156, contact module 130, graphics module 132, and text input module 134, the browser module 147 may be used to browse the Internet, including searching, linking to, receiving, and displaying web pages or portions thereof, as well as attachments and other files linked to web pages.
[0075] In conjunction with RF circuitry 108, touch screen 112, display system controller 156, contact module 130, graphics module 132, text input module 134, e-mail module 140, and browser module 147, the calendar module 148 may be used to create, display, modify, and store calendars and data associated with calendars (e.g., calendar entries, to do lists, etc.).
[0076] In conjunction with RF circuitry 108, touch screen 112, display system controller 156, contact module 130, graphics module 132, text input module 134, and browser module 147, the widget modules 149 are mini-applications that may be downloaded and used by a user (e.g., weather widget 149-1, stocks widget 149-2, calculator widget 149-3, alarm clock widget 149-4, and dictionary widget 149-5) or created by the user (e.g., user- created widget 149-6). In some embodiments, a widget includes an HTML (Hypertext Markup Language) file, a CSS (Cascading Style Sheets) file, and a JavaScript file. In some embodiments, a widget includes an XML (Extensible Markup Language) file and a JavaScript file (e.g., Yahoo! Widgets). [0077] In conjunction with RF circuitry 108, touch screen 112, display system controller 156, contact module 130, graphics module 132, text input module 134, and browser module 147, the widget creator module 150 may be used by a user to create widgets (e.g., turning a user-specified portion of a web page into a widget).
[0078] In conjunction with touch screen 112, display system controller 156, contact module 130, graphics module 132, and text input module 134, the search module 151 may be used to search for text, music, sound, image, video, and/or other files in memory 102 that match one or more search criteria (e.g., one or more user-specified search terms).
[0079] In conjunction with touch screen 112, display controller 156, contact module
130, graphics module 132, and text input module 134, the notes module 153 may be used to create and manage notes, to do lists, and the like.
[0080] In conjunction with RF circuitry 108, touch screen 112, display system controller 156, contact module 130, graphics module 132, text input module 134, GPS module 135, and browser module 147, the map module 154 may be used to receive, display, modify, and store maps and data associated with maps (e.g., driving directions; data on stores and other points of interest at or near a particular location; and other location-based data).
[0081] In conjunction with touch screen 112, display system controller 156, contact module 130, graphics module 132, audio circuitry 110, speaker 111, RF circuitry 108, text input module 134, e-mail client module 140, and browser module 147, the online video module 155 allows the user to access, browse, receive (e.g., by streaming and/or download), play back (e.g., on the touch screen or on an external, connected display via external port 124), send an e-mail with a link to a particular online video, and otherwise manage online videos in one or more file formats, such as H.264. In some embodiments, instant messaging module 141, rather than e-mail client module 140, is used to send a link to a particular online video. Embodiments of user interfaces and associated processes using online video module 155 are described further below. [0082] Each of the above identified modules and applications correspond to a set of instructions for performing one or more functions described above. These modules (i.e., sets of instructions) need not be implemented as separate software programs, procedures or modules, and thus various subsets of these modules may be combined or otherwise re- arranged in various embodiments. For example, video player module 145 may be combined with music player module 146 into a single module (e.g., video and music player module 152, Figure IB). In some embodiments, memory 102 may store a subset of the modules and data structures identified above. Furthermore, memory 102 may store additional modules and data structures not described above. [0083] In some embodiments, the device 100 is a device where operation of a predefined set of functions on the device is performed exclusively through a touch screen 112 and/or a touchpad. By using a touch screen and/or a touchpad as the primary input/control device for operation of the device 100, the number of physical input/control devices (such as push buttons, dials, and the like) on the device 100 may be reduced. [0084] The predefined set of functions that may be performed exclusively through a touch screen and/or a touchpad include navigation between user interfaces. In some embodiments, the touchpad, when touched by the user, navigates the device 100 to a main, home, or root menu from any user interface that may be displayed on the device 100. In such embodiments, the touchpad may be referred to as a "menu button." In some other embodiments, the menu button may be a physical push button or other physical input/control device instead of a touchpad.
[0085] Figure 2 illustrates a portable multifunction device 100 having a touch screen
112 in accordance with some embodiments. The touch screen may display one or more graphics within user interface (UI) 200. In this embodiment, as well as others described below, a user may select one or more of the graphics by making contact or touching the graphics, for example, with one or more fingers 202 (not drawn to scale in the figure). In some embodiments, selection of one or more graphics occurs when the user breaks contact with the one or more graphics. In some embodiments, the contact may include a gesture, such as one or more taps, one or more swipes (from left to right, right to left, upward and/or downward) and/or a rolling of a finger (from right to left, left to right, upward and/or downward) that has made contact with the device 100. In some embodiments, inadvertent contact with a graphic may not select the graphic. For example, a swipe gesture that sweeps over an application icon may not select the corresponding application when the gesture corresponding to selection is a tap.
[0086] The device 100 may also include one or more physical buttons, such as
"home" or menu button 204. As described previously, the menu button 204 may be used to navigate to any application 136 in a set of applications that may be executed on the device 100. Alternatively, in some embodiments, the menu button is implemented as a soft key in a GUI in touch screen 112.
[0087] In one embodiment, the device 100 includes a touch screen 112, a menu button 204, a push button 206 for powering the device on/off and locking the device, volume adjustment button(s) 208, a Subscriber Identity Module (SIM) card slot 210, a head set jack 212, and a docking/charging external port 124. The push button 206 may be used to turn the power on/off on the device by depressing the button and holding the button in the depressed state for a predefined time interval; to lock the device by depressing the button and releasing the button before the predefined time interval has elapsed; and/or to unlock the device or initiate an unlock process. In an alternative embodiment, the device 100 also may accept verbal input for activation or deactivation of some functions through the microphone 113.
[0088] Attention is now directed towards embodiments of user interfaces ("UI") and associated processes that may be implemented on a portable multifunction device 100.
[0089] Figure 3 illustrates an exemplary user interface for unlocking a portable electronic device in accordance with some embodiments. In some embodiments, user interface 300 includes the following elements, or a subset or superset thereof:
• Unlock image 302 that is moved with a finger gesture to unlock the device;
• Arrow 304 that provides a visual cue to the unlock gesture;
• Channel 306 that provides additional cues to the unlock gesture; • Time 308;
• Day 310;
• Date 312; and
• Wallpaper image 314.
[0090] In some embodiments, the device detects contact with the touch-sensitive display (e.g., a user's finger making contact on or near the unlock image 302) while the device is in a user-interface lock state. The device moves the unlock image 302 in accordance with the contact. The device transitions to a user-interface unlock state if the detected contact corresponds to a predefined gesture, such as moving the unlock image across channel 306. Conversely, the device maintains the user-interface lock state if the detected contact does not correspond to the predefined gesture. As noted above, processes that use gestures on the touch screen to unlock the device are described in U.S. Patent Applications 11/322,549, "Unlocking A Device By Performing Gestures On An Unlock Image," filed December 23, 2005, and 11/322,550, "Indication Of Progress Towards Satisfaction OfA User Input Condition," filed December 23, 2005, which are hereby incorporated by reference in their entirety.
[0091] Figures 4 A and 4B illustrate exemplary user interfaces for a menu of applications on a portable multifunction device in accordance with some embodiments. In some embodiments, user interface 400A includes the following elements, or a subset or superset thereof: • Signal strength indicator(s) 402 for wireless communication(s), such as cellular and
Wi-Fi signals;
• Time 404;
• Battery status indicator 406;
• Tray 408 with icons for frequently used applications, such as one or more of the following: o Phone 138, which may include an indicator 414 of the number of missed calls or voicemail messages; o E-mail client 140, which may include an indicator 410 of the number of unread e-mails; o Browser 147; and o Music player 146; and
• Icons for other applications, such as one or more of the following: o IM 141; o Image management 144; o Camera 143; o Video player 145; o Weather 149-1; o Stocks 149-2; o Blog 142; o Calendar 148; o Calculator 149-3; o Alarm clock 149-4; o Dictionary 149-5; and o User-created widget 149-6.
[0092] In some embodiments, user interface 400B includes the following elements, or a subset or superset thereof:
. 402, 404, 406, 141, 148, 144, 143, 149-3, 149-2, 149-1, 149-4, 410, 414, 138, 140, and 147, as described above; • Map 154;
• Notes 153;
• Settings 412, which provides access to settings for the device 100 and its various applications 136;
• Video and music player module 152, also referred to as iPod (trademark of Apple Computer, Inc.) module 152; and
• Online video module 155, also referred to as YouTube (trademark of Google, Inc.) module 155.
[0093] In some embodiments, UI 400A or 400B displays all of the available applications 136 on one screen so that there is no need to scroll through a list of applications (e.g., via a scroll bar). In some embodiments, as the number of applications increase, the icons corresponding to the applications may decrease in size so that all applications may be displayed on a single screen without scrolling. In some embodiments, having all applications on one screen and a menu button enables a user to access any desired application with at most two inputs, such as activating the menu button 204 and then activating the desired application (e.g., by a tap or other finger gesture on the icon corresponding to the application).
[0094] In some embodiments, UI 400A or 400B provides integrated access to both widget-based applications and non-widget-based applications. In some embodiments, all of the widgets, whether user-created or not, are displayed in UI 400A or 400B. In other embodiments, activating the icon for user-created widget 149-6 may lead to another UI that contains the user-created widgets or icons corresponding to the user-created widgets.
[0095] In some embodiments, a user may rearrange the icons in UI 400A or 400B, e.g., using processes described in U.S. Patent Application No. 11/459,602, "Portable Electronic Device With Interface Reconfiguration Mode," filed July 24, 2006, which is hereby incorporated by reference in its entirety. For example, a user may move application icons in and out of tray 408 using finger gestures.
[0096] In some embodiments, UI 400A or 400B includes a gauge (not shown) that displays an updated account usage metric for an account associated with usage of the device (e.g., a cellular phone account), as described in U.S. Patent Application 11/322,552,
"Account Information Display For Portable Communication Device," filed December 23, 2005, which is hereby incorporated by reference in its entirety.
[0097] As noted in the background section, many user- friendly GUI features cannot be implemented because of the limitations with the conventional graphical user interfaces. But these limitations can be overcome by a portable device with a touch screen display as described in the present application. For example, it is possible to use finger gestures to drag and drop a user interface object such as an icon from one position to another position on the touch screen display for swapping two objects. It is also possible for a user to rank information or services rendered by the portable device using such finger gestures. [0098] The term "user interface object" (which is interchangeable with "graphical object") generally refers to a graphical icon on the touch screen display, which may be associated with an entertainment item, an application configuration option, an email message, a photo, a data file, or the like depending on the specific application that employs the schemes. [0099] Figures 8 A through 81 illustrate exemplary user interfaces for a music and video player in accordance with some embodiments. Note that these user interfaces are only examples illustrating the processes mentioned above. One skilled in the art may apply these user navigation schemes to other applications that provide or require similar user experience.
[00100] In some embodiments, icons for major content categories (e.g., playlists 4308, artists 4310, songs 4312, and video 4314, Figure 8A) are displayed in a first area of the display (e.g., 4340, Figure 8A). In some embodiments, the first area also includes an icon
(e.g., more icon 4316) that when activated (e.g., by a finger tap on the icon) leads to additional content categories (e.g., albums, audiobooks, compilations, composers, genres, and podcasts in Figure 8A).
[00101] In some embodiments, the major content categories that are displayed in the first area 4340 of the display can be rearranged by a user to correspond to the user's preferred (favorite) categories (e.g., as illustrated in Figures 8A-8G). In some embodiments, activation of add category icon 4344 (e.g., by a finger tap on the icon) initiates display of a UI with a soft keyboard for adding user specified categories (not shown). In some embodiments, activation of edit icon 4342 in Figure 8A (e.g., by a finger tap on the icon) initiates display of UI 4300K (Figure 8B) with delete icons 4348 (which operate like delete icons 702, Figure 7, as described above) and moving affordance icons 4360. As described below, moving affordance icons 4360 may be used as control icons that assist in rearranging categories or other UI objects.
[00102] Figure 5 is a flow diagram illustrating a first process for swapping first and second user interface objects in accordance with some embodiments. In some embodiments, a portable multifunction device with a touch screen display with a plurality of user interface objects displays a first user interface object (e.g., genres icon 4350, Figure 8B) and a second user interface object (e.g., artists icon 4310, Figure 8B) on the touch screen display (501). In some embodiments, the first user interface object is one of a group of candidate icons (e.g., icons in the more list 4362, Figure 8B, which are candidates for rearrangement) and the second user interface object is one of a group of user favorite icons (e.g., icons in area 4340).
[00103] A finger-down event is detected at the first user interface object (503) (e.g., contact 4346-1, Figure 8B). In some embodiments, the first user interface object includes a control icon (e.g., the horizontal bars comprising a moving affordance icon 4360 in genres icon 4350) and the finger-down event occurs at or near the control icon. [00104] One or more finger-dragging events are detected on the touch screen display
(505) (e.g., the finger drag from 4346-1 (Figure 8B) to 4346-2 (Figure 8C) to 4346-3 via 4365 (Figure 8C)).
[00105] The first user interface object is moved on the touch screen display along a path determined by the finger-dragging events until the first user interface object at least in part overlaps the second user interface object (507).
[00106] In some embodiments, while moving the first user interface object on the touch screen display, the first user interface object is displayed in a manner visually distinguishable from other user interface objects on the touch screen display (e.g., the shading around genres icon 4350 in Figure 8C).
[00107] A finger-up event is detected at the second user interface object (509) (e.g., ending contact at 4346-3, Figure 8C). The second user interface object (e.g., artists icon 4310, Figure 8C) is visually replaced with the first user interface object (511) (e.g., genres icon 4350, Figure 8D). [00108] In some embodiments, in response to the finger-up event, the first user interface object is displayed at a location formerly occupied by the second user interface object, and a movement of the second user interface object to a location formerly occupied by the first user interface object is animated (e.g., in Figure 8D, artists 4310 is now part of the list that used to include genres 4350). [00109] In some embodiments, the first user interface object is displayed in a first form before the finger-up event and in a second form after the finger-up event, and the second form is visually different from the first form. In some embodiments, the first form is a row including characters and at least one control icon (e.g., 4350, Figure 8B) and the second form is an image or other graphic (e.g., 4350, Figure 8D). [00110] In some embodiments, the second user interface object is displayed in a first form before the finger-up event and in a second form after the finger-up event, and the second form is visually different from the first form. In some embodiments, the first form is an image or other graphic (e.g., 4310, Figure 8B) and the second form is a row (e.g., 4310, Figure 8D) including characters associated with at least one control icon (e.g., 4360-2, Figure 8D). In some embodiments, the second form is a row including characters near, or within a predefined distance, corresponding to a hit region for the control icon. [00111] In some embodiments, the first user interface object is one of a group of candidate icons and the second user interface object is one of a group of user favorite icons. In some embodiments, the remaining group of candidate icons is rearranged after moving the first user interface object away from its original location. The remaining group of candidate icons is the group of candidate icons excluding the first user interface object. Upon detecting the finger-up event, the first user interface object is displayed at a location formerly occupied by the second user interface object and a movement of the second user interface object to a location formerly occupied by one of the remaining group of candidate icons is animated.
[00112] Figure 6 is a flow diagram illustrating a second process for swapping first and second user interface objects in accordance with some embodiments. Figures 8E through 8G illustrate another way the major content categories that are displayed in the first area 4340 of the display can be rearranged by a user to correspond to the user's preferred (favorite) categories. The categories that are included in area 4340 may also be listed in a first list area 4364 in the more list 4362 (e.g., above separator 4352 in the more list 4362), with the candidate categories listed in a second list area 4366 in the more list 4362 (e.g., below separator 4352 in the more list 4362). Upon detection of a finger down event (601) (e.g., 4346-5, Figure 8E), a first user interface object is identified at which the finger-down event occurs (603) (e.g., genres icon 4350). In response to one or more finger dragging events (605) (e.g., from 4346-5 to 4346-6 (Figure 8F) to 4346-7 (Figure 8G)), the first user interface object is moved on the touch screen display in accordance with the finger-dragging event
(607). Upon detecting a finger up event (609) (e.g., at 4346-7), the portable device identifies a second user interface object at which the finger-up event occurs (611) and then visually replaces the second user interface object with the first user interface object (613) (e.g., artists icon 4310) in both the first list area 4364 and in area 4340 (e.g., 4350-1 and 4350-2, Figure 8G), with the second user interface object moving to the second list area 4366 (e.g., 4310, Figure 8G).
[00113] In some embodiments, a portable multifunction device displays a first group of user interface objects on the touch screen display (e.g., icons in the more list 4362, Figure 8B, which are candidates for rearrangement). A second group of user interface objects is displayed on the touch screen display (e.g., icons in area 4340). A finger-down event is detected on the touch screen display (e.g., contact 4346-1, Figure 8B). A first user interface object (e.g., genres icon 4350, Figure 8B) in the first group at which the finger-down event occurs is identified. One or more finger-dragging events are detected on the touch screen display (e.g., the finger drag from 4346-1 (Figure 8B) to 4346-2 (Figure 8C) to 4346-3 via 4365 (Figure 8C)). The first user interface object on the touch screen display is moved in accordance with the finger-dragging events. A finger-up event is detected on the touch screen display (e.g., ending contact at 4346-3, Figure 8C). A second user interface object (e.g., artists icon 4310, Figure 8B) in the second group at which the finger-up event occurs is identified. The second user interface object is visually replaced with the first user interface object (e.g., artists icon 4310 in Figure 8C is visually replaced with genres icon 4350 in Figure 8D).
[00114] In some embodiments, the set of finger movements described above can be employed to represent a user's feedback on information or services provided by the portable device. Figure 7 is a flow diagram illustrating a third process for displaying a ratings icon using as input a finger swipe gesture on the touch screen display in accordance with some embodiments. As illustrated in Figure 8H and Figure 81, a user rating may be applied to an item of content with a finger gesture. [00115] In some embodiments, a portable multifunction device displays a series of ratings indicia (e.g., 4382, Figure 8H and 81) on a touch screen display (701). The ratings indicia comprise a lowest rating indicia and one or more progressively higher rating indicia. In some embodiments, the ratings indicia comprise stars (e.g., 4382-2, Figure 81). In some embodiments, the series of ratings indicia consists of five stars. [00116] A finger gesture (e.g., 4384, Figure 81) by a user is detected on one or more of the ratings indicia (703). The finger gesture contacts a last rating indicia immediately prior to breaking contact with the touch screen display (e.g., the third rating indicia in Figure 81). In some embodiments, the finger gesture contacts the lowest rating indicia prior to contacting one or more of the progressively higher rating indicia. In some embodiments, the finger gesture is a swipe gesture.
[00117] A rating corresponding to the last rating indicia contacted by the finger gesture is used as input to a function or application in the device (705). For example, the three-star rating for the song "Come Together" in Figure 81 may be used to sort this content versus other content in the device and/or to determine how often this content is heard when content is played in a random order.
[00118] In some embodiments, the rating corresponding to the last rating indicia contacted by the finger gesture is used to give a rating for an item of content that is playable with a content player application on the device. In some embodiments, the item of content is an item of music and the content player application is a music player application. In some embodiments, the item of content is a video and the content player application is a video player application. [00119] In some embodiments, the rating corresponding to the last rating indicia contacted by the finger gesture is used to give a rating for content on a web page that is viewable with a browser application on the device.
[00120] A graphical user interface on a portable multifunction device with a touch screen display comprises a series of ratings indicia 4382 on the touch screen display. The ratings indicia comprise a lowest rating indicia and one or more progressively higher rating indicia. In response to detecting a finger gesture by a user on one or more of the ratings indicia, wherein the finger gesture contacts a last rating indicia immediately prior to breaking contact with the touch screen display, a rating corresponding to the last rating indicia contacted by the finger gesture is used as input to a function or an application in the device. [00121] Figures 9A-9P illustrate exemplary user interfaces for an online video application for a portable multifunction device in accordance with some embodiments.
[00122] In some embodiments, a computer-implemented method is performed at a portable electronic device (e.g., 100) with a touch screen display 112.
[00123] The device displays a first list 2330-1 (Figure 9A) of information about online video items in a plurality of lists 2330 of information about online video items. In some embodiments, the plurality of lists of information about online video items include at least two of: a list of information about featured content items (e.g., videos featured by the online video website), a list of information about most recently added content items (e.g., videos most recently added to the online video website), a list of information about most viewed content items (e.g., videos most viewed by other users of the online video website, 2330-1,
Figure 9A), a list of information about top rated content items (e.g., videos rated by other users of the online video website), a list of information about content items bookmarked by a user of the computing device (e.g., bookmark list 2330-2, Figure 90), and a list of information about content items viewed by a user of the computing device (e.g., a list with a history of the video played by the user). In some embodiments, a respective list 2330 of information about online video items is displayed in a portrait orientation of the touch screen display. In some embodiments, in response to activation of a time window icon (e.g., all 2390, today 2392, or this week 2394 icons in Figure 9A) a respective list may be chosen to correspond to a specific time period.
[00124] The device displays a plurality of icons (e.g., 2332-1, 2332-2, and 2332-3,
Figure 9A) corresponding to at least some of the plurality of lists of information about online video items. The plurality of icons are displayed at the same time as a list of information about online video items (e.g., list 2330-1, Figure 9A).
[00125] In some embodiments, the device displays a search icon 2334 that when activated initiates the display of a user interface 2300R (Figure 9N) for searching for online video items. [00126] In response to detecting a moving finger gesture 2336 on the first list of information about content items, the device scrolls the first list of information about content items.
[00127] In response to detecting a stationary finger contact on a first portion 2338 of a row 2340 in the first list of information about online video items, wherein the row contains information about a particular online video item, the device: initiates a request for the particular online video item 2342 from a remote computer (e.g., an online video server for a web site such as www.youtube.com), receives the particular online video item 2342, and plays the particular online video item 2342 (Figure 9B). In some embodiments, the first portion 2338 of a row includes anywhere in the row except a second portion of the row, such as additional information icon 2344.
[00128] In some embodiments, the row 2340 has a width, the touch screen 112 has a width and the width of the row is substantially the same as the width of the touch screen display (e.g., at least 90% of the width of the touch screen display). In some embodiments, the touch screen display 112 has an area and the particular online video item 2342 uses substantially all (e.g., at least 90%) of the touch screen display area when the particular online video item is played. In some embodiments, the particular online video item 2342 is played in a landscape orientation of the touch screen display (Figure 9B).
[00129] In some embodiments, in response to detecting a finger contact 2346 (Figure
9B) on the touch screen display while the particular online video item 2342 is playing, the device displays one or more playback controls. In some embodiments, the one or more playback controls comprise a play icon 2304, a pause icon (not shown, which may toggle with the play icon 2304), a sound volume icon 2324, and/or a playback progress bar icon 2310. In some embodiments, displaying one or more playback controls comprises displaying one or more playback controls on top of the particular online video item 2342 (e.g., a semi- transparent "heads-up display", as illustrated in Figure 9B). [00130] In some embodiments, while playing the particular online video item 2342, the device ceases to display the one or more playback controls. In some embodiments, ceasing to display the one or more playback controls comprises fading out the one or more playback controls. In some embodiments, the display of the one or more playback controls is ceased after a predetermined time. In some embodiments, the display of the one or more playback controls is ceased after no contact is detected with the touch screen display for a predetermined time.
[00131] In some embodiments, in response to detecting a finger contact 2346 on the touch screen display while the particular online video item is playing, the device displays a bookmark icon 2350 that, if activated by another finger contact 2348, bookmarks the particular online video item 2342 (or initiates a process for creating a bookmark for the item).
[00132] In some embodiments, in response to detecting a finger contact 2346 on the touch screen display while the particular online video item is playing, the device displays a sharing icon 2352 that, if activated by another finger contact 2354, initiates creation of an electronic message to another user that includes a link to the particular online video item. In some embodiments, in response to detecting a finger contact 2346 on the touch screen display while the particular online video item is playing, the device displays a sharing icon 2352 that, if activated by another finger contact 2354, initiates creation of an electronic message to another user that includes an online address (e.g., a URL such as "http://www.youtube.com/watch?v=lxXNoB3t8vM" in Figure 9E) for the particular online video item. In some embodiments, the electronic message is an email (Figure 9E). In some embodiments, the electronic message is an instant message, such as an SMS message.
[00133] In response to detecting a finger contact on a respective icon (e.g., icon 2332-
1, Figure 9A or 2332-2, Figure 9A) in the plurality of icons, the device displays a corresponding list (e.g., 2330-1, Figure 9A or 2330-2, Figure 90, respectively ) of information about online video items.
[00134] In some embodiments, in response to detecting a finger contact on a second portion of the row in the first list of information about online video items (e.g., a contact on icon 2344-3, Figure 9A), the device displays additional information about the particular online video item (e.g., UI 2300G, Figure 9C). The second portion (e.g., icon 2344) of the row is different from the first portion 2338 of the row (e.g., anywhere else in the row 2340 besides icon 2344). In some embodiments, the additional information about the particular online video item includes information about related online video items 2356. In some embodiments, in response to detecting a finger contact on the second portion of the row, the device displays a bookmark icon 2358 that, if activated by another finger contact 2360, bookmarks the particular online video item (or initiates a process for creating a bookmark). In some embodiments, in response to detecting a finger contact on the second portion of the row, the device displays a sharing icon 2362 that, if activated by another finger contact 2364, initiates creation of an electronic message to another user that includes a link to (or an online address for) the particular online video item (Figure 9E). In some embodiments, in response to detecting a finger contact on the second portion of the row, the device: (a) displays a bookmark icon 2358 and a sharing icon 2362 if the particular online video item is not bookmarked (Figure 9C), and (b) displays an enlarged sharing icon 2366 (Figure 9D) without the bookmark icon if the particular online video item is already bookmarked (Figure 9D).
[00135] In some embodiments, the device displays an icon 2368 that when activated initiates the display of: (a) icons corresponding to at least some of the plurality of lists of information about online video items (e.g., 2332-4, 2332-5, 2332-6, Figure 9F) , and (b) a configuration icon (e.g., Edit icon 2370, Figure 9F) that when activated initiates the display of a user interface 2300K (Figure 9G) for configuring which icons corresponding to at least some of the plurality of lists are displayed with the first list of information. In some embodiments, after detecting a gesture on the configuration icon 2370, the device: detects a finger-down event 2372 at a first icon in a plurality of icons; detects one or more finger- dragging events 2374 on the touch screen display; moves the first icon on the touch screen display along a path determined by the finger-dragging events until the first icon at least in part overlaps a second icon in the plurality of icons (e.g., in Figure 91, "Most Recent" icon partially overlaps "Most Viewed" icon); detects a finger-up event 2376 at the second icon; and visually replaces the second icon with the first icon (e.g., in Figure 9J, "Most Recent" icon visually replaces the "Most Viewed" icon in Figure 91). In some embodiments, while moving the first icon on the touch screen display, the device displays the first icon in a manner visually distinguishable from other icons on the touch screen display (e.g., the "Most Recent" icon is enlarged in Figure 91). As shown in Figures 9K-9M, an analogous finger down, finger drag, and finger up process may be used to rearrange the icons 2332 (and 2334) that are displayed with the first list of information (e.g., exchanging the positions of the "Most Recent" icon and the "Bookmarks" icon).
[00136] In some embodiments, in response to detecting a finger contact on a playback completion icon 2314 (Figure 9B), the device ceases to play the particular online video item
2342, and displays again the first list of information 2330-1 (Figure 9A). In some embodiments, the finger contact detected on the playback completion icon comprises a tap gesture.
[00137] A graphical user interface 2300E on a portable electronic device 100 with a touch screen display 112 includes: a first list 2330-1 of information about online video items in a plurality of lists of information about online video items; and a plurality of icons 2332 corresponding to at least some of the plurality of lists of information about online video items. In response to detecting a finger contact on a first portion 2338 of a row 2340 in the first list 2330-1 of information about online video items, wherein the row contains information about a particular online video item: a request is initiated for the particular online video item 2342 from a remote computer, the particular online video item 2342 is received, and the particular online video item 2342 is played. In some embodiments, in response to detecting a finger contact on a second portion of the row (e.g., additional information icon 2344) in the first list of information about online video items, wherein the second portion of the row is different from the first portion of the row, additional information is displayed about the particular online video item (e.g., in UI 2300G, Figure 9C). In response to detecting a finger contact on a respective icon 2332 in the plurality of icons, a corresponding list 2330 of information about online video items is displayed.
[00138] The foregoing description, for purpose of explanation, has been described with reference to specific embodiments. However, the illustrative discussions above are not intended to be exhaustive or to limit the invention to the precise forms disclosed. Many modifications and variations are possible in view of the above teachings. The embodiments were chosen and described in order to best explain the principles of the invention and its practical applications, to thereby enable others skilled in the art to best utilize the invention and various embodiments with various modifications as are suited to the particular use contemplated.

Claims

What is claimed is:
1. A computer-implemented method, comprising: at a portable multifunction device with a touch screen display, detecting a finger-down event on the touch screen display; identifying a first user interface object at which the finger-down event occurs; detecting one or more finger-dragging events on the touch screen display; moving the first user interface object on the touch screen display in accordance with the finger-dragging events; detecting a finger-up event on the touch screen display; identifying a second user interface object at which the finger-up event occurs; and visually replacing the second user interface object with the first user interface object.
2. A computer-implemented method, comprising: at a portable multifunction device with a touch screen display with a plurality of user interface objects, displaying a first user interface object and a second user interface object on the touch screen display; detecting a finger-down event at the first user interface object; detecting one or more finger-dragging events on the touch screen display; moving the first user interface object on the touch screen display along a path determined by the finger-dragging events until the first user interface object at least in part overlaps the second user interface object; detecting a finger-up event at the second user interface object; and visually replacing the second user interface object with the first user interface object.
3. The computer-implemented method of claim 2, further comprising: while moving the first user interface object on the touch screen display, displaying the first user interface object in a manner visually distinguishable from other user interface objects on the touch screen display.
4. The computer-implemented method of claim 2, further comprising: upon detecting the finger-up event, displaying the first user interface object at a location formerly occupied by the second user interface object; and animating a movement of the second user interface object to a location formerly occupied by the first user interface object.
5. The computer-implemented method of claim 2, wherein the first user interface object is displayed in a first form before the finger-down event and in a second form after the finger- down event, and the second form is visually different from the first form.
6. The computer-implemented method of claim 5, wherein the first form is a row including characters and at least one control icon and the second form is an image.
7. The computer-implemented method of claim 2, wherein the second user interface object is displayed in a first form before the finger-up event and in a second form after the finger-up event, and the second form is visually different from the first form.
8. The computer-implemented method of claim 7, wherein the first form is an image and the second form is a row including characters associated with at least one control icon.
9. The computer-implemented method of claim 2, wherein the first user interface object includes a control icon and the finger-down event occurs at or near the control icon.
10. The computer-implemented method of claim 2, wherein the first user interface object is one of a group of candidate icons and the second user interface object is one of a group of user favorite icons.
11. The computer-implemented method of claim 10, further comprising: re-arranging the remaining group of candidate icons after moving the first user interface object away from its original location; upon detecting the finger-up event, displaying the first user interface object at a location formerly occupied by the second user interface object; and animating a movement of the second user interface object to a location formerly occupied by one of the remaining group of candidate icons.
12. A portable electronic device, comprising: a touch screen display with a plurality of user interface objects; one or more processors; memory; and a program, wherein the program is stored in the memory and configured to be executed by the one or more processors, the program including: instructions for displaying a first user interface object and a second user interface object on the touch screen display; instructions for detecting a finger-down event at the first user interface object; instructions for detecting one or more finger-dragging events on the touch screen display; instructions for moving the first user interface object on the touch screen display along a path determined by the finger-dragging events until the first user interface object at least in part overlaps the second user interface object; instructions for detecting a finger-up event at the second user interface object; and instructions for visually replacing the second user interface object with the first user interface object.
13. A computer readable storage medium storing one or more programs, the one or more programs comprising instructions, which when executed by a portable electronic device with a touch screen display with a plurality of user interface objects, cause the device to: display a first user interface object and a second user interface object on the touch screen display; detect a finger-down event at the first user interface object; detect one or more finger-dragging events on the touch screen display; move the first user interface object on the touch screen display along a path determined by the finger-dragging events until the first user interface object at least in part overlaps the second user interface object; detect a finger-up event at the second user interface object; and visually replace the second user interface object with the first user interface object.
14. A portable electronic device with a touch screen display with a plurality of user interface objects, comprising: means for displaying a first user interface object and a second user interface object on the touch screen display; means for detecting a finger-down event at the first user interface object; means for detecting one or more finger-dragging events on the touch screen display; means for moving the first user interface object on the touch screen display along a path determined by the finger-dragging events until the first user interface object at least in part overlaps the second user interface object; means for detecting a finger-up event at the second user interface object; and means for visually replacing the second user interface object with the first user interface object.
15. A computer-implemented method, comprising: at a portable multifunction device with a touch screen display, displaying a series of ratings indicia on the touch screen display, wherein the ratings indicia comprise a lowest rating indicia and one or more progressively higher rating indicia; detecting a finger gesture by a user on one or more of the ratings indicia, wherein the finger gesture contacts a last rating indicia immediately prior to breaking contact with the touch screen display; and using a rating corresponding to the last rating indicia contacted by the finger gesture as input to a function or application in the device.
16. The computer-implemented method of claim 15, wherein the finger gesture contacts the lowest rating indicia prior to contacting one or more of the progressively higher rating indicia.
17. The computer-implemented method of claim 15, wherein the ratings indicia are stars.
18. The computer-implemented method of claim 15, wherein the series of ratings indicia consists of five stars.
19. The computer-implemented method of claim 15, wherein the finger gesture is a swipe gesture.
20. The computer-implemented method of claim 15, wherein the rating corresponding to the last rating indicia contacted by the finger gesture is used to give a rating for an item of content that is playable with a content player application on the device.
21. The computer-implemented method of claim 20, wherein the item of content is an item of music and the content player application is a music player application.
22. The computer-implemented method of claim 20, wherein the item of content is a video and the content player application is a video player application.
23. The computer-implemented method of claim 15, wherein the rating corresponding to the last rating indicia contacted by the finger gesture is used to give a rating for content on a web page that is viewable with a browser application on the device.
24. A graphical user interface on a portable multifunction device with a touch screen display, comprising: a series of ratings indicia on the touch screen display, wherein: the ratings indicia comprise a lowest rating indicia and one or more progressively higher rating indicia; and in response to detecting a finger gesture by a user on one or more of the ratings indicia, wherein the finger gesture contacts a last rating indicia immediately prior to breaking contact with the touch screen display, a rating corresponding to the last rating indicia contacted by the finger gesture is used as input to a function or an application in the device.
25. A portable multifunction device, comprising: a touch screen display; one or more processors; memory; and one or more programs, wherein the one or more programs are stored in the memory and configured to be executed by the one or more processors, the programs including: instructions for displaying a series of ratings indicia on the touch screen display, wherein the ratings indicia comprise a lowest rating indicia and one or more progressively higher rating indicia; instructions for detecting a finger gesture by a user on one or more of the ratings indicia, wherein the finger gesture contacts a last rating indicia immediately prior to breaking contact with the touch screen display; and instructions for using a rating corresponding to the last rating indicia contacted by the finger gesture as input to a function or application in the device.
26. A computer readable storage medium storing one or more programs, the one or more programs comprising instructions, which when executed by a portable electronic device with a touch screen display, cause the device to: display a series of ratings indicia on the touch screen display, wherein the ratings indicia comprise a lowest rating indicia and one or more progressively higher rating indicia; detect a finger gesture by a user on one or more of the ratings indicia, wherein the finger gesture contacts a last rating indicia immediately prior to breaking contact with the touch screen display; and use a rating corresponding to the last rating indicia contacted by the finger gesture as input to a function or application in the device.
27. A portable multifunction device with a touch screen display, comprising: means for displaying a series of ratings indicia on the touch screen display, wherein the ratings indicia comprise a lowest rating indicia and one or more progressively higher rating indicia; means for detecting a finger gesture by a user on one or more of the ratings indicia, wherein the finger gesture contacts a last rating indicia immediately prior to breaking contact with the touch screen display; and means for using a rating corresponding to the last rating indicia contacted by the finger gesture as input to a function or application in the device.
PCT/US2008/050430 2007-01-07 2008-01-07 Swapping user- interface objects by drag-and-drop finger gestures on a touch screen display WO2008086305A2 (en)

Applications Claiming Priority (10)

Application Number Priority Date Filing Date Title
US87925307P 2007-01-07 2007-01-07
US60/879,253 2007-01-07
US87946907P 2007-01-08 2007-01-08
US60/879,469 2007-01-08
US93799307P 2007-06-29 2007-06-29
US93799007P 2007-06-29 2007-06-29
US60/937,990 2007-06-29
US60/937,993 2007-06-29
US11/969,809 US8519964B2 (en) 2007-01-07 2008-01-04 Portable multifunction device, method, and graphical user interface supporting user navigations of graphical objects on a touch screen display
US11/969,809 2008-01-04

Publications (2)

Publication Number Publication Date
WO2008086305A2 true WO2008086305A2 (en) 2008-07-17
WO2008086305A3 WO2008086305A3 (en) 2008-10-09

Family

ID=39593862

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2008/050430 WO2008086305A2 (en) 2007-01-07 2008-01-07 Swapping user- interface objects by drag-and-drop finger gestures on a touch screen display

Country Status (2)

Country Link
US (6) US8519964B2 (en)
WO (1) WO2008086305A2 (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2010112089A1 (en) * 2009-03-30 2010-10-07 Sony Ericsson Mobile Communications Ab Navigation among media files represented by graphics in portable communication devices
DE102009017078A1 (en) * 2009-04-08 2010-11-25 Jurasoft Gmbh & Co. Kg Display device i.e. monitor, controlling method for use in computer system i.e. personal computer, involves representing transient graphic element in display area of additional symbol, and linking element with object based on user selection
WO2013045708A1 (en) * 2011-09-30 2013-04-04 Promethean Limited Transforming displayed objects on a gui
US9420108B1 (en) 2015-08-11 2016-08-16 International Business Machines Corporation Controlling conference calls

Families Citing this family (181)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8451221B2 (en) * 2008-08-11 2013-05-28 Imu Solutions, Inc. Instruction device and communicating method
US7948448B2 (en) 2004-04-01 2011-05-24 Polyvision Corporation Portable presentation system and methods for use therewith
US7509588B2 (en) 2005-12-30 2009-03-24 Apple Inc. Portable electronic device with interface reconfiguration mode
US8296684B2 (en) 2008-05-23 2012-10-23 Hewlett-Packard Development Company, L.P. Navigating among activities in a computing device
US8683362B2 (en) 2008-05-23 2014-03-25 Qualcomm Incorporated Card metaphor for activities in a computing device
US7956849B2 (en) 2006-09-06 2011-06-07 Apple Inc. Video manager for portable multifunction device
US7864163B2 (en) 2006-09-06 2011-01-04 Apple Inc. Portable electronic device, method, and graphical user interface for displaying structured electronic documents
US10313505B2 (en) * 2006-09-06 2019-06-04 Apple Inc. Portable multifunction device, method, and graphical user interface for configuring and displaying widgets
US8842074B2 (en) 2006-09-06 2014-09-23 Apple Inc. Portable electronic device performing similar operations for different gestures
US8519964B2 (en) 2007-01-07 2013-08-27 Apple Inc. Portable multifunction device, method, and graphical user interface supporting user navigations of graphical objects on a touch screen display
US8607167B2 (en) * 2007-01-07 2013-12-10 Apple Inc. Portable multifunction device, method, and graphical user interface for providing maps and directions
KR101382504B1 (en) * 2007-05-21 2014-04-07 삼성전자주식회사 Apparatus and method for making macro
US9933937B2 (en) 2007-06-20 2018-04-03 Apple Inc. Portable multifunction device, method, and graphical user interface for playing online videos
US8302033B2 (en) 2007-06-22 2012-10-30 Apple Inc. Touch screen device, method, and graphical user interface for providing maps, directions, and location-based information
US9772751B2 (en) 2007-06-29 2017-09-26 Apple Inc. Using gestures to slide between user interfaces
US20090037827A1 (en) * 2007-07-31 2009-02-05 Christopher Lee Bennetts Video conferencing system and method
US11126321B2 (en) 2007-09-04 2021-09-21 Apple Inc. Application menu user interface
US9619143B2 (en) 2008-01-06 2017-04-11 Apple Inc. Device, method, and graphical user interface for viewing application launch icons
US8619038B2 (en) 2007-09-04 2013-12-31 Apple Inc. Editing interface
US8327272B2 (en) 2008-01-06 2012-12-04 Apple Inc. Portable multifunction device, method, and graphical user interface for viewing and managing electronic calendars
US20090187842A1 (en) * 2008-01-22 2009-07-23 3Dlabs Inc., Ltd. Drag and Drop User Interface for Portable Electronic Devices with Touch Sensitive Screens
KR101320919B1 (en) * 2008-01-29 2013-10-21 삼성전자주식회사 Method for providing GUI by divided screen and multimedia device using the same
US20090199120A1 (en) * 2008-02-01 2009-08-06 Moaec, Inc. Customizable, reconfigurable graphical user interface
US8174503B2 (en) 2008-05-17 2012-05-08 David H. Cain Touch-based authentication of a mobile device through user generated pattern creation
KR101477743B1 (en) * 2008-06-16 2014-12-31 삼성전자 주식회사 Terminal and method for performing function thereof
US20090327278A1 (en) * 2008-06-26 2009-12-31 Baran-Sneh Alex System and method for ranking web content
KR101503493B1 (en) * 2008-07-16 2015-03-17 삼성전자주식회사 Method for controlling devices using widget contents and a remote controller thereof
TWI400630B (en) * 2008-08-11 2013-07-01 Imu Solutions Inc Selection device and method
TWI411940B (en) * 2009-07-27 2013-10-11 Imu Solutions Inc Instruction device and method
TWI461961B (en) * 2008-08-11 2014-11-21 Imu Solutions Inc Selection device and method for performing positioning operation to image area
KR101531504B1 (en) * 2008-08-26 2015-06-26 엘지전자 주식회사 Mobile terminal and method for controlling the same
US20100060549A1 (en) * 2008-09-11 2010-03-11 Ely Tsern Method and system for dynamically generating different user environments with secondary devices with displays of various form factors
KR101510738B1 (en) * 2008-10-20 2015-04-10 삼성전자주식회사 Apparatus and method for composing idle screen in a portable terminal
EP2184669A1 (en) 2008-10-30 2010-05-12 Research In Motion Limited Portable electronic device and method of controlling same
US20100110017A1 (en) * 2008-10-30 2010-05-06 Research In Motion Limited Portable electronic device and method of controlling same
US8150463B2 (en) * 2008-12-08 2012-04-03 At&T Intellectual Property I, L.P. Method and apparatus for presenting a user interface
US10228820B2 (en) * 2008-12-08 2019-03-12 Samsung Electronics Co., Ltd. Display apparatus and displaying method thereof
KR20100070733A (en) * 2008-12-18 2010-06-28 삼성전자주식회사 Method for displaying items and display apparatus applying the same
EP2207081A1 (en) * 2008-12-31 2010-07-14 Vodafone Holding GmbH Graphical user interface for mobile communication device
US9176747B2 (en) * 2009-02-17 2015-11-03 Sandisk Il Ltd. User-application interface
JP5734546B2 (en) 2009-02-25 2015-06-17 京セラ株式会社 Object display device
US20100229129A1 (en) * 2009-03-04 2010-09-09 Microsoft Corporation Creating organizational containers on a graphical user interface
US8689128B2 (en) 2009-03-16 2014-04-01 Apple Inc. Device, method, and graphical user interface for moving a current position in content at a variable scrubbing rate
US8464182B2 (en) * 2009-06-07 2013-06-11 Apple Inc. Device, method, and graphical user interface for providing maps, directions, and location-based information
GB0910545D0 (en) * 2009-06-18 2009-07-29 Therefore Ltd Picturesafe
CN102150104B (en) * 2009-07-21 2016-01-20 晶翔微系统股份有限公司 Selecting arrangement and method
DE102009036368A1 (en) * 2009-08-06 2011-02-10 Volkswagen Ag User interface providing method for use in motor vehicle, involves selecting one of widget-representatives of list, and displaying widget-object assigned to widget-representatives on display area of display device
CN104656889A (en) * 2009-08-10 2015-05-27 晶翔微系统股份有限公司 Instruction device
KR20110027117A (en) * 2009-09-09 2011-03-16 삼성전자주식회사 Electronic apparatus with touch panel and displaying method thereof
US10156979B2 (en) * 2009-12-02 2018-12-18 Samsung Electronics Co., Ltd. Method and apparatus for providing user interface of portable device
KR20110063297A (en) * 2009-12-02 2011-06-10 삼성전자주식회사 Mobile device and control method thereof
US20110138284A1 (en) * 2009-12-03 2011-06-09 Microsoft Corporation Three-state touch input system
US8862576B2 (en) 2010-01-06 2014-10-14 Apple Inc. Device, method, and graphical user interface for mapping directions between search results
US8736561B2 (en) * 2010-01-06 2014-05-27 Apple Inc. Device, method, and graphical user interface with content display modes and display rotation heuristics
US8456297B2 (en) * 2010-01-06 2013-06-04 Apple Inc. Device, method, and graphical user interface for tracking movement on a map
JP5526789B2 (en) * 2010-01-08 2014-06-18 ソニー株式会社 Information processing apparatus and program
US9405449B2 (en) 2010-01-14 2016-08-02 Microsoft Technology Licensing, Llc Layout constraint manipulation via user gesture recognition
WO2011093367A1 (en) * 2010-02-01 2011-08-04 株式会社 ニコン Information adding device, electronic camera, information adding program
FI20105105A0 (en) * 2010-02-04 2010-02-04 Axel Technologies User interface of a media device
US20110216095A1 (en) * 2010-03-04 2011-09-08 Tobias Rydenhag Methods, Devices, and Computer Program Products Providing Multi-Touch Drag and Drop Operations for Touch-Sensitive User Interfaces
US10788976B2 (en) 2010-04-07 2020-09-29 Apple Inc. Device, method, and graphical user interface for managing folders with multiple pages
US8458615B2 (en) 2010-04-07 2013-06-04 Apple Inc. Device, method, and graphical user interface for managing folders
KR101000063B1 (en) * 2010-04-27 2010-12-10 엘지전자 주식회사 Image display apparatus and method for operating the same
KR101690232B1 (en) * 2010-05-28 2016-12-27 엘지전자 주식회사 Electronic Device And Method Of Controlling The Same
US20120030595A1 (en) * 2010-07-29 2012-02-02 Seiko Epson Corporation Information storage medium, terminal apparatus, and image generation method
US9159298B2 (en) * 2010-09-08 2015-10-13 Lg Electronics Inc. Terminal and contents sharing method for terminal
US9262002B2 (en) 2010-11-03 2016-02-16 Qualcomm Incorporated Force sensing touch screen
KR101522345B1 (en) 2010-11-12 2015-05-21 주식회사 케이티 Method for displaying background pictures in mobile communication apparatus and apparatus the same
JP5833822B2 (en) * 2010-11-25 2015-12-16 パナソニックIpマネジメント株式会社 Electronics
US9135426B2 (en) 2010-12-16 2015-09-15 Blackberry Limited Password entry using moving images
US8769641B2 (en) 2010-12-16 2014-07-01 Blackberry Limited Multi-layer multi-point or pathway-based passwords
US8931083B2 (en) 2010-12-16 2015-01-06 Blackberry Limited Multi-layer multi-point or randomized passwords
US8863271B2 (en) 2010-12-16 2014-10-14 Blackberry Limited Password entry using 3D image with spatial alignment
US9258123B2 (en) 2010-12-16 2016-02-09 Blackberry Limited Multi-layered color-sensitive passwords
US8745694B2 (en) 2010-12-16 2014-06-03 Research In Motion Limited Adjusting the position of an endpoint reference for increasing security during device log-on
KR101772653B1 (en) * 2010-12-31 2017-08-29 삼성전자주식회사 Control device and method for control of broadcast reciever
US9043714B1 (en) 2011-01-07 2015-05-26 Google Inc. Adaptive user interface for widescreen devices
TW201239556A (en) * 2011-03-21 2012-10-01 Mitac Int Corp Electronic watch capable of adjusting information display angle
KR101199618B1 (en) 2011-05-11 2012-11-08 주식회사 케이티테크 Apparatus and Method for Screen Split Displaying
TWI425411B (en) * 2011-06-16 2014-02-01 Wistron Neweb Corp User-interface adjusting method and electronic device using the same
US8971572B1 (en) 2011-08-12 2015-03-03 The Research Foundation For The State University Of New York Hand pointing estimation for human computer interaction
US9223948B2 (en) 2011-11-01 2015-12-29 Blackberry Limited Combined passcode and activity launch modifier
US20130227445A1 (en) * 2012-02-24 2013-08-29 Maria Christina Nathalie Freyhult Method and apparatus for operation of a computing device
EP2631747B1 (en) 2012-02-24 2016-03-30 BlackBerry Limited Method and apparatus for providing a user interface on a device that indicates content operators
EP2631760A1 (en) 2012-02-24 2013-08-28 Research In Motion Limited Method and apparatus for providing a user interface on a device enabling selection of operations to be performed in relation to content
US9772700B2 (en) * 2012-04-30 2017-09-26 Blackberry Limited Device and method for processing user input
US10097496B2 (en) 2012-05-09 2018-10-09 Apple Inc. Electronic mail user interface
US10235014B2 (en) * 2012-05-09 2019-03-19 Apple Inc. Music user interface
US10649622B2 (en) 2012-05-09 2020-05-12 Apple Inc. Electronic message user interface
US20130318437A1 (en) * 2012-05-22 2013-11-28 Samsung Electronics Co., Ltd. Method for providing ui and portable apparatus applying the same
CN103455260B (en) * 2012-05-30 2016-03-09 腾讯科技(深圳)有限公司 A kind of implementation method and device region in list being moved to operation
US9805118B2 (en) * 2012-06-29 2017-10-31 Change Healthcare Llc Transcription method, apparatus and computer program product
JP5502943B2 (en) * 2012-06-29 2014-05-28 楽天株式会社 Information processing apparatus, authentication apparatus, information processing method, and information processing program
CN102768617B (en) * 2012-06-29 2016-12-28 惠州Tcl移动通信有限公司 Hand-held electronic equipment and the method for list items editor based on touch screen
US8698772B2 (en) 2012-08-24 2014-04-15 Google Inc. Visual object manipulation
US20140089815A1 (en) * 2012-09-21 2014-03-27 Google Inc. Sharing Content-Synchronized Ratings
US9229632B2 (en) 2012-10-29 2016-01-05 Facebook, Inc. Animation sequence associated with image
US9607289B2 (en) 2012-11-14 2017-03-28 Facebook, Inc. Content type filter
US9235321B2 (en) 2012-11-14 2016-01-12 Facebook, Inc. Animation sequence associated with content item
US9507757B2 (en) 2012-11-14 2016-11-29 Facebook, Inc. Generating multiple versions of a content item for multiple platforms
US9218188B2 (en) 2012-11-14 2015-12-22 Facebook, Inc. Animation sequence associated with feedback user-interface element
US9081410B2 (en) 2012-11-14 2015-07-14 Facebook, Inc. Loading content on electronic device
US9606695B2 (en) 2012-11-14 2017-03-28 Facebook, Inc. Event notification
US9547627B2 (en) 2012-11-14 2017-01-17 Facebook, Inc. Comment presentation
US9245312B2 (en) 2012-11-14 2016-01-26 Facebook, Inc. Image panning and zooming effect
US9684935B2 (en) 2012-11-14 2017-06-20 Facebook, Inc. Content composer for third-party applications
US9547416B2 (en) 2012-11-14 2017-01-17 Facebook, Inc. Image presentation
US9507483B2 (en) 2012-11-14 2016-11-29 Facebook, Inc. Photographs with location or time information
US9606717B2 (en) 2012-11-14 2017-03-28 Facebook, Inc. Content composer
US9696898B2 (en) 2012-11-14 2017-07-04 Facebook, Inc. Scrolling through a series of content items
US20170371492A1 (en) * 2013-03-14 2017-12-28 Rich IP Technology Inc. Software-defined sensing system capable of responding to cpu commands
US20140298258A1 (en) * 2013-03-28 2014-10-02 Microsoft Corporation Switch List Interactions
US10564836B2 (en) 2013-05-01 2020-02-18 Apple Inc. Dynamic moveable interface elements on a touch screen device
US9075612B2 (en) 2013-05-10 2015-07-07 Jinrong Yang System and method for managing display power consumption
US8593427B1 (en) 2013-05-10 2013-11-26 Jinrong Yang System and method for managing display power consumption
JP5762470B2 (en) * 2013-06-06 2015-08-12 シャープ株式会社 Display system and electronic device
CN103309618A (en) * 2013-07-02 2013-09-18 姜洪明 Mobile operating system
KR102234400B1 (en) * 2013-07-08 2021-03-31 삼성전자주식회사 Apparatas and method for changing the order or the position of list in an electronic device
US9568891B2 (en) 2013-08-15 2017-02-14 I.Am.Plus, Llc Multi-media wireless watch
KR20160051846A (en) 2013-09-03 2016-05-11 애플 인크. User interface for manipulating user interface objects with magnetic properties
US11068128B2 (en) 2013-09-03 2021-07-20 Apple Inc. User interface object manipulations in a user interface
US10545657B2 (en) 2013-09-03 2020-01-28 Apple Inc. User interface for manipulating user interface objects
USD740325S1 (en) * 2013-10-17 2015-10-06 Microsoft Corporation Display screen with icon
KR101952928B1 (en) 2013-10-30 2019-02-27 애플 인크. Displaying relevant user interface objects
JP5657771B1 (en) * 2013-12-10 2015-01-21 パナソニックIpマネジメント株式会社 Telephone device and mobile phone linkage method
US10318044B2 (en) * 2013-12-24 2019-06-11 Kyocera Corporation Electronic device having touch sensors on both front and back surfaces
DE102014202834A1 (en) * 2014-02-17 2015-09-03 Volkswagen Aktiengesellschaft User interface and method for contactless operation of a hardware-designed control element in a 3D gesture mode
US9811514B1 (en) * 2014-04-29 2017-11-07 Google Inc. Media object annotation with interactive elements
KR101631966B1 (en) * 2014-06-19 2016-06-20 엘지전자 주식회사 Mobile terminal and method for controlling the same
CN116301544A (en) 2014-06-27 2023-06-23 苹果公司 Reduced size user interface
US9787812B2 (en) 2014-08-28 2017-10-10 Honda Motor Co., Ltd. Privacy management
US10073590B2 (en) 2014-09-02 2018-09-11 Apple Inc. Reduced size user interface
WO2016036510A1 (en) * 2014-09-02 2016-03-10 Apple Inc. Music user interface
TW201610758A (en) 2014-09-02 2016-03-16 蘋果公司 Button functionality
WO2016036509A1 (en) 2014-09-02 2016-03-10 Apple Inc. Electronic mail user interface
USD761816S1 (en) * 2015-01-02 2016-07-19 Samsung Electronics Co., Ltd. Display screen or portion thereof with animated graphical user interface
USD762662S1 (en) * 2015-01-02 2016-08-02 Samsung Electronics Co., Ltd. Display screen or portion thereof with animated graphical user interface
USD761815S1 (en) * 2015-01-02 2016-07-19 Samsung Electronics Co., Ltd. Display screen or portion thereof with animated graphical user interface
US10365807B2 (en) 2015-03-02 2019-07-30 Apple Inc. Control of system zoom magnification using a rotatable input mechanism
US11209972B2 (en) * 2015-09-02 2021-12-28 D&M Holdings, Inc. Combined tablet screen drag-and-drop interface
KR102356449B1 (en) * 2015-05-13 2022-01-27 삼성전자주식회사 Apparatus and method for providing additional information according to rotary input
USD789395S1 (en) * 2015-05-19 2017-06-13 Ustocktrade LLC Display screen or portion thereof with stock trading graphical user interface
USD761294S1 (en) * 2015-05-19 2016-07-12 Ustocktrade LLC Display screen or portion thereof with stock trading graphical user interface
JP6314914B2 (en) * 2015-06-04 2018-04-25 京セラドキュメントソリューションズ株式会社 Image forming apparatus and operation screen control method of image forming apparatus
US20180095653A1 (en) * 2015-08-14 2018-04-05 Martin Hasek Device, method and graphical user interface for handwritten interaction
AU2016101424A4 (en) * 2015-09-08 2016-09-15 Apple Inc. Device, method, and graphical user interface for providing audiovisual feedback
JP6569425B2 (en) * 2015-09-25 2019-09-04 富士ゼロックス株式会社 Display device and program
USD825523S1 (en) 2016-01-06 2018-08-14 I.Am.Plus, Llc Set of earbuds
US10372306B2 (en) 2016-04-16 2019-08-06 Apple Inc. Organized timeline
DK201670595A1 (en) 2016-06-11 2018-01-22 Apple Inc Configuring context-specific user interfaces
US11816325B2 (en) 2016-06-12 2023-11-14 Apple Inc. Application shortcuts for carplay
USD802013S1 (en) * 2016-08-30 2017-11-07 Google Inc. Display screen with graphical user interface
US10264213B1 (en) 2016-12-15 2019-04-16 Steelcase Inc. Content amplification system and method
USD885414S1 (en) * 2016-12-30 2020-05-26 Whirlpool Corporation Appliance display screen or portion thereof with graphic user interface
US10904211B2 (en) 2017-01-21 2021-01-26 Verisign, Inc. Systems, devices, and methods for generating a domain name using a user interface
US10992795B2 (en) 2017-05-16 2021-04-27 Apple Inc. Methods and interfaces for home media control
US11431836B2 (en) 2017-05-02 2022-08-30 Apple Inc. Methods and interfaces for initiating media playback
US10928980B2 (en) 2017-05-12 2021-02-23 Apple Inc. User interfaces for playing and managing audio items
CN111343060B (en) 2017-05-16 2022-02-11 苹果公司 Method and interface for home media control
US20220279063A1 (en) 2017-05-16 2022-09-01 Apple Inc. Methods and interfaces for home media control
USD889491S1 (en) * 2017-07-19 2020-07-07 Lenovo (Beijing) Co., Ltd. Display screen or a portion thereof with graphical user interface
USD882602S1 (en) * 2017-07-28 2020-04-28 Verisign, Inc. Display screen or portion thereof with a sequential graphical user interface of a mobile device
USD844649S1 (en) 2017-07-28 2019-04-02 Verisign, Inc. Display screen or portion thereof with a sequential graphical user interface
USD879811S1 (en) * 2018-03-16 2020-03-31 Magic Leap, Inc. Display panel or portion thereof with a transitional mixed reality graphical user interface
US11269500B2 (en) * 2018-05-21 2022-03-08 Samsung Electronics Co., Ltd. Method and system for modular widgets in smart devices
GB201813240D0 (en) * 2018-08-14 2018-09-26 Core Network Ltd Methods and systems for interactive platforms
US11435830B2 (en) 2018-09-11 2022-09-06 Apple Inc. Content-based tactile outputs
DK179888B1 (en) 2018-09-11 2019-08-27 Apple Inc. CONTENT-BASED TACTICAL OUTPUTS
CN109379493B (en) * 2018-10-19 2021-07-23 北京小米移动软件有限公司 Sliding cover event processing method and device, electronic equipment and storage medium
US11675476B2 (en) 2019-05-05 2023-06-13 Apple Inc. User interfaces for widgets
US10996917B2 (en) 2019-05-31 2021-05-04 Apple Inc. User interfaces for audio media control
KR20230039775A (en) 2019-05-31 2023-03-21 애플 인크. User interfaces for audio media control
US11392291B2 (en) 2020-09-25 2022-07-19 Apple Inc. Methods and interfaces for media control with dynamic feedback
US11112945B1 (en) * 2020-09-30 2021-09-07 Snap Inc. Content detection and transmission in response to receiving user interactions
USD991966S1 (en) * 2021-01-08 2023-07-11 Samsung Electronics Co., Ltd. Display screen or portion thereof with graphical user interface
US11762458B2 (en) * 2021-02-15 2023-09-19 Sony Group Corporation Media display device control based on eye gaze
US11875016B2 (en) 2021-05-17 2024-01-16 Apple Inc. Devices, methods, and graphical user interfaces for displaying media items shared from distinct applications
KR20240005099A (en) * 2021-05-17 2024-01-11 애플 인크. Devices, methods, and graphical user interfaces for automatically providing shared content to applications
USD1017634S1 (en) * 2021-11-02 2024-03-12 Abiomed, Inc. Display panel or portion thereof with graphical user interface
USD1014552S1 (en) * 2021-11-02 2024-02-13 Abiomed, Inc. Display panel or portion thereof with graphical user interface
US20230367458A1 (en) * 2022-05-10 2023-11-16 Apple Inc. Search operations in various user interfaces

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5760773A (en) * 1995-01-06 1998-06-02 Microsoft Corporation Methods and apparatus for interacting with data objects using action handles
US6590568B1 (en) * 2000-11-20 2003-07-08 Nokia Corporation Touch screen drag and drop input technique
WO2005041020A1 (en) * 2003-10-24 2005-05-06 Nokia Corporation Method for shifting a shortcut in an electronic device, a display unit of the device, and an electronic device

Family Cites Families (1123)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4714918A (en) 1984-04-30 1987-12-22 International Business Machines Corporation Window view control
US4899136A (en) 1986-04-28 1990-02-06 Xerox Corporation Data processor having a user interface display with metaphoric objects
JPH01172997A (en) 1987-12-23 1989-07-07 Internatl Business Mach Corp <Ibm> Graphic customization of memu display
US5146556A (en) * 1988-10-11 1992-09-08 Next Computer, Inc. System and method for managing graphic images
JPH02116783A (en) 1988-10-27 1990-05-01 Seikosha Co Ltd Time signalling timepiece
US5075673A (en) 1989-06-16 1991-12-24 International Business Machines Corp. Variable speed, image pan method and apparatus
US5051736A (en) 1989-06-28 1991-09-24 International Business Machines Corporation Optical stylus and passive digitizing tablet data input system
US5312478A (en) 1990-04-11 1994-05-17 Lotus Development Corporation System for managing information in a three dimensional workspace
FR2662009B1 (en) 1990-05-09 1996-03-08 Apple Computer MULTIPLE FACES MANOPULABLE ICON FOR DISPLAY ON COMPUTER.
US5237679A (en) 1990-05-24 1993-08-17 International Business Machines Corporation Method and system for automatic deletion of a folder having temporary document relationships within a data processing system
US5119079A (en) 1990-09-17 1992-06-02 Xerox Corporation Touch screen user interface with expanding touch locations for a reprographic machine
US5196838A (en) 1990-12-28 1993-03-23 Apple Computer, Inc. Intelligent scrolling
US5898434A (en) 1991-05-15 1999-04-27 Apple Computer, Inc. User interface system having programmable user interface elements
FR2693810B1 (en) 1991-06-03 1997-01-10 Apple Computer USER INTERFACE SYSTEMS WITH DIRECT ACCESS TO A SECONDARY DISPLAY AREA.
US5592675A (en) 1992-01-08 1997-01-07 Hitachi, Ltd. Computer controlled method and system capable of preserving information representing plural work states and recovering the work states
US5610653A (en) 1992-02-07 1997-03-11 Abecassis; Max Method and system for automatically tracking a zoomed video image
JPH05225302A (en) 1992-02-07 1993-09-03 Matsushita Electric Ind Co Ltd Graphic processing support device
US5544295A (en) 1992-05-27 1996-08-06 Apple Computer, Inc. Method and apparatus for indicating a change in status of an object and its disposition using animation
JP3248981B2 (en) 1992-06-02 2002-01-21 松下電器産業株式会社 calculator
US5414805A (en) 1992-11-06 1995-05-09 International Business Machines Corporation Visual display transition effects using sorted table of display cells
US5420976A (en) 1992-11-12 1995-05-30 International Business Machines Corp. Method for selecting position-dependent actions of computer applications programs
US5612719A (en) 1992-12-03 1997-03-18 Apple Computer, Inc. Gesture sensitive buttons for graphical user interfaces
US5598524A (en) 1993-03-03 1997-01-28 Apple Computer, Inc. Method and apparatus for improved manipulation of data between an application program and the files system on a computer-controlled display system
US5621878A (en) 1993-03-03 1997-04-15 Apple Computer, Inc. Method and apparatus or manipulating data from a suspended application program on a computer-controlled display system
US5528735A (en) 1993-03-23 1996-06-18 Silicon Graphics Inc. Method and apparatus for displaying data within a three-dimensional information landscape
US5812862A (en) 1993-05-10 1998-09-22 Apple Computer, Inc. Computer-human interface system for compound documents
US5745910A (en) 1993-05-10 1998-04-28 Apple Computer, Inc. Frame structure which provides an interface between parts of a compound document
DE69432199T2 (en) 1993-05-24 2004-01-08 Sun Microsystems, Inc., Mountain View Graphical user interface with methods for interfacing with remote control devices
US5956030A (en) 1993-06-11 1999-09-21 Apple Computer, Inc. Computer system with graphical user interface including windows having an identifier within a control region on the display
US6012072A (en) 1993-09-17 2000-01-04 Digital Equipment Corporation Display apparatus for the display of documents in a three-dimensional workspace
US6262732B1 (en) 1993-10-25 2001-07-17 Scansoft, Inc. Method and apparatus for managing and navigating within stacks of document pages
JP2602001B2 (en) 1993-11-01 1997-04-23 インターナショナル・ビジネス・マシーンズ・コーポレイション Personal communicator with shrinkable keyboard
US5825357A (en) * 1993-12-13 1998-10-20 Microsoft Corporation Continuously accessible computer system interface
JPH07225829A (en) 1994-02-15 1995-08-22 Hitachi Ltd Method and device for data display
US5642490A (en) 1994-06-24 1997-06-24 International Business Machines Corporation Providing icon placement alternatives for dynamically added container records
US5546529A (en) 1994-07-28 1996-08-13 Xerox Corporation Method and apparatus for visualization of database search results
JPH0863326A (en) 1994-08-22 1996-03-08 Hitachi Ltd Image processing device/method
EP0701220B1 (en) 1994-09-12 2001-07-04 Adobe Systems Inc. Method and apparatus for viewing electronic documents
US5625818A (en) 1994-09-30 1997-04-29 Apple Computer, Inc. System for managing local database updates published to different online information services in different formats from a central platform
DE19513308A1 (en) 1994-10-04 1996-04-11 Hewlett Packard Co Virtual node file system for computer data system
US5497454A (en) 1994-11-02 1996-03-05 International Business Machines Corporation System for presenting alternate views of a computer window environment
EP0713172B1 (en) 1994-11-15 2002-02-06 Microsoft Corporation Slide out interface bar
DE69523543T2 (en) * 1994-12-13 2002-04-04 Microsoft Corp Taskbar with start menu
US5515486A (en) 1994-12-16 1996-05-07 International Business Machines Corporation Method, apparatus and memory for directing a computer system to display a multi-axis rotatable, polyhedral-shape panel container having front panels for displaying objects
US5644739A (en) * 1995-01-27 1997-07-01 Microsoft Corporation Method and system for adding buttons to a toolbar
US5572238A (en) 1995-01-27 1996-11-05 Xerox Corporation Computer user interface for non-dominant hand assisted control
JP2743854B2 (en) 1995-02-14 1998-04-22 日本電気株式会社 Input device with input time judgment function
US5565888A (en) 1995-02-17 1996-10-15 International Business Machines Corporation Method and apparatus for improving visibility and selectability of icons
US5611060A (en) 1995-02-22 1997-03-11 Microsoft Corporation Auto-scrolling during a drag and drop operation
US5900876A (en) 1995-04-14 1999-05-04 Canon Kabushiki Kaisha Information processing apparatus and method with display book page turning
GB2301217B (en) 1995-05-26 1999-12-15 Nokia Mobile Phones Ltd Display driver
US5754179A (en) 1995-06-07 1998-05-19 International Business Machines Corporation Selection facilitation on a graphical interface
US6496182B1 (en) 1995-06-07 2002-12-17 Microsoft Corporation Method and system for providing touch-sensitive screens for the visually impaired
US6199082B1 (en) 1995-07-17 2001-03-06 Microsoft Corporation Method for delivering separate design and content in a multimedia publishing system
US5914717A (en) 1995-07-21 1999-06-22 Microsoft Methods and system for providing fly out menus
US5745718A (en) 1995-07-31 1998-04-28 International Business Machines Corporation Folder bar widget
US5678015A (en) 1995-09-01 1997-10-14 Silicon Graphics, Inc. Four-dimensional graphical user interface
JPH0973381A (en) 1995-09-04 1997-03-18 Hitachi Ltd Processor specifying method, computer system, and user computer
US6486895B1 (en) 1995-09-08 2002-11-26 Xerox Corporation Display system for displaying lists of linked documents
US5877765A (en) 1995-09-11 1999-03-02 Microsoft Corporation Method and system for displaying internet shortcut icons on the desktop
JPH0997162A (en) 1995-10-02 1997-04-08 Sony Corp Method and device for picture control
JP3688361B2 (en) 1995-10-06 2005-08-24 富士通株式会社 Display control device
US5754809A (en) 1995-12-12 1998-05-19 Dell U.S.A., L.P. Perspective windowing technique for computer graphical user interface
US5825352A (en) 1996-01-04 1998-10-20 Logitech, Inc. Multiple fingers contact sensing method for emulating mouse buttons and mouse operations on a touch sensor pad
US5801699A (en) 1996-01-26 1998-09-01 International Business Machines Corporation Icon aggregation on a graphical user interface
JPH09297750A (en) 1996-03-08 1997-11-18 Nikon Corp Source file editing device
JPH09258971A (en) 1996-03-19 1997-10-03 Sharp Corp Icon programming device
US6044405A (en) 1996-04-12 2000-03-28 Wam!Net Inc. Service network incorporating geographically-remote hubs linked by high speed transmission paths
JPH09292262A (en) 1996-04-26 1997-11-11 Alpine Electron Inc Circumferential facility retrieval display method and destination setting method for guide route
KR20000064931A (en) 1996-04-30 2000-11-06 밀러 제리 에이 User interface for browsing, organizing, and running programs, files, and data within computer systems
US5880733A (en) 1996-04-30 1999-03-09 Microsoft Corporation Display system and method for displaying windows of an operating system to provide a three-dimensional workspace for a computer system
US6043818A (en) 1996-04-30 2000-03-28 Sony Corporation Background image with a continuously rotating and functional 3D icon
US5835079A (en) 1996-06-13 1998-11-10 International Business Machines Corporation Virtual pointing device for touchscreens
JPH1040067A (en) 1996-07-19 1998-02-13 Nec Corp Sound control system interlocking with operation of pointing device
JP3171145B2 (en) 1996-07-31 2001-05-28 アイシン・エィ・ダブリュ株式会社 Information display device provided with touch panel and storage medium
US5877775A (en) 1996-08-08 1999-03-02 Theisen; Karen E. Method of generating a 3-D representation of a hierarchical data structure
US5796401A (en) 1996-08-09 1998-08-18 Winer; Peter W. System for designing dynamic layouts adaptable to various display screen sizes and resolutions
US5774119A (en) 1996-08-14 1998-06-30 International Business Machines Corporation Graphical interface method, apparatus and application for selection of target object
US6407757B1 (en) 1997-12-18 2002-06-18 E-Book Systems Pte Ltd. Computer-based browsing method and computer program product for displaying information in an electronic book form
US6097431A (en) 1996-09-04 2000-08-01 Flashpoint Technology, Inc. Method and system for reviewing and navigating among images on an image capture unit
US5745116A (en) 1996-09-09 1998-04-28 Motorola, Inc. Intuitive gesture-based graphical user interface
US5870683A (en) 1996-09-18 1999-02-09 Nokia Mobile Phones Limited Mobile station having method and apparatus for displaying user-selectable animation sequence
US5963204A (en) 1996-09-20 1999-10-05 Nikon Corporation Electronic camera with reproduction and display of images at the same timing
US5838326A (en) 1996-09-26 1998-11-17 Xerox Corporation System for moving document objects in a 3-D workspace
US6088032A (en) 1996-10-04 2000-07-11 Xerox Corporation Computer controlled display system for displaying a three-dimensional document workspace having a means for prefetching linked documents
JP2008123553A (en) 1996-10-16 2008-05-29 Sharp Corp Information apparatus
US5943679A (en) 1996-10-30 1999-08-24 Xerox Corporation Multi-page document viewer having a focus image and recursively nested images of varying resolutions less than the resolution of the focus image
US6144863A (en) 1996-11-26 2000-11-07 U.S. Philips Corporation Electronic device with screen comprising a menu which can be customized by a user
US6710788B1 (en) 1996-12-03 2004-03-23 Texas Instruments Incorporated Graphical user interface
US6256008B1 (en) 1996-12-10 2001-07-03 Motorola Computer screen saver with wireless messaging capability and method therefor
US6253218B1 (en) 1996-12-26 2001-06-26 Atsushi Aoki Three dimensional data display method utilizing view point tracing and reduced document images
US5835094A (en) 1996-12-31 1998-11-10 Compaq Computer Corporation Three-dimensional computer environment
US6683628B1 (en) 1997-01-10 2004-01-27 Tokyo University Of Agriculture And Technology Human interactive type display system
US6583797B1 (en) 1997-01-21 2003-06-24 International Business Machines Corporation Menu management mechanism that displays menu items based on multiple heuristic factors
JP3780601B2 (en) 1997-01-29 2006-05-31 カシオ計算機株式会社 Image processing apparatus and program recording medium thereof
US6222547B1 (en) 1997-02-07 2001-04-24 California Institute Of Technology Monitoring and analysis of data in cyberspace
US6111573A (en) 1997-02-14 2000-08-29 Velocity.Com, Inc. Device independent window and view system
JP2957507B2 (en) 1997-02-24 1999-10-04 インターナショナル・ビジネス・マシーンズ・コーポレイション Small information processing equipment
US6069626A (en) 1997-02-27 2000-05-30 International Business Machines Corporation Method and apparatus for improved scrolling functionality in a graphical user interface utilizing a transparent scroll bar icon
US5874958A (en) 1997-03-31 1999-02-23 Sun Microsystems, Inc. Method and apparatus for accessing information and items across workspaces
US5923327A (en) 1997-04-23 1999-07-13 Bell-Northern Research Ltd. Scrolling with automatic compression and expansion
US6073036A (en) 1997-04-28 2000-06-06 Nokia Mobile Phones Limited Mobile station with touch input having automatic symbol magnification function
US6326970B1 (en) 1997-05-16 2001-12-04 Liberate Technologies TV centric layout
US5934707A (en) 1997-05-30 1999-08-10 Johnson; Joyce W. Message calendar
US5956025A (en) 1997-06-09 1999-09-21 Philips Electronics North America Corporation Remote with 3D organized GUI for a home entertainment system
CA2769736C (en) 1997-07-09 2013-05-14 Advanced Audio Devices, Llc Device for editing and non-volatile optical storage of digital audio
US6121969A (en) 1997-07-29 2000-09-19 The Regents Of The University Of California Visual navigation in perceptual databases
GB2329539B (en) 1997-09-17 2002-05-15 Sony Uk Ltd Security System
US6433801B1 (en) 1997-09-26 2002-08-13 Ericsson Inc. Method and apparatus for using a touch screen display on a portable intelligent communications device
US6211858B1 (en) 1997-09-26 2001-04-03 Ericsson Inc. Method and apparatus for displaying a rotating meter icon on a portable intelligent communications device
GB9722766D0 (en) 1997-10-28 1997-12-24 British Telecomm Portable computers
US5923908A (en) 1997-10-30 1999-07-13 Eastman Kodak Company Camera with touch sensitive control
US6025842A (en) 1997-11-04 2000-02-15 International Business Machines Corporation System and method for window queues and white space activation for toggling windows
JPH11143604A (en) 1997-11-05 1999-05-28 Nec Corp Portable terminal equipment
FI109733B (en) 1997-11-05 2002-09-30 Nokia Corp Utilizing the content of the message
EP0917080B1 (en) 1997-11-17 2001-07-18 DATALOGIC S.p.A. Method of locating highly variable brightness or colour regions in an image
US6613100B2 (en) 1997-11-26 2003-09-02 Intel Corporation Method and apparatus for displaying miniaturized graphical representations of documents for alternative viewing selection
US5940076A (en) 1997-12-01 1999-08-17 Motorola, Inc. Graphical user interface for an electronic device and method therefor
US6133914A (en) 1998-01-07 2000-10-17 Rogers; David W. Interactive graphical user interface
US6072486A (en) * 1998-01-13 2000-06-06 Microsoft Corporation System and method for creating and customizing a deskbar
JPH11203044A (en) 1998-01-16 1999-07-30 Sony Corp Information processing system
EP1717684A3 (en) 1998-01-26 2008-01-23 Fingerworks, Inc. Method and apparatus for integrating manual input
US7840912B2 (en) 2006-01-30 2010-11-23 Apple Inc. Multi-touch gesture dictionary
US20060033724A1 (en) 2004-07-30 2006-02-16 Apple Computer, Inc. Virtual input device placement on a touch screen user interface
US7844914B2 (en) 2004-07-30 2010-11-30 Apple Inc. Activating virtual keys of a touch-screen virtual keyboard
US7663607B2 (en) 2004-05-06 2010-02-16 Apple Inc. Multipoint touchscreen
US20070177804A1 (en) 2006-01-30 2007-08-02 Apple Computer, Inc. Multi-touch gesture dictionary
US7760187B2 (en) 2004-07-30 2010-07-20 Apple Inc. Visual expander
US7614008B2 (en) 2004-07-30 2009-11-03 Apple Inc. Operation of a computer with touch screen interface
US8479122B2 (en) 2004-07-30 2013-07-02 Apple Inc. Gestures for touch sensitive input devices
US9239673B2 (en) 1998-01-26 2016-01-19 Apple Inc. Gesturing with a multipoint sensing device
JPH11242539A (en) 1998-02-25 1999-09-07 Sharp Corp Display
US6188407B1 (en) 1998-03-04 2001-02-13 Critikon Company, Llc Reconfigurable user interface for modular patient monitor
US6313853B1 (en) 1998-04-16 2001-11-06 Nortel Networks Limited Multi-service user interface
US6275935B1 (en) 1998-04-17 2001-08-14 Thingworld.Com, Llc Systems and methods for locking interactive objects
US6211856B1 (en) 1998-04-17 2001-04-03 Sung M. Choi Graphical user interface touch screen with an auto zoom feature
US6145083A (en) 1998-04-23 2000-11-07 Siemens Information And Communication Networks, Inc. Methods and system for providing data and telephony security
JPH11327433A (en) 1998-05-18 1999-11-26 Denso Corp Map display device
JP2000010702A (en) 1998-06-23 2000-01-14 Pioneer Electron Corp Method and device for picture display menu selection
US6496206B1 (en) 1998-06-29 2002-12-17 Scansoft, Inc. Displaying thumbnail images of document pages in an electronic folder
JP2968523B1 (en) 1998-07-07 1999-10-25 株式会社ジャストシステム INFORMATION PROCESSING DEVICE, INFORMATION PROCESSING METHOD, AND COMPUTER-READABLE RECORDING MEDIUM RECORDING PROGRAM FOR CAUSING COMPUTER TO EXECUTE THE METHOD
US6229542B1 (en) 1998-07-10 2001-05-08 Intel Corporation Method and apparatus for managing windows in three dimensions in a two dimensional windowing system
US6243080B1 (en) 1998-07-14 2001-06-05 Ericsson Inc. Touch-sensitive panel with selector
US6414700B1 (en) 1998-07-21 2002-07-02 Silicon Graphics, Inc. System for accessing a large number of menu items using a zoned menu bar
JP2000105772A (en) 1998-07-28 2000-04-11 Sharp Corp Information managing device
US20010015719A1 (en) 1998-08-04 2001-08-23 U.S. Philips Corporation Remote control has animated gui
US6049336A (en) 1998-08-12 2000-04-11 Sony Corporation Transition animation for menu structure
US6177936B1 (en) 1998-08-20 2001-01-23 International Business Machines Corporation Browser hierarchical contextual information for web pages
US6054989A (en) 1998-09-14 2000-04-25 Microsoft Corporation Methods, apparatus and data structures for providing a user interface, which exploits spatial memory in three-dimensions, to objects and which provides spatialized audio
US6166738A (en) 1998-09-14 2000-12-26 Microsoft Corporation Methods, apparatus and data structures for providing a user interface, which exploits spatial memory in three-dimensions, to objects
US20020018051A1 (en) 1998-09-15 2002-02-14 Mona Singh Apparatus and method for moving objects on a touchscreen display
US6278454B1 (en) 1998-09-24 2001-08-21 Ericsson Inc. Call progress graphical user interface
US6195094B1 (en) 1998-09-29 2001-02-27 Netscape Communications Corporation Window splitter bar system
EP1003098B1 (en) 1998-10-30 2005-09-07 Fujitsu Limited Method and system for displaying and sending information
JP4542637B2 (en) 1998-11-25 2010-09-15 セイコーエプソン株式会社 Portable information device and information storage medium
JP2000163193A (en) 1998-11-25 2000-06-16 Seiko Epson Corp Portable information equipment and information storage medium
JP2000163444A (en) 1998-11-25 2000-06-16 Seiko Epson Corp Portable information device and information storage medium
JP4264614B2 (en) 1998-11-30 2009-05-20 ソニー株式会社 Information providing apparatus and information providing method
US6571245B2 (en) 1998-12-07 2003-05-27 Magically, Inc. Virtual desktop in a computer network
US6222465B1 (en) 1998-12-09 2001-04-24 Lucent Technologies Inc. Gesture-based computer interface
US6489975B1 (en) 1998-12-14 2002-12-03 International Business Machines Corporation System and method for improved navigation between open windows in an application program using window tabs
JP2000181436A (en) 1998-12-14 2000-06-30 Sharp Corp Document display device
SG87065A1 (en) 1998-12-16 2002-03-19 Ibm Method and apparatus for protecting controls in graphic user interfaces of computer systems
US6353451B1 (en) 1998-12-16 2002-03-05 Intel Corporation Method of providing aerial perspective in a graphical user interface
US6816175B1 (en) 1998-12-19 2004-11-09 International Business Machines Corporation Orthogonal browsing in object hierarchies
US6621509B1 (en) 1999-01-08 2003-09-16 Ati International Srl Method and apparatus for providing a three dimensional graphical user interface
FR2788617B1 (en) 1999-01-15 2001-03-02 Za Production METHOD FOR SELECTING AND DISPLAYING A DIGITAL FILE TYPE ELEMENT, STILL IMAGE OR MOVING IMAGES, ON A DISPLAY SCREEN
US6628309B1 (en) 1999-02-05 2003-09-30 International Business Machines Corporation Workspace drag and drop
JP2000242390A (en) 1999-02-18 2000-09-08 Sony Corp Display method for information and information display device
US6310633B1 (en) 1999-03-23 2001-10-30 Ricoh Company Limited Method and system for organizing document information
US6590594B2 (en) 1999-03-25 2003-07-08 International Business Machines Corporation Window scroll-bar
US6545669B1 (en) * 1999-03-26 2003-04-08 Husam Kinawi Object-drag continuity between discontinuous touch-screens
JP2000276272A (en) 1999-03-26 2000-10-06 Mitsubishi Electric Corp Device and method for displaying state with icon
US6549218B1 (en) 1999-03-31 2003-04-15 Microsoft Corporation Dynamic effects for computer display windows
US7119819B1 (en) 1999-04-06 2006-10-10 Microsoft Corporation Method and apparatus for supporting two-dimensional windows in a three-dimensional environment
US6262724B1 (en) 1999-04-15 2001-07-17 Apple Computer, Inc. User interface for presenting media information
US20050166232A1 (en) 1999-04-21 2005-07-28 Lamkin Allan B... Presentation of media content from multiple media sources
JP2000312360A (en) 1999-04-27 2000-11-07 Matsushita Electric Ind Co Ltd Information service system
US6822638B2 (en) 1999-05-10 2004-11-23 International Business Machines Corporation Pointing device for navigating a 3 dimensional GUI interface
US6359615B1 (en) 1999-05-11 2002-03-19 Ericsson Inc. Movable magnification icons for electronic device display screens
US6411283B1 (en) 1999-05-20 2002-06-25 Micron Technology, Inc. Computer touch screen adapted to facilitate selection of features at edge of screen
US7030863B2 (en) 2000-05-26 2006-04-18 America Online, Incorporated Virtual keyboard system with automatic correction
US7263667B1 (en) 1999-06-09 2007-08-28 Microsoft Corporation Methods, apparatus and data structures for providing a user interface which facilitates decision making
US7278115B1 (en) 1999-06-18 2007-10-02 Microsoft Corporation Methods, apparatus and data structures for providing a user interface to objects, the user interface exploiting spatial memory and visually indicating at least one object parameter
US6647534B1 (en) 1999-06-30 2003-11-11 Ricoh Company Limited Method and system for organizing document information in a non-directed arrangement of documents
US6639584B1 (en) 1999-07-06 2003-10-28 Chuang Li Methods and apparatus for controlling a portable electronic device using a touchpad
JP4545884B2 (en) 1999-07-22 2010-09-15 キヤノン株式会社 Information processing apparatus, control method therefor, and computer-readable memory
US6771250B1 (en) 1999-07-27 2004-08-03 Samsung Electronics Co., Ltd. Portable computer system having application program launcher for low power consumption and method of operating the same
US6317140B1 (en) 1999-08-02 2001-11-13 Hewlett-Packard Company Displaying interactive bitmap images within a display space
US6349410B1 (en) 1999-08-04 2002-02-19 Intel Corporation Integrating broadcast television pause and web browsing
US6763388B1 (en) 1999-08-10 2004-07-13 Akamai Technologies, Inc. Method and apparatus for selecting and viewing portions of web pages
US6781575B1 (en) * 2000-09-21 2004-08-24 Handspring, Inc. Method and apparatus for organizing addressing elements
US7007239B1 (en) 2000-09-21 2006-02-28 Palm, Inc. Method and apparatus for accessing a contacts database and telephone services
US20020173721A1 (en) 1999-08-20 2002-11-21 Novasonics, Inc. User interface for handheld imaging devices
GB9920327D0 (en) 1999-08-28 1999-11-03 Koninkl Philips Electronics Nv Menu display for a graphical user interface
US6976210B1 (en) 1999-08-31 2005-12-13 Lucent Technologies Inc. Method and apparatus for web-site-independent personalization from multiple sites having user-determined extraction functionality
US6898307B1 (en) 1999-09-22 2005-05-24 Xerox Corporation Object identification method and system for an augmented-reality display
JP4091223B2 (en) 1999-09-27 2008-05-28 富士フイルム株式会社 Image display method and apparatus for confirmation
AU7621300A (en) 1999-09-28 2001-04-30 Chameleon Network Inc. Portable electronic authorization system and associated method
US6950949B1 (en) 1999-10-08 2005-09-27 Entrust Limited Method and apparatus for password entry using dynamic interface legitimacy information
US7134095B1 (en) 1999-10-20 2006-11-07 Gateway, Inc. Simulated three-dimensional navigational menu system
US7028264B2 (en) 1999-10-29 2006-04-11 Surfcast, Inc. System and method for simultaneous display of multiple information sources
US6466198B1 (en) 1999-11-05 2002-10-15 Innoventions, Inc. View navigation and magnification of a hand-held device with a display
US6844871B1 (en) 1999-11-05 2005-01-18 Microsoft Corporation Method and apparatus for computer input using six degrees of freedom
US6600497B1 (en) 1999-11-15 2003-07-29 Elliot A. Gottfurcht Apparatus and method to navigate interactive television using unique inputs with a remote control
JP2001256050A (en) 1999-11-30 2001-09-21 Texas Instr Inc <Ti> Graphical development system and method
US6820111B1 (en) 1999-12-07 2004-11-16 Microsoft Corporation Computer user interface architecture that saves a user's non-linear navigation history and intelligently maintains that history
US6978127B1 (en) 1999-12-16 2005-12-20 Koninklijke Philips Electronics N.V. Hand-ear user interface for hand-held device
US7958457B1 (en) 1999-12-20 2011-06-07 Wireless Agents, Llc Method and apparatus for scheduling presentation of digital content on a personal communication device
US7434177B1 (en) 1999-12-20 2008-10-07 Apple Inc. User interface for providing consolidation and access
JP2001175386A (en) 1999-12-21 2001-06-29 Fujitsu Ltd Display, display method and storage medium
JP2001184842A (en) 1999-12-28 2001-07-06 Hitachi Ltd Information reproducing device
US7362331B2 (en) 2000-01-05 2008-04-22 Apple Inc. Time-based, non-constant translation of user interface objects between states
US6396520B1 (en) 2000-01-05 2002-05-28 Apple Computer, Inc. Method of transition between window states
US6597378B1 (en) 2000-01-18 2003-07-22 Seiko Epson Corporation Display device, portable information processing apparatus, information storage medium, and electronic apparatus
US6809724B1 (en) 2000-01-18 2004-10-26 Seiko Epson Corporation Display apparatus and portable information processing apparatus
JP2003531418A (en) 2000-02-02 2003-10-21 イージーログイン・ドット・コム・インコーポレイテッド Clipping and manipulation of elements contained in web pages
US6313855B1 (en) 2000-02-04 2001-11-06 Browse3D Corporation System and method for web browsing
US9129034B2 (en) 2000-02-04 2015-09-08 Browse3D Corporation System and method for web browsing
US7240296B1 (en) 2000-02-11 2007-07-03 Microsoft Corporation Unified navigation shell user interface
GB2365676B (en) 2000-02-18 2004-06-23 Sensei Ltd Mobile telephone with improved man-machine interface
AU2001235940A1 (en) 2000-02-23 2001-09-03 Eyal, Yehoshua Systems and methods for generating and providing previews of electronic files such as web files
US6859909B1 (en) 2000-03-07 2005-02-22 Microsoft Corporation System and method for annotating web-based documents
US6874128B1 (en) 2000-03-08 2005-03-29 Zephyr Associates, Inc. Mouse driven splitter window
US20020038299A1 (en) 2000-03-20 2002-03-28 Uri Zernik Interface for presenting information
JP2001265481A (en) 2000-03-21 2001-09-28 Nec Corp Method and device for displaying page information and storage medium with program for displaying page information stored
JP3763389B2 (en) 2000-03-24 2006-04-05 シャープ株式会社 Image data editing operation method and information processing apparatus
DE10016753A1 (en) 2000-04-04 2001-10-11 Definiens Ag Procedure for navigating between sections in a display room
AU2001253161A1 (en) * 2000-04-04 2001-10-15 Stick Networks, Inc. Method and apparatus for scheduling presentation of digital content on a personal communication device
EP1143334A3 (en) 2000-04-06 2005-03-30 Microsoft Corporation Theme aware graphical user interface
US20010048448A1 (en) 2000-04-06 2001-12-06 Raiz Gregory L. Focus state themeing
US20040049737A1 (en) 2000-04-26 2004-03-11 Novarra, Inc. System and method for displaying information content with selective horizontal scrolling
US7403910B1 (en) * 2000-04-28 2008-07-22 Netflix, Inc. Approach for estimating user ratings of items
JP2001312347A (en) 2000-05-01 2001-11-09 Sony Corp Device and method for processing information and program storage medium
US7917869B2 (en) 2000-05-06 2011-03-29 Anderson Thomas G Human-computer interface incorporating personal and application domains
US7287232B2 (en) 2000-05-08 2007-10-23 Fujitsu Limited Information display system having graphical user interface switchingly controlling information display on display screen
JP3539553B2 (en) 2000-05-30 2004-07-07 シャープ株式会社 Animation creation method, animation creation device, and computer-readable recording medium recording animation creation program
US7210099B2 (en) 2000-06-12 2007-04-24 Softview Llc Resolution independent vector display of internet content
US6628310B1 (en) 2000-06-16 2003-09-30 Chapelle Planning Co., Ltd. Method of and system for turning over a window that is laid over another window, and recording medium having program of turning over a window that is laid over another window
US6714222B1 (en) 2000-06-21 2004-03-30 E2 Home Ab Graphical user interface for communications
US7624356B1 (en) 2000-06-21 2009-11-24 Microsoft Corporation Task-sensitive methods and systems for displaying command sets
US7155667B1 (en) 2000-06-21 2006-12-26 Microsoft Corporation User interface for integrated spreadsheets and word processing tables
US7149549B1 (en) 2000-10-26 2006-12-12 Ortiz Luis M Providing multiple perspectives for a venue activity through an electronic hand held device
US6525997B1 (en) 2000-06-30 2003-02-25 International Business Machines Corporation Efficient use of display real estate in a wrist watch display
US6477117B1 (en) 2000-06-30 2002-11-05 International Business Machines Corporation Alarm interface for a smart watch
SE0002472L (en) 2000-06-30 2001-12-31 Nokia Corp Method and apparatus for selection control
EP1354263A2 (en) 2000-07-07 2003-10-22 Openwave Systems Inc. Graphical user interface features of a browser in a hand-held wireless communication device
US7071943B2 (en) 2000-07-18 2006-07-04 Incredimail, Ltd. System and method for visual feedback of command execution in electronic mail systems
US20020104096A1 (en) 2000-07-19 2002-08-01 Cramer Allen Brett System and methods for providing web-based multimedia presentations
GB0017793D0 (en) 2000-07-21 2000-09-06 Secr Defence Human computer interface
AU2001283004A1 (en) 2000-07-24 2002-02-05 Vivcom, Inc. System and method for indexing, searching, identifying, and editing portions of electronic multimedia files
US20050204385A1 (en) 2000-07-24 2005-09-15 Vivcom, Inc. Processing and presentation of infomercials for audio-visual programs
JP2002041197A (en) 2000-07-24 2002-02-08 Matsushita Electric Ind Co Ltd Electronic display method and its device
JP2002041206A (en) 2000-07-25 2002-02-08 Sega Corp Image display method
CA2349649A1 (en) 2000-07-31 2002-01-31 International Business Machines Corporation Switching between virtual desktops
US20020015064A1 (en) 2000-08-07 2002-02-07 Robotham John S. Gesture-based user interface to multi-level and multi-modal sets of bit-maps
US6704024B2 (en) 2000-08-07 2004-03-09 Zframe, Inc. Visual content browsing using rasterized representations
JP3949912B2 (en) 2000-08-08 2007-07-25 株式会社エヌ・ティ・ティ・ドコモ Portable electronic device, electronic device, vibration generator, notification method by vibration and notification control method
US7103838B1 (en) 2000-08-18 2006-09-05 Firstrain, Inc. Method and apparatus for extracting relevant data
US6915294B1 (en) 2000-08-18 2005-07-05 Firstrain, Inc. Method and apparatus for searching network resources
US6563913B1 (en) 2000-08-21 2003-05-13 Koninklijke Philips Electronics N.V. Selective sending of portions of electronic content
JP2002062966A (en) 2000-08-21 2002-02-28 Seiko Epson Corp Information processor and control method thereof
EP1311803B8 (en) 2000-08-24 2008-05-07 VDO Automotive AG Method and navigation device for querying target information and navigating within a map view
TW466415B (en) 2000-08-28 2001-12-01 Compal Electronics Inc Hand-held device with zooming display function
GB2366696B (en) 2000-08-31 2004-03-10 Nokia Mobile Phones Ltd Reminders for a communication terminal
US20020054090A1 (en) 2000-09-01 2002-05-09 Silva Juliana Freire Method and apparatus for creating and providing personalized access to web content and services from terminals having diverse capabilities
CA2317336A1 (en) 2000-09-06 2002-03-06 David Cowperthwaite Occlusion resolution operators for three-dimensional detail-in-context
JP2002082745A (en) 2000-09-07 2002-03-22 Sony Corp Device and method for information processing, and program storage medium
US6915490B1 (en) 2000-09-29 2005-07-05 Apple Computer Inc. Method for dragging and dropping between multiple layered windows
US7688306B2 (en) 2000-10-02 2010-03-30 Apple Inc. Methods and apparatuses for operating a portable device based on an accelerometer
US7218226B2 (en) 2004-03-01 2007-05-15 Apple Inc. Acceleration-based theft detection system for portable electronic devices
JP4385511B2 (en) 2000-10-12 2009-12-16 ソニー株式会社 Information processing apparatus and method, and program storage medium
US7076275B1 (en) 2000-10-13 2006-07-11 Palmsource, Inc. Method and system for single-step enablement of telephony functionality for a portable computer system
JP2002132412A (en) 2000-10-26 2002-05-10 Denso Corp Display method for pocket telephone icon
US6990452B1 (en) 2000-11-03 2006-01-24 At&T Corp. Method for sending multi-media messages using emoticons
AU2002226886A1 (en) * 2000-11-09 2002-05-21 Change Tools, Inc. A user definable interface system, method and computer program product
JP3890880B2 (en) 2000-11-10 2007-03-07 株式会社日立製作所 Information retrieval terminal
US6897853B2 (en) 2000-11-10 2005-05-24 Microsoft Corp. Highlevel active pen matrix
US7134092B2 (en) 2000-11-13 2006-11-07 James Nolen Graphical user interface method and apparatus
US6961731B2 (en) 2000-11-15 2005-11-01 Kooltorch, L.L.C. Apparatus and method for organizing and/or presenting data
US7174512B2 (en) 2000-12-01 2007-02-06 Thomson Licensing S.A. Portal for a communications system
WO2002046903A1 (en) 2000-12-07 2002-06-13 Siemens Aktiengesellschaft Method for selection and activation of a function from an operation menu and operation system for carrying out said method
US7584278B2 (en) 2000-12-11 2009-09-01 Microsoft Corporation Method and system for task based management of multiple network resources
KR100377936B1 (en) 2000-12-16 2003-03-29 삼성전자주식회사 Method for inputting emotion icon in mobile telecommunication terminal
US6727916B1 (en) 2000-12-21 2004-04-27 Sprint Spectrum, L.P. Method and system for assisting a user to engage in a microbrowser-based interactive chat session
US6944830B2 (en) 2000-12-21 2005-09-13 Xerox Corporation System and method for browsing hierarchically based node-link structures based on an estimated degree of interest
US7139982B2 (en) * 2000-12-21 2006-11-21 Xerox Corporation Navigation methods, systems, and computer program products for virtual three-dimensional books
US7017118B1 (en) 2000-12-29 2006-03-21 International Business Machines Corp. Method and apparatus for reordering data items
US7133859B1 (en) 2001-01-05 2006-11-07 Palm, Inc. Category specific sort and display instructions for an electronic device
US20020093531A1 (en) 2001-01-17 2002-07-18 John Barile Adaptive display for video conferences
FR2819675B1 (en) 2001-01-17 2003-05-16 Sagem PORTABLE TELEPHONE WITH CAPTURE BROWSER AND REMINDER OF COMPUTER ADDRESSES
US6928461B2 (en) 2001-01-24 2005-08-09 Raja Singh Tuli Portable high speed internet access device with encryption
US6677932B1 (en) 2001-01-28 2004-01-13 Finger Works, Inc. System and method for recognizing touch typing under limited tactile feedback conditions
US20050183017A1 (en) 2001-01-31 2005-08-18 Microsoft Corporation Seekbar in taskbar player visualization mode
US6570557B1 (en) 2001-02-10 2003-05-27 Finger Works, Inc. Multi-touch system and method for emulating modifier keys via fingertip chords
US7030861B1 (en) 2001-02-10 2006-04-18 Wayne Carl Westerman System and method for packing multi-touch gestures onto a hand
JP3881179B2 (en) 2001-02-14 2007-02-14 三菱電機株式会社 User interface design device
US7216305B1 (en) 2001-02-15 2007-05-08 Denny Jaeger Storage/display/action object for onscreen use
US7735021B2 (en) 2001-02-16 2010-06-08 Microsoft Corporation Shortcut system for use in a mobile electronic device and method thereof
JP2002244635A (en) 2001-02-20 2002-08-30 Fujitsu General Ltd Picture display device
US7506256B2 (en) 2001-03-02 2009-03-17 Semantic Compaction Systems Device and method for previewing themes and categories of sequenced symbols
TWI243320B (en) 2001-03-28 2005-11-11 Ulead Systems Inc Method for manipulating multiple multimedia objects
JP2002297514A (en) 2001-03-29 2002-10-11 Sony Corp Receiver and method, recording medium, and program
US6798429B2 (en) 2001-03-29 2004-09-28 Intel Corporation Intuitive mobile device interface to virtual spaces
US6987512B2 (en) 2001-03-29 2006-01-17 Microsoft Corporation 3D navigation techniques
DE10117457A1 (en) 2001-04-06 2002-10-17 T Mobile Deutschland Gmbh Method for displaying standardized, large-format Internet pages with, for example, HTML protocol in one-hand-held devices with a mobile radio connection
US7039643B2 (en) 2001-04-10 2006-05-02 Adobe Systems Incorporated System, method and apparatus for converting and integrating media files
US6901585B2 (en) 2001-04-12 2005-05-31 International Business Machines Corporation Active ALT tag in HTML documents to increase the accessibility to users with visual, audio impairment
JP2002312105A (en) 2001-04-17 2002-10-25 Toshiba Corp Input device, key function guidance method and input method
JP3618303B2 (en) 2001-04-24 2005-02-09 松下電器産業株式会社 Map display device
EP1393189A4 (en) 2001-05-02 2007-06-13 Bitstream Inc Methods, systems, and programming for displaying media scaled-down by a variable scale factor
CA2385401C (en) 2001-05-07 2012-09-25 Vizible.Com Inc. Method of representing information on a three-dimensional user interface
US20040109031A1 (en) 2001-05-11 2004-06-10 Kenneth Deaton Method and system for automatically creating and displaying a customizable three-dimensional graphical user interface (3D GUI) for a computer system
JP2003037731A (en) 2001-05-14 2003-02-07 Canon Inc Image processing apparatus and method
US7730401B2 (en) 2001-05-16 2010-06-01 Synaptics Incorporated Touch screen with user interface enhancement
US20050024341A1 (en) 2001-05-16 2005-02-03 Synaptics, Inc. Touch screen with user interface enhancement
US7246329B1 (en) 2001-05-18 2007-07-17 Autodesk, Inc. Multiple menus for use with a graphical user interface
US20030164827A1 (en) 2001-05-18 2003-09-04 Asaf Gottesman System and method for displaying search results in a three-dimensional virtual environment
US7010758B2 (en) 2001-05-21 2006-03-07 Leap Wireless International, Inc. Dynamically defined context sensitive jump menu
US7185290B2 (en) 2001-06-08 2007-02-27 Microsoft Corporation User interface for a system and process for providing dynamic communication access and information awareness in an interactive peripheral display
US20020186257A1 (en) 2001-06-08 2002-12-12 Cadiz Jonathan J. System and process for providing dynamic communication access and information awareness in an interactive peripheral display
US7434246B2 (en) 2001-06-08 2008-10-07 Digeo, Inc. Systems and methods for automatic personalizing of channel favorites in a set top box
JP2003005912A (en) 2001-06-20 2003-01-10 Hitachi Ltd Display device with touch panel and display method
US6976228B2 (en) 2001-06-27 2005-12-13 Nokia Corporation Graphical user interface comprising intersecting scroll bar for selection of content
US20030013483A1 (en) 2001-07-06 2003-01-16 Ausems Michiel R. User interface for handheld communication device
US20030117427A1 (en) 2001-07-13 2003-06-26 Universal Electronics Inc. System and method for interacting with a program guide displayed on a portable electronic device
US20050134578A1 (en) 2001-07-13 2005-06-23 Universal Electronics Inc. System and methods for interacting with a control environment
US8063923B2 (en) 2001-07-13 2011-11-22 Universal Electronics Inc. System and method for updating information in an electronic portable device
US6819340B2 (en) 2001-07-23 2004-11-16 Paul E. Burke Adding a shortcut to a web site
US20040205492A1 (en) 2001-07-26 2004-10-14 Newsome Mark R. Content clipping service
US20030025676A1 (en) 2001-08-02 2003-02-06 Koninklijke Philips Electronics N.V. Sensor-based menu for a touch screen panel
US20030030664A1 (en) 2001-08-13 2003-02-13 Parry Travis J. Customizable control panel software
US6987991B2 (en) 2001-08-17 2006-01-17 Wildseed Ltd. Emoticon input method and apparatus
JP2003066941A (en) 2001-08-28 2003-03-05 Fuji Photo Film Co Ltd Display control method, image processor and recording medium
US7093201B2 (en) 2001-09-06 2006-08-15 Danger, Inc. Loop menu navigation apparatus and method
JP2003091347A (en) 2001-09-18 2003-03-28 Sony Corp Information processor, screen display method, screen display program and recording medium recording the screen display program
DE10146471A1 (en) 2001-09-21 2003-04-17 3Dconnexion Gmbh 3D input device with integrated touchscreen
US7032188B2 (en) 2001-09-28 2006-04-18 Nokia Corporation Multilevel sorting and displaying of contextual objects
US20040205496A1 (en) 2001-10-11 2004-10-14 International Business Machines Corporation Displaying subheadings and hyperlinks in a scrollable long document
US7606819B2 (en) 2001-10-15 2009-10-20 Maya-Systems Inc. Multi-dimensional locating system and method
US7680817B2 (en) 2001-10-15 2010-03-16 Maya-Systems Inc. Multi-dimensional locating system and method
US7221933B2 (en) 2001-10-22 2007-05-22 Kyocera Wireless Corp. Messaging system for mobile communication
US6970200B2 (en) 2001-10-26 2005-11-29 Hewlett-Packard Development Company, L.P. System and method for a simplified digital camera interface for viewing images and controlling camera operation
US7171626B2 (en) 2001-10-29 2007-01-30 Microsoft Corporation System and method for presenting the contents of a content collection based on content type
US7146576B2 (en) 2001-10-30 2006-12-05 Hewlett-Packard Development Company, L.P. Automatically designed three-dimensional graphical environments for information discovery and visualization
JP3891335B2 (en) 2001-10-31 2007-03-14 独立行政法人情報通信研究機構 Navigation device
US8095879B2 (en) 2002-12-10 2012-01-10 Neonode Inc. User interface for mobile handheld computer unit
US7714880B2 (en) 2001-11-16 2010-05-11 Honeywell International Inc. Method and apparatus for displaying images on a display
US20030142136A1 (en) 2001-11-26 2003-07-31 Carter Braxton Page Three dimensional graphical user interface
US20040205498A1 (en) 2001-11-27 2004-10-14 Miller John David Displaying electronic content
US7075550B2 (en) 2001-11-27 2006-07-11 Bonadio Allan R Method and system for graphical file management
JP2003162356A (en) 2001-11-28 2003-06-06 Nec Corp Scroll control device, scroll control method, and communication terminal using the same
US7158175B2 (en) 2001-11-30 2007-01-02 Eastman Kodak Company System including a digital camera and a docking unit for coupling to the internet
GB2407900B (en) 2001-12-04 2005-08-24 Hewlett Packard Co Generation and usage of workflows for processing data on a printing device
AUPR947701A0 (en) 2001-12-14 2002-01-24 Activesky, Inc. Digital multimedia publishing system for wireless devices
US7346855B2 (en) 2001-12-21 2008-03-18 Microsoft Corporation Method and system for switching between multiple computer applications
US6690387B2 (en) 2001-12-28 2004-02-10 Koninklijke Philips Electronics N.V. Touch-screen image scrolling system and method
JP3955969B2 (en) 2001-12-28 2007-08-08 株式会社ケンウッド Mobile phone
US7043701B2 (en) 2002-01-07 2006-05-09 Xerox Corporation Opacity desktop with depth perception
US7310636B2 (en) 2002-01-15 2007-12-18 International Business Machines Corporation Shortcut enabled, context aware information management
JP2003209771A (en) 2002-01-16 2003-07-25 Hitachi Ltd Digital video reproducing device and reproducing method
US6934911B2 (en) 2002-01-25 2005-08-23 Nokia Corporation Grouping and displaying of contextual objects
US7075512B1 (en) 2002-02-07 2006-07-11 Palmsource, Inc. Method and system for navigating a display screen for locating a desired item of information
WO2003071410A2 (en) 2002-02-15 2003-08-28 Canesta, Inc. Gesture recognition system using depth perceptive sensors
JP2003242178A (en) 2002-02-20 2003-08-29 Fuji Photo Film Co Ltd Folder icon display control device
DE10207703B4 (en) 2002-02-22 2005-06-09 Kathrein-Werke Kg Antenna for a receiving and / or transmitting device, in particular as a roof antenna for motor vehicles
US7370281B2 (en) 2002-02-22 2008-05-06 Bea Systems, Inc. System and method for smart drag-and-drop functionality
JP2003248538A (en) 2002-02-25 2003-09-05 Gakken Co Ltd Program, information processing device using the same and information processing method
JP3847641B2 (en) 2002-02-28 2006-11-22 株式会社ソニー・コンピュータエンタテインメント Information processing apparatus, information processing program, computer-readable recording medium storing information processing program, and information processing method
US6907576B2 (en) 2002-03-04 2005-06-14 Microsoft Corporation Legibility of selected content
US8972890B2 (en) 2002-03-06 2015-03-03 Apple Inc. Aminated menu bar
JP2003271310A (en) 2002-03-13 2003-09-26 Canon Inc Information inputting and outputting device, method for controlling the device, and program for realizing the method
US7607102B2 (en) 2002-03-14 2009-10-20 Apple Inc. Dynamically changing appearances for user interface elements during drag-and-drop operations
KR100833229B1 (en) 2002-03-16 2008-05-28 삼성전자주식회사 Multi-layer focusing method and apparatus therefor
US20030179240A1 (en) 2002-03-20 2003-09-25 Stephen Gest Systems and methods for managing virtual desktops in a windowing environment
US7249327B2 (en) 2002-03-22 2007-07-24 Fuji Xerox Co., Ltd. System and method for arranging, manipulating and displaying objects in a graphical user interface
US20030184552A1 (en) 2002-03-26 2003-10-02 Sanja Chadha Apparatus and method for graphics display system for markup languages
JP2003295994A (en) 2002-03-29 2003-10-17 Casio Comput Co Ltd Information equipment, control program and control method
US6931601B2 (en) 2002-04-03 2005-08-16 Microsoft Corporation Noisy operating system user interface
US7203909B1 (en) 2002-04-04 2007-04-10 Microsoft Corporation System and methods for constructing personalized context-sensitive portal pages or views by analyzing patterns of users' information access activities
US7433546B2 (en) 2004-10-25 2008-10-07 Apple Inc. Image scaling arrangement
US7010755B2 (en) 2002-04-05 2006-03-07 Microsoft Corporation Virtual desktop manager
US7689673B2 (en) * 2002-04-23 2010-03-30 Canon Kabushiki Kaisha Remote creation of printer instances on a workstation
US6629793B1 (en) 2002-04-26 2003-10-07 Westie Intellectual Properties Limited Partnership Emoticon keyboard
US7810038B2 (en) 2002-05-03 2010-10-05 International Business Machines Corporation Method for modifying a GUI for an application
US20030206197A1 (en) 2002-05-06 2003-11-06 Mcinerney John Personal information management devices with persistent application information and methods
US8947543B2 (en) 2002-05-08 2015-02-03 Hewlett-Packard Development Company, L.P. System and method of personalizing a user interface of a portable electronic device
US7458034B2 (en) 2002-05-08 2008-11-25 Kabushiki Kaisha Toshiba Data organization support method and program product therefor
JP2003339079A (en) 2002-05-20 2003-11-28 Ntt Docomo Inc Mobile communication terminal, program, and recording medium
US6996798B2 (en) 2002-05-29 2006-02-07 Sun Microsystems, Inc. Automatically deriving an application specification from a web-based application
US7415677B2 (en) 2002-06-05 2008-08-19 Sap Aktiengesellschaft Temporary communication areas for a computer user interface
CN1464719A (en) 2002-06-06 2003-12-31 翁延鸣 Screen selection type mobile phone
US7456823B2 (en) 2002-06-14 2008-11-25 Sony Corporation User interface apparatus and portable information apparatus
FI20021162A0 (en) 2002-06-14 2002-06-14 Nokia Corp Electronic device and a method for administering its keypad
US7171625B1 (en) 2002-06-18 2007-01-30 Actify, Inc. Double-clicking a point-and-click user interface apparatus to enable a new interaction with content represented by an active visual display element
US7194527B2 (en) 2002-06-18 2007-03-20 Microsoft Corporation Media variations browser
FI20021655A (en) 2002-06-19 2003-12-20 Nokia Corp Method of deactivating locking and a portable electronic device
JP2004023651A (en) 2002-06-19 2004-01-22 Matsushita Electric Ind Co Ltd Telephone set
US7546548B2 (en) 2002-06-28 2009-06-09 Microsoft Corporation Method and system for presenting menu commands for selection
JP2004038310A (en) 2002-06-28 2004-02-05 Kyocera Corp Personal digital assistant and control program to be used for the same
JP2004038260A (en) 2002-06-28 2004-02-05 Clarion Co Ltd Information processor, information processing method and program
JP2004062645A (en) 2002-07-30 2004-02-26 Kyocera Corp Personal digital assistant
US7656393B2 (en) 2005-03-04 2010-02-02 Apple Inc. Electronic device having display and surrounding touch sensitive bezel for user interface and control
US11275405B2 (en) 2005-03-04 2022-03-15 Apple Inc. Multi-functional hand-held device
US7292243B1 (en) 2002-07-02 2007-11-06 James Burke Layered and vectored graphical user interface to a knowledge and relationship rich data source
US7080326B2 (en) 2002-07-11 2006-07-18 International Business Machines Corporation Method and system for managing multi—paned windowed environments
US7166791B2 (en) 2002-07-30 2007-01-23 Apple Computer, Inc. Graphical user interface and methods of use thereof in a multimedia player
JP4115198B2 (en) 2002-08-02 2008-07-09 株式会社日立製作所 Display device with touch panel
US7406666B2 (en) 2002-08-26 2008-07-29 Palm, Inc. User-interface features for computers with contact-sensitive displays
US20040109025A1 (en) 2002-08-28 2004-06-10 Jean-Marie Hullot Computer program comprising a plurality of calendars
US20040041849A1 (en) 2002-08-30 2004-03-04 Von Mock Display screen saver with two way messaging capability and method therefor
FI115255B (en) 2002-09-02 2005-03-31 Myorigo Oy Monitor control method for a mobile terminal and a mobile terminal
US7739604B1 (en) 2002-09-25 2010-06-15 Apple Inc. Method and apparatus for managing windows
JP2004118917A (en) 2002-09-25 2004-04-15 Clarion Co Ltd Electronic equipment and navigation apparatus
JP4606692B2 (en) 2002-09-26 2011-01-05 ソニー株式会社 Information processing apparatus and method, recording medium, and program
JP2004128766A (en) 2002-10-01 2004-04-22 Pioneer Electronic Corp Information recording medium, apparatus and method for information recording, apparatus and methdo for information reproducing, apparatus and method for information recording and reproducing, information recording program, and information reproduction program
JP2004132741A (en) 2002-10-08 2004-04-30 Kenwood Corp Navigation device
US7913183B2 (en) 2002-10-08 2011-03-22 Microsoft Corporation System and method for managing software applications in a graphical user interface
US7519910B2 (en) 2002-10-10 2009-04-14 International Business Machines Corporation Method for transferring files from one machine to another using adjacent desktop displays in a virtual network
US7373612B2 (en) 2002-10-21 2008-05-13 Battelle Memorial Institute Multidimensional structured data visualization method and apparatus, text visualization method and apparatus, method and apparatus for visualizing and graphically navigating the world wide web, method and apparatus for visualizing hierarchies
JP2004152075A (en) 2002-10-31 2004-05-27 Casio Comput Co Ltd Electronic equipment and program
US20040093582A1 (en) 2002-11-01 2004-05-13 Segura Tim E. Method for allowing a computer to be used as an information kiosk while locked
US7124125B2 (en) 2002-11-01 2006-10-17 Loudeye Corp. System and method for providing media samples on-line in response to media related searches on the internet
US6690623B1 (en) 2002-11-08 2004-02-10 Arnold K. Maano Multi-functional time indicating device with a multi-colored fiber optic display
JP4117352B2 (en) 2002-11-12 2008-07-16 株式会社ソニー・コンピュータエンタテインメント File processing method and apparatus capable of using this method
US7511710B2 (en) 2002-11-25 2009-03-31 Microsoft Corporation Three-dimensional program guide
US7266776B2 (en) * 2002-11-25 2007-09-04 Aol Llc Facilitating communications between computer users across a network
US7203901B2 (en) 2002-11-27 2007-04-10 Microsoft Corporation Small form factor web browsing
US7113809B2 (en) 2002-12-19 2006-09-26 Nokia Corporation Apparatus and a method for providing information to a user
JP2004206230A (en) 2002-12-24 2004-07-22 Casio Comput Co Ltd Electronic apparatus
JP2004208217A (en) 2002-12-26 2004-07-22 Tu-Ka Cellular Tokyo Inc Calling unit of portable telephone
FI20022282A0 (en) 2002-12-30 2002-12-30 Nokia Corp Method for enabling interaction in an electronic device and an electronic device
US7898529B2 (en) 2003-01-08 2011-03-01 Autodesk, Inc. User interface having a placement and layout suitable for pen-based computers
US7117453B2 (en) 2003-01-21 2006-10-03 Microsoft Corporation Media frame object visualization system
US7509321B2 (en) 2003-01-21 2009-03-24 Microsoft Corporation Selection bins for browsing, annotating, sorting, clustering, and filtering media objects
US7383497B2 (en) 2003-01-21 2008-06-03 Microsoft Corporation Random access editing of media
JP2004227393A (en) 2003-01-24 2004-08-12 Sony Corp Icon drawing system, icon drawing method and electronic device
US20040155909A1 (en) 2003-02-07 2004-08-12 Sun Microsystems, Inc. Scroll tray mechanism for cellular telephone
US7403211B2 (en) 2003-02-13 2008-07-22 Lumapix, Inc. Method and system for interactive region segmentation
JP4074530B2 (en) 2003-02-28 2008-04-09 京セラ株式会社 Portable information terminal device
US7231229B1 (en) 2003-03-16 2007-06-12 Palm, Inc. Communication device interface
US7054965B2 (en) 2003-03-18 2006-05-30 Oqo Incorporated Component for use as a portable computing device and pointing device
US7769794B2 (en) 2003-03-24 2010-08-03 Microsoft Corporation User interface for a file system shell
US7650575B2 (en) 2003-03-27 2010-01-19 Microsoft Corporation Rich drag drop user interface
US7587411B2 (en) 2003-03-27 2009-09-08 Microsoft Corporation System and method for filtering and organizing items based on common elements
JP4215549B2 (en) 2003-04-02 2009-01-28 富士通株式会社 Information processing device that operates in touch panel mode and pointing device mode
US7480872B1 (en) 2003-04-06 2009-01-20 Apple Inc. Method and apparatus for dynamically resizing windows
US20040215719A1 (en) 2003-04-09 2004-10-28 Altshuler Dennis Wayne Method and system for designing, editing and publishing web page content in a live internet session
US20040201595A1 (en) 2003-04-11 2004-10-14 Microsoft Corporation Self-orienting display
JP4046000B2 (en) 2003-04-16 2008-02-13 日本電信電話株式会社 Structured document extraction method, apparatus and program
US9223426B2 (en) 2010-10-01 2015-12-29 Z124 Repositioning windows in the pop-up window
US7702811B2 (en) 2003-04-30 2010-04-20 International Business Machines Corporation Method and apparatus for marking of web page portions for revisiting the marked portions
US7233316B2 (en) 2003-05-01 2007-06-19 Thomson Licensing Multimedia user interface
US20050044509A1 (en) 2003-05-07 2005-02-24 Hunleth Frank A. Item selection using helical menus
EP1623301A2 (en) 2003-05-15 2006-02-08 Comcast Cable Holdings LLC Method and system for playing video
JP2004341886A (en) 2003-05-16 2004-12-02 Casio Comput Co Ltd File management device and file management method
JP2004341892A (en) 2003-05-16 2004-12-02 Fuji Xerox Co Ltd Instruction input device, instruction input method, and program
JP2004343662A (en) 2003-05-19 2004-12-02 Sony Corp Imaging apparatus
US9607092B2 (en) 2003-05-20 2017-03-28 Excalibur Ip, Llc Mapping method and system
US7660817B2 (en) 2003-05-22 2010-02-09 Microsoft Corporation System and method for representing content in a file system
US20040243307A1 (en) 2003-06-02 2004-12-02 Pieter Geelen Personal GPS navigation device
JP2004363892A (en) 2003-06-04 2004-12-24 Canon Inc Portable apparatus
JP2005004419A (en) 2003-06-11 2005-01-06 Fuji Photo Film Co Ltd File browsing device and method, and program
JP2005004396A (en) 2003-06-11 2005-01-06 Sony Corp Information display method, information display unit, and computer program
US7051282B2 (en) 2003-06-13 2006-05-23 Microsoft Corporation Multi-layer graphical user interface
JP2006527439A (en) 2003-06-13 2006-11-30 ユニヴァーシティ オブ ランカスター User interface
JP2005018229A (en) 2003-06-24 2005-01-20 Seiko Epson Corp Document browsing terminal, document display control method, and document display control program
US20040268400A1 (en) 2003-06-26 2004-12-30 Microsoft Corporation Quick starting video content
JP2007526548A (en) 2003-06-27 2007-09-13 ソフトスコープ エルエルシー Virtual desktop-meta organization and control system
KR100512616B1 (en) 2003-07-18 2005-09-05 엘지전자 주식회사 (An) image display device for having (a) variable screen ratio and method of controlling the same
US7164410B2 (en) 2003-07-28 2007-01-16 Sig G. Kupka Manipulating an on-screen object using zones surrounding the object
US20050026644A1 (en) * 2003-07-28 2005-02-03 Inventec Appliances Corp. Cellular phone for specific person
US7343564B2 (en) 2003-08-11 2008-03-11 Core Mobility, Inc. Systems and methods for displaying location-based maps on communication devices
US20050039134A1 (en) 2003-08-11 2005-02-17 Sony Corporation System and method for effectively implementing a dynamic user interface in an electronic network
US8065618B2 (en) 2003-08-18 2011-11-22 Sap Ag Customization of an interaction center manager's graphical dashboard
US7325204B2 (en) 2003-08-29 2008-01-29 Yahoo! Inc. Slideout windows
KR20050022117A (en) 2003-08-29 2005-03-07 엘지전자 주식회사 Power saving apparatus and method of mobile communication terminal
US20050060653A1 (en) 2003-09-12 2005-03-17 Dainippon Screen Mfg. Co., Ltd. Object operation apparatus, object operation method and object operation program
US7480873B2 (en) 2003-09-15 2009-01-20 Sun Microsystems, Inc. Method and apparatus for manipulating two-dimensional windows within a three-dimensional display model
US7411575B2 (en) 2003-09-16 2008-08-12 Smart Technologies Ulc Gesture recognition method and touch system incorporating the same
US7594194B2 (en) 2003-09-24 2009-09-22 Nokia Corporation Portrayal of navigation objects
US20050071736A1 (en) 2003-09-26 2005-03-31 Fuji Xerox Co., Ltd. Comprehensive and intuitive media collection and management tool
US20050071778A1 (en) 2003-09-26 2005-03-31 Nokia Corporation Method for dynamic key size prediction with touch displays and an electronic device using the method
US20050071738A1 (en) 2003-09-30 2005-03-31 Park David J. Scan document identification-send scanning using a template so that users can handwrite the destination and identification information
US7290006B2 (en) 2003-09-30 2007-10-30 Microsoft Corporation Document representation for scalable structure
US7620894B1 (en) 2003-10-08 2009-11-17 Apple Inc. Automatic, dynamic user interface configuration
US7719542B1 (en) 2003-10-10 2010-05-18 Adobe Systems Incorporated System, method and user interface controls for communicating status information
JP2005115896A (en) 2003-10-10 2005-04-28 Nec Corp Communication apparatus and method
US7231231B2 (en) 2003-10-14 2007-06-12 Nokia Corporation Method and apparatus for locking a mobile telephone touch screen
FR2861206B1 (en) 2003-10-16 2006-11-24 Michel Rissons METHOD AND DEVICE FOR AUTOMATICALLY ADAPTING DISPLAY
US6990637B2 (en) 2003-10-23 2006-01-24 Microsoft Corporation Graphical user interface for 3-dimensional view of a data collection based on an attribute of the data
US8527896B2 (en) 2003-10-23 2013-09-03 Microsoft Corporation User interface menu with hovering icons
KR100537280B1 (en) 2003-10-29 2005-12-16 삼성전자주식회사 Apparatus and method for inputting character using touch screen in portable terminal
US20050097089A1 (en) 2003-11-05 2005-05-05 Tom Nielsen Persistent user interface for providing navigational functionality
US6970749B1 (en) 2003-11-12 2005-11-29 Adobe Systems Incorporated Grouped palette stashing
JP4408039B2 (en) 2003-11-13 2010-02-03 キヤノン株式会社 Information processing apparatus, printing system, information processing method, and printing method
KR100617827B1 (en) 2003-11-14 2006-08-28 삼성전자주식회사 Apparatus and method for displaying menu of hierarchy structures in mobile terminal equipment
US7814419B2 (en) 2003-11-26 2010-10-12 Nokia Corporation Changing an orientation of a user interface via a course of motion
EP1690176A1 (en) 2003-12-01 2006-08-16 Research In Motion Limited Previewing a new event on a small screen device
US7787971B2 (en) 2003-12-02 2010-08-31 Thermo Fisher Scientific (Asheville) Llc Rotor selection interface and method
US8434027B2 (en) 2003-12-15 2013-04-30 Quantum Matrix Holdings, Llc System and method for multi-dimensional organization, management, and manipulation of remote data
WO2005059699A2 (en) 2003-12-15 2005-06-30 Quantum Matrix Holdings, Llc System and method for multi-dimensional organization, management, and manipulation of data
KR20050060379A (en) 2003-12-16 2005-06-22 (주)모비언스 Button-type device for three dimensional rotation and translation control
US7667703B2 (en) 2003-12-19 2010-02-23 Palo Alto Research Center Incorporated Systems and method for turning pages in a three-dimensional electronic document
US7707503B2 (en) 2003-12-22 2010-04-27 Palo Alto Research Center Incorporated Methods and systems for supporting presentation tools using zoomable user interface
US7085590B2 (en) 2003-12-31 2006-08-01 Sony Ericsson Mobile Communications Ab Mobile terminal with ergonomic imaging functions
US7355593B2 (en) 2004-01-02 2008-04-08 Smart Technologies, Inc. Pointer tracking across multiple overlapping coordinate input sub-regions defining a generally contiguous input region
WO2005067604A2 (en) 2004-01-05 2005-07-28 Oqo Incorporated Docking station for mobile computing device
JP4239090B2 (en) 2004-01-08 2009-03-18 富士フイルム株式会社 File management program
US7401300B2 (en) 2004-01-09 2008-07-15 Nokia Corporation Adaptive user interface input device
KR100588042B1 (en) 2004-01-14 2006-06-09 한국과학기술연구원 Interactive presentation system
JP2005202703A (en) 2004-01-15 2005-07-28 Olympus Corp File management device
US8171084B2 (en) 2004-01-20 2012-05-01 Microsoft Corporation Custom emoticons
US8156175B2 (en) 2004-01-23 2012-04-10 Tiversa Inc. System and method for searching for specific types of people or information on a peer-to-peer network
JP2007525115A (en) 2004-01-29 2007-08-30 コーニンクレッカ フィリップス エレクトロニクス エヌ ヴィ On-screen control of video playback equipment
US20050172239A1 (en) 2004-01-30 2005-08-04 International Business Machines Corporation Modeless interaction with GUI widget applications
JP2005227826A (en) 2004-02-10 2005-08-25 Seiko Epson Corp Device and method for image sequencing, and computer program
US7551187B2 (en) 2004-02-10 2009-06-23 Microsoft Corporation Systems and methods that utilize a dynamic digital zooming interface in connection with digital inking
JP2005227951A (en) 2004-02-12 2005-08-25 Sony Corp Device, method, and program for information processing
US7340678B2 (en) 2004-02-12 2008-03-04 Fuji Xerox Co., Ltd. Systems and methods for creating an interactive 3D visualization of indexed media
JP2005228091A (en) 2004-02-13 2005-08-25 Fuji Xerox Co Ltd Folder management device, method, and program
JP2005228088A (en) 2004-02-13 2005-08-25 Sony Corp Information processor and method, and program
US7437005B2 (en) * 2004-02-17 2008-10-14 Microsoft Corporation Rapid visual sorting of digital files and data
JP2005234291A (en) 2004-02-20 2005-09-02 Nissan Motor Co Ltd Display apparatus and display method
JP4438448B2 (en) 2004-02-26 2010-03-24 セイコーエプソン株式会社 Structured document display processing device, structured document display method, structured document display program
US7788583B1 (en) 2004-03-04 2010-08-31 Google Inc. In-page full screen internet video method
JP2005267049A (en) 2004-03-17 2005-09-29 Sharp Corp Portable information apparatus, and its control method and program
US20050210369A1 (en) 2004-03-18 2005-09-22 Damm John A Jr Methods of updating spreadsheets
JP4325449B2 (en) 2004-03-19 2009-09-02 ソニー株式会社 Display control device, display control method, and recording medium
US7328411B2 (en) 2004-03-19 2008-02-05 Lexmark International, Inc. Scrollbar enhancement for browsing data
US20050216913A1 (en) * 2004-03-23 2005-09-29 Gemmell David J Annotating / rating / organizing / relating content rendered on computer device during idle mode thereof
US7173604B2 (en) 2004-03-23 2007-02-06 Fujitsu Limited Gesture identification of controlled devices
US7546554B2 (en) 2004-03-31 2009-06-09 Fuji Xerox Co., Ltd. Systems and methods for browsing multimedia content on small mobile devices
US7948448B2 (en) 2004-04-01 2011-05-24 Polyvision Corporation Portable presentation system and methods for use therewith
CN100437464C (en) 2004-04-05 2008-11-26 松下电器产业株式会社 Display screen management unit
US7349677B2 (en) 2004-04-08 2008-03-25 Broadcom Corporation Hardware efficient RF transceiver I/Q imbalance compensation based upon taylor approximation
US20050229102A1 (en) 2004-04-12 2005-10-13 Microsoft Corporation System and method for providing an interactive display
CN1257247C (en) 2004-04-13 2006-05-24 杨毅男 Composite type sleet melting and snow removing liquid
JP4241484B2 (en) 2004-04-14 2009-03-18 日本電気株式会社 Portable terminal device, incoming response message transmission method, and server device
US20050231512A1 (en) 2004-04-16 2005-10-20 Niles Gregory E Animation of an object using behaviors
JP2005309933A (en) 2004-04-23 2005-11-04 Canon Inc Enhancement control device, image processing system, method for displaying application icon, program, and storage medium
US20070250768A1 (en) 2004-04-30 2007-10-25 Raiko Funakami Method, Terminal Device and Program for Dynamic Image Scaling Display in Browsing
EP2343699A1 (en) 2004-04-30 2011-07-13 Hillcrest Laboratories, Inc. Methods and devices for identifying users based on tremor
EP1741088B1 (en) 2004-04-30 2012-03-21 Hillcrest Laboratories, Inc. Free space pointing devices with tilt compensation and improved usability
US7565625B2 (en) * 2004-05-06 2009-07-21 Pixar Toolbar slot method and apparatus
JP2005321915A (en) 2004-05-07 2005-11-17 Sony Corp Information processor, information processing method and program
US20050250438A1 (en) 2004-05-07 2005-11-10 Mikko Makipaa Method for enhancing communication, a terminal and a telecommunication system
JP4063246B2 (en) 2004-05-11 2008-03-19 日本電気株式会社 Page information display device
JP5055684B2 (en) 2004-05-13 2012-10-24 ソニー株式会社 Image folder switching device
JP4327016B2 (en) 2004-05-14 2009-09-09 アルパイン株式会社 Input device
US20050278757A1 (en) 2004-05-28 2005-12-15 Microsoft Corporation Downloadable watch faces
JP4148187B2 (en) 2004-06-03 2008-09-10 ソニー株式会社 Portable electronic device, input operation control method and program thereof
JP5132028B2 (en) 2004-06-11 2013-01-30 三菱電機株式会社 User interface device
KR100490373B1 (en) 2004-06-12 2005-05-18 (주)모비솔 Method and apparatus for operating a user interface of a mobile terminal having a pointing device
JP2005352943A (en) 2004-06-14 2005-12-22 Matsushita Electric Ind Co Ltd Information terminal and display control program
US7358962B2 (en) 2004-06-15 2008-04-15 Microsoft Corporation Manipulating association of data with a physical object
US8230358B1 (en) 2004-06-22 2012-07-24 Apple Inc. Defining motion in a computer system with a graphical user interface
US7873916B1 (en) 2004-06-22 2011-01-18 Apple Inc. Color labeling in a graphical user interface
US20050285880A1 (en) 2004-06-23 2005-12-29 Inventec Appliances Corporation Method of magnifying a portion of display
US7490295B2 (en) 2004-06-25 2009-02-10 Apple Inc. Layer for accessing user interface elements
US7546543B2 (en) 2004-06-25 2009-06-09 Apple Inc. Widget authoring and editing environment
US7761800B2 (en) 2004-06-25 2010-07-20 Apple Inc. Unified interest layer for user interface
US7730012B2 (en) 2004-06-25 2010-06-01 Apple Inc. Methods and systems for managing data
US8281241B2 (en) 2004-06-28 2012-10-02 Nokia Corporation Electronic device and method for providing extended user interface
FI20045245A0 (en) 2004-06-28 2004-06-28 Nokia Corp Boost browsing on your electronic device
EP1763732A2 (en) 2004-06-29 2007-03-21 Koninklijke Philips Electronics N.V. Discontinuous zoom
US7464110B2 (en) 2004-06-30 2008-12-09 Nokia Corporation Automated grouping of image and other user data
JP2006018645A (en) 2004-07-02 2006-01-19 Sharp Corp Display apparatus
JP4210936B2 (en) 2004-07-08 2009-01-21 ソニー株式会社 Information processing apparatus and program used therefor
US7669135B2 (en) 2004-07-15 2010-02-23 At&T Mobility Ii Llc Using emoticons, such as for wireless devices
KR100608589B1 (en) 2004-07-24 2006-08-03 삼성전자주식회사 Three dimensional motion graphic user interface and method and apparutus for providing this user interface
US20060020903A1 (en) 2004-07-26 2006-01-26 Shih-Yang Wang Window split system and method
US20060025110A1 (en) 2004-07-28 2006-02-02 Jun Liu Password protection for mobile phones
US20060035628A1 (en) 2004-07-30 2006-02-16 Microsoft Corporation Weather channel
US7653883B2 (en) 2004-07-30 2010-01-26 Apple Inc. Proximity detector in handheld device
JP2008508601A (en) 2004-07-30 2008-03-21 アップル インコーポレイテッド Gestures for touch-sensitive input devices
WO2006020304A2 (en) 2004-07-30 2006-02-23 Apple Computer, Inc. Mode-based graphical user interfaces for touch sensitive input devices
US8381135B2 (en) 2004-07-30 2013-02-19 Apple Inc. Proximity detector in handheld device
US7178111B2 (en) 2004-08-03 2007-02-13 Microsoft Corporation Multi-planar three-dimensional user interface
US7728821B2 (en) 2004-08-06 2010-06-01 Touchtable, Inc. Touch detecting interactive display
US7719523B2 (en) 2004-08-06 2010-05-18 Touchtable, Inc. Bounding box gesture recognition on a touch detecting interactive display
US7724242B2 (en) 2004-08-06 2010-05-25 Touchtable, Inc. Touch driven method and apparatus to integrate and display multiple image layers forming alternate depictions of same subject matter
KR100755684B1 (en) 2004-08-07 2007-09-05 삼성전자주식회사 Three dimensional motion graphic user interface and method and apparutus for providing this user interface
JP2006053678A (en) 2004-08-10 2006-02-23 Toshiba Corp Electronic equipment with universal human interface
US8560972B2 (en) 2004-08-10 2013-10-15 Microsoft Corporation Surface UI for gesture-based interaction
US7721197B2 (en) 2004-08-12 2010-05-18 Microsoft Corporation System and method of displaying content on small screen computing devices
US7071939B2 (en) 2004-08-12 2006-07-04 Broadcom Corporation Unique method for performing zoom-in and zoom-out operations with horizontal and vertical video decimation within a wireless device having a video display
JP4701027B2 (en) 2004-09-02 2011-06-15 キヤノン株式会社 Information processing apparatus, control method, and program
KR100677129B1 (en) 2004-09-03 2007-02-02 삼성전자주식회사 Storage medium recording interactive graphic stream and reproducing apparatus and method thereof
CA2481065C (en) 2004-09-07 2014-01-21 Research In Motion Limited System and method for inserting a graphic object in to a text based message
US7761814B2 (en) 2004-09-13 2010-07-20 Microsoft Corporation Flick gesture
WO2006030862A1 (en) 2004-09-17 2006-03-23 Nikon Corporation Electronic apparatus
US20060064647A1 (en) 2004-09-23 2006-03-23 Tapuska David F Web browser graphical user interface and method for implementing same
BRPI0419168B1 (en) 2004-09-24 2017-05-16 Nokia Corp electronic device comprising detecting a user's input during an idle operating mode
NO20044073D0 (en) 2004-09-27 2004-09-27 Isak Engquist Information Processing System and Procedures
US7995078B2 (en) 2004-09-29 2011-08-09 Noregin Assets, N.V., L.L.C. Compound lenses for multi-source data presentation
WO2006039516A2 (en) 2004-09-30 2006-04-13 Millennium It (Usa) Inc. System and method for configurable trading system
JP2006134288A (en) 2004-10-06 2006-05-25 Sharp Corp Interface and interface program executed by computer
US7561157B2 (en) 2004-10-06 2009-07-14 Apple Inc. Compare mode for variable number of images
US7778671B2 (en) 2004-10-08 2010-08-17 Nokia Corporation Mobile communications terminal having an improved user interface and method therefor
EP1821182B1 (en) 2004-10-12 2013-03-27 Nippon Telegraph And Telephone Corporation 3d pointing method, 3d display control method, 3d pointing device, 3d display control device, 3d pointing program, and 3d display control program
US20060080616A1 (en) * 2004-10-13 2006-04-13 Xerox Corporation Systems, methods and user interfaces for document workflow construction
US7345688B2 (en) 2004-10-18 2008-03-18 Microsoft Corporation Semantic thumbnails
US8169410B2 (en) 2004-10-20 2012-05-01 Nintendo Co., Ltd. Gesture inputs for a portable display device
US20060090022A1 (en) 2004-10-22 2006-04-27 Intergraph Hardware Technologies Company Input device for controlling movement in a three-dimensional virtual environment
US20060092770A1 (en) 2004-10-30 2006-05-04 Demas Theodore J Information displays and methods associated therewith
US7683883B2 (en) 2004-11-02 2010-03-23 Pierre Touma 3D mouse and game controller based on spherical coordinates system and system for use
EP1659766B1 (en) 2004-11-09 2007-02-28 Research In Motion Limited Dynamic bar oriented user interface
KR100697072B1 (en) 2004-11-10 2007-03-20 엘지전자 주식회사 Mobile Communication Terminal enable to arrange the indicator icons in display panel
US7603105B2 (en) 2004-11-12 2009-10-13 Research In Motion Limited Method of making phone calls from a locked out handheld electronic device and a handheld electronic device incorporating the same
US7657842B2 (en) 2004-11-12 2010-02-02 Microsoft Corporation Sidebar tile free-arrangement
US8001476B2 (en) 2004-11-16 2011-08-16 Open Text Inc. Cellular user interface
US7925996B2 (en) 2004-11-18 2011-04-12 Microsoft Corporation Method and system for providing multiple input connecting user interface
US7530030B2 (en) 2004-11-24 2009-05-05 Microsoft Corporation Facilitating target acquisition by expanding targets
JP2006155232A (en) 2004-11-29 2006-06-15 Fuji Xerox Co Ltd Operation display device
US20060123360A1 (en) 2004-12-03 2006-06-08 Picsel Research Limited User interfaces for data processing devices and systems
US20060123359A1 (en) 2004-12-03 2006-06-08 Schatzberger Richard J Portable electronic device having user interactive visual interface
US7665031B2 (en) 2004-12-08 2010-02-16 Microsoft Corporation Method and system of taskbar button interfaces
AU2004240229B2 (en) 2004-12-20 2011-04-07 Canon Kabushiki Kaisha A radial, three-dimensional, hierarchical file system view
US7683889B2 (en) 2004-12-21 2010-03-23 Microsoft Corporation Pressure based selection
US7489306B2 (en) 2004-12-22 2009-02-10 Microsoft Corporation Touch screen accuracy
EP1677182B1 (en) 2004-12-28 2014-04-23 Sony Mobile Communications Japan, Inc. Display method, portable terminal device, and display program
TW200622661A (en) 2004-12-30 2006-07-01 Tatung Co Ltd Method of zooming out a display image for a portable electrical device
KR101085447B1 (en) 2004-12-31 2011-11-21 삼성전자주식회사 Touch position detecting device and method of touch position detecting the same and touch screen display device having the same
TWI254558B (en) 2005-01-18 2006-05-01 Asustek Comp Inc Mobile communication device with a transition effect function
US8190466B2 (en) 2005-01-21 2012-05-29 Hntb Holdings Ltd Methods and systems for identifying safe havens for hazardous transports
US8302011B2 (en) 2005-01-24 2012-10-30 A9.Com, Inc. Technique for modifying presentation of information displayed to end users of a computer system
KR101034439B1 (en) 2005-01-25 2011-05-12 엘지전자 주식회사 Multimedia device control system based on pattern recognition in touch screen
US20060164418A1 (en) 2005-01-25 2006-07-27 Hao Ming C Method and system for automated visualization using common scale
US7404151B2 (en) 2005-01-26 2008-07-22 Attenex Corporation System and method for providing a dynamic user interface for a dense three-dimensional scene
GB0502891D0 (en) 2005-02-12 2005-03-16 Next Device Ltd User interfaces
US7770125B1 (en) 2005-02-16 2010-08-03 Adobe Systems Inc. Methods and apparatus for automatically grouping graphical constructs
US7952564B2 (en) 2005-02-17 2011-05-31 Hurst G Samuel Multiple-touch sensor
US8819569B2 (en) 2005-02-18 2014-08-26 Zumobi, Inc Single-handed approach for navigation of application tiles using panning and zooming
KR100617821B1 (en) 2005-02-24 2006-08-28 삼성전자주식회사 User interfacing apparatus and method
DE602005025700D1 (en) 2005-03-03 2011-02-10 Nokia Corp USER INTERFACE COMPONENT
JP4254732B2 (en) 2005-03-17 2009-04-15 ヤマハ株式会社 Electronic music apparatus and program
US20060209035A1 (en) 2005-03-17 2006-09-21 Jenkins Phillip D Device independent specification of navigation shortcuts in an application
US8046714B2 (en) 2005-03-17 2011-10-25 Clarion Co., Ltd. Method, program and device for displaying menu
US7984381B2 (en) 2005-03-18 2011-07-19 Nokia Corporation User interface
US7710423B2 (en) 2005-03-21 2010-05-04 Microsoft Corproation Automatic layout of items along an embedded one-manifold path
US8147248B2 (en) 2005-03-21 2012-04-03 Microsoft Corporation Gesture training
US7340686B2 (en) * 2005-03-22 2008-03-04 Microsoft Corporation Operating system program launch menu search
US8205172B2 (en) 2005-03-31 2012-06-19 Microsoft Corporation Graphical web browser history toolbar
US7506268B2 (en) 2005-04-07 2009-03-17 Microsoft Corporation User interface with visual tracking feature
US7512898B2 (en) 2005-04-07 2009-03-31 Microsoft Corporation User interface with multi-state menu
JP4533791B2 (en) 2005-04-19 2010-09-01 株式会社日立製作所 Information browsing device
US7856602B2 (en) 2005-04-20 2010-12-21 Apple Inc. Updatable menu items
US7614016B2 (en) * 2005-04-21 2009-11-03 Microsoft Corporation Multiple roots in navigation pane
US20060252442A1 (en) 2005-05-04 2006-11-09 Nokia Corporation Method for establishing a PoC connection in a terminal device with a touch-screen display, an application used in the method and a terminal device
KR101307716B1 (en) 2005-05-04 2013-09-11 힐크레스트 래보래토리스, 인크. Methods and systems for scrolling and pointing in user interfaces
US9176934B2 (en) 2005-05-06 2015-11-03 Leo Baschy User interface for nonuniform access control system and methods
US20060250578A1 (en) 2005-05-06 2006-11-09 Pohl Garrick G Systems and methods for controlling, monitoring, and using remote applications
KR100606803B1 (en) 2005-05-16 2006-08-01 엘지전자 주식회사 Mobile communication terminal with performing function using scroll wheel device and method of performing function using this
US7587671B2 (en) 2005-05-17 2009-09-08 Palm, Inc. Image repositioning, storage and retrieval
WO2006125133A2 (en) 2005-05-19 2006-11-23 Hillcrest Laboratories, Inc. Global navigation objects in user interfaces
US20070024646A1 (en) 2005-05-23 2007-02-01 Kalle Saarinen Portable electronic apparatus and associated method
US20060267966A1 (en) 2005-05-24 2006-11-30 Microsoft Corporation Hover widgets: using the tracking state to extend capabilities of pen-operated devices
US7530029B2 (en) * 2005-05-24 2009-05-05 Microsoft Corporation Narrow mode navigation pane
US8020110B2 (en) 2005-05-26 2011-09-13 Weisermazars Llp Methods for defining queries, generating query results and displaying same
US20060271867A1 (en) 2005-05-27 2006-11-30 Wang Kong Q Mobile communications terminal and method therefore
US7797641B2 (en) 2005-05-27 2010-09-14 Nokia Corporation Mobile communications terminal and method therefore
JP2006338205A (en) 2005-05-31 2006-12-14 Casio Comput Co Ltd Electronic equipment and control program thereof
US8365084B1 (en) 2005-05-31 2013-01-29 Adobe Systems Incorporated Method and apparatus for arranging the display of sets of information while preserving context
US20060277588A1 (en) 2005-06-01 2006-12-07 Madison Software Inc. Method for making a Web-DVD
US20060277486A1 (en) 2005-06-02 2006-12-07 Skinner David N File or user interface element marking system
US20060277481A1 (en) 2005-06-03 2006-12-07 Scott Forstall Presenting clips of content
US20060277460A1 (en) 2005-06-03 2006-12-07 Scott Forstall Webview applications
US9098597B2 (en) 2005-06-03 2015-08-04 Apple Inc. Presenting and managing clipped content
US7195170B2 (en) * 2005-06-09 2007-03-27 Fuji Xerox Co., Ltd. Post-bit: multimedia ePaper stickies
US7685530B2 (en) 2005-06-10 2010-03-23 T-Mobile Usa, Inc. Preferred contact group centric interface
WO2006131780A1 (en) 2005-06-10 2006-12-14 Nokia Corporation Re-configuring the standby screen of an electronic device
US7432928B2 (en) 2005-06-14 2008-10-07 Microsoft Corporation User interface state reconfiguration through animation
US7676767B2 (en) 2005-06-15 2010-03-09 Microsoft Corporation Peel back user interface to show hidden functions
US7487467B1 (en) 2005-06-23 2009-02-03 Sun Microsystems, Inc. Visual representation and other effects for application management on a device with a small screen
US20070004451A1 (en) 2005-06-30 2007-01-04 C Anderson Eric Controlling functions of a handheld multifunction device
KR100800995B1 (en) 2005-07-11 2008-02-05 삼성전자주식회사 Apparatus and method for displaying icon
US20070016958A1 (en) 2005-07-12 2007-01-18 International Business Machines Corporation Allowing any computer users access to use only a selection of the available applications
JP4590320B2 (en) 2005-07-14 2010-12-01 キヤノン株式会社 Information management apparatus, control method therefor, and computer program
AU2005203074A1 (en) 2005-07-14 2007-02-01 Canon Information Systems Research Australia Pty Ltd Image browser
JP2007052403A (en) 2005-07-19 2007-03-01 Canon Inc Display apparatus, method, and program, and storage medium
US20070022388A1 (en) 2005-07-20 2007-01-25 Cisco Technology, Inc. Presence display icon and method
US7559033B2 (en) 2005-07-21 2009-07-07 International Business Machines Corporation Method and system for improving selection capability for user interface
JP4815927B2 (en) 2005-07-27 2011-11-16 ソニー株式会社 DISPLAY DEVICE, MENU DISPLAY METHOD, MENU DISPLAY METHOD PROGRAM, AND RECORDING MEDIUM CONTAINING MENU DISPLAY METHOD PROGRAM
US20100169357A1 (en) 2005-08-01 2010-07-01 Michael Ingrassia Method, Apparatus, and Computer Program Product for Automatically Obtaining Custom Interface Elements When Changing UI Themes by Querying a Remote Repository
US7360166B1 (en) 2005-08-17 2008-04-15 Clipmarks Llc System, method and apparatus for selecting, displaying, managing, tracking and transferring access to content of web pages and other sources
US7644391B2 (en) 2005-08-18 2010-01-05 Microsoft Corporation Sidebar engine, object model and schema
JP2007058785A (en) 2005-08-26 2007-03-08 Canon Inc Information processor, and operating method for drag object in the same
US20070055947A1 (en) 2005-09-02 2007-03-08 Microsoft Corporation Animations and transitions
JP2009508205A (en) 2005-09-08 2009-02-26 パワー2ビー,インコーポレイティド Display and information input device
US20070061745A1 (en) 2005-09-09 2007-03-15 Microsoft Corporation Nested views in an electronic file system
EP1768053A1 (en) 2005-09-12 2007-03-28 Honda Research Institute Europe GmbH Evolutionary search for robust solutions
US8026920B2 (en) 2005-09-13 2011-09-27 Microsoft Corporation Extensible visual effects on active content in user interfaces
CN101300621B (en) 2005-09-13 2010-11-10 时空3D公司 System and method for providing three-dimensional graphical user interface
US20080259057A1 (en) 2005-09-14 2008-10-23 Johannes Brons Electronic Reading Device Mimicking a Reading experience of a Paper Document
WO2007031816A1 (en) 2005-09-14 2007-03-22 Nokia Corporation A device, method, computer program and user interface for enabling a user to vary which items are displayed to the user
US7694231B2 (en) 2006-01-05 2010-04-06 Apple Inc. Keyboards for portable electronic devices
US20070152980A1 (en) 2006-01-05 2007-07-05 Kenneth Kocienda Touch Screen Keyboards for Portable Electronic Devices
US7873356B2 (en) 2005-09-16 2011-01-18 Microsoft Corporation Search interface for mobile devices
US20070067738A1 (en) 2005-09-16 2007-03-22 Microsoft Corporation Extensible, filtered lists for mobile device user interface
CN1940833A (en) 2005-09-26 2007-04-04 鸿富锦精密工业(深圳)有限公司 Multilevel menu display device and method
JP4982065B2 (en) 2005-09-26 2012-07-25 株式会社東芝 Video content display system, video content display method and program thereof
US7633076B2 (en) 2005-09-30 2009-12-15 Apple Inc. Automated response to and sensing of user activity in portable devices
US7714265B2 (en) 2005-09-30 2010-05-11 Apple Inc. Integrated proximity sensor and light sensor
US7966577B2 (en) 2005-10-11 2011-06-21 Apple Inc. Multimedia control center
WO2007042857A1 (en) 2005-10-07 2007-04-19 Nokia Corporation A graphical user interface, a method, a device and a computer program for providing a menu and/or inputting an access code
US8769408B2 (en) 2005-10-07 2014-07-01 Apple Inc. Intelligent media navigation
US8037421B2 (en) 2005-10-11 2011-10-11 Research In Motion Limited System and method for organizing application indicators on an electronic device
KR100679039B1 (en) 2005-10-21 2007-02-05 삼성전자주식회사 Three dimensional graphic user interface, method and apparatus for providing the user interface
US7954064B2 (en) 2005-10-27 2011-05-31 Apple Inc. Multiple dashboards
KR100837162B1 (en) 2005-10-28 2008-06-11 엘지전자 주식회사 Communication Terminal with Multi-input Device
US8634425B2 (en) 2005-11-04 2014-01-21 At&T Intellectual Property I, L.P. Profile sharing across persona
JP2007132676A (en) 2005-11-08 2007-05-31 Xanavi Informatics Corp Navigation device
US8943035B2 (en) 2005-11-14 2015-01-27 Patrick J. Ferrel Distributing web applications across a pre-existing web
US7725839B2 (en) 2005-11-15 2010-05-25 Microsoft Corporation Three-dimensional active file explorer
US20070113207A1 (en) 2005-11-16 2007-05-17 Hillcrest Laboratories, Inc. Methods and systems for gesture classification in 3D pointing devices
US7730425B2 (en) 2005-11-30 2010-06-01 De Los Reyes Isabelo Function-oriented user interface
US7788607B2 (en) 2005-12-01 2010-08-31 Navisense Method and system for mapping virtual coordinates
US7663620B2 (en) 2005-12-05 2010-02-16 Microsoft Corporation Accessing 2D graphic content using axonometric layer views
US7958456B2 (en) 2005-12-23 2011-06-07 Apple Inc. Scrolling list with floating adjacent index symbols
US20070132789A1 (en) 2005-12-08 2007-06-14 Bas Ording List scrolling in response to moving contact over list of index symbols
US8607147B2 (en) 2005-12-09 2013-12-10 International Business Machines Corporation System and methods for previewing alternative compositions and arrangements when composing a strictly-structured flow diagram
KR100801089B1 (en) 2005-12-13 2008-02-05 삼성전자주식회사 Mobile device and operation method control available for using touch and drag
EP1801711A1 (en) * 2005-12-21 2007-06-27 Transmedia Communications Sàrl Method for remotely organizing audio-visual items stored in a central database
US7480870B2 (en) 2005-12-23 2009-01-20 Apple Inc. Indication of progress towards satisfaction of a user input condition
US7650137B2 (en) 2005-12-23 2010-01-19 Apple Inc. Account information display for portable communication device
US7657849B2 (en) 2005-12-23 2010-02-02 Apple Inc. Unlocking a device by performing gestures on an unlock image
EP1804154A3 (en) 2005-12-27 2012-08-08 Poston Timothy Computer input device enabling three degrees of freedom and related input and feedback methods
US9557887B2 (en) 2005-12-27 2017-01-31 International Business Machines Corporation Integrated multidimensional view of hierarchical objects
US7552399B2 (en) 2005-12-27 2009-06-23 International Business Machines Corporation Extensible icons with multiple drop zones
US7503009B2 (en) 2005-12-29 2009-03-10 Sap Ag Multifunctional icon in icon-driven computer system
US7509588B2 (en) 2005-12-30 2009-03-24 Apple Inc. Portable electronic device with interface reconfiguration mode
US7812826B2 (en) 2005-12-30 2010-10-12 Apple Inc. Portable electronic device with multi-touch input
US20070157228A1 (en) 2005-12-30 2007-07-05 Jason Bayer Advertising with video ad creatives
US7596761B2 (en) 2006-01-05 2009-09-29 Apple Inc. Application user interface with navigation bar showing current and prior application contexts
US7860536B2 (en) 2006-01-05 2010-12-28 Apple Inc. Telephone interface for a portable communication device
EP1977312A2 (en) 2006-01-16 2008-10-08 Zlango Ltd. Iconic communication
US7587684B2 (en) 2006-01-23 2009-09-08 Nokia Corporation Mobile communication terminal and method therefore
US20070174384A1 (en) 2006-01-25 2007-07-26 John Abd-El-Malek Sidebar communication system and method
US7612786B2 (en) 2006-02-10 2009-11-03 Microsoft Corporation Variable orientation input mode
WO2007095504A1 (en) 2006-02-10 2007-08-23 Access Systems Americas, Inc. Improved user-interface and architecture portable processing device
US8139514B2 (en) 2006-02-24 2012-03-20 Yahoo! Inc. Method and system for communicating with multiple users via a map over the internet
US7450003B2 (en) 2006-02-24 2008-11-11 Yahoo! Inc. User-defined private maps
US7557804B1 (en) 2006-03-06 2009-07-07 Adobe Systems Inc. Methods and apparatus for three-dimensional isographic navigation
US7828772B2 (en) 2006-03-15 2010-11-09 Bioquiddity, Inc. Fluid dispensing device
KR100877829B1 (en) 2006-03-21 2009-01-12 엘지전자 주식회사 Terminal with scrolling function and scrolling method thereof
JP2007257336A (en) 2006-03-23 2007-10-04 Sony Corp Information processor, information processing method and program thereof
US20080229254A1 (en) 2006-03-24 2008-09-18 Ervin-Dawson Warner Method and system for enhanced cursor control
US7720893B2 (en) 2006-03-31 2010-05-18 Research In Motion Limited Methods and apparatus for providing map locations in user applications using URL strings
US20070239760A1 (en) 2006-04-09 2007-10-11 Daniel Simon System for providing an interactive intelligent internet based knowledgebase
US8548452B2 (en) 2006-04-13 2013-10-01 Blackberry Limited System and method for controlling device usage
US8968077B2 (en) 2006-04-13 2015-03-03 Idt Methods and systems for interfacing with a third-party application
US20070245250A1 (en) 2006-04-18 2007-10-18 Microsoft Corporation Microsoft Patent Group Desktop window manager using an advanced user interface construction framework
KR100771626B1 (en) 2006-04-25 2007-10-31 엘지전자 주식회사 Terminal device and method for inputting instructions thereto
US8279180B2 (en) 2006-05-02 2012-10-02 Apple Inc. Multipoint touch surface controller
JP2007300565A (en) 2006-05-03 2007-11-15 Sony Computer Entertainment Inc Multimedia reproduction device, and menu screen display method
US7783990B2 (en) * 2006-05-05 2010-08-24 Microsoft Corporation Association of display elements
TW200743350A (en) 2006-05-08 2007-11-16 Mediatek Inc System and method for controlling a portable electronic device
WO2007134164A2 (en) 2006-05-10 2007-11-22 Google Inc. Managing and accessing data in web notebooks
CN102081645B (en) 2006-05-10 2014-11-26 谷歌公司 WEB notebook tools
US7783085B2 (en) 2006-05-10 2010-08-24 Aol Inc. Using relevance feedback in face recognition
EP1920314A4 (en) 2006-05-16 2008-09-03 Research In Motion Ltd System and method of skinning the user interface of an application
US20070271532A1 (en) 2006-05-19 2007-11-22 Nguyen Loc V Method and apparatus for displaying layered user interface
SE0601216L (en) 2006-05-31 2007-12-01 Abb Technology Ltd Virtual workplace
US8571580B2 (en) 2006-06-01 2013-10-29 Loopt Llc. Displaying the location of individuals on an interactive map display on a mobile communication device
JP4759743B2 (en) 2006-06-06 2011-08-31 国立大学法人 東京大学 Object display processing device, object display processing method, and object display processing program
KR100827230B1 (en) 2006-06-09 2008-05-07 삼성전자주식회사 Portable device and method for providing menu icon
JP2007334984A (en) 2006-06-14 2007-12-27 Toshiba Corp Video library management method and apparatus
US7496595B2 (en) 2006-06-16 2009-02-24 International Business Machines Corporation Methodology for directory categorization for categorized files
US7552402B2 (en) 2006-06-22 2009-06-23 Microsoft Corporation Interface orientation using shadows
US8086971B2 (en) 2006-06-28 2011-12-27 Nokia Corporation Apparatus, methods and computer program products providing finger-based and hand-based gesture commands for portable electronic device applications
US7880728B2 (en) 2006-06-29 2011-02-01 Microsoft Corporation Application switching via a touch screen interface
JP4612902B2 (en) 2006-07-04 2011-01-12 キヤノン株式会社 File display device, control method therefor, and program
US20080062126A1 (en) 2006-07-06 2008-03-13 Algreatly Cherif A 3D method and system for hand-held devices
US8972902B2 (en) 2008-08-22 2015-03-03 Northrop Grumman Systems Corporation Compound gesture recognition
US9696808B2 (en) 2006-07-13 2017-07-04 Northrop Grumman Systems Corporation Hand-gesture recognition method
US20080016471A1 (en) 2006-07-14 2008-01-17 Samsung Electronics Co., Ltd. Electronic device for providing 3D user interface and method of providing a 3D user interface
JP4912214B2 (en) 2006-07-25 2012-04-11 キヤノン株式会社 Information processing apparatus and object display method
US20080024454A1 (en) 2006-07-31 2008-01-31 Paul Everest Three-dimensional touch pad input device
US20080034309A1 (en) 2006-08-01 2008-02-07 Louch John O Multimedia center including widgets
US10503342B2 (en) 2006-08-04 2019-12-10 Apple Inc. User interface spaces
US7856424B2 (en) 2006-08-04 2010-12-21 Apple Inc. User interface for backup management
US7996789B2 (en) 2006-08-04 2011-08-09 Apple Inc. Methods and apparatuses to control application programs
US7908569B2 (en) 2006-08-10 2011-03-15 Nokia Corporation Creating virtual targets in directory structures
DE102006037510B3 (en) 2006-08-10 2008-04-10 Infineon Technologies Austria Ag A method for producing a trench structure, the use of this method for producing a semiconductor device and semiconductor device having a trench structure
KR100781706B1 (en) 2006-08-16 2007-12-03 삼성전자주식회사 Device and method for scrolling list in mobile terminal
US7665033B2 (en) 2006-08-31 2010-02-16 Sun Microsystems, Inc. Using a zooming effect to provide additional display space for managing applications
US8051388B2 (en) 2006-08-31 2011-11-01 Access Co., Ltd. Device having bookmark thumbnail management function
US7805684B2 (en) 2006-09-01 2010-09-28 Nokia Corporation Mobile communications terminal
US8316324B2 (en) 2006-09-05 2012-11-20 Navisense Method and apparatus for touchless control of a device
US7956849B2 (en) 2006-09-06 2011-06-07 Apple Inc. Video manager for portable multifunction device
US8253695B2 (en) 2006-09-06 2012-08-28 Apple Inc. Email client for a portable multifunction device
US8564544B2 (en) 2006-09-06 2013-10-22 Apple Inc. Touch screen device, method, and graphical user interface for customizing display of content category icons
US10313505B2 (en) 2006-09-06 2019-06-04 Apple Inc. Portable multifunction device, method, and graphical user interface for configuring and displaying widgets
US7940250B2 (en) 2006-09-06 2011-05-10 Apple Inc. Web-clip widgets on a portable multifunction device
US8395658B2 (en) 2006-09-07 2013-03-12 Sony Computer Entertainment Inc. Touch screen-like user interface that does not require actual touching
US7853972B2 (en) 2006-09-11 2010-12-14 Apple Inc. Media preview user interface
US8736557B2 (en) 2006-09-11 2014-05-27 Apple Inc. Electronic device with image based browsers
US20080062137A1 (en) 2006-09-11 2008-03-13 Apple Computer, Inc. Touch actuation controller for multi-state media presentation
US8564543B2 (en) 2006-09-11 2013-10-22 Apple Inc. Media player with imaged based browsing
JP4311432B2 (en) 2006-09-29 2009-08-12 ブラザー工業株式会社 Information processing apparatus and program
KR100783552B1 (en) 2006-10-11 2007-12-07 삼성전자주식회사 Input control method and device for mobile phone
SG161266A1 (en) 2006-10-13 2010-05-27 Quipa Holdings Ltd A method for sharing functionality and/or data between two or more linked entities
JP2008102860A (en) 2006-10-20 2008-05-01 Nec Corp Small electronic device and menu display program
US20080104515A1 (en) 2006-10-30 2008-05-01 Dan Dumitru System and method for slide presentation
KR100785066B1 (en) 2006-11-06 2007-12-12 삼성전자주식회사 Phone book group managing method for portable terminal
US7642934B2 (en) 2006-11-10 2010-01-05 Research In Motion Limited Method of mapping a traditional touchtone keypad on a handheld electronic device and associated apparatus
US20080120568A1 (en) 2006-11-20 2008-05-22 Motorola, Inc. Method and device for entering data using a three dimensional position of a pointer
US8705081B2 (en) 2006-11-22 2014-04-22 Fuji Xerox Co., Ltd. System and method for including input method information on a printed document
EP1959407A3 (en) 2006-11-27 2008-09-03 Aristocrat Technologies Australia PTY Ltd Gaming machine with touch screen
US7518959B2 (en) 2006-12-01 2009-04-14 Seiko Epson Corporation Display device and display method
US20080134088A1 (en) 2006-12-05 2008-06-05 Palm, Inc. Device for saving results of location based searches
US7692629B2 (en) 2006-12-07 2010-04-06 Microsoft Corporation Operating touch screen interfaces
KR100822295B1 (en) 2006-12-07 2008-04-16 삼성전자주식회사 Method and system for transferring message in mobile communication terminal
US8006002B2 (en) 2006-12-12 2011-08-23 Apple Inc. Methods and systems for automatic configuration of peripherals
US7836475B2 (en) 2006-12-20 2010-11-16 Verizon Patent And Licensing Inc. Video access
US7940604B2 (en) 2006-12-21 2011-05-10 Seiko Epson Corporation Dial indicator display device
US20080161045A1 (en) 2006-12-29 2008-07-03 Nokia Corporation Method, Apparatus and Computer Program Product for Providing a Link to Contacts on the Idle Screen
US8970501B2 (en) 2007-01-03 2015-03-03 Apple Inc. Proximity and multi-touch sensor detection and demodulation
US7855718B2 (en) 2007-01-03 2010-12-21 Apple Inc. Multi-touch input discrimination
US7692274B2 (en) 2007-01-04 2010-04-06 Taiwan Semiconductor Manufacturing Co., Ltd. Reinforced semiconductor structures
US7924271B2 (en) 2007-01-05 2011-04-12 Apple Inc. Detecting gestures on multi-event sensitive devices
US8214768B2 (en) 2007-01-05 2012-07-03 Apple Inc. Method, system, and graphical user interface for viewing multiple application windows
US9001047B2 (en) 2007-01-07 2015-04-07 Apple Inc. Modal change based on orientation of a portable multifunction device
US20080222545A1 (en) 2007-01-07 2008-09-11 Lemay Stephen O Portable Electronic Device with a Global Setting User Interface
US20080168368A1 (en) 2007-01-07 2008-07-10 Louch John O Dashboards, Widgets and Devices
US20080168367A1 (en) 2007-01-07 2008-07-10 Chaudhri Imran A Dashboards, Widgets and Devices
US8519964B2 (en) 2007-01-07 2013-08-27 Apple Inc. Portable multifunction device, method, and graphical user interface supporting user navigations of graphical objects on a touch screen display
US20080168382A1 (en) 2007-01-07 2008-07-10 Louch John O Dashboards, Widgets and Devices
US8788954B2 (en) 2007-01-07 2014-07-22 Apple Inc. Web-clip widgets on a portable multifunction device
US7957762B2 (en) 2007-01-07 2011-06-07 Apple Inc. Using ambient light sensor to augment proximity sensor output
US20080168478A1 (en) 2007-01-07 2008-07-10 Andrew Platzer Application Programming Interfaces for Scrolling
US8060825B2 (en) 2007-01-07 2011-11-15 Apple Inc. Creating digital artwork based on content file metadata
JP4189004B2 (en) 2007-01-10 2008-12-03 シャープ株式会社 Portable information terminal and control program
US7742774B2 (en) 2007-01-11 2010-06-22 Virgin Mobile Usa, L.P. Location-based text messaging
US7986324B2 (en) 2007-01-12 2011-07-26 Fujitsu Limited Display device, display program storage medium and display method
KR20080068781A (en) 2007-01-20 2008-07-24 엘지전자 주식회사 Electronic device with touch screen and method of displaying information using same
US20080182628A1 (en) 2007-01-26 2008-07-31 Matthew Lee System and method for previewing themes
US8996045B2 (en) 2007-01-29 2015-03-31 Blackberry Limited Method of e-mailing a map location using predefined context-sensitive messages
EP1956472A1 (en) 2007-01-31 2008-08-13 Research In Motion Limited System and method for organizing icons for applications on a mobile device
WO2008095137A2 (en) 2007-01-31 2008-08-07 Perceptive Pixel, Inc. Methods of interfacing with multi-point input devices and multi-point input systems employing interfacing techniques
US8601370B2 (en) 2007-01-31 2013-12-03 Blackberry Limited System and method for organizing icons for applications on a mobile device
US20080189108A1 (en) 2007-02-05 2008-08-07 Comverse Ltd. Text messaging in a telephony network
US8972898B2 (en) 2007-02-06 2015-03-03 Novell Intellectual Properties, Inc. Techniques for representing and navigating information in three dimensions
KR101239797B1 (en) 2007-02-07 2013-03-06 엘지전자 주식회사 Electronic Device With Touch Screen And Method Of Providing Analog Clock Using Same
EP2127321A2 (en) 2007-02-09 2009-12-02 Novarra, Inc. Method and system for providing portions of information content to a client device
KR101450584B1 (en) 2007-02-22 2014-10-14 삼성전자주식회사 Method for displaying screen in terminal
CN101636720A (en) 2007-03-20 2010-01-27 株式会社爱可信 Terminal having application update managing function, and application update managing program and system
US7765266B2 (en) 2007-03-30 2010-07-27 Uranus International Limited Method, apparatus, system, medium, and signals for publishing content created during a communication
JP4547633B2 (en) 2007-03-30 2010-09-22 ソニー株式会社 Information processing apparatus, information processing method, and information processing program
ES2606396T3 (en) 2007-03-30 2017-03-23 Microsoft Technology Licensing, Llc Method for controlling a mobile communication device equipped with a touch screen, communication device and method for executing its functions
JP4999523B2 (en) 2007-04-10 2012-08-15 三菱電機株式会社 Information processing apparatus, information processing method, and information processing program
KR20080096134A (en) 2007-04-27 2008-10-30 엘지전자 주식회사 Mobile communication terminal and webpage controlling method thereof
US20080268882A1 (en) 2007-04-30 2008-10-30 Palm, Inc. Short message service enhancement techniques for added communication options
US7979809B2 (en) 2007-05-11 2011-07-12 Microsoft Corporation Gestured movement of object to display edge
US20080294981A1 (en) 2007-05-21 2008-11-27 Advancis.Com, Inc. Page clipping tool for digital publications
US9317110B2 (en) 2007-05-29 2016-04-19 Cfph, Llc Game with hand motion control
US20080300572A1 (en) 2007-06-01 2008-12-04 Medtronic Minimed, Inc. Wireless monitor for a personal medical device system
JP4900058B2 (en) 2007-06-05 2012-03-21 ブラザー工業株式会社 Label data creation device, program, and recording medium
US8423914B2 (en) 2007-06-08 2013-04-16 Apple Inc. Selection user interface
US20080307362A1 (en) 2007-06-08 2008-12-11 Apple Inc. Desktop Filter
US7917846B2 (en) 2007-06-08 2011-03-29 Apple Inc. Web clip using anchoring
US9232042B2 (en) 2007-07-20 2016-01-05 Broadcom Corporation Method and system for utilizing and modifying user preference information to create context data tags in a wireless system
JP5070579B2 (en) 2007-06-11 2012-11-14 シャープ株式会社 Information communication terminal and processing program
US8218734B2 (en) 2007-06-12 2012-07-10 Microsoft Corporation Messaging with a locked communication device
US7971180B2 (en) 2007-06-13 2011-06-28 International Business Machines Corporation Method and system for evaluating multi-dimensional project plans for implementing packaged software applications
US8055606B2 (en) 2007-06-13 2011-11-08 International Business Machines Corporation Method and system for self-calibrating project estimation models for packaged software applications
US8681104B2 (en) 2007-06-13 2014-03-25 Apple Inc. Pinch-throw and translation gestures
US8059101B2 (en) 2007-06-22 2011-11-15 Apple Inc. Swipe gestures for touch screen keyboards
JP2009009350A (en) 2007-06-27 2009-01-15 Nec Corp Health management system, information processing unit, health management method, and control program
US20090002324A1 (en) 2007-06-27 2009-01-01 Nokia Corporation Method, Apparatus and Computer Program Product for Providing a Scrolling Mechanism for Touch Screen Devices
US8214793B1 (en) 2007-06-28 2012-07-03 Adobe Systems Incorporated Automatic restoration of tool configuration while navigating layers of a composition
US8127254B2 (en) 2007-06-29 2012-02-28 Nokia Corporation Unlocking a touch screen device
US9772751B2 (en) 2007-06-29 2017-09-26 Apple Inc. Using gestures to slide between user interfaces
US20090019385A1 (en) 2007-07-12 2009-01-15 Nvidia Corporation Management of Icons in a Display Interface
US7956869B1 (en) 2007-07-13 2011-06-07 Adobe Systems Incorporated Proximity based transparency of windows aiding in obscured window selection
US8589811B2 (en) 2007-07-20 2013-11-19 International Business Machines Corporation Techniques for organizing information accessed through a web browser
JP5007625B2 (en) 2007-08-15 2012-08-22 ソニー株式会社 Display interface, display control apparatus, display method, and program
US8069404B2 (en) 2007-08-22 2011-11-29 Maya-Systems Inc. Method of managing expected documents and system providing same
JP5245065B2 (en) 2007-08-27 2013-07-24 名古屋市 Water repellent material, water repellent film forming method using the same, and water repellent coating composition
US20090063971A1 (en) 2007-08-31 2009-03-05 Yahoo! Inc. Media discovery interface
US8619038B2 (en) 2007-09-04 2013-12-31 Apple Inc. Editing interface
US11126321B2 (en) 2007-09-04 2021-09-21 Apple Inc. Application menu user interface
US9619143B2 (en) 2008-01-06 2017-04-11 Apple Inc. Device, method, and graphical user interface for viewing application launch icons
AU2012202140B2 (en) 2007-09-04 2014-06-12 Apple Inc. Editing interface
US20090066648A1 (en) 2007-09-07 2009-03-12 Apple Inc. Gui applications for use with 3d remote controller
US20090070708A1 (en) 2007-09-12 2009-03-12 Palm, Inc. Display of Information of Interest
US8122384B2 (en) 2007-09-18 2012-02-21 Palo Alto Research Center Incorporated Method and apparatus for selecting an object within a user interface by performing a gesture
US8130211B2 (en) 2007-09-24 2012-03-06 Microsoft Corporation One-touch rotation of virtual objects in virtual workspace
US20090122018A1 (en) 2007-11-12 2009-05-14 Leonid Vymenets User Interface for Touchscreen Device
JP2009123022A (en) 2007-11-15 2009-06-04 Canon Inc Document management apparatus, document management method, program and storage medium
JP5046881B2 (en) 2007-11-16 2012-10-10 キヤノン株式会社 Information processing apparatus, display control method, and program
US20090128581A1 (en) 2007-11-20 2009-05-21 Microsoft Corporation Custom transition framework for application state transitions
US20100333017A1 (en) 2007-11-27 2010-12-30 David J. Ortiz Computer graphic user interface and display system
JP5304172B2 (en) 2007-12-04 2013-10-02 株式会社リコー File management apparatus, file management method, and file management program
JP2009136456A (en) 2007-12-05 2009-06-25 Nec Corp Mobile terminal device
US9513765B2 (en) 2007-12-07 2016-12-06 Sony Corporation Three-dimensional sliding object arrangement method and system
US8965787B2 (en) 2007-12-17 2015-02-24 Smooth Productions Inc. Communications system and method for serving electronic content
US8446371B2 (en) 2007-12-19 2013-05-21 Research In Motion Limited Method and apparatus for launching activities
JP4605478B2 (en) 2007-12-19 2011-01-05 ソニー株式会社 Information processing apparatus, display control method, and display control program
JP4364273B2 (en) 2007-12-28 2009-11-11 パナソニック株式会社 Portable terminal device, display control method, and display control program
US8327277B2 (en) 2008-01-14 2012-12-04 Microsoft Corporation Techniques to automatically manage overlapping objects
KR101482103B1 (en) 2008-01-14 2015-01-13 엘지전자 주식회사 Mobile Terminal Capable of Expressing Weather Information
US8166421B2 (en) 2008-01-14 2012-04-24 Primesense Ltd. Three-dimensional user interface
US20090184936A1 (en) 2008-01-22 2009-07-23 Mathematical Inventing - Slicon Valley 3D touchpad
JP2009181321A (en) 2008-01-30 2009-08-13 Sony Corp Display device, display method, and program
US8356258B2 (en) 2008-02-01 2013-01-15 Microsoft Corporation Arranging display areas utilizing enhanced window states
US9092134B2 (en) 2008-02-04 2015-07-28 Nokia Technologies Oy User touch display interface providing an expanded selection area for a user selectable object
EP2469399B1 (en) 2008-02-11 2019-09-11 Idean Enterprises Oy Layer-based user interface
US8112722B2 (en) 2008-02-21 2012-02-07 Honeywell International Inc. Method and system of controlling a cursor in a three-dimensional graphical environment
US20090222765A1 (en) 2008-02-29 2009-09-03 Sony Ericsson Mobile Communications Ab Adaptive thumbnail scrollbar
US8205157B2 (en) 2008-03-04 2012-06-19 Apple Inc. Methods and graphical user interfaces for conducting searches on a portable multifunction device
US8259163B2 (en) 2008-03-07 2012-09-04 Intellectual Ventures Holding 67 Llc Display with built in 3D sensing
EP2104024B1 (en) 2008-03-20 2018-05-02 LG Electronics Inc. Portable terminal capable of sensing proximity touch and method for controlling screen using the same
US8723811B2 (en) 2008-03-21 2014-05-13 Lg Electronics Inc. Mobile terminal and screen displaying method thereof
JP5130979B2 (en) 2008-03-21 2013-01-30 ソニー株式会社 Information processing apparatus and search result display method
US8125481B2 (en) 2008-03-21 2012-02-28 Google Inc. Lightweight three-dimensional display
US8280732B2 (en) 2008-03-27 2012-10-02 Wolfgang Richter System and method for multidimensional gesture analysis
US20090247112A1 (en) 2008-03-28 2009-10-01 Sprint Communications Company L.P. Event disposition control for mobile communications device
US20090254799A1 (en) 2008-04-04 2009-10-08 Michael Unger System for creating graphical display from text
US9019237B2 (en) 2008-04-06 2015-04-28 Lester F. Ludwig Multitouch parameter and gesture user interface employing an LED-array tactile sensor that can also operate as a display
US8209628B1 (en) 2008-04-11 2012-06-26 Perceptive Pixel, Inc. Pressure-sensitive manipulation of displayed objects
US8949743B2 (en) 2008-04-22 2015-02-03 Apple Inc. Language input interface on a device
US8799821B1 (en) 2008-04-24 2014-08-05 Pixar Method and apparatus for user inputs for three-dimensional animation
JP4171770B1 (en) 2008-04-24 2008-10-29 任天堂株式会社 Object display order changing program and apparatus
US20090267909A1 (en) 2008-04-27 2009-10-29 Htc Corporation Electronic device and user interface display method thereof
KR101461954B1 (en) 2008-05-08 2014-11-14 엘지전자 주식회사 Terminal and method for controlling the same
US20100177053A2 (en) 2008-05-09 2010-07-15 Taizo Yasutake Method and apparatus for control of multiple degrees of freedom of a display
JP5171386B2 (en) 2008-05-19 2013-03-27 キヤノン株式会社 Content management apparatus, content management method, program, and recording medium
US8266550B1 (en) 2008-05-28 2012-09-11 Google Inc. Parallax panning of mobile device desktop
GB0810179D0 (en) 2008-06-04 2008-07-09 Elliptic Laboratories As Object location
US8477139B2 (en) 2008-06-09 2013-07-02 Apple Inc. Touch screen device, method, and graphical user interface for manipulating three-dimensional virtual objects
DE102008028023A1 (en) 2008-06-12 2009-12-17 Siemens Aktiengesellschaft Method for displaying a plurality of image data sets and user interface for displaying a plurality of image data sets
KR101477743B1 (en) 2008-06-16 2014-12-31 삼성전자 주식회사 Terminal and method for performing function thereof
US9092053B2 (en) 2008-06-17 2015-07-28 Apple Inc. Systems and methods for adjusting a display based on the user's position
US20090319928A1 (en) 2008-06-20 2009-12-24 Microsoft Corporation Generating previews for themes that personalize an operating environment
US9030418B2 (en) 2008-06-24 2015-05-12 Lg Electronics Inc. Mobile terminal capable of sensing proximity touch
US20090327969A1 (en) 2008-06-27 2009-12-31 Microsoft Corporation Semantic zoom in a virtual three-dimensional graphical user interface
US10095375B2 (en) 2008-07-09 2018-10-09 Apple Inc. Adding a contact to a home screen
US8345014B2 (en) 2008-07-12 2013-01-01 Lester F. Ludwig Control of the operating system on a computing device via finger angle using a high dimensional touchpad (HDTP) touch user interface
US8169414B2 (en) 2008-07-12 2012-05-01 Lim Seung E Control of electronic games via finger angle using a high dimensional touchpad (HDTP) touch user interface
JP4632102B2 (en) 2008-07-17 2011-02-16 ソニー株式会社 Information processing apparatus, information processing method, and information processing program
KR101546774B1 (en) 2008-07-29 2015-08-24 엘지전자 주식회사 Mobile terminal and operation control method thereof
US20100031202A1 (en) 2008-08-04 2010-02-04 Microsoft Corporation User-defined gesture set for surface computing
KR101095106B1 (en) 2008-08-11 2011-12-16 에스케이플래닛 주식회사 Ui design modification system and ui design modification method
US20100053151A1 (en) 2008-09-02 2010-03-04 Samsung Electronics Co., Ltd In-line mediation for manipulating three-dimensional content on a display device
KR100969790B1 (en) 2008-09-02 2010-07-15 엘지전자 주식회사 Mobile terminal and method for synthersizing contents
JP5308747B2 (en) 2008-09-03 2013-10-09 オリンパス株式会社 Information presentation system, program, and information storage medium
US8306969B2 (en) 2008-09-23 2012-11-06 Microsoft Corporation Linking search queries to rich media themes
US8701040B2 (en) 2008-09-29 2014-04-15 Microsoft Corporation Panoramic graphical user interface
KR20100041006A (en) 2008-10-13 2010-04-22 엘지전자 주식회사 A user interface controlling method using three dimension multi-touch
US9760234B2 (en) 2008-10-14 2017-09-12 International Business Machines Corporation Desktop icon management and grouping using desktop containers
KR101510738B1 (en) 2008-10-20 2015-04-10 삼성전자주식회사 Apparatus and method for composing idle screen in a portable terminal
JP5343505B2 (en) 2008-10-20 2013-11-13 日本電気株式会社 Icon display device, icon display method and program
US8024667B2 (en) 2008-10-24 2011-09-20 Microsoft Corporation In-document floating object re-ordering
KR101609162B1 (en) 2008-11-13 2016-04-05 엘지전자 주식회사 Mobile Terminal With Touch Screen And Method Of Processing Data Using Same
US20100124152A1 (en) 2008-11-18 2010-05-20 Gilbert Kye Lee Image Clock
US20110302513A1 (en) 2008-11-24 2011-12-08 Fredrik Ademar Methods and apparatuses for flexible modification of user interfaces
US8255808B2 (en) 2008-12-12 2012-08-28 Nokia Corporation Controlling data transfer between devices
US8762885B2 (en) 2008-12-15 2014-06-24 Verizon Patent And Licensing Inc. Three dimensional icon stacks
US8522163B2 (en) 2008-12-19 2013-08-27 Verizon Patent And Licensing Inc. Systems and methods for radial display of time based information
US9274505B2 (en) 2008-12-19 2016-03-01 Verizon Patent And Licensing Inc. Systems and methods for radial display of time based information
US8693993B2 (en) 2008-12-24 2014-04-08 Microsoft Corporation Personalized cloud of mobile tasks
CH700242B1 (en) 2009-01-12 2014-02-28 Peter Meszaros Clock with adjustment of the mechanical display of time and additional functions that do not affect the appearance of the watch.
JP4723656B2 (en) 2009-02-03 2011-07-13 京セラ株式会社 Input device
US9152292B2 (en) 2009-02-05 2015-10-06 Hewlett-Packard Development Company, L.P. Image collage authoring
JP5419486B2 (en) 2009-02-10 2014-02-19 キヤノン株式会社 Data processing apparatus, data processing method, and program
US9176747B2 (en) 2009-02-17 2015-11-03 Sandisk Il Ltd. User-application interface
JP5734546B2 (en) 2009-02-25 2015-06-17 京セラ株式会社 Object display device
US8108791B2 (en) 2009-02-27 2012-01-31 Microsoft Corporation Multi-screen user interface
US20100223563A1 (en) 2009-03-02 2010-09-02 Apple Inc. Remotely defining a user interface for a handheld device
US20100229129A1 (en) 2009-03-04 2010-09-09 Microsoft Corporation Creating organizational containers on a graphical user interface
US8631354B2 (en) 2009-03-06 2014-01-14 Microsoft Corporation Focal-control user interface
EP2406708A1 (en) 2009-03-11 2012-01-18 Fugoo Llc A graphical user interface for the representation of and interaction with one or more objects
US20100241999A1 (en) 2009-03-19 2010-09-23 Microsoft Corporation Canvas Manipulation Using 3D Spatial Gestures
TWI459280B (en) 2009-03-23 2014-11-01 Alpha Networks Inc Setting and modifying method of user's operating interface and digital audio/video playback system using the method
US20100241955A1 (en) 2009-03-23 2010-09-23 Microsoft Corporation Organization and manipulation of content items on a touch-sensitive display
US20100251085A1 (en) 2009-03-25 2010-09-30 Microsoft Corporation Content and subfolder navigation control
KR101640460B1 (en) 2009-03-25 2016-07-18 삼성전자 주식회사 Operation Method of Split Window And Portable Device supporting the same
US20100257468A1 (en) 2009-04-06 2010-10-07 Francisco Javier Gonzalez Bernardo Method and system for an enhanced interactive visualization environment
US9141087B2 (en) 2009-04-26 2015-09-22 Nike, Inc. Athletic watch
KR20090100320A (en) 2009-04-28 2009-09-23 엘지전자 주식회사 User interface for a hand-held device and controll method thereof
US8669945B2 (en) 2009-05-07 2014-03-11 Microsoft Corporation Changing of list views on mobile device
US8713459B2 (en) 2009-05-29 2014-04-29 Jason Philip Yanchar Graphical planner
US20100315413A1 (en) 2009-06-16 2010-12-16 Microsoft Corporation Surface Computer User Interaction
US8171401B2 (en) 2009-06-19 2012-05-01 Microsoft Corporation Resizing an editable area in a web page
US8799253B2 (en) 2009-06-26 2014-08-05 Microsoft Corporation Presenting an assembled sequence of preview videos
US8799777B1 (en) 2009-07-13 2014-08-05 Sprint Communications Company L.P. Selectability of objects on a touch-screen display
US8497884B2 (en) 2009-07-20 2013-07-30 Motorola Mobility Llc Electronic device and method for manipulating graphic user interface elements
US8656314B2 (en) 2009-07-30 2014-02-18 Lenovo (Singapore) Pte. Ltd. Finger touch gesture for joining and unjoining discrete touch objects
US10198854B2 (en) 2009-08-14 2019-02-05 Microsoft Technology Licensing, Llc Manipulation of 3-dimensional graphical objects for view in a multi-touch display
US8335784B2 (en) 2009-08-31 2012-12-18 Microsoft Corporation Visual search and three-dimensional results
US20110055722A1 (en) 2009-09-02 2011-03-03 Ludwig Lester F Data Visualization Environment with DataFlow Processing, Web, Collaboration, Advanced User Interfaces, and Spreadsheet Visualization
US8966375B2 (en) 2009-09-07 2015-02-24 Apple Inc. Management of application programs on a portable electronic device
KR20110026809A (en) 2009-09-08 2011-03-16 엘지전자 주식회사 Mobile terminal and control method thereof
KR101411593B1 (en) 2009-09-14 2014-06-25 삼성전자주식회사 Method for providing User Interface and display apparatus applying the same
US8438500B2 (en) 2009-09-25 2013-05-07 Apple Inc. Device, method, and graphical user interface for manipulation of user interface objects with activation regions
US8381118B2 (en) 2009-10-05 2013-02-19 Sony Ericsson Mobile Communications Ab Methods and devices that resize touch selection zones while selected on a touch sensitive display
US8261212B2 (en) 2009-10-20 2012-09-04 Microsoft Corporation Displaying GUI elements on natural user interfaces
WO2011060382A1 (en) 2009-11-13 2011-05-19 Google Inc. Live wallpaper
US9128602B2 (en) 2009-11-25 2015-09-08 Yahoo! Inc. Gallery application for content viewing
EP2360665A3 (en) 2009-11-26 2012-03-28 LG Electronics Mobile terminal and control method thereof
US9046991B2 (en) 2009-11-30 2015-06-02 Hewlett-Packard Development Company, L.P. System and method for dynamically displaying structurally dissimilar thumbnail images of an electronic document
US20110145758A1 (en) 2009-12-10 2011-06-16 International Business Machines Corporation Display navigation system, method and computer program product
US20110148786A1 (en) 2009-12-18 2011-06-23 Synaptics Incorporated Method and apparatus for changing operating modes
KR101626621B1 (en) 2009-12-30 2016-06-01 엘지전자 주식회사 Method for controlling data in mobile termina having circle type display unit and mobile terminal thereof
US20110167365A1 (en) 2010-01-04 2011-07-07 Theodore Charles Wingrove System and method for automated interface configuration based on habits of user in a vehicle
US8232990B2 (en) 2010-01-05 2012-07-31 Apple Inc. Working with 3D objects
US8525839B2 (en) 2010-01-06 2013-09-03 Apple Inc. Device, method, and graphical user interface for providing digital content products
US8862576B2 (en) 2010-01-06 2014-10-14 Apple Inc. Device, method, and graphical user interface for mapping directions between search results
US10007393B2 (en) 2010-01-19 2018-06-26 Apple Inc. 3D view of file structure
KR20110093729A (en) 2010-02-12 2011-08-18 삼성전자주식회사 Method and apparatus of providing widget
US9417787B2 (en) 2010-02-12 2016-08-16 Microsoft Technology Licensing, Llc Distortion effects to indicate location in a movable data collection
KR101677621B1 (en) 2010-03-12 2016-11-18 엘지전자 주식회사 Content control apparatus and method thereof
US8957866B2 (en) 2010-03-24 2015-02-17 Microsoft Corporation Multi-axis navigation
US8386950B2 (en) 2010-04-05 2013-02-26 Sony Ericsson Mobile Communications Ab Methods, systems and computer program products for arranging a plurality of icons on a touch sensitive display
US10788976B2 (en) 2010-04-07 2020-09-29 Apple Inc. Device, method, and graphical user interface for managing folders with multiple pages
US8458615B2 (en) 2010-04-07 2013-06-04 Apple Inc. Device, method, and graphical user interface for managing folders
AU2015100115C4 (en) 2010-04-07 2018-07-05 Apple Inc. Device, method, and graphical user interface for managing folders
US8789131B2 (en) 2010-05-14 2014-07-22 Lg Electronics Inc. Electronic device and method of sharing contents thereof with other devices
US20110298723A1 (en) 2010-06-07 2011-12-08 Christopher Brian Fleizach Devices, Methods, and Graphical User Interfaces for Accessibility via a Touch-Sensitive Surface
US20110310005A1 (en) 2010-06-17 2011-12-22 Qualcomm Incorporated Methods and apparatus for contactless gesture recognition
US8682971B2 (en) 2010-06-22 2014-03-25 International Business Machines Corporation Relationship management in a social network service
US8782577B2 (en) 2010-07-24 2014-07-15 Cadence Design Systems, Inc. Method, apparatus, and article of manufacture for providing in situ, customizable information in designing electronic circuits with electrical awareness
US8799815B2 (en) 2010-07-30 2014-08-05 Apple Inc. Device, method, and graphical user interface for activating an item in a folder
KR101657122B1 (en) 2010-09-15 2016-09-30 엘지전자 주식회사 Mobile terminal and method for controlling the same
US8890818B2 (en) 2010-09-22 2014-11-18 Nokia Corporation Apparatus and method for proximity based input
KR101708821B1 (en) 2010-09-30 2017-02-21 엘지전자 주식회사 Mobile terminal and method for controlling thereof
WO2012058015A1 (en) 2010-10-26 2012-05-03 Barnes & Noble, Inc. System and method for organizing user interface for categories of recently used digital material
US8581997B2 (en) 2010-10-28 2013-11-12 Intellectual Ventures Fund 83 Llc System for locating nearby picture hotspots
US9011292B2 (en) 2010-11-01 2015-04-21 Nike, Inc. Wearable device assembly having athletic functionality
KR101281448B1 (en) 2010-11-29 2013-07-03 팅크웨어(주) Icon adjusting method and terminal equipment using the same
EP2649782A2 (en) 2010-12-10 2013-10-16 Yota Devices IPR Ltd Mobile device with user interface
US8730188B2 (en) 2010-12-23 2014-05-20 Blackberry Limited Gesture input on a portable electronic device and method of controlling the same
KR101755376B1 (en) 2010-12-23 2017-07-26 엘지전자 주식회사 Method for controlling using voice action and the mobile terminal
US8683349B2 (en) 2010-12-31 2014-03-25 Verizon Patent And Licensing Inc. Media content user interface systems and methods
US20120169617A1 (en) 2011-01-04 2012-07-05 Nokia Corporation Controlling of user input device
CN102081502A (en) 2011-01-24 2011-06-01 中兴通讯股份有限公司 Method for managing icons on standby interface of mobile terminal and mobile terminal
US9152312B1 (en) 2011-01-26 2015-10-06 Google Inc. Displaying related content in a content stream
AU2012215303B2 (en) 2011-02-10 2016-09-15 Samsung Electronics Co., Ltd Portable device comprising a touch-screen display, and method for controlling same
US20120216146A1 (en) 2011-02-17 2012-08-23 Nokia Corporation Method, apparatus and computer program product for integrated application and task manager display
JP5784944B2 (en) 2011-03-29 2015-09-24 京セラ株式会社 Electronics
US9104307B2 (en) 2011-05-27 2015-08-11 Microsoft Technology Licensing, Llc Multi-application environment
US20130194066A1 (en) 2011-06-10 2013-08-01 Aliphcom Motion profile templates and movement languages for wearable devices
US20120324390A1 (en) 2011-06-16 2012-12-20 Richard Tao Systems and methods for a virtual watch
US20130019175A1 (en) 2011-07-14 2013-01-17 Microsoft Corporation Submenus for context based menu system
JP5799628B2 (en) 2011-07-15 2015-10-28 ソニー株式会社 Information processing apparatus, information processing method, and program
JP5172997B2 (en) 2011-07-15 2013-03-27 シャープ株式会社 Information processing apparatus, operation screen display method, control program, and recording medium
EP2737688A4 (en) 2011-07-31 2015-03-04 Nokia Corp Method and apparatus for providing zone-based device interaction
JP2013047919A (en) 2011-08-29 2013-03-07 Kyocera Corp Device, method, and program
US20130067411A1 (en) 2011-09-08 2013-03-14 Google Inc. User gestures indicating rates of execution of functions
JP5848932B2 (en) 2011-09-27 2016-01-27 京セラ株式会社 Mobile terminal, folder management program, and folder management method
JP2013105202A (en) 2011-11-10 2013-05-30 Kyocera Corp Device, method, and program
JP5703196B2 (en) 2011-11-15 2015-04-15 株式会社東海理化電機製作所 Portable machine
JP5929145B2 (en) 2011-12-07 2016-06-01 株式会社ニコン Electronic device, information processing method and program
CN104159508B (en) 2012-01-04 2018-01-30 耐克创新有限合伙公司 Sports watch
US9524272B2 (en) 2012-02-05 2016-12-20 Apple Inc. Navigating among content items in a browser using an array mode
US9189062B2 (en) 2012-03-07 2015-11-17 Google Technology Holdings LLC Portable electronic device and method for controlling operation thereof based on user motion
JP2013206274A (en) 2012-03-29 2013-10-07 Toshiba Corp Portable electronic apparatus and display control method
JP5941315B2 (en) 2012-03-29 2016-06-29 富士通テン株式会社 Vehicle control apparatus and vehicle control system
EP2648386B1 (en) 2012-04-08 2021-08-25 Samsung Electronics Co., Ltd. Management Server and Method for Controlling Device, User Terminal Apparatus and Method for Controlling Device, and User Terminal Apparatus and Control Method Thereof
CN104246677A (en) 2012-04-20 2014-12-24 索尼公司 Information processing device, information processing method, and program
US9547425B2 (en) 2012-05-09 2017-01-17 Apple Inc. Context-specific user interfaces
US9459781B2 (en) 2012-05-09 2016-10-04 Apple Inc. Context-specific user interfaces for displaying animated sequences
US9430120B2 (en) 2012-06-08 2016-08-30 Apple Inc. Identification of recently downloaded content
US9042971B2 (en) 2012-06-22 2015-05-26 Fitbit, Inc. Biometric monitoring device with heart rate measurement activated by a single user-gesture
US8948832B2 (en) 2012-06-22 2015-02-03 Fitbit, Inc. Wearable heart rate monitor
CN102801649A (en) 2012-08-11 2012-11-28 上海量明科技发展有限公司 Method and terminal for establishing shortcut of instant messaging interactive interface
US20140108978A1 (en) 2012-10-15 2014-04-17 At&T Mobility Ii Llc System and Method For Arranging Application Icons Of A User Interface On An Event-Triggered Basis
US11237719B2 (en) 2012-11-20 2022-02-01 Samsung Electronics Company, Ltd. Controlling remote electronic device with wearable electronic device
US8994827B2 (en) 2012-11-20 2015-03-31 Samsung Electronics Co., Ltd Wearable electronic device
US8863487B2 (en) 2012-11-30 2014-10-21 Marion Calmer Narrow row head unit
US20140195972A1 (en) 2013-01-07 2014-07-10 Electronics And Telecommunications Research Institute Method and apparatus for managing programs or icons
US9062617B2 (en) 2013-01-16 2015-06-23 General Motors Llc Autostarting a vehicle based on user criteria
US11210076B2 (en) 2013-01-28 2021-12-28 Samsung Electronics Co., Ltd. Downloading and launching an app on a second device from a first device
KR102049855B1 (en) 2013-01-31 2019-11-28 엘지전자 주식회사 Mobile terminal and controlling method thereof
EP2972816A4 (en) 2013-03-13 2016-11-09 Owaves Inc Lifestyle management system
US8826170B1 (en) 2013-03-15 2014-09-02 Google Inc. Window switching interface
US9857193B2 (en) 2013-06-08 2018-01-02 Apple Inc. Mapping application with turn-by-turn navigation mode for output to vehicle display
US20140293755A1 (en) 2013-03-28 2014-10-02 Meta Watch Oy Device with functional display and method for time management
KR102148809B1 (en) 2013-04-22 2020-08-27 삼성전자주식회사 Apparatus, method and computer readable recording medium for displaying shortcut window
US9354613B2 (en) 2013-05-01 2016-05-31 Rajendra Serber Proportional hour time display
JP2013191234A (en) 2013-06-05 2013-09-26 Pioneer Electronic Corp Car sharing system
KR102044701B1 (en) 2013-07-10 2019-11-14 엘지전자 주식회사 Mobile terminal
KR102138506B1 (en) 2013-07-15 2020-07-28 엘지전자 주식회사 Mobile terminal
US9386432B2 (en) 2013-08-12 2016-07-05 Yahoo! Inc. Displaying location-based images that match the weather conditions
KR20150022599A (en) 2013-08-23 2015-03-04 삼성전자주식회사 Method for setting configuration of electronic device and apparatus for the same
CN103530220A (en) 2013-10-09 2014-01-22 华为技术有限公司 Display method and system and terminal for application program icons
US9993913B2 (en) 2013-10-14 2018-06-12 Senco Brands, Inc. Clenching adapter for automatic nailers
US9794397B2 (en) 2013-10-16 2017-10-17 Lg Electronics Inc. Watch type mobile terminal and method for controlling the same
US9721218B2 (en) 2013-10-21 2017-08-01 Sap Se Determining the user-specific relevance of applications
KR101952928B1 (en) 2013-10-30 2019-02-27 애플 인크. Displaying relevant user interface objects
US9082314B2 (en) 2013-10-30 2015-07-14 Titanium Marketing, Inc. Time teaching watch and method
US9430758B2 (en) 2013-12-05 2016-08-30 Cisco Technology, Inc. User interface component with a radial clock and integrated schedule
KR102131829B1 (en) 2013-12-17 2020-08-05 엘지전자 주식회사 Mobile terminal and method for controlling thereof
EP2911377B1 (en) 2014-02-24 2019-06-12 Samsung Electronics Co., Ltd Method of providing preview image regarding display setting for device
KR102208115B1 (en) 2014-03-27 2021-01-27 엘지전자 주식회사 Mobile terminal and method for controlling the same
US20150301506A1 (en) 2014-04-22 2015-10-22 Fahad Koumaiha Transparent capacitive touchscreen device overlying a mechanical component
US10478127B2 (en) 2014-06-23 2019-11-19 Sherlock Solutions, LLC Apparatuses, methods, processes, and systems related to significant detrimental changes in health parameters and activating lifesaving measures
CN116301544A (en) 2014-06-27 2023-06-23 苹果公司 Reduced size user interface
EP3742272B1 (en) 2014-08-02 2022-09-14 Apple Inc. Context-specific user interfaces
US20160048296A1 (en) 2014-08-12 2016-02-18 Motorola Mobility Llc Methods for Implementing a Display Theme on a Wearable Electronic Device
KR102418119B1 (en) 2014-08-25 2022-07-07 삼성전자 주식회사 Method for organizing a clock frame and an wearable electronic device implementing the same
KR102258579B1 (en) 2014-08-29 2021-05-31 엘지전자 주식회사 Watch type terminal
JP6667233B2 (en) 2014-09-02 2020-03-18 ナイキ イノベイト シーブイ Monitoring health using mobile devices
US9547419B2 (en) 2014-09-02 2017-01-17 Apple Inc. Reduced size configuration interface
KR101776098B1 (en) 2014-09-02 2017-09-07 애플 인크. Physical activity and workout monitor
US10261672B1 (en) 2014-09-16 2019-04-16 Amazon Technologies, Inc. Contextual launch interfaces
KR20160047273A (en) 2014-10-22 2016-05-02 엘지전자 주식회사 Watch type terminal
KR102354769B1 (en) 2014-10-29 2022-01-25 삼성전자주식회사 Terminal apparatus and method for controlling thereof
US9456123B2 (en) 2014-12-18 2016-09-27 Xerox Corporation Method and system to configure mobile electronic device settings using remote data store analytics
KR20170016262A (en) 2015-08-03 2017-02-13 엘지전자 주식회사 Mobile terminal and control method thereof
KR20170033062A (en) 2015-09-16 2017-03-24 엘지전자 주식회사 Mobile terminal and method for controlling the same
US20170357427A1 (en) 2016-06-10 2017-12-14 Apple Inc. Context-specific user interfaces
DK201670595A1 (en) 2016-06-11 2018-01-22 Apple Inc Configuring context-specific user interfaces
US11816325B2 (en) 2016-06-12 2023-11-14 Apple Inc. Application shortcuts for carplay
US9942418B2 (en) 2016-06-28 2018-04-10 Kyocera Document Solutions Inc. Methods for configuring settings for an image forming apparatus with template sheet
WO2018184154A1 (en) 2017-04-05 2018-10-11 Microsoft Technology Licensing, Llc Desktop launcher
WO2019000232A1 (en) 2017-06-27 2019-01-03 Microsoft Technology Licensing, Llc Page navigation in desktop launcher
US10684592B2 (en) 2017-11-27 2020-06-16 Lg Electronics Inc. Watch type terminal
CN110321047B (en) 2018-03-30 2021-08-20 华为技术有限公司 Display control method and device
CN109363430A (en) 2018-11-08 2019-02-22 广州三拾七度智能家居有限公司 A kind of parent-offspring's dining table
US11675476B2 (en) 2019-05-05 2023-06-13 Apple Inc. User interfaces for widgets

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5760773A (en) * 1995-01-06 1998-06-02 Microsoft Corporation Methods and apparatus for interacting with data objects using action handles
US6590568B1 (en) * 2000-11-20 2003-07-08 Nokia Corporation Touch screen drag and drop input technique
WO2005041020A1 (en) * 2003-10-24 2005-05-06 Nokia Corporation Method for shifting a shortcut in an electronic device, a display unit of the device, and an electronic device

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2010112089A1 (en) * 2009-03-30 2010-10-07 Sony Ericsson Mobile Communications Ab Navigation among media files represented by graphics in portable communication devices
DE102009017078A1 (en) * 2009-04-08 2010-11-25 Jurasoft Gmbh & Co. Kg Display device i.e. monitor, controlling method for use in computer system i.e. personal computer, involves representing transient graphic element in display area of additional symbol, and linking element with object based on user selection
WO2013045708A1 (en) * 2011-09-30 2013-04-04 Promethean Limited Transforming displayed objects on a gui
US9420108B1 (en) 2015-08-11 2016-08-16 International Business Machines Corporation Controlling conference calls
US9537911B1 (en) 2015-08-11 2017-01-03 International Business Machines Corporation Controlling conference calls
US9591141B1 (en) 2015-08-11 2017-03-07 International Business Machines Corporation Controlling conference calls
US9621731B2 (en) 2015-08-11 2017-04-11 International Business Machines Corporation Controlling conference calls

Also Published As

Publication number Publication date
US20200348814A1 (en) 2020-11-05
WO2008086305A3 (en) 2008-10-09
US20140068483A1 (en) 2014-03-06
US10732821B2 (en) 2020-08-04
US20160253065A1 (en) 2016-09-01
US20080165153A1 (en) 2008-07-10
US20220137765A1 (en) 2022-05-05
US10254949B2 (en) 2019-04-09
US11586348B2 (en) 2023-02-21
US8519964B2 (en) 2013-08-27
US11169691B2 (en) 2021-11-09
US9367232B2 (en) 2016-06-14
US20190235724A1 (en) 2019-08-01

Similar Documents

Publication Publication Date Title
US11586348B2 (en) Portable multifunction device, method, and graphical user interface supporting user navigations of graphical objects on a touch screen display
US20210397336A1 (en) Portable multifunction device, method, and graphical user interface for providing maps and directions
US9933937B2 (en) Portable multifunction device, method, and graphical user interface for playing online videos
US9575646B2 (en) Modal change based on orientation of a portable multifunction device
AU2008100011B4 (en) Positioning a slider icon on a portable multifunction device
US9817436B2 (en) Portable multifunction device, method, and graphical user interface for displaying user interface objects adaptively
US8116807B2 (en) Airplane mode indicator on a portable multifunction device
US7966578B2 (en) Portable multifunction device, method, and graphical user interface for translating displayed content
CA2661856C (en) Voicemail manager for portable multifunction device
US8631357B2 (en) Dual function scroll wheel input
US20080165143A1 (en) Portable Multifunction Device, Method, and Graphical User Interface for Interacting with User Input Elements in Displayed Content
US20080082930A1 (en) Portable Multifunction Device, Method, and Graphical User Interface for Configuring and Displaying Widgets
WO2008085742A2 (en) Portable multifunction device, method and graphical user interface for interacting with user input elements in displayed content
AU2007342102A1 (en) System and method for moving list items on a touch screen

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 08713628

Country of ref document: EP

Kind code of ref document: A2

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 08713628

Country of ref document: EP

Kind code of ref document: A2