US20080088599A1 - Data entry for personal computing devices - Google Patents
Data entry for personal computing devices Download PDFInfo
- Publication number
- US20080088599A1 US20080088599A1 US11/871,904 US87190407A US2008088599A1 US 20080088599 A1 US20080088599 A1 US 20080088599A1 US 87190407 A US87190407 A US 87190407A US 2008088599 A1 US2008088599 A1 US 2008088599A1
- Authority
- US
- United States
- Prior art keywords
- user
- computer
- completion
- completion candidates
- text entry
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
- G06F3/04883—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/02—Input arrangements using manually operated switches, e.g. using keyboards or dials
- G06F3/023—Arrangements for converting discrete items of information into a coded form, e.g. arrangements for interpreting keyboard generated codes as alphanumeric codes, operand codes or instruction codes
- G06F3/0233—Character input methods
- G06F3/0236—Character input methods using selection techniques to select from displayed items
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/02—Input arrangements using manually operated switches, e.g. using keyboards or dials
- G06F3/023—Arrangements for converting discrete items of information into a coded form, e.g. arrangements for interpreting keyboard generated codes as alphanumeric codes, operand codes or instruction codes
- G06F3/0233—Character input methods
- G06F3/0237—Character input methods using prediction or retrieval techniques
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
- G06F3/04886—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures by partitioning the display area of the touch-screen or the surface of the digitising tablet into independently controllable areas, e.g. virtual keyboards or menus
Definitions
- the present invention relates generally to computer-assisted data entry and more particularly to a method, system, and apparatus for computer-assisted text generation and entry using a pointing device with a personal computing device, and to computer-readable media having executable instructions for supporting text generation and entry using a pointing device.
- PDAs personal digital assistants
- pen-based computing wherein users enter text and commands into hand-held personal computers via a touch-sensitive screen. While such pen-based computing is popular, especially with the increasing power of miniature computing devices, it does present challenges to a user entering data in an application running on the hand-held device. For instance, many hand-held computers and personal digital assistants require that the user enter data according to a predetermined scripting style, such as with the PalmPilotTM series of PDAs.
- Other hand-held devices provide a handwriting recognition system which requires that the computer learn the user's handwriting style. While such data entry mechanisms are useful, they are relatively difficult to use and complex to learn and can be prone to error in the event the user deviates from the predetermined scripting style or the user's traditional handwriting style.
- On-screen digital keyboards are typically miniaturized replica of conventional full-sized physical keyboards, such as QWERTY keyboards. Many on-screen keyboards have shown themselves to be less than efficient for entering text.
- a pointing device such as a pen
- a user is typically required to enter text one character at a time by tapping out individual character selections from the on-screen keyboard. This “hunt-and-peck” method of typing with a single pointing device is time-consuming, especially when a user is entering large amounts of data.
- Text completion systems have been developed in an effort to assist users with text entry. In general, these systems predict and suggest a complete word completion based on a partial text entry entered by a user. These systems allow a user to type in the partial text entry and then accept a predicted text completion for the partial text entry. This avoids the keystrokes that would otherwise be required to type the complete text desired by a user. While such text completion systems provide some basic assistance for users to more rapidly enter text than would be required if every character of the desired text had to be typed in independently, there remains a need in the art for a more flexible text completion system for use with a single pointing device. It would also be desirable for such a text completion system to employ a convenient selection technique which would reduce the amount of movement of the pointing device required to enter text into a computer. It would further be desirable if such a system were applicable to both large and small personal computing devices.
- the user can rapidly enter and search for text using a data entry system through a combination of entering one or more characters on a digitally displayed keyboard with a pointing device and using a search list to obtain a list of completion candidates.
- the user can activate the search list to obtain a list of completion candidates at any time while entering a partial text entry with the data entry system.
- a list of completion candidates is displayed on a graphical user interface for the user to select from and the user can perform one of several actions.
- the user can deactivate the search list and return to modifying the current partial text entry and other text.
- the user can select one of the completion candidates in the search list and use the selected completion candidate to replace the partial text entry which the user is currently entering.
- the user can immediately continue adding to or modifying the current partial text entry being entered, and may re-invoke the search list to further search for completion candidates based on the modified partial text entry.
- the selected completion candidate is used to replace the partial text entry that the user is currently entering, and the data entry system begins monitoring for a new partial text entry from the user.
- the user may use one of the completion candidates in the search list to initiate a further automated search to obtain a more refined list of completion candidates.
- multi-level search lists and searching are available to help accelerate completion of a partial text entry.
- the user can automatically initiate an iterative search wherein a completion candidate listed in the search list is used as the new partial text entry to dynamically obtain a new list of completion candidates, which is then displayed in the search list.
- the automated ability to use the search list to obtain a refined list of completion candidates allows the user to quickly make good use of search results that are only partially successful.
- the user can then choose one of the completion candidates in the new list, or the user can repeat the iterative search process once again by choosing one of the completion candidates in the new list and activating a further iterative search.
- the user may return to keyboard entry with the last completion candidate selected by the user in the previous iteration of the search list.
- This latter feature provides the user with the convenience of being able to automatically and seamlessly continue entering the desired word, phrase, or character sequence using the last completion candidate selected by the user in the previous iteration of the interactive search list.
- the user can quickly and easily replace the user's current partial text entry with a partially successful completion candidate and continue building upon or modifying this partial completion candidate using the keyboard while at the same time allowing the user the flexibility to re-enter the search list at any time.
- a method of processing text entered into a personal computing device with a pointing device With this method, a partial text entry is received and used to obtain a dynamically generated list of completion candidates. The list of completion candidates is displayed in a search list within a graphical user interface. A user input signal associated with the pointing device is received. If the user input signal corresponds to a first type of user selection with the pointing device, then the search list is deactivated. If the user input signal corresponds to a second type of user selection with the pointing device, then the partial text entry is replaced with a completion candidate from the search list.
- a refined list of completion candidates is dynamically obtained based on one of the completion candidates from the search list.
- the refined list is displayed in the search list for further user selection.
- a method of processing an input string at least partially entered into a personal computing device with a pointing device includes performing a search of a set of completion candidates to locate a plurality of possible completion candidates for completing the input string in response to a prior located possible completion candidate or a character selectable by a user. At least one of the plurality of possible completion candidates and characters selectable by the user are displayed.
- a method is provided of user-based text entry.
- a set of position coordinates for a pointing device is monitored relative to a user interface and a digital keyboard is displayed on the user interface at a last known set of coordinates for the pointing device whenever the digital keyboard is activated for user input.
- a method for interchanging the display of a digital keyboard and a search list.
- the digital keyboard is displayed on a user interface when a user is entering text a keystroke at a time.
- user input is monitored. If the user input corresponds to activating the search list, then the digital keyboard is replaced with the search list. If the user input corresponds to terminating use of the search list once activated, then the search list is replaced with the digital keyboard.
- a digital keyboard is configured to include a plurality of characters assigned to predetermined locations within a layout for the digital keyboard according to a predetermined frequency distribution associated with the plurality of characters.
- the plurality of characters includes less commonly used characters and more commonly used characters based on the predetermined frequency distribution.
- the digital keyboard is displayed on a graphical user interface with the less commonly used characters displayed substantially further from a center of the digital keyboard than the more commonly used characters.
- a system for computer-assisted text generation and entry includes an input interface, a processing unit and a computer-readable medium.
- the input interface receives user input signals based on actions with a pointing device.
- the computer-readable medium contains computer-readable instructions for directing the processing unit to assist with text generation and entry based on user input received via the input interface with the pointing device.
- the computer-readable medium includes instructions for receiving a partial text entry; for obtaining a dynamically generated list of completion candidates based on the partial text entry; for displaying the list of completion candidates in a search list on a display device; and for receiving a user input signal associated with the pointing device from the input interface.
- the computer-readable instructions are programmed to deactivate the search list. If the user input signal corresponds to a second type of user selection with the pointing device, the computer-readable instructions are programmed to replace the partial text entry with a completion candidate from the search list.
- FIG. 1 is a schematic diagram of a personal computing device loaded with a data entry system, according to a first embodiment of the invention
- FIG. 2 is a schematic diagram of the data entry system of the first embodiment
- FIG. 3 is a schematic representation illustrating the display of a digital keyboard on a graphical user interface within the personal computing device of the first embodiment
- FIG. 4 is a schematic representation of a data structure for a dictionary according to the first embodiment
- FIG. 5 is a schematic representation illustrating the interchangeable display of the digital keyboard and an interactive search list according to the first embodiment
- FIG. 5A is a schematic representation illustrating potential completion candidates for retrieval and display in the interactive search list according to an example of the use of the first embodiment
- FIG. 6 is a flow diagram illustrating, by way of example from the user's perspective, the use of the data entry system of the first embodiment
- FIG. 7 to 9 are flow diagrams illustrating the flow of operation of the data entry system according to the first embodiment
- FIG. 10 is a schematic representation of an alternative embodiment of a digital keyboard layout according to the present invention.
- FIG. 11 is a schematic representation of another alternative embodiment of a digital keyboard layout according to the present invention.
- FIG. 12 is a schematic representation of another alternative embodiment of a digital keyboard layout according to the present invention.
- FIG. 13 is a schematic representation of another configuration of a digital keyboard according to an embodiment of the present invention.
- FIG. 14 is a schematic representation of the layout for the interactive search list according to the first embodiment
- FIG. 15 to 18 are schematic representations of alternative layouts for the interactive search list according to alternate embodiments of the present invention.
- FIG. 19 is a schematic representation of a configuration for the digital keyboard and the interactive search list according to an embodiment of the present invention.
- FIG. 20 is a schematic representation of an alternative configuration for the digital keyboard according to an embodiment of the present invention.
- FIG. 21 to 22 are flow diagrams illustrating the flow of operation of the data entry system according to an alternative embodiment of the present invention.
- FIG. 23 to 24 are flow diagrams illustrating the flow of operation of the data entry system according to an alternative embodiment of the present invention.
- FIG. 25 to 26 are flow diagrams illustrating the flow of operation of the data entry system according to an alternative embodiment of the present invention.
- FIG. 27 is a schematic representation of an alternative configuration for the digital keyboard according to an embodiment of the present invention.
- the user can rapidly enter and search for text using a data entry system through a combination of entering one or more characters on a digitally displayed keyboard with a pointing device and using an interactive search list to dynamically obtain a list of completion candidates.
- the user can activate the interactive search list to obtain a dynamically generated list of completion candidates at any time while entering a partial text entry with the data entry system.
- partial text entry means a sequence of one or more characters making up a leading portion of a word, phrase or character sequence.
- the list of completion candidates is retrieved from at least one dictionary by a candidate prediction system which retrieves completion candidates that are most likely to contain the completion candidate desired by the user.
- Candidate prediction is based on statistical measures ranking the entries within the dictionary relative to each other.
- the user when the user deactivates the interactive search list, the user can immediately continue adding to or modifying the current partial text entry being entered, and may re-invoke the interactive search list to further search for completion candidates based on the modified partial text entry.
- the selected completion candidate is used to replace the partial text entry that the user is currently entering, and the data entry system begins monitoring for a new partial text entry (of a word, phrase, or character sequence) from the user.
- a third action available to the user when the interactive search list is activated is to:
- multi-level search lists and searching are available to help accelerate completion of a partial text entry.
- the user can automatically initiate an iterative search wherein a completion candidate listed in the interactive search list is used as the new partial text entry to dynamically obtain a new list of completion candidates, which is then displayed in an updated interactive search list.
- the automated ability to use the interactive search list to dynamically obtain a refined list of completion candidates allows the user to quickly make good use of search results that are only partially successful.
- the interactive search list is updated with a new list of completion candidates, the user can then choose one of the completion candidates in the new list, or the user can repeat the iterative search process once again by choosing one of the completion candidates in the new list and activating a further iterative search.
- the user may return to keyboard entry with the last completion candidate selected by the user in the previous iteration of the interactive search list.
- This latter feature provides the user with the convenience of being able to automatically and seamlessly continue entering the desired word, phrase, or character sequence using the last completion candidate selected by the user in the previous iteration of the interactive search list.
- the user can quickly and easily replace the user's current partial text entry with a partially successful completion candidate and continue building upon or modifying this partial completion candidate using the keyboard while at the same time allowing the user the flexibility to re-enter the interactive search list at any time.
- FIG. 1 shows a schematic diagram of a personal computing device 10 for text entry according to a first embodiment of the invention.
- the personal computing device 10 shown in FIG. 1 contains at least one processing unit 12 (such as a CPU or a similar processor or multiprocessor) connected by a bus to a computer-readable medium 16 .
- the computer-readable medium 16 provides a memory store for software and data residing within the personal computing device 10 .
- the computer-readable medium 16 can include one or more types of computer-readable media including volatile memory such as Random Access Memory (RAM), and non-volatile memory, such as a hard disk or Read Only Memory (ROM).
- RAM Random Access Memory
- ROM Read Only Memory
- the computer-readable medium 16 includes a combination of volatile and non-volatile memory.
- the computer-readable medium 16 contains an operating system, a data entry system 26 and an application 27 receptive to user-based text entry such as a word processor.
- the computer-readable medium 16 may also store alternative or other applications such as a browser or micro-browser, an e-mail application, and/or other end-user applications.
- the operating system can be any of several well-known operating systems depending on the personal computing device used.
- the operating system can be PalmOSTM, Windows CETM, or an equivalent operating system.
- a more robust operating system may be used such as, for example, Windows 95TM, Windows 98TM, Windows NTTM, Windows 2000TM, MacOSTM, UNIX, Linux or the like.
- the operating system is PalmOSTM.
- the data entry system 26 is implemented as software that runs on the processing unit 12 to support computer-assisted text generation and entry for the user, although in other alternatives the data entry system 26 can be implemented as computer-readable instructions in firmware or embedded in hardware components.
- electronic text and documents are generated and maintained by the application 27 and the user authors and edits the electronic text and documents with the data entry system 26 which communicates with the application 27 through an application programming interface (API).
- API application programming interface
- the data entry system 26 may be integrated into part of an application.
- the personal computing device 10 includes a graphical display device 15 and a hardware input interface 17 receptive to user input from a pointing device.
- pointing device means an input device that allows a user to select one choice amongst one or many choices (a user-based selection).
- Some pointing devices enable a user to make user-based selections by pointing to a desired choice and include, by way of example, a pen, stylus, or finger. More generally, pointing devices capable of supporting user-based selections include, by way of example, the pointing devices above capable of pointing, as well as other input devices such as a mouse, trackball or the like.
- the graphical display device 15 is connected to and controlled by the processing unit 12 via a video display circuit 13 .
- the graphical display device 15 may be a CRT, a liquid crystal display, or an equivalent computer display.
- the personal computing device 10 is a personal digital assistant wherein the graphical display device 15 and the hardware input interface 17 are combined in the form of a touch-sensitive screen 14 that serves both as a graphical display 15 and as an input interface receptive to generating coordinate position signals in response to contact from a pointing device such as a pen or stylus.
- the personal computing device 10 is represented in the following discussion as a personal digital assistant for illustration purposes only, and that the invention may be practised with other personal computing devices including hand-held devices, personal computers and other microprocessor-based electronic devices, mobile telephones, internet appliances, and embedded devices, having a suitable graphical display and an input interface receptive to user input via a pen, stylus, finger, mouse, or an equivalent pointing device that allows the user to select one choice from many.
- Other types of equivalent personal computing devices to which the features and aspects of the present invention are applicable include, by way of example, an internet appliance controlled via a remote control (for instance, running an Internet service through a television via a set top box and a remote control).
- the hardware input interface 17 may be a digitising tablet, a pressure-sensitive input surface or a proximity sensing input surface.
- the personal computing device 10 may be powered by an internal ( 18 ) or external power source.
- the data entry system 26 includes computer-readable instructions for a digital keyboard 28 , a candidate prediction system 32 , a dictionary 20 , and an interactive search list 30 .
- the digital keyboard 28 provides an interface for the user to enter text and data into the personal computing device 10 ( FIG. 1 ).
- the characters entered by the user are stored in a search string as a partial text entry.
- the search string is used by the candidate prediction system 32 to search the dictionary 20 for completion candidates that begin with the current partial text entry being stored in the search string.
- the interactive search list 30 is used to display for user selection the list of completion candidates retrieved by the candidate prediction system 32 .
- the data entry system 26 supports gesture-based user input for the selection of completion candidates from the interactive search list 30 as will be described in further detail below.
- an image of the digital keyboard 28 is displayed on a graphical user interface 34 within the screen area of the touch-sensitive screen 14 once the data entry system 26 is initialised and ready to receive input from the user.
- the digital keyboard 28 contains a plurality of keys each of which is associated with at least one character from a set of characters. For example, when the English alphabet or a character set containing the English alphabet is used, each key on the digital keyboard 28 can contain one or more letters from the English alphabet. It should be noted that reference to the English alphabet is by way of example only, and that the digital keyboard 28 can be configured to contain and display any set of characters which the user may then select and use to enter text into the personal computing device 10 .
- character set and “set of characters” refer in this specification to a set containing a plurality of letters, numbers and/or other typographic symbols.
- Examples of character sets include, but are not limited to, one or more alphabets of a written language (e.g. English, French, German, Spanish, Chinese, or Japanese), and binary-coded character sets such as ASCII (American Standard Code for Information Interchange), EBCDIC (Extended Binary Coded Decimal Interexchange Code), and BCD (Binary Coded Decimal).
- ASCII American Standard Code for Information Interchange
- EBCDIC Extended Binary Coded Decimal Interexchange Code
- BCD Binary Coded Decimal
- the digital keyboard 28 displays digital keys containing characters from the English alphabet along with special characters selected from the ASCII character set. Words, phrases, and character sequences can be typed into an electronic document or electronic text by simply using the pointing device to tap or select in sequence each key for the desired word, phrase, or character sequence. As will be discussed further below, the user can also use the digital keyboard 28 to initiate an automated search for completion candidates to more rapidly and flexibly enter words, phrases, and/or character sequences in an automated manner. As will also be discussed later in this specification, several enhancements to the digital keyboard 28 may be implemented to further enhance the user's ability to quickly, efficiently, and flexibly enter text using a pointing device.
- FIG. 4 shows a sample data structure for the dictionary 20 .
- the dictionary 20 contains completion candidates with weight values for ranking completion candidates relative to each other.
- the dictionary 20 contains a plurality of entries, with each entry having a completion candidate field 22 for storing a completion candidate and a weight field 24 for storing a numeric value associated with the completion candidate stored in a corresponding completion candidate field 22 .
- Each completion candidate stored in the dictionary 20 represents a word, phrase, or character sequence according to a particular language. Character sequences may include, but are not limited to, word continuations. Word continuations represent a leading part of a word, but not an entire word.
- word continuations may include, by way of example, such common word prefixes as “com”, “con”, “dis”, “expl”, “inter”, “mis”, “para”, “pre” “syn”, “tele”, “trans” and “univers”.
- common word prefixes as “com”, “con”, “dis”, “expl”, “inter”, “mis”, “para”, “pre” “syn”, “tele”, “trans” and “univers”.
- Each weight field 24 stores a weight value for ranking the corresponding completion candidate stored in the corresponding completion candidate field 22 with other completion candidates in the dictionary 20 .
- the weight value stored in a weight field 24 may be based on one of many metrics.
- each weight field 24 may contain a value representing a degree of common usage of the completion candidate stored in the corresponding completion candidate field 22 relative to the other completion candidates in the dictionary 20 .
- the weight fields 24 may contain numeric values based on what words or phrase came before the completion candidate stored in the corresponding completion candidate from an analysis of a large corpus of text.
- Each weight field 24 may also be supplemented with one or more related fields.
- the candidate prediction system 32 is programmed to dynamically search the dictionary 20 for completion candidates that begin with the partial text entry entered by the user.
- the candidate prediction system 32 retrieves completion candidates from the dictionary 20 by determining which dictionary entries are more likely to be the ones that the user is attempting to type.
- the completion candidates are obtained from the dictionary 20 on the basis of frequency values stored in the weight field 24 for each entry.
- completion candidates having the highest weight values are retrieved.
- the frequency values represent the frequency of usage of the entries relative to each other.
- frequency values may be predetermined on the basis of an analysis of a large corpus of text or may be dynamically generated or modified on the basis of the specific user's usage of words, phrases, and/or character sequences through the data entry system 26 .
- Other statistical measures may also be employed to enhance the candidate prediction system 32 , such as ranking information or identifying the frequency with which an entry in the dictionary 20 follows any of the other entries in the dictionary 20 .
- the candidate prediction system 32 retrieves all possible completion candidates limited by one or more predetermined metrics.
- the total number of completion candidates retrieved is limited by a predetermined maximum number of displayable completion candidates.
- the maximum number of completion candidates retrieved by the candidate prediction system 32 is preferably a small but significant number sufficient enough to provide the user with as many potential candidates as possible without unduly saturating the user with an excessive number of candidates and without unduly delaying the user's ability to quickly review and select from the candidates selected.
- the maximum number of completion candidates is also preferably sufficiently large enough to provide a variety of completion candidates (if available) for the user to choose from so as to avoid an excessive amount of multi-level searching to complete a partial text entry.
- the maximum number of displayable completion candidates is set to five.
- the data entry system 26 can be configured by the user to present a greater or lesser number of completion candidates.
- the completion candidates are displayable in a configuration that can also be selected by the user. Making the number of completion candidates that can be displayed a user-configurable feature provides enhanced capabilities for computer-assisted text generation and entry with the data entry system 26 .
- the interactive search list 30 receives completion candidates retrieved from the dictionary 20 by the candidate prediction system 32 and presents the user with a list of these completion candidates.
- the interactive search list 30 can be displayed to the user in any of several different ways depending on which options the data entry system 26 has been programmed with and which of those options have been selected by the user, as will be discussed later in this specification.
- the interactive search list 30 is displayed on the touch-sensitive screen 14 as an interactive vertical list of completion candidates.
- the interactive search list 30 is programmed to support multi-level searches so that the user can quickly use a completion candidate from one level of search results to drill deeper into the dictionary 20 for a narrower set of completion candidates.
- the digital keyboard 28 and the interactive search list 30 are also interchangeably displayed on the graphical user interface 34 , as further illustrated in FIG. 5 .
- the digital keyboard 28 and the interactive search list 30 are interchangeable, the user can easily and quickly swap between entering characters or otherwise modifying a partial text entry from the digital keyboard 28 and using the interactive search list 30 to rapidly and flexibly complete the entry of words, phrases, and/or character sequences.
- the image of the digital keyboard 28 and the image of the interactive search list 30 share substantially the same display area on the graphical user interface 34 .
- the data entry system 26 for the first embodiment is preferably programmed to automatically swap between the digital keyboard 28 and the interactive search list 30 depending upon the input provided by user from the pointing device.
- This latter feature minimizes disruption to the user's attention to the data entry process since the user's attention remains focussed on the same region of the graphical user interface 34 for both the digital keyboard 28 and the interactive search list 30 .
- Interchanging the display of the digital keyboard 28 and the interactive search list 30 within the graphical user interface 34 provides for a space-efficient layout for the use of the data entry system 26 on the touch-sensitive screen 14 .
- the interchangeability of the digital keyboard 28 and the interactive search list 30 can also minimize hand movement as the user switches between using the digital keyboard 28 and the interactive search list 30 without having to move the pointing device to another part of the touch-sensitive screen 14 (or, more generally, without having to move the pointing device to another part of the hardware input interface). This arrangement can be particularly useful for smaller personal computing devices such as PDAs or other hand-held devices, or where the amount of space on the graphical user interface 34 used by the data entry system 26 needs to be minimized.
- the interactive search list 30 enables the user to more rapidly and flexibly complete the entry of words, phrases, and/or character sequences than would otherwise be required if the user were to simply type in each individual character of the desired entry.
- the user interfaces with the personal computing device 10 via the touch-sensitive screen 14 using a pen as a pointing device.
- a pen as a pointing device.
- the following methodology can be used with various other pointing devices, such as a stylus, finger, track ball, or mouse. If a mouse or an equivalent pointing device is used in place of a pen, stylus, or finger, then for the first embodiment the act of depressing a mouse button should be considered equivalent to touching the touch-sensitive screen 14 or touch pad with a stylus, pen, or finger, and similarly, releasing a depressed mouse button should be considered equivalent to lifting the stylus, pen, finger or other pointing device from the touch-sensitive screen 14 (or a touch pad or pressure sensitive pad).
- a keyboard mode the user can enter text a character (or keystroke) at a time by simply pointing and selecting on keys on the digital keyboard 28 with the pointing device. With each selection of a key on the digital keyboard 28 , the one or more characters that are associated with that key are forwarded to the application 27 for entry into text in, for example, a document or data entry field.
- the search mode the user can search for and select amongst completion candidate suggestions for the completion of a word, phrase, or character sequence as further described below.
- the candidate prediction system 32 when the user begins entering a word, phrase or character sequence using the digital keyboard 28 , the candidate prediction system 32 automatically begins searching the dictionary 20 for candidate words, phrases, and/or character sequences that the user may be attempting to enter.
- the searching of the dictionary 20 performed by the candidate prediction system 20 begins automatically in response to the emerging character sequence entered by the user. This is done by searching the dictionary 20 for completion candidates that begin with the leading character or characters which the user has entered (i.e. the partial text entry).
- the leading characters manually entered by the user with the pointing device are stored temporarily as a search string.
- the search string is used by the candidate prediction system 32 to search the dictionary 20 for potential completion candidates.
- the candidate prediction system 32 retrieves completion candidates on the basis of which entries in the dictionary 20 are most likely to contain the completion candidate that the user is attempting to type based on the partial text entry currently entered as indicated by the weight fields in the dictionary.
- the completion candidates retrieved from the dictionary 20 by the candidate prediction system 32 are provided to the interactive search list logic which causes the interactive search list 30 to be produced.
- the interactive search list logic produces a revised list of completion candidates as the user adds or deletes characters to or from the partial text entry with the digital keyboard 28 .
- the interactive search list 30 is displayed on the graphical user interface 34 , however, depends on whether or not the interactive search list 30 has been activated.
- the user may activate the interactive search list 30 by pausing for a predetermined delay period L 1 with the pointing device in a selection made by touching down on a key on the digital keyboard 28 containing one or more characters.
- the amount of time that the user must pause with the key selected is a user-configurable option. Typically, the delay chosen will be less than one second, although the delay may be configured to any time period the user desires.
- the interactive search list 30 becomes active. When the interactive search list 30 is activated, the entry mode for the data entry system 26 changes from the keyboard mode to the search mode. In addition, when the interactive search list 30 is activated in the first embodiment, the image of the digital keyboard 28 is cleared from the graphical user interface 34 and replaced by an image of the interactive search list 30 .
- the interactive search list 30 is updated continuously in the first embodiment, in an alternative configuration the interactive search list 30 may be generated once the user activates the interactive search list 30 by pausing on a selected key on the digital keyboard 28 . In this alternative configuration, the interactive search list 30 is revised each time the user calls up a new or modified interactive search list 30 .
- the data entry system 26 provides the user with the flexibility to proceed with one of several operations using the pointing device.
- the user can do any one of the following:
- the user's action with the pointing device generates user input signals which are monitored and analyzed by the data entry system 26 to determine the type of user selection or action being made. Which operation is executed by the data entry system 26 once the interactive search list 30 is activated depends on what action the user takes with the pointing device.
- the user may lift the pointing device without any significant movement or may lift the pointing device after dragging it to or returning it to a dead zone. Either of these actions causes the hardware input interface 17 to generate a user input signal which serves as an indication to the data entry system 26 that the user wishes to deactivate the interactive search list 30 and return to modifying the current partial text entry manually with the digital keyboard 28 .
- the image of the interactive search list 30 is cleared from the graphical user interface 34 and replaced with the image of the digital keyboard 28 which is enabled for further use by the user. This allows the user to smoothly return to using the digital keyboard 28 in keyboard mode without having to relocate the pointing device. Once in keyboard mode, the user may reinitiate searching with the interactive search list 30 to obtain further completion candidates by pausing once again on a subsequently selected key (or character) on the digital keyboard 28 with the pointing device.
- a completion candidate from the interactive search list 30 is selected by the user by generating a gesture with the pointing device.
- gesture refers to a motion with the pointing device when the pointing device is in an active state. In general, motions making up gestures may be linear or in another computer-recognizable pattern.
- a gesture is a motion with the pointing device in a particular direction for at least a minimum distance when the pointing device is in an active state. The minimum distance is used by the data entry system 26 in the first embodiment to ignore small, insignificant gestures with the pointing device. This minimizes false selections arising from inadvertent movements with the pointing device.
- the pointing device is in an active state when the pointing device is held to the touch-sensitive screen 14 .
- other conditions can be used to identify when the pointing device is in an active state and are considered equivalent. For example, with a pressure-sensitive pad, the pointing device is in an active state when the pointing device is either in contact with the pressure-sensitive pad or depressed on the pressure-sensitive pad with at least a measurable amount of pressure.
- the pointing device is a mouse, it is in an active state when a button on the mouse is pressed.
- the data entry system 26 monitors the gestures made with the pointing device to determine when and if a completion candidate in the interactive search list 30 has been selected. In the first embodiment the data entry system 26 does this by monitoring the current position coordinates of the pointing device on the touch-sensitive screen 14 relative to a point of origin generated when the pointing device first activates the interactive search list 30 . The current position coordinates of the pointing device are monitored so long as the pointing device remains in contact with the touch-sensitive screen 14 . As the current position coordinates are received, the data entry system 26 generates a vector using the current position coordinates of the pointing device and the point of origin. This vector is used by the data entry system 26 to determine if and when a particular completion candidate in the interactive search list 30 has been selected.
- the data entry system 26 is programmed to associate certain types of vectors with certain entries in the interactive search list 30 .
- Each entry in the interactive search list 30 may have associated with it one or more predetermined vectors.
- the data entry system 26 recognizes that the user is selecting the completion candidate displayed in the corresponding entry of the interactive search list 30 .
- the data entry system 26 determines that the vector currently being generated by the user's gesture is associated with one of the completion candidates in the interactive search list 30 , then the associated completion candidate is selected from the interactive search list 30 .
- the particular selection is indicated to the user by the data entry system 26 by highlighting the selected completion candidate in the interactive search list 30 .
- the data entry system 26 is programmed to require that the user gesture with the pointing device towards or onto a completion candidate in the interactive search list 30 in order to select that completion candidate.
- the gesture need only be a minimum distance.
- the movement performed by the user is relative. With this “relative” mode of gesture-based candidate selection, the user need only gesture in a direction associated by the data entry system 26 with a desired completion candidate without the pointing device necessarily moving towards or onto the portion of the graphical user interface 34 where the completion candidate is displayed.
- the predetermined vectors associated by the data entry system 26 with entries in the interactive search list 30 correspond to unique gestures with the pointing device but not necessarily gestures which are towards or onto a particular entry in the interactive search list 30 .
- gestures with the pointing device are used to select amongst the completion candidates on the interactive search list 30 even if the user is not gesturing with the pointing device towards or onto a particular completion candidate within the interactive search list 30 .
- the user need not move the pointing device toward or onto a particular fixed location within the interactive search list 30 in order to select a specific completion candidate.
- Associating a gesture with a particular completion candidate in the interactive search list 30 in the above manner minimizes the amount of movement required with the pointing device to make selections from the interactive search list 30 and provides the user with the flexibility of selecting completion candidates with gestures which do not necessarily need to be towards or onto the desired completion candidate.
- completion candidate represents the entry that the user wishes to add to the text
- the user can accept the selected completion candidate for insertion into the text by lifting the pointing device up from the touch-sensitive screen 14 in less than a predetermined time limit L 2 while keeping the particular completion candidate selected. This latter event generates a user input signal which instructs the data entry system 26 to terminate all searching based on the partial text entry and to signal to the application 27 to use the selected completion candidate to permanently replace the partial text entry.
- the interactive search list 30 is cleared from the graphical user interface 34 and the image of the digital keyboard 28 is re-enabled ready to receive the next keystroke from the pointing device.
- the data entry system 26 begins monitoring anew for text entries by the user.
- the user can move through the list of completion candidates displayed in the interactive search list 30 by gesturing with the pointing device to other completion candidates in the interactive search list 30 . If the user is unsure of which completion candidate to use, or wishes to pause to consider whether to continue in search mode or to return to keyboard mode, the user can gesture to or pause in a “dead zone” within the graphical user interface 34 for any length of time without triggering any further action by the data entry system 26 .
- the data entry system 26 when the interactive search list 30 is first activated by pausing on a key on the digital keyboard 28 , the data entry system 26 begins monitoring the length of the vector that the user thereafter generates to determine when and if the user is selecting a particular completion candidate from the interactive search list 30 or when the user is carrying out another recognized operation such as moving the pointing device to a dead zone. If the data entry system 26 determines that the length of the vector generated from the current position coordinates and the point of origin is less than or equal to a predetermined length, then the pointing device is deemed by the data entry system 26 to be in a dead zone on the graphical user interface 34 .
- the user When the pointing device is in a dead zone, the user has the freedom to pause without activating any further operation by the data entry system 26 and without clearing the interactive search list 30 . This provides the user with the option to pause and consider what their next operation will be.
- Other dead zones may be programmed into the graphical user interface 34 within the data entry system 26 for the user to move to with the pointing device so as to further enhance the user's ability to pause in different parts of the graphical user interface 34 . If the vector being generated through a gesture is found by the data entry system 26 to exceed a predetermined length, the data entry system 26 checks to determine whether or not the particular vector being generated is associated with any of the completion candidates in the interactive search list 30 or with any other operation on the screen such as alternative or additional dead zones.
- the user can use the selected completion candidate to dynamically initiate a further search for a more refined list of completion candidates from the dictionary 20 .
- the ability to dynamically search for a more refined list of completion candidates based on a selected completion candidate is also referred to in this specification as an iterative search.
- This iterative search tool is provided through the interactive search list 30 when the interactive search list 30 is active and displays a list of potential completion candidates for the user to select from.
- An iterative search is triggered in the first embodiment by the user continuing to keep a completion candidate in the interactive search list 30 selected for more than the predetermined time limit L 2 .
- the automated ability to use the interactive search list 30 to further search the dictionary 20 allows the user to make good use of search results that are only partially successful.
- the data entry system 26 determines that the user input corresponds to a user selection to initiate a new search using the selected completion candidate as the basis for the new search.
- the candidate prediction system 32 dynamically obtains a refined list of completion candidates based on the selected completion candidate.
- the refined list of completion candidates provides a narrower list of completion candidates based on a more specific search string (ie. the selected completion candidate). Iterative searching enables one to perform multi-level searches with the interactive search list 30 , so that the user can quickly use a completion candidate from one level of search results to drill deeper into the dictionary 30 for a narrower set of completion candidates.
- iterative searching can be initiated when a completion candidate in the interactive search list 30 represents only a first part (i.e. a leading part) of the entry that the user wishes to add to the text.
- the interactive search list 30 is redisplayed with the refined list of completion candidates.
- the point of origin coordinates used by the data entry system 26 to track vectors generated from gestures with the pointing device, are set to the position coordinates of the pointing device at the time the selected completion candidate is used to initiate the iterative search.
- the user can then choose one of the completion candidates in the refined list by lifting the pointing device up after selecting the particular completion candidate through a gesture, or the user can further repeat the iterative search process by selecting one of the completion candidates in the refined list and pausing with that particular completion candidate selected for the predetermined time limit L 2 .
- the user may also lift the pointing device without selecting any completion candidates obtained from the iterative search. This latter action leaves the search string set to the last completion candidate selected by the user in the previous iteration of the interactive search list 30 and returns the data entry system 26 to keyboard mode.
- the interactive search list 30 is cleared from the graphical user interface 34 , the digital keyboard 28 is redisplayed, and the data entry system 26 sends the completion candidate last selected by the user from the interactive search list 30 to the application 27 for entry into the text, thereby replacing the contents of the partial text entry currently under development by the user. New characters can then be added.
- the user can then, if desired, instruct the application 27 to cancel the entry of the modified partial text entry into the text by selecting a function button displayed on or associated with the digital keyboard 28 .
- FIGS. 6 and 6 A show an example 100 from the user's perspective of the operation and flexibility of the data entry system 26 for the first embodiment in FIG. 1 to 5 .
- the user wishes to enter in the word “endlessly” and begins by entering at block 102 the letter “e” on the digital keyboard 28 .
- the user pauses on the letter “e” for at least the predetermined time limit L 1 which automatically triggers at block 104 the candidate prediction system 32 to obtain a list of completion candidates that are then displayed to the user in the interactive search list 30 .
- the desired completion candidate “endlessly” may not be one of the choices displayed in the initial interactive search list 30 .
- the word “end” is one of the completion candidates displayed in the initial interactive search list 30 .
- the user can select by gesture at block 108 the completion candidate “end” and use it to automatically initiate a further search of the dictionary 20 in order to retrieve a list of prioritized completion candidates which all begin with the prefix “end”.
- this iterative searching technique is performed with the pointing device by simply pausing while selecting a completion candidate (in this case the word “end”) in the interactive search list 30 for the predetermined time limit L 2 at block 116 which thereby automatically initiates a new search using the selected completion candidate as the basis for such a search. If the desired completion candidate “endlessly” appears in the updated interactive search list 30 , the user can then immediately add the desired completion word by selecting it (block 120 ) from the interactive search list 30 and lifting the pointing device up in less than the predetermined time limit L 2 .
- a completion candidate in this case the word “end”
- the user has the option of lifting the pointing device up from the touch-sensitive screen 14 at block 106 without selecting any of the completion candidates.
- This type of user input notifies the data entry system 26 to clear the interactive search list 30 from the screen and to re-enable and display the digital keyboard 28 . It should be noted that in this latter operation the search string continues to contain the partial text entry which the user has generated with the digital keyboard 28 (in this case the search string contains only the letter “e”).
- the partial completion candidate “end” is used to initiate a further search and the new list of completion candidates does not include the word “endlessly” in the list of completion candidates, then the user can choose to continue building upon the partial completion candidate “end” by lifting the pointing device up from the touch-sensitive screen 14 without any of the completion candidates in the new list selected and proceeding to continue entering in characters from the digital keyboard 28 .
- the partial completion candidate “end” permanently replaces the partial text entry that the user was generating with the digital keyboard 28 in the user's electronic text, the search string is cleared and any new user input is automatically treated as being part of a new partial text entry.
- the user can follow the final steps to completing a partial completion candidate by entering in the remaining letters at the end of the partial completion candidate. For instance, if the partial completion candidate “endless” was retrieved, then the user can simply tap on the digital keyboard 28 the letters “I” and “y” followed by a space (or the end-of-entry function button) in order to notify the data entry system 26 of the completion of the current text entry.
- the use of the data entry system 26 to retrieve a partial completion candidate can result in less time and effort being expended than if the user had simply typed in each letter of the desired word, phrase, or sequence of characters.
- the data entry system 26 is programmed with the ability to re-initiate automated searching even once the application 27 is instructed by the data entry system 26 to permanently replace the partial text entry in the text with a partial completion candidate.
- the search string is set to the partial completion candidate when the partial completion candidate replaces the partial text entry in the user's electronic text. The user may then return at any time to the automated search facility of the data entry system 26 by pausing on a key on the digital keyboard 28 for the predetermined time limit L 1 .
- the user can then return to automated searching with the data entry system 26 by, for example, touching on the letter “I” on the digital keyboard 28 for a sufficient period of time to initiate a search on the basis of the prefix “endl”.
- the partial text entry “endl” will then be used by the candidate prediction system 32 to obtain a list of completion candidates that are then displayed in the interactive search list 30 . If the desired completion candidate “endlessly” appears in the new list of completion candidates, the user may then choose that candidate and add it to the text by selecting that candidate and lifting the pointing device before the time limit L 2 is reached.
- the user can simply lift the pointing device without selecting any of the completion candidates and continue building upon the partial text entry “endl” by entering further characters via the digital keyboard 28 .
- the desired word is complete, in order to clear the search string and instruct the data entry system 26 to treat any new user input as being part of a new partial text entry in this alternative, the user selects a key or function from the digital keyboard 28 programmed to indicate that entry of the original partial text entry has ended, as discussed further in the section below.
- the data entry system 26 is programmed to recognize that the user has completed the current partial text entry and automatically initializes so that the next character selected from the digital keyboard 28 will be treated as a leading character for a new partial text entry. However, if the user is completing a partial text entry that is not found in the dictionary 20 and completes the partial text entry by simply entering characters from the digital keyboard 28 , then the data entry system 26 may not know when the current partial text entry is completed and when the next partial text entry has begun.
- the data entry system 26 can be programmed to monitor for an “end-of-search” signal from the user via the digital keyboard 28 .
- an end-of-search signal is received by the data entry system 26 when a key or function button programmed to indicate an express “end-of-search” instruction is selected from the digital keyboard 28 .
- the data entry system 26 can be programmed to recognize an implicit end-of-search instruction such as, for example, when the space key on the digital keyboard 28 is selected. Other non-alphabetic characters may also be used to provide an implicit end-of-search instruction.
- FIG. 7 to 9 are logical flow diagrams illustrating the flow of operation of the data entry system 26 .
- the description of the computer-implemented process illustrated in FIG. 7 to 9 will be made with reference to the personal computing device 10 and the data entry system 26 shown in FIG. 1 to 5 .
- the data entry system 26 is initialized at block 202 . This includes the initialization of variables and flags used within the data entry system 26 to track the state of user input, processing, and output. This also involves initializing the user interface for the data entry system 26 including loading and setting up the digital keyboard 28 for display, selecting the dictionary 20 to be used by the data entry system 26 , identifying the type of pointing device that will be used for text entry, and setting up any user-defined configurations for the display and use of the digital keyboard 28 , and the interactive search list 30 . Once the data entry system 26 including the digital keyboard 28 is initialized, the user interface for the data entry system 26 is then displayed on the touch-sensitive screen 14 at block 204 . In its most basic form, the user interface initially displayed comprises the digital keyboard 28 .
- the user interface may also include one or more toolbars or display boxes for the display of the current value of the search string and the current contents of the interactive search list 30 .
- the data entry system 26 awaits for user input from the pointing device at block 206 .
- the data entry system 26 determines at block 208 whether the user input received by the data entry system 26 at block 206 corresponds to any of the characters displayed on the digital keyboard 28 . If the user input is found at block 208 to correspond with a character displayed in the digital keyboard 28 , then that character is added to the search string at block 210 .
- the search string is used by the candidate prediction system 32 to search the dictionary 20 for potential completion candidates.
- the candidate prediction system 32 continuously retrieves a list of completion candidates from the dictionary 20 as contents of the search string change. As the user modifies the current partial text entry under construction, the contents of the search string are modified and used by the candidate prediction system 32 at block 212 to obtain a new list of completion candidates from the dictionary 20 .
- the candidate prediction system 32 retrieves the first and last entry from the dictionary 20 that begin with the contents of the search string. The first and last entry retrieved are then used to define a search span. If the search span is greater than the number of completion candidates that the data entry system 26 is programmed to display, then the completion candidates within the search span having the highest corresponding weight values (for example, frequency values) are retrieved up to the maximum number of permissible completion candidates which may be displayed in the interactive search list 30 . The completion candidates retrieved by the candidate prediction system 32 in this manner are then compiled into a list of completion candidates which is used for display in the interactive search list 30 .
- weight values for example, frequency values
- the interactive search list 30 containing a list of completion candidates may be generated once the user invokes activation of the interactive search list 30 by pausing on a key on the digital keyboard 28 (i.e. following block 214 ).
- the data entry system 26 determines at block 214 whether or not the pointing device has been released from the touch-sensitive screen 14 within the predetermined time limit L 1 . If the data entry system 26 finds that the pointing device has been released within the time limit L 1 , processing returns to block 206 where the data entry system 26 waits for further user input. This allows the user to use the digital keyboard 28 to type out a portion or all of a desired text entry by briefly tapping on keys on the digital keyboard 28 one keystroke at a time.
- the data entry system 26 determines at block 216 whether or not the list of completion candidates (also referred to and shown in the drawings as a candidate list) is empty. If the candidate list is found to be empty at block 216 , then the candidate prediction system 32 has not found any completion candidates in the dictionary 20 which would potentially complete the partial text entry under development by the user. In this case, processing returns to block 206 .
- the list of completion candidates also referred to and shown in the drawings as a candidate list
- the user can complete the desired text entry by continuing to type in the remaining characters from the digital keyboard 28 or may otherwise modify the partial text entry under development using other function keys available on the user interface (such as canceling the current partial text entry or backspacing one or more characters in the partial text entry). If, on the other hand, the list of completion candidates is found not to be empty at block 216 , then the digital keyboard 28 is disabled at block 218 and the interactive search list 30 containing the candidate list obtained in block 212 is displayed within the interactive search list 30 on the touch-sensitive screen 14 at block 222 and the data entry system 26 waits for further user input at block 224 .
- the user can use the data entry system 26 to take one of several actions.
- the user can deactivate the interactive search list 30 and return to modifying or editing the current partial text entry by lifting the pointing device from the touch-sensitive screen 14 without any significant movement. If this action is detected at block 226 then the candidate list is cleared and the search string contents are preserved at block 262 . Processing then returns to block 204 where the user can continue modifying the current partial text entry using the digital keyboard 28 .
- the gesture is analyzed at block 228 to determine if it is associated with a completion candidate displayed in the interactive search list 30 . If the gesture is found to be associated with a completion candidate in the interactive search list 30 , then that completion candidate is selected from the interactive search list 30 at block 230 . Preferably, when a completion candidate is selected in the interactive search list 30 at block 230 , the selected completion candidate is highlighted or otherwise emphasized in some way to the user.
- a timer T 2 is started.
- the timer T 2 is used in the first embodiment to monitor how long the selected completion candidate remains selected by the user.
- the user can select one of the completion candidates in the interactive search list 30 and use the selected completion candidate to either replace the partial text entry that the user is currently entering or use the selected completion candidate to initiate a further automated search to obtain a more refined list of completion candidates from the dictionary 20 .
- the timer T 2 is used to distinguish between these latter two types of operations which the user may initiate with the pointing device using the selected completion candidate. It should be noted that if the gesture analyzed at block 228 is not found to be associated with the completion candidate, then processing returns to block 224 where the data entry system 26 awaits further user input from the pointing device for analysis at block 226 .
- the data entry system 26 monitors the timer T 2 at block 232 and monitors for further user input. If the data entry system 26 detects further user input from the pointing device at block 236 , the timer T 2 has not exceeded the predetermined time limit L 2 and the user input is analyzed at block 238 to determine whether the user has initiated a gesture or lift with the pointing device. If a lift is detected at block 238 , then this event serves as an indication to the data entry system 26 that the selected candidate in the interactive search list 30 has been accepted by the user, in which case the completion candidate is added to the text in place of the potential text entry and the search string is cleared at block 240 .
- the data entry system 26 returns to block 204 where the initialized user interface is displayed on the touch-sensitive screen 14 , and the data entry system 26 awaits for further user input at block 206 . Any new characters received by the data entry system 26 are then treated as being part of a new partial text entry.
- a gesture is detected at block 238 , then the gesture is analyzed to determine if it is associated with a different completion candidate in the interactive search list 30 at block 242 , and if the answer to the analysis of block 242 is “YES”, then the data entry system 26 changes the completion candidate selection from the interactive search list 30 and restarts the timer T 2 at block 244 . The data entry system 26 then continues to monitor the timer T 2 at block 232 and user input at block 234 . From the user's perspective, a different completion candidate from the list of completion candidates is highlighted.
- the data entry system 26 determines at block 246 whether the gesture is associated with a dead zone on the user interface.
- dead zones are used to allow the user to deselect a selected completion candidate and to pause to consider what further action the user may wish to take. Dead zones are particularly useful when a timer such as timer T 2 is used as the triggering mechanism to determine when a selected completion candidate is to be used to initiate a further automated search of the dictionary 20 .
- this event serves as an indication to the data entry system 26 that the selected candidate is to be used to initiate a further automated search, in which case processing proceeds to block 250 where the search string is set to equal the selected completion candidate and a new list of completion candidates is obtained from the dictionary 20 at block 252 . This new list of completion candidates is then displayed in the interactive search list 30 at block 254 , and the data entry system 26 then awaits further user input at block 256 .
- User input received at block 256 is analyzed at block 258 and if at block 258 the data entry system 26 determines that the user input corresponds to a gesture with the pointing device, the gesture is analyzed at block 260 to determine if the gesture generated by the pointing device is associated with any of the completion candidates from the new list of completion candidates displayed in the interactive search list 30 . If the gesture is not associated with a completion candidate, then the data entry returns to block 256 and awaits for further user input from the pointing device.
- the data entry system 26 returns to block 230 where the associated completion candidate is selected, the timer T 2 is restarted, and the data entry system 26 then monitors to see, as before, whether or not the user will use the selected completion candidate to either replace the partial text entry or initiate a further automated search.
- the new list of completion candidates is displayed in the interactive search list 30 at block 254 , and the data entry system 26 awaits for user input at block 256 , that the pointing device remains in contact with the touch-sensitive screen 14 . This situation is similar to the one at blocks 222 and 224 except that the interactive search list 30 has been updated to contain a new list of completion candidates for the user to select from.
- the data entry system 26 may include a variety of features and aspects to further enhance functionality and flexibility of text entry for the user when a single pointing device is used. Furthermore, each of the following features and aspects individually provides a beneficial enhancement and is an embodiment of the present invention. These additional features and aspects of the present invention will now be described below. Many of the features and aspects described below can also be applied in combination with various types of search lists containing completion candidates, including single and multi-level search lists.
- the data entry system 26 is programmed to notify the user of the active entry mode.
- the data entry system 26 is programmed to display on the graphical user interface 34 an express indication of the currently active entry mode (as illustrated in blocks 218 and 240 of FIGS. 23 and 24 ).
- two entry modes are tracked with the data entry system 26 : (1) a keyboard mode to indicate that the digital keyboard 28 is active, and (2) a search mode to indicate that automated searching is active with the interactive search list 30 . Displaying on the graphical user interface 34 an express indication of the current entry mode for the data entry system 26 is achieved by displaying a different color signal (or set of signals) on the graphical user interface 34 depending on which entry mode is currently active.
- specific icons can be assigned to each entry mode and displayed on the graphical user interface 34 when the corresponding entry mode is active. Notifying the user of the entry mode with one or more express indicators on the graphical user interface 34 minimizes the risk of the user losing track of whether the user is in keyboard mode or in search mode and enhances the ease of use of the data entry system 26 . This can be particularly useful when both the digital keyboard 28 and the interactive search list 30 are displayed simultaneously on the graphical user interface 34 .
- the interactive search list 30 has fewer than the predetermined maximum number of displayable completion candidates, then this will serve as an indication to the user that the interactive search list 30 currently displayed contains all of the completion candidates in the dictionary 20 that begin with the partial text entry that the user has entered. If, however, the interactive search list 30 is full when it is activated by the user, it will not be clear from looking at the interactive search list 30 whether any other potential completion candidates for the current partial text entry may reside in the dictionary 20 . In order to remove this ambiguity and expressly indicate if there are any more potential completion candidates and if so, how many, in another aspect the data entry system 26 is programmed to display on the graphical user interface 34 the number of potential completion candidates in the dictionary 20 that have leading characters matching the current partial text entry.
- the number of potential completion candidates is displayed and updated by the data entry system 26 when the digital keyboard 28 is in use and whenever the interactive search list 30 is activated or updated with new completion candidates (as illustrated for example at blocks 213 A and 254 of FIGS. 21 and 22 ).
- the data entry system 26 can be programmed to display on the graphical user interface 34 a graphical indication of whether or not additional completion candidates having leading characters matching the current partial text entry are located in the dictionary 20 , in addition to those candidates displayed in the interactive search list 30 .
- the graphical indication is displayed and updated by the data entry system 26 when the digital keyboard 28 is in use and whenever the interactive search list 30 is activated or updated with new completion candidates.
- This notification feature enhances the user's ability to know, even before attempting to use the interactive search list 30 , when automated searching may retrieve a list of possible completion candidates (or a refined list). With this advanced notification feature, the user can better decide when to continue adding further characters to the partial text entry with the digital keyboard 28 and when to activate and use the interactive search list 30 .
- the digital keyboard 28 can be programmed to be displayed in a frequency distributed layout.
- the frequency distributed layout takes advantage of the well known principle that certain characters in a character set are more frequently used than other characters within the same character set.
- the digital keyboard may contain the letters of the English alphabet displayed in a frequency distributed layout based on an analysis of a large corpus of text. It will be appreciated, of course, that the characters or symbols in a particular character set may have different relative frequencies depending upon the sample population of data used to rank such characters relative to each other within a particular character set. It will be appreciated that when the data entry system 26 is employed, the frequency of characters entered may be different than that of traditional systems that enter text one character at a time. These general principles are used to generate a frequency distributed layout for the digital keyboard.
- the digital keyboard is programmed to include a plurality of characters assigned to predetermined locations within the layout for the digital keyboard according to a predetermined frequency distribution associated with the plurality of characters.
- the plurality of characters displayed on the digital keyboard include less commonly used characters and more commonly used characters based on the predetermined frequency distribution.
- the digital keyboard is displayed on a graphical user interface with the less commonly used characters displayed substantially further from the center of the digital keyboard than the more commonly used characters.
- An example of this type of digital keyboard is illustrated generally in FIG. 3 except that the “space” key has been located in the outer ring rather than closer to the center of the digital keyboard 28 .
- An example of the digital keyboard 28 having a frequency distributed layout with the space key near the center is shown in FIG. 10 .
- the image of the digital keyboard 28 when substantially circular or elliptical, has a first group of most frequently used characters (i.e. the most commonly used characters) located substantially near to the center of the digital keyboard 28 with at least one group of less frequently used characters (relative to the first group) displayed at a distance further from the center of the keyboard than the characters of the first group.
- the digital keyboard 28 is preferably configured to be displayed in a frequency distributed layout comprising a plurality of characters arranged into rings.
- the characters on the digital keyboard 28 are arranged into rings, then the characters in a particular ring can be arranged to each be about the same distance from the center of the digital keyboard 28 providing some uniformity to the movements required to enter text. This can also be useful for certain arrangements including, for example, when the digital keyboard 28 is programmed to be dynamically re-positionable as discussed further below.
- At least one most commonly used character of a pre-selected character set or subset (such as a subset or the ASCII character set) is located substantially in or near the center of the digital keyboard 28 .
- the next most commonly used characters are located within an intermediate ring, and the less commonly used characters of the character set are distributed in an outer ring of the digital keyboard 28 .
- the most commonly used characters are located in or close to the center of the digital keyboard 28 , the degree of movement required with a pointing device to select characters displayed within the intermediate or inner rings of the digital keyboard 28 is minimized.
- arranging characters on the digital keyboard 28 in concentric-like rings according to their frequency of use provides an easy and efficient mechanism for retrieving characters and entering data using a pointing device.
- each ring When rings are used with the digital keyboard 28 , it will be appreciated that the arrangements of the characters within each ring is by no means limited to the layout shown in FIG. 3 or 10 .
- the characters within a particular ring may be organized alphabetically in a clockwise (or counter clockwise) order.
- a challenge with many keyboard designs is that they take time to learn.
- the above ordered organization increases the opportunity to quickly learn and recall the location of characters displayed on the digital keyboard 28 , since user's are already familiar with this clockwise distribution.
- the characters in one half of a ring may be ordered alphabetically in one direction (for example, clockwise), and all characters in the other half of the same ring (for example, the lower half) may be ordered alphabetically in the other direction (counterclockwise).
- the digital keyboard 28 may also vary. In general, the type of characters displayed and available, the type and number of characters displayed on particular keys of the digital keyboard 28 , the font size of each character displayed, and the value to be processed when a particular key is contacted (or selected) may all vary from keyboard to keyboard. As well, to minimize clutter the digital keyboard 28 can be displayed with no graphics outlining the keys on the digital keyboard 28 . For circular or ring-like keyboard layouts, several other characteristics may also vary, including the number of rings making up the keyboard layout, the number of keys displayed in each ring, and in the keyboard as a whole and the thickness or width of each ring.
- the digital keyboard 28 layout may be dynamically replaced by the user with another keyboard layout.
- This feature can be particularly advantageous when it is desirable to permit a user to quickly swap between several keyboard layouts (for example, as between the keyboard layouts in FIGS. 10, 11 and 12 ), as in the case where the touch-sensitive screen 14 is relatively small or the number of characters required to enter data exceeds the space available to display the digital keyboard 28 within a location on the touch-sensitive screen 14 . Permitting the user to swap between multiple keyboard layouts provides the user with a significant degree of flexibility when entering characters with the data entry system 26 .
- multiple keyboard layouts when multiple keyboard layouts are available, they can be organized according to various subclasses of characters. For instance, a default keyboard layout may contain alphabetic characters.
- a second keyboard layout may contain numeric characters.
- a third keyboard layout may contain special characters. Grouping a character set into logical subgroups and organizing these subgroups on multiple keyboard layouts provides the user with the ability to logically navigate amongst different types of keyboard layouts when desired.
- the user may activate a particular keyboard layout using one or more hot keys each associated with at least one of the available keyboard layouts.
- a hot key may be any key or function associated with the digital keyboard 28 that triggers the display of an alternative keyboard layout. When a hot key associated with a particular keyboard layout is selected by the user from the digital keyboard 28 , the currently displayed keyboard layout is replaced with the keyboard layout associated with the selected hot key.
- a number of different related symbols or characters may be accessed through one key on the digital keyboard 28 .
- a punctuation key a number of different punctuation marks may be displayed, and the user may select one of these choices by gesturing to select the desired symbol or character.
- multiple dictionaries may be stored in the computer-readable medium 16 ( FIG. 1 ), with each dictionary containing completion candidates with associated weight values for ranking completion candidates relative to each other.
- the weight values may represent frequency of use values weighted according to usage in a particular language or a particular field of use (eg. engineering, general business, law, accounting) or a particular user's use.
- the data entry system 26 (for example, in FIG. 1 to 5 ) can contain multiple simultaneously accessible dictionaries that the user can enable and disable individually.
- the data entry system 26 can have a first dictionary containing completion candidates based on Oxford English and a second dictionary containing completion candidates based on American English, both active at the same time and both accessed and used by the candidate prediction system 32 when a list of completion candidates is to be obtained.
- the data entry system 26 can have a legal dictionary, a civil engineering dictionary, and a regular American English dictionary all active simultaneously. This feature enables the user to obtain a list of completion candidates simultaneously containing variations on particular words, phrases, or character sequences particular to specific areas of practice or particular to specific types of dictionaries.
- the candidate prediction system 32 can be programmed to retrieve completion candidates from two or more dictionaries, each having their own weighting function for completion candidates (as illustrated in blocks 212 and 252 of FIGS. 25 and 26 ). When this is done, the candidate prediction system 32 can generate a final list of completion candidates based on a combining function that takes into account the weight values associated with the completion candidates retrieved from the multiple dictionaries and which also prioritizes the completion candidates based on the source dictionary from which a particular completion candidate is retrieved. By way of example, the candidate prediction system 32 may be programmed to include in the final list the top N completion candidates (where N ⁇ 1) from each list of completion candidates retrieved from the multiple a dictionaries.
- a predefined dictionary may also be modified or generated based on a particular user's usage of particular words or character sequences over the course of using the data entry system 26 .
- Such a “personalized” dictionary may also be used to produce lists of the most common completion candidates used by a user. For example, the actual usage of completion candidates from the dictionary may be tracked by the data entry system 26 .
- a personalized dictionary may also be used in combination with other dictionaries.
- the candidate prediction system 32 may be programmed to give priority first to completion candidates (up to a predetermined limit) beginning with the contents of the search string and recorded in the personalized dictionary as having the highest weight values, and then, if space remains in the interactive search list 30 , to completion candidates having the highest weight values in the standardized dictionary and beginning with the contents of the search string.
- a new dictionary may be generated based on the completion candidates selected by the user through the use of the data entry system 26 over time. The user may activate the new dictionary at any time so that it takes priority over any pre-existing dictionary(ies) if completion candidates beginning with the search string are located in the new dictionary.
- the data entry system 26 may be programmed to monitor a specific user's pattern of usage of completion candidates from the interactive search list 30 over time. For example, as completion candidates are selected by the user and entered into the text using the data entry system 26 , an additional weight field in each entry of the dictionary 20 may be used by the data entry system 26 to track the user's actual frequency of completion candidate usage.
- the candidate prediction system 32 may be configured to find the most common completion candidates in the dictionary 20 beginning with a search string based firstly on the degree of actual user usage tracked in the additional usage fields of the dictionary 20 associated with completion candidates therein, and secondly based on the predefined weight fields 24 if the additional usage fields are null or are less than a predetermined threshold value defining a minimum percentage level of usage for evaluation, or if the list of completion candidates retrieved using the additional usage fields results in a number of completion candidates less than the maximum number which may be displayed with the interactive search list 30 .
- the candidate prediction system 32 tracks the total number of selections made from the dictionary 20 (for example, in a TOTAL_USAGE field in the candidate prediction system 32 ) over time by the user, as well as the total number of occasions on which a particular completion candidate in the dictionary 20 is actually used by the user to replace a partial text entry (for example, in a COMPLETION_CANDIDATE_USAGE field in the candidate prediction system 32 ).
- the data entry system 26 compares the value COMPLETION_CANDIDATE_USAGE/TOTAL_USAGE with the predetermined threshold value.
- the end-most commonly used completion candidates retrieved for display may be configured based primarily on the user's actual completion candidate usage as opposed to a predefined frequency distribution preprogrammed into fields 24 of the dictionary 20 .
- the digital keyboard 28 is programmed to be dynamically re-positionable so as to follow the pointing device.
- the digital keyboard 28 is programmed to be dynamically re-positionable, its image follows the movement of the pointing device on the touch-sensitive screen 14 so that the keyboard image remains generally centered beneath the pointing device after each keyboard selection.
- the digital keyboard 28 is programmed to automatically re-center itself on a location within the graphical user interface 34 associated with a last known set of position coordinates for the pointing device. For example, if the character “u” is selected with the pointing device from the digital keyboard 28 in FIG.
- the digital keyboard 28 re-centers itself substantially over the position coordinates which were used by the pointing device to select the character “u”.
- the position and distance of the keys on the digital keyboard 28 relative to the user's pointing device remains substantially constant. This provides a uniform mechanism for consistently selecting the same key on the digital keyboard 28 using substantially the same movement with the pointing device.
- the digital keyboard 28 is dynamically re-positionable the degree and frequency with which the user is required to reposition the pointing device after selecting keyboard characters is minimized.
- this re-positionable feature is combined with a frequency distributed keyboard having the most common characters near the center, the pointing device will generally always rest in the center of the most common characters. If the frequency distributed keyboard is made up of rings, then each of the characters in a particular ring will be equidistant from the pointing device when the pointing device is resting in the center of the keyboard, resulting in a uniformity of movement for character entry.
- the digital keyboard 28 When the digital keyboard 28 is programmed to be dynamically re-positionable, it may also be programmed to reposition to a substantially central location within the graphical user interface 34 (or to another user-definable position) when the digital keyboard 28 approaches within a predetermined distance of any of the boundaries of the graphical user interface 34 . Repositioning the digital keyboard 28 in this way provides a mechanism to adjust for circumstances where the digital keyboard 28 drifts too close to a boundary of the touch-sensitive screen 14 . In an alternative repositioning mechanism, a hot key may be used to automatically re-center the digital keyboard 28 .
- the dynamically re-positionable digital keyboard 28 may be programmed to re-center about position coordinates for the pointing device when the position coordinates correspond to a part of the graphical user interface 34 (or screen) that is not currently occupied by the digital keyboard 28 . For example, if the digital keyboard 28 approaches an edge of the graphical user interface 34 the user can simply touch down in a center of the graphical user interface 34 and the digital keyboard 28 will relocate to that point.
- the digital keyboard 28 When the digital keyboard 28 is dynamically re-positionable, it is preferable in general that the amount of keyboard movement, or drift, is minimized. This can be achieved by arranging the keyboard layout so that the keyboard characters are distributed about the digital keyboard 28 in a configuration that reduces the amount of drifting experienced when it is dynamically re-positionable.
- One way of achieving this is by configuring the digital keyboard 28 so that the total of the frequency of use values for characters located within a particular portion (or sector) of the digital keyboard 28 is substantially the same as other similarly shaped portions (sectors) of the digital keyboard 28 . It will be recalled that for the frequency distributed arrangement of keyboard characters discussed earlier, each keyboard character has a predetermined frequency of use value assigned to (or associated with) it.
- the digital keyboard 28 may be divided into notional, substantially equally shaped sectors, and the keyboard characters may be assigned to locations within the digital keyboard 28 such that the total of combined frequency values for characters within a particular sector of the digital keyboard 28 is substantially equal to the total of combined frequency values for characters within any of the other sectors of the digital keyboard 28 .
- the likelihood of selecting a character from any one of the predetermined sectors of the digital keyboard 28 is substantially the same.
- the keyboard characters are distributed such that when the digital keyboard 28 is notionally divided into substantially equally shaped wedge-like sectors, each sector of the keyboard has substantially the same total ‘weight’ of characters, according to their frequency of use, as each of the other sectors.
- Another way to minimize drift is to configure the digital keyboard 28 in a substantially symmetric layout of characters with pairs of opposing characters displayed on the digital keyboard 28 having substantially similar frequencies of use.
- the frequency of use of one character in a pair of opposing characters is as close as possible to that of the other character in the pair.
- An example of this configuration is shown in FIG. 13 which shows the frequencies (f(X 1 ) and f(X 2 )) of characters X 1 and X 2 being substantially the same as each other, and the frequencies (f(X 3 ) and f(X 4 )) of characters X 3 and X 4 being substantially the same as each other.
- the frequencies of use of the characters displayed in the digital keyboard 28 may be calculated using well-known techniques of analysis on a large corpus of text.
- the dynamically re-positionable digital keyboard 28 minimizes the need for repositioning the pointing device and instead operates on the basis of repositioning the digital keyboard 28 relative to the pointing device.
- Making the digital keyboard 28 dynamically re-positionable also provides uniform movement for a particular character resulting in a more intuitive keyboard and a more intuitive data entry mechanism.
- the character frequency distribution and the dynamically re-positionable aspects of the digital keyboard 28 further reduce the movement required for the pointing device when characters are to be selected from the digital keyboard 28 .
- completion candidates are selected from the interactive search list 30 by way of gestures.
- other forms of candidate selection may be performed with pointing devices.
- candidates may be selected based on their location in the interactive search list 30 .
- the data entry system 26 when the data entry system 26 is programmed to receive input from a mouse having two or more buttons, the data entry system 26 can be programmed to use input from one mouse button to toggle between activating and deactivating the interactive search list 30 , and to use input from a second mouse button to insert a completion candidate from the interactive search list 30 into the text when the interactive search list 30 is active and the mouse has been used to highlight that completion candidate.
- the data entry system 26 may also be programmed to use input from the second mouse button as a trigger to select a key from the digital keyboard 28 if the mouse's cursor position (i.e. the mouse's position coordinates) on the graphical user interface 34 is associated with a key on the digital keyboard 28 at the time input from the second mouse button is received.
- the mouse's cursor position i.e. the mouse's position coordinates
- candidate selection using the interactive search list 30 may be modified to replace the time delay-based technique for triggering the activation of the interactive search list 30 or for triggering iterative searching, with other forms of input indicators from the pointing device. For instance, with a mouse, an input signal from a mouse button when the mouse position is located over a particular function button or location on graphical user interface 34 or when a double click signal from that mouse button is received by the data entry system 26 .
- the interactive search list 30 is displayed in the first embodiment ( FIG. 1 to 5 ) as a vertical list of completion candidates
- the interactive search list 30 can be displayed in several different ways depending upon which options the data entry system 26 has been programmed with and which of those options have been selected by the user.
- the first is where the interactive search list 30 is to be positioned within the graphical user interface 34 .
- the second is whether the interactive search list 30 is continuously visible or not.
- the third consideration is the type of interactive search list, more specifically, how the completion candidates in the interactive search list 30 are arranged visually within the graphical user interface 34 .
- the fourth consideration is whether the interactive search list 30 replaces the digital keyboard 30 or whether the interactive search list 30 , when active, temporarily appears remote from or superimposed over a portion of the digital keyboard 30 .
- the interactive search list 30 may be displayed in a fixed location within the graphical user interface 34 .
- the interactive search list 30 may be docked with the digital keyboard 28 , when it is repositionable, and displayed continuously. With either the docked or fixed location interactive search list 30 , the results of automated searching are continuously displayed within the interactive search list 30 as the user enters characters with the digital keyboard 28 or uses the interactive search list 30 itself (as illustrated by block 213 B and 254 of FIGS. 23 and 24 ). Activating a docked or fixed location interactive search list 30 can be achieved by pausing with the pointing device on a keyboard character selected within the digital keyboard 28 .
- the interactive search list 30 becomes active. At this point, if the user wishes, the user can select one of the completion candidates (if any) within the interactive search list 30 or the user can return to keyboard mode and continue adding to or otherwise modifying the current partial text entry from the digital keyboard 28 .
- the interactive search list 30 when arranged in a docked or fixed location, may be continuously updated with potential completion candidates based on the current contents of the search string being constructed by the user via the digital keyboard 28 .
- the user can simply continue adding characters to the end of the current partial text entry one character at a time via the digital keyboard 28 so as to continue building the desired word, phrase, or character sequence until such time as the desired completion candidate or a partial completion candidate thereof appears in the interactive search list 30 .
- the interactive search list 30 when activated, may be shown superimposed over a portion of the digital keyboard 28 .
- the digital keyboard 28 may be instructed to make itself visible or invisible to view on the graphical user interface 34 .
- the digital keyboard 28 may be programmed to be displayed on the graphical user interface 34 in response to a user selection on the personal computing device, and to be hidden (or cleared) from view in response to another user selection.
- This feature also provides, for example, the option for the application 27 to instruct the digital keyboard 28 when to be visible and when to be invisible.
- the application 27 is programmed to decide when and where the digital keyboard 28 is to be displayed.
- This feature can be applied to many types of personal computing devices including, for example, where a touch-sensitive screen is used, or where the digital keyboard 28 is displayed on a display device that is separate from the hardware input interface 17 such as with a data tablet, a proximity sensing input surface or an equivalent input interface.
- the hardware input interface can be located on a remote control device used to control when the digital keyboard 28 is displayed on a television or a remotely located computer display.
- the digital keyboard can be displayed when the pointing device is detected within a set predetermined distance of a proximity sensing input surface, and the digital keyboard can be hidden when the pointing device is not detected within the set predetermined distance of the proximity sensing input surface.
- the application 27 can instruct the digital keyboard 28 to become invisible so as to swap to full text mode.
- the application 27 reactivates the display of the digital keyboard 28 over position coordinates associated with the position of the pointing device over the proximity sensing input surface or to an area remote to the pointing device. Variations on handling text entry with the proximity sensing input surface are discussed further on below.
- the API for the data entry system 26 also allows the application 27 to programmatically change the partial text entry which is used for searching. For example, the user of a text editor might place the cursor after a word or character sequence and the application 27 could then tell the data entry system 26 to use that word or character sequence as a partial text entry for further searching.
- completion candidates within the interactive search list 30 may be displayed in an X configuration ( FIG. 15 ), in a rectangular configuration ( FIG. 16 ), in a cross configuration ( FIG. 17 ), in a T configuration ( FIG. 18 ), or in a horizontal configuration.
- X configuration one completion candidate is preferably located slightly offset in the x-axis or y-axis from a central location within the X configuration and surrounded by four or more completion candidates located within the north-west, north-east, south-west, and south-east directions (relative to the central completion candidate displayed).
- a unique direction is provided for each of the five completion candidates displayed in the list, so as to minimize pen movement.
- a substantially centrally displayed completion candidate within the interactive search list 30 is surrounded by up to four completion candidates in the north, east, south, and west directions (relative to the central completion candidate displayed).
- the data entry system 26 is also programmed to display all completion candidates near the last known position coordinates for the pointing device so that they are slightly off-set from the x-axis or y-axis of the last known position coordinates so as to minimize the degree to which such completion candidates in the interactive search list 30 are obscured from the user's view by the pointing device.
- This feature can be particularly useful when the pointing device is a pen or finger and the user interfaces with a touch-sensitive screen 14 . In this way, the interactive search list may be displayed in a location which makes it easily visible and accessible to the user.
- the list of completion candidates within the interactive search list 30 can be displayed such that the most common of the completion candidates is displayed closest to the last known position coordinates of the pointing device while the other completion candidates within the interactive search list 30 are displayed further away from the last known position coordinates of the pointing device relative to the most common of the completion candidates.
- This variation results in a frequency distributed interactive search list 30 which can assist in further minimizing the amount of motion required with the pointing device in order to use and select from the interactive search list 30 .
- both the digital keyboard 28 and the interactive search list 30 may be continuously displayed within fixed separate locations on the graphical user interface 34 , along with a search string window 40 used to display the current contents of the search string.
- a tool bar 42 may also be displayed to identify predefined functions and commands that may be selected by the user while using the data entry system 26 .
- the tool bar 42 may be repositioned dynamically along with the digital keyboard 28 , or the tool bar 42 may be located and remain in a fixed location within the graphical user interface 34 .
- a commonly used word or character sequence may appear in the same position each time such a word or character sequence is displayed in a search list. This helps the user become familiar with the location of such a word or character sequence within the search list, and thereby helps the user to access such a word or character sequence more readily.
- the user may begin to know which gesture is required to enter a certain word even before the predetermined delay period L 1 has expired and the search list is displayed.
- the date entry system 26 may be programmed to recognize such gestures even before the predetermined delay period L 1 has expired and the interactive search list 30 is displayed.
- the interactive search list may display completion candidates with the part of each completion candidate matching the search string displayed in a different manner (for example a different color, font, or boldness) than the remaining parts of the completion candidates. For example, if the remaining parts were significantly bolder than the part of the completion candidates matching the search string, the user's eye can be drawn to those portions which distinguish the completion candidates on the interactive search list from one another, therefore facilitating selection of the desired completion candidate.
- the data entry system 26 may be programmed to determine whether or not the position of a cursor displayed on the graphical user interface 34 tracks the stylus position precisely or whether it moves relatively to the stylus movement. In the first case, if the cursor position tracks the stylus position precisely, then the stylus and cursor function like a mouse and the cursor on a conventional user interface and the position of the cursor tracks precisely the position of the stylus tip on the hardware input surface (i.e. the last known position coordinates for the stylus).
- the cursor displayed on the graphical user interface 34 is moved by a distance proportional to the movement with the stylus. This latter behaviour can come into effect when the interactive search list 30 is displayed. For instance, when moving up a vertical list of completion candidates, the cursor can move up faster than the actual physical movement of the stylus.
- the stylus (or other pointing device) can be used locally on the display device if it is a touch-sensitive screen, or remotely such as with a data tablet, a proximity-sensing input interface, or with the character input space on a Palm PilotTM or another hand-held personal computing device.
- Using the cursor to track the position coordinates of the pointing device can help the user keep their attention on the digital keyboard 28 or the interactive search list 30 displayed on the display device 15 without having to be distracted with looking at the physical position of the pointing device (see, for instance, cursor 48 as illustrated in FIG. 27 ). This can be helpful when, for example, a data tablet or input pad is used and is located remote from the display area of the graphical display device 15 where the digital keyboard 28 or the interactive search list 30 (or both) are displayed. Also, using the cursor to remotely track the movement of the stylus, pen or finger provides a mechanism for using the digital keyboard 28 and the interactive search list 30 without obscuring them from the user's view with the stylus, pen or finger.
- the cursor may be displayed over the digital keyboard 28 when the data entry system 26 is in keyboard mode, and the cursor may be programmed to relocate to the center of the digital keyboard 28 whenever a character from the keyboard or a completion candidate from the interactive search list 30 is selected.
- the cursor is centered in the digital keyboard 28 , further movements with the pointing device can be used to make selections from the digital keyboard 28 as if the pointing device were physically centered about the center of the digital keyboard 28 .
- the digital keyboard 28 is displayed in a fixed remote location on the graphical user interface 34 .
- the user is not visually distracted by movement of the digital keyboard 28 , while enjoying many of the advantages of the dynamically re-positionable digital keyboard 28 .
- the cursor relocates to the center of the digital keyboard 28 when the keyboard is active and waiting for user input, a particular character on the digital keyboard 28 remains the same distance and direction from the pointing device no matter what input was made last with the pointing device.
- This feature of the cursor enables the user to incorporate unconscious learning and therefore, learned efficiency.
- a frequency distributed keyboard layout is used with most frequently used characters located near a central location, relocating the cursor to the center of the digital keyboard 28 enables ready access to the characters most likely to be chosen next, thereby reducing finger movement and increasing efficiency.
- the movement of the cursor need not necessarily be directly proportional to the movement of the pointing device.
- the data entry system 26 is programmed so that moving the pointing device a small distance equates to moving the cursor a larger distance on the digital keyboard 28 or the interactive search list 30 .
- This variation uses scaling to minimize the movement required to accurately distinguish and select characters from the digital keyboard 28 and completion candidates from the interactive search list 30 .
- the distance of the cursor movement may be related by the data entry system 26 to the speed that the pointing device moves, so that the faster the movement with the pointing device, the greater the distance traveled by the cursor, and the slower the movement of the pointing device, the less distance traveled by the cursor on the graphical user interface 34 .
- the pointing device when used on a digital keyboard that is displayed in a location remote from the pointing device, such as where the pointing device is a mouse, a finger on a touch sensitive palette, or a stylus.
- the digital keyboard is not displayed under the pointing device, but is viewed on a display device, and pointer motion is seen as relative motions of a cursor displayed on the digital keyboard.
- characters on the digital keyboard are not obscured.
- a special display area containing a series of numbers are displayed as part of or in association with the digital keyboard 28 to enable the user to rapidly instruct the data entry system 26 to obtain and display in the interactive search list 30 completion candidates having at least a minimum number of characters.
- the data entry system 26 is programmed to have the candidate prediction system 32 obtain from the dictionary 20 completion candidates beginning with the selected character and having at least as many characters as the number that was selected by the user from the special display area.
- the user may type in a number from the special display area, followed by holding down on one of the characters on the digital keyboard 28 to instruct the data entry system 26 to have the candidate prediction system 32 obtain from the dictionary 20 completion candidates beginning with the selected character and having at least as many characters as the number that was selected by the user from the special display area.
- the data entry system 26 may be programmed so that when the user touches a number from the special display area and lifts the pointing device, the data entry system 26 retrieves a list of completion candidates having a number of characters equal to the number selected touched on the special display area.
- the data entry system 26 may be programmed to obtain completion candidates of at least a predetermined length when the user selects a number from the special display area with the pointing device, gestures a significant distance in a predetermined direction (for example, to the right), lifts up the pointing device, touches down on a character on the digital keyboard 28 and then pauses on that character.
- another special display area may be included with the digital keyboard 28 from which the category of completion candidates can be narrowed.
- the data entry system 26 may be programmed to display general identifiers for nouns, verbs, adjectives, etc.
- the data entry system 26 in this variation is programmed to have the candidate prediction system 32 obtain completion candidates that are identified in the dictionary 20 as falling within the category associated with the selected identifier (for example, only nouns, or only verbs).
- This variation may be combined with the other aspects herein to assist the user in obtaining completion candidates of one or more specific categories identified in the dictionary 20 .
- a physical button or switch located on the personal computing device, or on the pointing device, and within easy reach of a user's finger or hand may be used to easily activate certain features of the data entry system 26 .
- the data entry system 26 may be programmed to make, with each press, the digital keyboard 28 invisible or visible.
- the data entry system 26 may be programmed to recognize that if the button or switch is pressed, the interactive search list 30 , when displayed, should display only certain types of completion candidates available within the dictionary.
- the data entry system 26 may be programmed to activate the interactive search list 30 .
- the data entry system 26 may be programmed to require that the interactive search list 30 display completion candidates of a certain minimum length of characters.
- a stylus or pen or finger or like hand-held pointing device
- a proximity sensing input surface can detect the proximity of a pointing device to the input surface as well as the location of the pointing device over the proximity sensing input surface.
- the proximity sensing input surface may also detect the distance and angle that a pointing device is being held relative to the input surface.
- the data entry system 26 can be programmed so as to display the digital keyboard 28 (or another digital keyboard) with the cursor displayed over it when the stylus approaches within a set predetermined distance of the proximity sensing input surface.
- the proximity sensing input surface detects the position of the stylus over the proximity sensing input surface when the stylus is within the set predetermined distance. As the user moves the pointing device over the proximity sensing input surface, the cursor moves correspondingly.
- the digital keyboard 28 can be displayed directly beneath the stylus in some embodiments or, for other embodiments, remote from the stylus. When the stylus is moved away from the proximity sensing input surface further than the set predetermined distance, the data entry system 26 is programmed to hide (or clear) the digital keyboard 28 from the graphical user interface 34 . This variation enables the entire screen to be used to display text while the digital keyboard 28 is hidden.
- This variation also avoids screen clutter by displaying the digital keyboard 28 only when the stylus is found to be within the set predetermined distance of the proximity sensing input surface.
- the user can quickly and intuitively return to adding to or deleting from the text using the digital keyboard 28 by bringing the stylus within the set predetermined distance of the proximity sensitive input surface.
- the digital keyboard 28 is displayed when the user's hand controlling the stylus (the “typing hand”) is placed in a natural position for continuing text and data entry.
- the location where the digital keyboard 28 is displayed on the graphical user interface 34 may be near and possibly follow the line of text under construction by the user, so as to facilitate the eye following the digital keyboard 28 and the entered text simultaneously.
- the digital keyboard 28 can be displayed in the same location as the stylus.
- the digital keyboard 28 is programmed to be displayed just below or above the line of text that the user is creating or editing on a personal computing device.
- the data entry system 26 may be programmed to allow for the cursor to be repositioned within previously typed text with the stylus while the stylus is within the minimum distance, provided the stylus is detected as approaching the proximity sensing input surface from a particular side of the input surface (for example, the right side of the proximity sensing input surface). Once the cursor was repositioned, the user could then approach the proximity sensing input surface from another direction (for example, from above) to trigger the display of the digital keyboard 28 to assist with further text entry and modification.
- the data entry system 26 is application independent and communicates with applications via an API.
- the data entry system 26 may be embedded in an application.
- digital keyboards and keyboard layouts may contain other symbols that could encode a language.
- One example of this would be a digital keyboard that contains regions representing the strokes used in writing an oriental language. The user would select the strokes by pointing to them, and the characters would be constructed from the strokes.
Landscapes
- Engineering & Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- User Interface Of Digital Computer (AREA)
- Input From Keyboards Or The Like (AREA)
Abstract
In one aspect of the present invention the user can rapidly enter and search for text using a data entry system through a combination of entering one or more characters on a digitally displayed keyboard with a pointing device and using a search list to obtain a list of completion candidates. The user can activate the search list to obtain a list of completion candidates at any time while entering a partial text entry with the data entry system. When the search list is active, a list of completion candidates is displayed on a graphical user interface for the user to select from and the user can perform one of several actions. The user can deactivate the search list and return to modifying the current partial text entry and other text. The user can select one of the completion candidates in the search list and use the selected completion candidate to replace the partial text entry which the user is currently entering. When the user deactivates the interactive search list, the user can immediately continue adding to or modifying the current partial text entry being entered, and may re-invoke the search list to further search for completion candidates based on the modified partial text entry. In the second case, the selected completion candidate is used to replace the partial text entry that the user is currently entering, and the data entry system begins monitoring for a new partial text entry from the user.
Description
- This application is a continuation of International Application No. PCT/CA00/00285 filed Mar. 15, 2000, which is designated, for the purposes of the United States of America, a continuation-in-part of U.S. patent application Ser. No. 09/272,700 filed Mar. 18, 1999. This application claims the benefit of the aforementioned International Application No. PCT/CA00/00285 and U.S. patent application Ser. No. 09/272,700.
- The present invention relates generally to computer-assisted data entry and more particularly to a method, system, and apparatus for computer-assisted text generation and entry using a pointing device with a personal computing device, and to computer-readable media having executable instructions for supporting text generation and entry using a pointing device.
- The wide-spread adoption of miniaturized personal computing devices, such as hand-held devices and personal digital assistants (PDAs), has led to an increasing use of devices to send and receive text and data. One example of this trend is pen-based computing, wherein users enter text and commands into hand-held personal computers via a touch-sensitive screen. While such pen-based computing is popular, especially with the increasing power of miniature computing devices, it does present challenges to a user entering data in an application running on the hand-held device. For instance, many hand-held computers and personal digital assistants require that the user enter data according to a predetermined scripting style, such as with the PalmPilot™ series of PDAs. Other hand-held devices provide a handwriting recognition system which requires that the computer learn the user's handwriting style. While such data entry mechanisms are useful, they are relatively difficult to use and complex to learn and can be prone to error in the event the user deviates from the predetermined scripting style or the user's traditional handwriting style.
- Many pen-based computing systems, both large and small, offer the user the option to enter text using an on-screen digital keyboard. On-screen digital keyboards are typically miniaturized replica of conventional full-sized physical keyboards, such as QWERTY keyboards. Many on-screen keyboards have shown themselves to be less than efficient for entering text. When using a pointing device such as a pen, a user is typically required to enter text one character at a time by tapping out individual character selections from the on-screen keyboard. This “hunt-and-peck” method of typing with a single pointing device is time-consuming, especially when a user is entering large amounts of data.
- Another common challenge when entering data into a personal computing device with a single pointing device such as a pen or stylus, and in particular when entering text, is that each letter making up the word or phrase must be entered manually. The longer the word or phrase, the greater the amount of manual entry required.
- Text completion systems have been developed in an effort to assist users with text entry. In general, these systems predict and suggest a complete word completion based on a partial text entry entered by a user. These systems allow a user to type in the partial text entry and then accept a predicted text completion for the partial text entry. This avoids the keystrokes that would otherwise be required to type the complete text desired by a user. While such text completion systems provide some basic assistance for users to more rapidly enter text than would be required if every character of the desired text had to be typed in independently, there remains a need in the art for a more flexible text completion system for use with a single pointing device. It would also be desirable for such a text completion system to employ a convenient selection technique which would reduce the amount of movement of the pointing device required to enter text into a computer. It would further be desirable if such a system were applicable to both large and small personal computing devices.
- Another problem in the art is that soft, or digital, keyboards have tended to be continually displayed so as to permanently consume screen space or have needed to be manually invoked and dismissed by the user. It would be desirable if a digital keyboard could automatically appear and disappear as required.
- The above and related desires are addressed in the present invention by providing a novel and non-obvious method, system and computer-readable instructions for computer-assisted text generation and entry using a pointing device with a personal computing device.
- In one aspect of the present invention the user can rapidly enter and search for text using a data entry system through a combination of entering one or more characters on a digitally displayed keyboard with a pointing device and using a search list to obtain a list of completion candidates. The user can activate the search list to obtain a list of completion candidates at any time while entering a partial text entry with the data entry system. When the search list is active, a list of completion candidates is displayed on a graphical user interface for the user to select from and the user can perform one of several actions. The user can deactivate the search list and return to modifying the current partial text entry and other text. The user can select one of the completion candidates in the search list and use the selected completion candidate to replace the partial text entry which the user is currently entering. When the user deactivates the interactive search list, the user can immediately continue adding to or modifying the current partial text entry being entered, and may re-invoke the search list to further search for completion candidates based on the modified partial text entry. In the second case, the selected completion candidate is used to replace the partial text entry that the user is currently entering, and the data entry system begins monitoring for a new partial text entry from the user.
- In one embodiment, when the search list is active the user may use one of the completion candidates in the search list to initiate a further automated search to obtain a more refined list of completion candidates. In this embodiment, multi-level search lists and searching are available to help accelerate completion of a partial text entry. As a result, the user can automatically initiate an iterative search wherein a completion candidate listed in the search list is used as the new partial text entry to dynamically obtain a new list of completion candidates, which is then displayed in the search list. The automated ability to use the search list to obtain a refined list of completion candidates allows the user to quickly make good use of search results that are only partially successful. When the search list is revised with a new list of completion candidates, the user can then choose one of the completion candidates in the new list, or the user can repeat the iterative search process once again by choosing one of the completion candidates in the new list and activating a further iterative search. In addition, the user may return to keyboard entry with the last completion candidate selected by the user in the previous iteration of the search list. This latter feature provides the user with the convenience of being able to automatically and seamlessly continue entering the desired word, phrase, or character sequence using the last completion candidate selected by the user in the previous iteration of the interactive search list. Thus, the user can quickly and easily replace the user's current partial text entry with a partially successful completion candidate and continue building upon or modifying this partial completion candidate using the keyboard while at the same time allowing the user the flexibility to re-enter the search list at any time.
- In another aspect of the present invention, there is provided a method of processing text entered into a personal computing device with a pointing device. With this method, a partial text entry is received and used to obtain a dynamically generated list of completion candidates. The list of completion candidates is displayed in a search list within a graphical user interface. A user input signal associated with the pointing device is received. If the user input signal corresponds to a first type of user selection with the pointing device, then the search list is deactivated. If the user input signal corresponds to a second type of user selection with the pointing device, then the partial text entry is replaced with a completion candidate from the search list.
- In one embodiment, if the user input signal corresponds to a third type of user selection with the pointing device, then a refined list of completion candidates is dynamically obtained based on one of the completion candidates from the search list. The refined list is displayed in the search list for further user selection.
- In another aspect of the present invention, there is provided a method of processing an input string at least partially entered into a personal computing device with a pointing device. This aspect includes performing a search of a set of completion candidates to locate a plurality of possible completion candidates for completing the input string in response to a prior located possible completion candidate or a character selectable by a user. At least one of the plurality of possible completion candidates and characters selectable by the user are displayed.
- In another aspect of the present invention, a method is provided of user-based text entry. With this aspect, a set of position coordinates for a pointing device is monitored relative to a user interface and a digital keyboard is displayed on the user interface at a last known set of coordinates for the pointing device whenever the digital keyboard is activated for user input.
- In another aspect of the present invention, a method is provided for interchanging the display of a digital keyboard and a search list. The digital keyboard is displayed on a user interface when a user is entering text a keystroke at a time. As the digital keyboard is displayed, user input is monitored. If the user input corresponds to activating the search list, then the digital keyboard is replaced with the search list. If the user input corresponds to terminating use of the search list once activated, then the search list is replaced with the digital keyboard.
- In another aspect of the present invention, a digital keyboard is configured to include a plurality of characters assigned to predetermined locations within a layout for the digital keyboard according to a predetermined frequency distribution associated with the plurality of characters. The plurality of characters includes less commonly used characters and more commonly used characters based on the predetermined frequency distribution. The digital keyboard is displayed on a graphical user interface with the less commonly used characters displayed substantially further from a center of the digital keyboard than the more commonly used characters.
- In another aspect of the present invention there is provided a system for computer-assisted text generation and entry. The system includes an input interface, a processing unit and a computer-readable medium. The input interface receives user input signals based on actions with a pointing device. The computer-readable medium contains computer-readable instructions for directing the processing unit to assist with text generation and entry based on user input received via the input interface with the pointing device. The computer-readable medium includes instructions for receiving a partial text entry; for obtaining a dynamically generated list of completion candidates based on the partial text entry; for displaying the list of completion candidates in a search list on a display device; and for receiving a user input signal associated with the pointing device from the input interface. If the user input signal corresponds to a first type of user selection with the pointing device, the computer-readable instructions are programmed to deactivate the search list. If the user input signal corresponds to a second type of user selection with the pointing device, the computer-readable instructions are programmed to replace the partial text entry with a completion candidate from the search list. Other aspects and features of the present invention will become apparent to those ordinarily skilled in the art upon review of the following description of specific embodiments of the invention in conjunction with the accompanying drawings.
- In the accompanying drawings which illustrate embodiments of the invention,
-
FIG. 1 is a schematic diagram of a personal computing device loaded with a data entry system, according to a first embodiment of the invention; -
FIG. 2 is a schematic diagram of the data entry system of the first embodiment; -
FIG. 3 is a schematic representation illustrating the display of a digital keyboard on a graphical user interface within the personal computing device of the first embodiment; -
FIG. 4 is a schematic representation of a data structure for a dictionary according to the first embodiment; -
FIG. 5 is a schematic representation illustrating the interchangeable display of the digital keyboard and an interactive search list according to the first embodiment; -
FIG. 5A is a schematic representation illustrating potential completion candidates for retrieval and display in the interactive search list according to an example of the use of the first embodiment; -
FIG. 6 is a flow diagram illustrating, by way of example from the user's perspective, the use of the data entry system of the first embodiment; -
FIG. 7 to 9 are flow diagrams illustrating the flow of operation of the data entry system according to the first embodiment; -
FIG. 10 is a schematic representation of an alternative embodiment of a digital keyboard layout according to the present invention; -
FIG. 11 is a schematic representation of another alternative embodiment of a digital keyboard layout according to the present invention; -
FIG. 12 is a schematic representation of another alternative embodiment of a digital keyboard layout according to the present invention; -
FIG. 13 is a schematic representation of another configuration of a digital keyboard according to an embodiment of the present invention; -
FIG. 14 is a schematic representation of the layout for the interactive search list according to the first embodiment; -
FIG. 15 to 18 are schematic representations of alternative layouts for the interactive search list according to alternate embodiments of the present invention; -
FIG. 19 is a schematic representation of a configuration for the digital keyboard and the interactive search list according to an embodiment of the present invention; -
FIG. 20 is a schematic representation of an alternative configuration for the digital keyboard according to an embodiment of the present invention; -
FIG. 21 to 22 are flow diagrams illustrating the flow of operation of the data entry system according to an alternative embodiment of the present invention; -
FIG. 23 to 24 are flow diagrams illustrating the flow of operation of the data entry system according to an alternative embodiment of the present invention; -
FIG. 25 to 26 are flow diagrams illustrating the flow of operation of the data entry system according to an alternative embodiment of the present invention; -
FIG. 27 is a schematic representation of an alternative configuration for the digital keyboard according to an embodiment of the present invention. - It will be appreciated that for simplicity and clarity of illustration, elements illustrated in the accompanying drawings have not necessarily been drawn to scale. For example, the dimensions of some of the elements are exaggerated relative to other elements for clarity. Further, where considered appropriate, reference numerals and labels have been repeated among the drawings to indicate corresponding or analogous elements and features.
- Reference will now be made in detail to implementations and embodiments of the invention, examples of which are illustrated in the accompanying drawings.
- Introduction
- In one aspect of the present invention the user can rapidly enter and search for text using a data entry system through a combination of entering one or more characters on a digitally displayed keyboard with a pointing device and using an interactive search list to dynamically obtain a list of completion candidates. The user can activate the interactive search list to obtain a dynamically generated list of completion candidates at any time while entering a partial text entry with the data entry system. In this specification “partial text entry” means a sequence of one or more characters making up a leading portion of a word, phrase or character sequence. When the interactive search list is active, a dynamically generated list of completion candidates is displayed on a graphical user interface for the user to select from. The list of completion candidates is retrieved from at least one dictionary by a candidate prediction system which retrieves completion candidates that are most likely to contain the completion candidate desired by the user. Candidate prediction is based on statistical measures ranking the entries within the dictionary relative to each other. When the interactive search list is active, the user can perform one of several actions with the data entry system including any of the following:
-
- (1) the user can deactivate the interactive search list and return to modifying the current partial text entry and other text; or
- (2) the user can select one of the completion candidates in the interactive search list and use the selected completion candidate to replace the partial text entry which the user is currently entering.
- In the first case, when the user deactivates the interactive search list, the user can immediately continue adding to or modifying the current partial text entry being entered, and may re-invoke the interactive search list to further search for completion candidates based on the modified partial text entry. In the second case, the selected completion candidate is used to replace the partial text entry that the user is currently entering, and the data entry system begins monitoring for a new partial text entry (of a word, phrase, or character sequence) from the user.
- In certain embodiments of the present invention, as illustrated further below, a third action available to the user when the interactive search list is activated is to:
-
- (3) use one of the completion candidates in the interactive search list to initiate a further automated search to obtain a more refined dynamic list of completion candidates from the dictionary.
- In the third case, multi-level search lists and searching are available to help accelerate completion of a partial text entry. In this latter case, the user can automatically initiate an iterative search wherein a completion candidate listed in the interactive search list is used as the new partial text entry to dynamically obtain a new list of completion candidates, which is then displayed in an updated interactive search list. The automated ability to use the interactive search list to dynamically obtain a refined list of completion candidates allows the user to quickly make good use of search results that are only partially successful. When the interactive search list is updated with a new list of completion candidates, the user can then choose one of the completion candidates in the new list, or the user can repeat the iterative search process once again by choosing one of the completion candidates in the new list and activating a further iterative search. In addition, the user may return to keyboard entry with the last completion candidate selected by the user in the previous iteration of the interactive search list. This latter feature provides the user with the convenience of being able to automatically and seamlessly continue entering the desired word, phrase, or character sequence using the last completion candidate selected by the user in the previous iteration of the interactive search list. Thus, the user can quickly and easily replace the user's current partial text entry with a partially successful completion candidate and continue building upon or modifying this partial completion candidate using the keyboard while at the same time allowing the user the flexibility to re-enter the interactive search list at any time.
- As discussed further in this specification, other actions may also be carried out by the user when the interactive search list is active.
- Operation Environment
-
FIG. 1 shows a schematic diagram of apersonal computing device 10 for text entry according to a first embodiment of the invention. Thepersonal computing device 10 shown inFIG. 1 contains at least one processing unit 12 (such as a CPU or a similar processor or multiprocessor) connected by a bus to a computer-readable medium 16. The computer-readable medium 16 provides a memory store for software and data residing within thepersonal computing device 10. The computer-readable medium 16 can include one or more types of computer-readable media including volatile memory such as Random Access Memory (RAM), and non-volatile memory, such as a hard disk or Read Only Memory (ROM). Preferably, the computer-readable medium 16 includes a combination of volatile and non-volatile memory. In the first embodiment, the computer-readable medium 16 contains an operating system, adata entry system 26 and anapplication 27 receptive to user-based text entry such as a word processor. The computer-readable medium 16 may also store alternative or other applications such as a browser or micro-browser, an e-mail application, and/or other end-user applications. - The operating system can be any of several well-known operating systems depending on the personal computing device used. For example, for hand-held devices, the operating system can be PalmOS™, Windows CE™, or an equivalent operating system. For larger systems, such as with work stations or desktop computers, a more robust operating system may be used such as, for example, Windows 95™, Windows 98™, Windows NT™, Windows 2000™, MacOS™, UNIX, Linux or the like. For the purposes of the first embodiment, the operating system is PalmOS™.
- The
data entry system 26 is implemented as software that runs on theprocessing unit 12 to support computer-assisted text generation and entry for the user, although in other alternatives thedata entry system 26 can be implemented as computer-readable instructions in firmware or embedded in hardware components. In the first embodiment, electronic text and documents are generated and maintained by theapplication 27 and the user authors and edits the electronic text and documents with thedata entry system 26 which communicates with theapplication 27 through an application programming interface (API). This allows thedata entry system 26 to be portable so that it can be used by one or more applications to accept text and data entry from the user. As an alternative, thedata entry system 26 may be integrated into part of an application. - The
personal computing device 10 includes agraphical display device 15 and ahardware input interface 17 receptive to user input from a pointing device. In this specification, the term “pointing device” means an input device that allows a user to select one choice amongst one or many choices (a user-based selection). Some pointing devices enable a user to make user-based selections by pointing to a desired choice and include, by way of example, a pen, stylus, or finger. More generally, pointing devices capable of supporting user-based selections include, by way of example, the pointing devices above capable of pointing, as well as other input devices such as a mouse, trackball or the like. - The
graphical display device 15 is connected to and controlled by theprocessing unit 12 via avideo display circuit 13. Thegraphical display device 15 may be a CRT, a liquid crystal display, or an equivalent computer display. - In the first embodiment, the
personal computing device 10 is a personal digital assistant wherein thegraphical display device 15 and thehardware input interface 17 are combined in the form of a touch-sensitive screen 14 that serves both as agraphical display 15 and as an input interface receptive to generating coordinate position signals in response to contact from a pointing device such as a pen or stylus. It will be appreciated by those skilled in the art that thepersonal computing device 10 is represented in the following discussion as a personal digital assistant for illustration purposes only, and that the invention may be practised with other personal computing devices including hand-held devices, personal computers and other microprocessor-based electronic devices, mobile telephones, internet appliances, and embedded devices, having a suitable graphical display and an input interface receptive to user input via a pen, stylus, finger, mouse, or an equivalent pointing device that allows the user to select one choice from many. Other types of equivalent personal computing devices to which the features and aspects of the present invention are applicable include, by way of example, an internet appliance controlled via a remote control (for instance, running an Internet service through a television via a set top box and a remote control). In other embodiments, thehardware input interface 17 may be a digitising tablet, a pressure-sensitive input surface or a proximity sensing input surface. It will also be appreciated that thepersonal computing device 10 may be powered by an internal (18) or external power source. - As shown in
FIG. 2 , thedata entry system 26 includes computer-readable instructions for adigital keyboard 28, acandidate prediction system 32, adictionary 20, and aninteractive search list 30. Thedigital keyboard 28 provides an interface for the user to enter text and data into the personal computing device 10 (FIG. 1 ). As the user enters in characters via thedigital keyboard 28 to construct a word, phrase, or character sequence, the characters entered by the user are stored in a search string as a partial text entry. The search string is used by thecandidate prediction system 32 to search thedictionary 20 for completion candidates that begin with the current partial text entry being stored in the search string. Theinteractive search list 30 is used to display for user selection the list of completion candidates retrieved by thecandidate prediction system 32. In the first embodiment, thedata entry system 26 supports gesture-based user input for the selection of completion candidates from theinteractive search list 30 as will be described in further detail below. - Digital Keyboard
- As illustrated in
FIG. 3 , an image of thedigital keyboard 28 is displayed on agraphical user interface 34 within the screen area of the touch-sensitive screen 14 once thedata entry system 26 is initialised and ready to receive input from the user. Thedigital keyboard 28 contains a plurality of keys each of which is associated with at least one character from a set of characters. For example, when the English alphabet or a character set containing the English alphabet is used, each key on thedigital keyboard 28 can contain one or more letters from the English alphabet. It should be noted that reference to the English alphabet is by way of example only, and that thedigital keyboard 28 can be configured to contain and display any set of characters which the user may then select and use to enter text into thepersonal computing device 10. The terms “character set” and “set of characters” refer in this specification to a set containing a plurality of letters, numbers and/or other typographic symbols. Examples of character sets include, but are not limited to, one or more alphabets of a written language (e.g. English, French, German, Spanish, Chinese, or Japanese), and binary-coded character sets such as ASCII (American Standard Code for Information Interchange), EBCDIC (Extended Binary Coded Decimal Interexchange Code), and BCD (Binary Coded Decimal). - In the first embodiment, the
digital keyboard 28 displays digital keys containing characters from the English alphabet along with special characters selected from the ASCII character set. Words, phrases, and character sequences can be typed into an electronic document or electronic text by simply using the pointing device to tap or select in sequence each key for the desired word, phrase, or character sequence. As will be discussed further below, the user can also use thedigital keyboard 28 to initiate an automated search for completion candidates to more rapidly and flexibly enter words, phrases, and/or character sequences in an automated manner. As will also be discussed later in this specification, several enhancements to thedigital keyboard 28 may be implemented to further enhance the user's ability to quickly, efficiently, and flexibly enter text using a pointing device. - Dictionary
-
FIG. 4 shows a sample data structure for thedictionary 20. Preferably, as in the first embodiment, thedictionary 20 contains completion candidates with weight values for ranking completion candidates relative to each other. Thedictionary 20 contains a plurality of entries, with each entry having acompletion candidate field 22 for storing a completion candidate and aweight field 24 for storing a numeric value associated with the completion candidate stored in a correspondingcompletion candidate field 22. Each completion candidate stored in thedictionary 20 represents a word, phrase, or character sequence according to a particular language. Character sequences may include, but are not limited to, word continuations. Word continuations represent a leading part of a word, but not an entire word. In the first embodiment where the dictionary is based on American english, word continuations may include, by way of example, such common word prefixes as “com”, “con”, “dis”, “expl”, “inter”, “mis”, “para”, “pre” “syn”, “tele”, “trans” and “univers”. Obtaining a list of completion candidates which can include common word prefixes reduces user effort in terms of the number of steps required to generate a desired text entry. Furthermore, certain character combinations such as word prefixes are shared across multiple completion candidates which can speed up learning of the locations of completion candidates within sub-levels of theinteractive search list 30 when multi-level searching (i.e. iterative searching) is available. Eachweight field 24 stores a weight value for ranking the corresponding completion candidate stored in the correspondingcompletion candidate field 22 with other completion candidates in thedictionary 20. The weight value stored in aweight field 24 may be based on one of many metrics. By way of example, eachweight field 24 may contain a value representing a degree of common usage of the completion candidate stored in the correspondingcompletion candidate field 22 relative to the other completion candidates in thedictionary 20. As another example, the weight fields 24 may contain numeric values based on what words or phrase came before the completion candidate stored in the corresponding completion candidate from an analysis of a large corpus of text. Eachweight field 24 may also be supplemented with one or more related fields. - Candidate Prediction System
- Referring to
FIG. 2 , thecandidate prediction system 32 is programmed to dynamically search thedictionary 20 for completion candidates that begin with the partial text entry entered by the user. Thecandidate prediction system 32 retrieves completion candidates from thedictionary 20 by determining which dictionary entries are more likely to be the ones that the user is attempting to type. In the first embodiment, the completion candidates are obtained from thedictionary 20 on the basis of frequency values stored in theweight field 24 for each entry. In the first embodiment, completion candidates having the highest weight values are retrieved. The frequency values represent the frequency of usage of the entries relative to each other. These frequency values may be predetermined on the basis of an analysis of a large corpus of text or may be dynamically generated or modified on the basis of the specific user's usage of words, phrases, and/or character sequences through thedata entry system 26. Other statistical measures may also be employed to enhance thecandidate prediction system 32, such as ranking information or identifying the frequency with which an entry in thedictionary 20 follows any of the other entries in thedictionary 20. - Preferably, the
candidate prediction system 32 retrieves all possible completion candidates limited by one or more predetermined metrics. In the first embodiment, the total number of completion candidates retrieved is limited by a predetermined maximum number of displayable completion candidates. The maximum number of completion candidates retrieved by thecandidate prediction system 32 is preferably a small but significant number sufficient enough to provide the user with as many potential candidates as possible without unduly saturating the user with an excessive number of candidates and without unduly delaying the user's ability to quickly review and select from the candidates selected. The maximum number of completion candidates is also preferably sufficiently large enough to provide a variety of completion candidates (if available) for the user to choose from so as to avoid an excessive amount of multi-level searching to complete a partial text entry. In the case of the first embodiment, the maximum number of displayable completion candidates is set to five. Thedata entry system 26 can be configured by the user to present a greater or lesser number of completion candidates. Preferably, the completion candidates are displayable in a configuration that can also be selected by the user. Making the number of completion candidates that can be displayed a user-configurable feature provides enhanced capabilities for computer-assisted text generation and entry with thedata entry system 26. - Interactive Search List
- The
interactive search list 30 receives completion candidates retrieved from thedictionary 20 by thecandidate prediction system 32 and presents the user with a list of these completion candidates. Theinteractive search list 30 can be displayed to the user in any of several different ways depending on which options thedata entry system 26 has been programmed with and which of those options have been selected by the user, as will be discussed later in this specification. In the first embodiment, theinteractive search list 30 is displayed on the touch-sensitive screen 14 as an interactive vertical list of completion candidates. As described further below, theinteractive search list 30 is programmed to support multi-level searches so that the user can quickly use a completion candidate from one level of search results to drill deeper into thedictionary 20 for a narrower set of completion candidates. - In the first embodiment, the
digital keyboard 28 and theinteractive search list 30 are also interchangeably displayed on thegraphical user interface 34, as further illustrated inFIG. 5 . When thedigital keyboard 28 and theinteractive search list 30 are interchangeable, the user can easily and quickly swap between entering characters or otherwise modifying a partial text entry from thedigital keyboard 28 and using theinteractive search list 30 to rapidly and flexibly complete the entry of words, phrases, and/or character sequences. When thedigital keyboard 28 and the interactive search list are interchangeable, the image of thedigital keyboard 28 and the image of theinteractive search list 30 share substantially the same display area on thegraphical user interface 34. Thedata entry system 26 for the first embodiment is preferably programmed to automatically swap between thedigital keyboard 28 and theinteractive search list 30 depending upon the input provided by user from the pointing device. This latter feature minimizes disruption to the user's attention to the data entry process since the user's attention remains focussed on the same region of thegraphical user interface 34 for both thedigital keyboard 28 and theinteractive search list 30. Interchanging the display of thedigital keyboard 28 and theinteractive search list 30 within thegraphical user interface 34 provides for a space-efficient layout for the use of thedata entry system 26 on the touch-sensitive screen 14. The interchangeability of thedigital keyboard 28 and theinteractive search list 30 can also minimize hand movement as the user switches between using thedigital keyboard 28 and theinteractive search list 30 without having to move the pointing device to another part of the touch-sensitive screen 14 (or, more generally, without having to move the pointing device to another part of the hardware input interface). This arrangement can be particularly useful for smaller personal computing devices such as PDAs or other hand-held devices, or where the amount of space on thegraphical user interface 34 used by thedata entry system 26 needs to be minimized. - The
interactive search list 30 enables the user to more rapidly and flexibly complete the entry of words, phrases, and/or character sequences than would otherwise be required if the user were to simply type in each individual character of the desired entry. - Overview of Methodology
- The basic data entry methodology of the present invention as applied to a
personal computing device 10 of the first embodiment will now be described. In what follows, reference is made toFIG. 1 to 5. - In the first embodiment, the user interfaces with the
personal computing device 10 via the touch-sensitive screen 14 using a pen as a pointing device. However, it should be noted that the following methodology can be used with various other pointing devices, such as a stylus, finger, track ball, or mouse. If a mouse or an equivalent pointing device is used in place of a pen, stylus, or finger, then for the first embodiment the act of depressing a mouse button should be considered equivalent to touching the touch-sensitive screen 14 or touch pad with a stylus, pen, or finger, and similarly, releasing a depressed mouse button should be considered equivalent to lifting the stylus, pen, finger or other pointing device from the touch-sensitive screen 14 (or a touch pad or pressure sensitive pad). - With the
data entry system 26 of the first embodiment two primary entry modes are available to the user: a keyboard mode and a search mode. In the keyboard mode, the user can enter text a character (or keystroke) at a time by simply pointing and selecting on keys on thedigital keyboard 28 with the pointing device. With each selection of a key on thedigital keyboard 28, the one or more characters that are associated with that key are forwarded to theapplication 27 for entry into text in, for example, a document or data entry field. In the search mode, the user can search for and select amongst completion candidate suggestions for the completion of a word, phrase, or character sequence as further described below. - In the first embodiment, when the user begins entering a word, phrase or character sequence using the
digital keyboard 28, thecandidate prediction system 32 automatically begins searching thedictionary 20 for candidate words, phrases, and/or character sequences that the user may be attempting to enter. The searching of thedictionary 20 performed by thecandidate prediction system 20 begins automatically in response to the emerging character sequence entered by the user. This is done by searching thedictionary 20 for completion candidates that begin with the leading character or characters which the user has entered (i.e. the partial text entry). The leading characters manually entered by the user with the pointing device are stored temporarily as a search string. The search string is used by thecandidate prediction system 32 to search thedictionary 20 for potential completion candidates. Thecandidate prediction system 32 retrieves completion candidates on the basis of which entries in thedictionary 20 are most likely to contain the completion candidate that the user is attempting to type based on the partial text entry currently entered as indicated by the weight fields in the dictionary. - The completion candidates retrieved from the
dictionary 20 by thecandidate prediction system 32 are provided to the interactive search list logic which causes theinteractive search list 30 to be produced. As each new list of completion candidates is produced by thecandidate prediction system 32 the interactive search list logic produces a revised list of completion candidates as the user adds or deletes characters to or from the partial text entry with thedigital keyboard 28. Whether theinteractive search list 30 is displayed on thegraphical user interface 34, however, depends on whether or not theinteractive search list 30 has been activated. In the first embodiment, the user may activate theinteractive search list 30 by pausing for a predetermined delay period L1 with the pointing device in a selection made by touching down on a key on thedigital keyboard 28 containing one or more characters. The amount of time that the user must pause with the key selected is a user-configurable option. Typically, the delay chosen will be less than one second, although the delay may be configured to any time period the user desires. As soon as the user pauses on a selected key for the delay period L1, theinteractive search list 30 becomes active. When theinteractive search list 30 is activated, the entry mode for thedata entry system 26 changes from the keyboard mode to the search mode. In addition, when theinteractive search list 30 is activated in the first embodiment, the image of thedigital keyboard 28 is cleared from thegraphical user interface 34 and replaced by an image of theinteractive search list 30. - Although the
interactive search list 30 is updated continuously in the first embodiment, in an alternative configuration theinteractive search list 30 may be generated once the user activates theinteractive search list 30 by pausing on a selected key on thedigital keyboard 28. In this alternative configuration, theinteractive search list 30 is revised each time the user calls up a new or modifiedinteractive search list 30. - Once the user has activated the display of the
interactive search list 30 containing a list of completion candidates, thedata entry system 26 provides the user with the flexibility to proceed with one of several operations using the pointing device. With theinteractive search list 30 displayed and active, the user can do any one of the following: -
- (1) the user can deactivate the
interactive search list 30 and return to modifying the current partial text entry and other text; or - (2) the user can select one of the completion candidates in the
interactive search list 30 and use the selected completion candidate to replace the partial text entry which the user is currently entering; or - (3) the user can use one of the completion candidates in the
interactive search list 30 to initiate a further automated search to dynamically obtain a more refined list of completion candidates from thedictionary 20; or - (4) the user can scroll or cycle through the list of completion candidates displayed in the
interactive search list 30, selecting and deselecting one completion candidate at a time as the user decides which of the completion candidates (if any) will be used to complete the current partial text entry or to initiate further searching; or - (5) the user can gesture to or pause in a “dead zone” within the
graphical user interface 34 for any length of time without triggering any further action by thedata entry system 26 so as to pause to consider whether to continue using the displayedinteractive search list 30 or to carry out one of the other operations listed above.
- (1) the user can deactivate the
- The user's action with the pointing device generates user input signals which are monitored and analyzed by the
data entry system 26 to determine the type of user selection or action being made. Which operation is executed by thedata entry system 26 once theinteractive search list 30 is activated depends on what action the user takes with the pointing device. - If the user does not want to select any of the completion candidates presented in the
interactive search list 30 and wishes to continue entering further characters or to otherwise modify the partial text entry from thedigital keyboard 28, then with theinteractive search list 30 active and displayed on thegraphical user interface 34, the user may lift the pointing device without any significant movement or may lift the pointing device after dragging it to or returning it to a dead zone. Either of these actions causes thehardware input interface 17 to generate a user input signal which serves as an indication to thedata entry system 26 that the user wishes to deactivate theinteractive search list 30 and return to modifying the current partial text entry manually with thedigital keyboard 28. In the first embodiment, when theinteractive search list 30 is deactivated, the image of theinteractive search list 30 is cleared from thegraphical user interface 34 and replaced with the image of thedigital keyboard 28 which is enabled for further use by the user. This allows the user to smoothly return to using thedigital keyboard 28 in keyboard mode without having to relocate the pointing device. Once in keyboard mode, the user may reinitiate searching with theinteractive search list 30 to obtain further completion candidates by pausing once again on a subsequently selected key (or character) on thedigital keyboard 28 with the pointing device. - A completion candidate from the
interactive search list 30 is selected by the user by generating a gesture with the pointing device. For the purposes of this specification the term “gesture” refers to a motion with the pointing device when the pointing device is in an active state. In general, motions making up gestures may be linear or in another computer-recognizable pattern. For the first embodiment shown inFIG. 1 to 5 and for variations thereof, a gesture is a motion with the pointing device in a particular direction for at least a minimum distance when the pointing device is in an active state. The minimum distance is used by thedata entry system 26 in the first embodiment to ignore small, insignificant gestures with the pointing device. This minimizes false selections arising from inadvertent movements with the pointing device. - In the first embodiment, the pointing device is in an active state when the pointing device is held to the touch-
sensitive screen 14. In alternative variations of thegraphical display device 15 and thehardware input interface 17, other conditions can be used to identify when the pointing device is in an active state and are considered equivalent. For example, with a pressure-sensitive pad, the pointing device is in an active state when the pointing device is either in contact with the pressure-sensitive pad or depressed on the pressure-sensitive pad with at least a measurable amount of pressure. Alternatively, when the pointing device is a mouse, it is in an active state when a button on the mouse is pressed. - The
data entry system 26 monitors the gestures made with the pointing device to determine when and if a completion candidate in theinteractive search list 30 has been selected. In the first embodiment thedata entry system 26 does this by monitoring the current position coordinates of the pointing device on the touch-sensitive screen 14 relative to a point of origin generated when the pointing device first activates theinteractive search list 30. The current position coordinates of the pointing device are monitored so long as the pointing device remains in contact with the touch-sensitive screen 14. As the current position coordinates are received, thedata entry system 26 generates a vector using the current position coordinates of the pointing device and the point of origin. This vector is used by thedata entry system 26 to determine if and when a particular completion candidate in theinteractive search list 30 has been selected. Thedata entry system 26 is programmed to associate certain types of vectors with certain entries in theinteractive search list 30. Each entry in theinteractive search list 30 may have associated with it one or more predetermined vectors. When the vector formed by the user matches such a predetermined vector, thedata entry system 26 recognizes that the user is selecting the completion candidate displayed in the corresponding entry of theinteractive search list 30. If thedata entry system 26 determines that the vector currently being generated by the user's gesture is associated with one of the completion candidates in theinteractive search list 30, then the associated completion candidate is selected from theinteractive search list 30. Preferably, when a completion candidate is selected from theinteractive search list 30, the particular selection is indicated to the user by thedata entry system 26 by highlighting the selected completion candidate in theinteractive search list 30. - In the first embodiment, the
data entry system 26 is programmed to require that the user gesture with the pointing device towards or onto a completion candidate in theinteractive search list 30 in order to select that completion candidate. The gesture need only be a minimum distance. In an alternative arrangement, however, when a user makes selections from theinteractive search list 30 using gestures with the pointing device, the movement performed by the user is relative. With this “relative” mode of gesture-based candidate selection, the user need only gesture in a direction associated by thedata entry system 26 with a desired completion candidate without the pointing device necessarily moving towards or onto the portion of thegraphical user interface 34 where the completion candidate is displayed. In this case, the predetermined vectors associated by thedata entry system 26 with entries in theinteractive search list 30 correspond to unique gestures with the pointing device but not necessarily gestures which are towards or onto a particular entry in theinteractive search list 30. With this relative mode of selection, gestures with the pointing device are used to select amongst the completion candidates on theinteractive search list 30 even if the user is not gesturing with the pointing device towards or onto a particular completion candidate within theinteractive search list 30. In this case, the user need not move the pointing device toward or onto a particular fixed location within theinteractive search list 30 in order to select a specific completion candidate. Associating a gesture with a particular completion candidate in theinteractive search list 30 in the above manner minimizes the amount of movement required with the pointing device to make selections from theinteractive search list 30 and provides the user with the flexibility of selecting completion candidates with gestures which do not necessarily need to be towards or onto the desired completion candidate. - Once the user has selected a completion candidate from the
interactive search list 30, several options are available to the user. If the completion candidate represents the entry that the user wishes to add to the text, then the user can accept the selected completion candidate for insertion into the text by lifting the pointing device up from the touch-sensitive screen 14 in less than a predetermined time limit L2 while keeping the particular completion candidate selected. This latter event generates a user input signal which instructs thedata entry system 26 to terminate all searching based on the partial text entry and to signal to theapplication 27 to use the selected completion candidate to permanently replace the partial text entry. When the selected completion candidate is used to permanently replace the partial text entry at this point, theinteractive search list 30 is cleared from thegraphical user interface 34 and the image of thedigital keyboard 28 is re-enabled ready to receive the next keystroke from the pointing device. With the partial text entry thus completed, thedata entry system 26 begins monitoring anew for text entries by the user. - If the user wishes to change selections, the user can move through the list of completion candidates displayed in the
interactive search list 30 by gesturing with the pointing device to other completion candidates in theinteractive search list 30. If the user is unsure of which completion candidate to use, or wishes to pause to consider whether to continue in search mode or to return to keyboard mode, the user can gesture to or pause in a “dead zone” within thegraphical user interface 34 for any length of time without triggering any further action by thedata entry system 26. In the first embodiment, when theinteractive search list 30 is first activated by pausing on a key on thedigital keyboard 28, thedata entry system 26 begins monitoring the length of the vector that the user thereafter generates to determine when and if the user is selecting a particular completion candidate from theinteractive search list 30 or when the user is carrying out another recognized operation such as moving the pointing device to a dead zone. If thedata entry system 26 determines that the length of the vector generated from the current position coordinates and the point of origin is less than or equal to a predetermined length, then the pointing device is deemed by thedata entry system 26 to be in a dead zone on thegraphical user interface 34. When the pointing device is in a dead zone, the user has the freedom to pause without activating any further operation by thedata entry system 26 and without clearing theinteractive search list 30. This provides the user with the option to pause and consider what their next operation will be. Other dead zones may be programmed into thegraphical user interface 34 within thedata entry system 26 for the user to move to with the pointing device so as to further enhance the user's ability to pause in different parts of thegraphical user interface 34. If the vector being generated through a gesture is found by thedata entry system 26 to exceed a predetermined length, thedata entry system 26 checks to determine whether or not the particular vector being generated is associated with any of the completion candidates in theinteractive search list 30 or with any other operation on the screen such as alternative or additional dead zones. - If the completion candidate selected on the
interactive search list 30 represents only part of the entry that the user wishes to add to the text, then the user can use the selected completion candidate to dynamically initiate a further search for a more refined list of completion candidates from thedictionary 20. The ability to dynamically search for a more refined list of completion candidates based on a selected completion candidate is also referred to in this specification as an iterative search. This iterative search tool is provided through theinteractive search list 30 when theinteractive search list 30 is active and displays a list of potential completion candidates for the user to select from. An iterative search is triggered in the first embodiment by the user continuing to keep a completion candidate in theinteractive search list 30 selected for more than the predetermined time limit L2. The automated ability to use theinteractive search list 30 to further search thedictionary 20 allows the user to make good use of search results that are only partially successful. When the user continues to keep a completion candidate selected in theinteractive search list 30 for more than the predetermined time limit L2, thedata entry system 26 determines that the user input corresponds to a user selection to initiate a new search using the selected completion candidate as the basis for the new search. When this happens, thecandidate prediction system 32 dynamically obtains a refined list of completion candidates based on the selected completion candidate. As illustrated inFIG. 5A , the refined list of completion candidates provides a narrower list of completion candidates based on a more specific search string (ie. the selected completion candidate). Iterative searching enables one to perform multi-level searches with theinteractive search list 30, so that the user can quickly use a completion candidate from one level of search results to drill deeper into thedictionary 30 for a narrower set of completion candidates. - In the first embodiment, iterative searching can be initiated when a completion candidate in the
interactive search list 30 represents only a first part (i.e. a leading part) of the entry that the user wishes to add to the text. - Once the
candidate prediction system 32 has obtained the refined list of completion candidates, theinteractive search list 30 is redisplayed with the refined list of completion candidates. When theinteractive search list 30 is redisplayed, the point of origin coordinates, used by thedata entry system 26 to track vectors generated from gestures with the pointing device, are set to the position coordinates of the pointing device at the time the selected completion candidate is used to initiate the iterative search. With theinteractive search list 30 redisplayed, the user can then choose one of the completion candidates in the refined list by lifting the pointing device up after selecting the particular completion candidate through a gesture, or the user can further repeat the iterative search process by selecting one of the completion candidates in the refined list and pausing with that particular completion candidate selected for the predetermined time limit L2. With the refined list of completion candidates displayed, the user may also lift the pointing device without selecting any completion candidates obtained from the iterative search. This latter action leaves the search string set to the last completion candidate selected by the user in the previous iteration of theinteractive search list 30 and returns thedata entry system 26 to keyboard mode. At this point, theinteractive search list 30 is cleared from thegraphical user interface 34, thedigital keyboard 28 is redisplayed, and thedata entry system 26 sends the completion candidate last selected by the user from theinteractive search list 30 to theapplication 27 for entry into the text, thereby replacing the contents of the partial text entry currently under development by the user. New characters can then be added. As a variation, using thedigital keyboard 28, the user can then, if desired, instruct theapplication 27 to cancel the entry of the modified partial text entry into the text by selecting a function button displayed on or associated with thedigital keyboard 28. -
FIGS. 6 and 6 A show an example 100 from the user's perspective of the operation and flexibility of thedata entry system 26 for the first embodiment inFIG. 1 to 5. For the example shown inFIGS. 6 and 6 A, suppose the user wishes to enter in the word “endlessly” and begins by entering atblock 102 the letter “e” on thedigital keyboard 28. In order to activate theinteractive search list 30, the user pauses on the letter “e” for at least the predetermined time limit L1 which automatically triggers atblock 104 thecandidate prediction system 32 to obtain a list of completion candidates that are then displayed to the user in theinteractive search list 30. Depending upon the contents of thedictionary 20 and the ranking system used to rank completion candidates stored within thedictionary 20, the desired completion candidate “endlessly” may not be one of the choices displayed in the initialinteractive search list 30. Suppose for the moment, however, that the word “end” is one of the completion candidates displayed in the initialinteractive search list 30. The user can select by gesture atblock 108 the completion candidate “end” and use it to automatically initiate a further search of thedictionary 20 in order to retrieve a list of prioritized completion candidates which all begin with the prefix “end”. As discussed, this iterative searching technique is performed with the pointing device by simply pausing while selecting a completion candidate (in this case the word “end”) in theinteractive search list 30 for the predetermined time limit L2 atblock 116 which thereby automatically initiates a new search using the selected completion candidate as the basis for such a search. If the desired completion candidate “endlessly” appears in the updatedinteractive search list 30, the user can then immediately add the desired completion word by selecting it (block 120) from theinteractive search list 30 and lifting the pointing device up in less than the predetermined time limit L2. - If the
interactive search list 30 displayed afterblock 104 does not include the prefix “end” or any other prefix that would lead the user to rapidly enter the desired word “endlessly” using thedata entry system 26, then the user has the option of lifting the pointing device up from the touch-sensitive screen 14 atblock 106 without selecting any of the completion candidates. This type of user input notifies thedata entry system 26 to clear theinteractive search list 30 from the screen and to re-enable and display thedigital keyboard 28. It should be noted that in this latter operation the search string continues to contain the partial text entry which the user has generated with the digital keyboard 28 (in this case the search string contains only the letter “e”). - As a further illustration of the flexibility of the present
data entry system 26, if the partial completion candidate “end” is used to initiate a further search and the new list of completion candidates does not include the word “endlessly” in the list of completion candidates, then the user can choose to continue building upon the partial completion candidate “end” by lifting the pointing device up from the touch-sensitive screen 14 without any of the completion candidates in the new list selected and proceeding to continue entering in characters from thedigital keyboard 28. In this example the partial completion candidate “end” permanently replaces the partial text entry that the user was generating with thedigital keyboard 28 in the user's electronic text, the search string is cleared and any new user input is automatically treated as being part of a new partial text entry. - It will be appreciated from the above particular that the user can follow the final steps to completing a partial completion candidate by entering in the remaining letters at the end of the partial completion candidate. For instance, if the partial completion candidate “endless” was retrieved, then the user can simply tap on the
digital keyboard 28 the letters “I” and “y” followed by a space (or the end-of-entry function button) in order to notify thedata entry system 26 of the completion of the current text entry. Thus, even when a complete word, phrase, or sequence of characters is not found in thedictionary 20, the use of thedata entry system 26 to retrieve a partial completion candidate can result in less time and effort being expended than if the user had simply typed in each letter of the desired word, phrase, or sequence of characters. - In an alternative embodiment, the
data entry system 26 is programmed with the ability to re-initiate automated searching even once theapplication 27 is instructed by thedata entry system 26 to permanently replace the partial text entry in the text with a partial completion candidate. In this alternative, the search string is set to the partial completion candidate when the partial completion candidate replaces the partial text entry in the user's electronic text. The user may then return at any time to the automated search facility of thedata entry system 26 by pausing on a key on thedigital keyboard 28 for the predetermined time limit L1. For instance, if the partial completion a selection “end” has already been used to initiate a further search and the user then lifts up the pointing device without selecting any of the completion candidates in the new list of completion candidates, the user can then return to automated searching with thedata entry system 26 by, for example, touching on the letter “I” on thedigital keyboard 28 for a sufficient period of time to initiate a search on the basis of the prefix “endl”. The partial text entry “endl” will then be used by thecandidate prediction system 32 to obtain a list of completion candidates that are then displayed in theinteractive search list 30. If the desired completion candidate “endlessly” appears in the new list of completion candidates, the user may then choose that candidate and add it to the text by selecting that candidate and lifting the pointing device before the time limit L2 is reached. If the desired completion candidate does not appear in theinteractive search list 30, the user can simply lift the pointing device without selecting any of the completion candidates and continue building upon the partial text entry “endl” by entering further characters via thedigital keyboard 28. Once the desired word is complete, in order to clear the search string and instruct thedata entry system 26 to treat any new user input as being part of a new partial text entry in this alternative, the user selects a key or function from thedigital keyboard 28 programmed to indicate that entry of the original partial text entry has ended, as discussed further in the section below. - Forcing the End of a Partial Text Entry
- As can be seen from the previous example, there will be times when the user is entering a word, phrase, or character sequence that does not appear in the
dictionary 20. In this case, if the user has activated theinteractive search list 30, it will be empty once the user has entered enough characters for thecandidate prediction system 32 to determine that thedictionary 20 does not have any words, phrase or character sequences which could be completion candidates for the partial text entry that is currently being entered by the user. When this is the case, or when the user is completing the entry of a word, phrase, or character sequence by entering on thedigital keyboard 28, thedata entry system 26 needs to know when the current partial text entry being generated by the user has been completed and when another partial text entry has begun. If the user has selected and accepted a completion candidate from theinteractive search list 30 in the first embodiment, then thedata entry system 26 is programmed to recognize that the user has completed the current partial text entry and automatically initializes so that the next character selected from thedigital keyboard 28 will be treated as a leading character for a new partial text entry. However, if the user is completing a partial text entry that is not found in thedictionary 20 and completes the partial text entry by simply entering characters from thedigital keyboard 28, then thedata entry system 26 may not know when the current partial text entry is completed and when the next partial text entry has begun. In order to assist in the identification of when a user has completed an entry for a word, phrase, or sequence of characters, thedata entry system 26 can be programmed to monitor for an “end-of-search” signal from the user via thedigital keyboard 28. In the first embodiment, an end-of-search signal is received by thedata entry system 26 when a key or function button programmed to indicate an express “end-of-search” instruction is selected from thedigital keyboard 28. Alternatively, thedata entry system 26 can be programmed to recognize an implicit end-of-search instruction such as, for example, when the space key on thedigital keyboard 28 is selected. Other non-alphabetic characters may also be used to provide an implicit end-of-search instruction. - System Flow
- In the discussion that follows, the processing performed by the
data entry system 26 running on theprocessing unit 12 is described in further detail. For this discussion, reference is made toFIG. 7 to 9 which are logical flow diagrams illustrating the flow of operation of thedata entry system 26. The description of the computer-implemented process illustrated inFIG. 7 to 9 will be made with reference to thepersonal computing device 10 and thedata entry system 26 shown inFIG. 1 to 5. - The
data entry system 26 is initialized atblock 202. This includes the initialization of variables and flags used within thedata entry system 26 to track the state of user input, processing, and output. This also involves initializing the user interface for thedata entry system 26 including loading and setting up thedigital keyboard 28 for display, selecting thedictionary 20 to be used by thedata entry system 26, identifying the type of pointing device that will be used for text entry, and setting up any user-defined configurations for the display and use of thedigital keyboard 28, and theinteractive search list 30. Once thedata entry system 26 including thedigital keyboard 28 is initialized, the user interface for thedata entry system 26 is then displayed on the touch-sensitive screen 14 atblock 204. In its most basic form, the user interface initially displayed comprises thedigital keyboard 28. The user interface may also include one or more toolbars or display boxes for the display of the current value of the search string and the current contents of theinteractive search list 30. With the interface initialized and displayed on the touch-sensitive screen 14, thedata entry system 26 awaits for user input from the pointing device atblock 206. Once user input is received atblock 206, thedata entry system 26 determines atblock 208 whether the user input received by thedata entry system 26 atblock 206 corresponds to any of the characters displayed on thedigital keyboard 28. If the user input is found atblock 208 to correspond with a character displayed in thedigital keyboard 28, then that character is added to the search string atblock 210. As indicated earlier, the search string is used by thecandidate prediction system 32 to search thedictionary 20 for potential completion candidates. In the first embodiment, thecandidate prediction system 32 continuously retrieves a list of completion candidates from thedictionary 20 as contents of the search string change. As the user modifies the current partial text entry under construction, the contents of the search string are modified and used by thecandidate prediction system 32 atblock 212 to obtain a new list of completion candidates from thedictionary 20. - The operation of the
candidate prediction system 32 for the first embodiment is further illustrated inFIG. 9 . As illustrated inFIG. 9 for the first embodiment, thecandidate prediction system 32 retrieves the first and last entry from thedictionary 20 that begin with the contents of the search string. The first and last entry retrieved are then used to define a search span. If the search span is greater than the number of completion candidates that thedata entry system 26 is programmed to display, then the completion candidates within the search span having the highest corresponding weight values (for example, frequency values) are retrieved up to the maximum number of permissible completion candidates which may be displayed in theinteractive search list 30. The completion candidates retrieved by thecandidate prediction system 32 in this manner are then compiled into a list of completion candidates which is used for display in theinteractive search list 30. In the case of the first embodiment, this involves very little processing as the list of completion candidates is updated whenever the partial text entry currently under development by the user is modified. As indicated earlier, however, in an alternative embodiment, theinteractive search list 30 containing a list of completion candidates may be generated once the user invokes activation of theinteractive search list 30 by pausing on a key on the digital keyboard 28 (i.e. following block 214). - Once a character has been added to the search string at
block 210, thedata entry system 26 determines atblock 214 whether or not the pointing device has been released from the touch-sensitive screen 14 within the predetermined time limit L1. If thedata entry system 26 finds that the pointing device has been released within the time limit L1, processing returns to block 206 where thedata entry system 26 waits for further user input. This allows the user to use thedigital keyboard 28 to type out a portion or all of a desired text entry by briefly tapping on keys on thedigital keyboard 28 one keystroke at a time. On the other hand, if thedata entry system 26 determines atblock 214 that the pointing device has not been released within the time limit L1, thedata entry system 26 determines atblock 216 whether or not the list of completion candidates (also referred to and shown in the drawings as a candidate list) is empty. If the candidate list is found to be empty atblock 216, then thecandidate prediction system 32 has not found any completion candidates in thedictionary 20 which would potentially complete the partial text entry under development by the user. In this case, processing returns to block 206. From the user's perspective, with no possible candidates having been retrieved from thedictionary 20, the user can complete the desired text entry by continuing to type in the remaining characters from thedigital keyboard 28 or may otherwise modify the partial text entry under development using other function keys available on the user interface (such as canceling the current partial text entry or backspacing one or more characters in the partial text entry). If, on the other hand, the list of completion candidates is found not to be empty atblock 216, then thedigital keyboard 28 is disabled atblock 218 and theinteractive search list 30 containing the candidate list obtained inblock 212 is displayed within theinteractive search list 30 on the touch-sensitive screen 14 atblock 222 and thedata entry system 26 waits for further user input atblock 224. - It will be recalled that once the user has activated the display of the
interactive search list 30 containing a list of completion candidates, the user can use thedata entry system 26 to take one of several actions. The user can deactivate theinteractive search list 30 and return to modifying or editing the current partial text entry by lifting the pointing device from the touch-sensitive screen 14 without any significant movement. If this action is detected atblock 226 then the candidate list is cleared and the search string contents are preserved atblock 262. Processing then returns to block 204 where the user can continue modifying the current partial text entry using thedigital keyboard 28. - If, on the other hand, the user input received at
block 224 is identified atblock 226 as being a gesture, then the gesture is analyzed atblock 228 to determine if it is associated with a completion candidate displayed in theinteractive search list 30. If the gesture is found to be associated with a completion candidate in theinteractive search list 30, then that completion candidate is selected from theinteractive search list 30 atblock 230. Preferably, when a completion candidate is selected in theinteractive search list 30 atblock 230, the selected completion candidate is highlighted or otherwise emphasized in some way to the user. - When a completion candidate is selected at
block 230, a timer T2 is started. The timer T2 is used in the first embodiment to monitor how long the selected completion candidate remains selected by the user. As will be recalled, the user can select one of the completion candidates in theinteractive search list 30 and use the selected completion candidate to either replace the partial text entry that the user is currently entering or use the selected completion candidate to initiate a further automated search to obtain a more refined list of completion candidates from thedictionary 20. In the first embodiment, the timer T2 is used to distinguish between these latter two types of operations which the user may initiate with the pointing device using the selected completion candidate. It should be noted that if the gesture analyzed atblock 228 is not found to be associated with the completion candidate, then processing returns to block 224 where thedata entry system 26 awaits further user input from the pointing device for analysis atblock 226. - Once a completion candidate has been selected from the
interactive search list 30 atblock 230, thedata entry system 26 monitors the timer T2 atblock 232 and monitors for further user input. If thedata entry system 26 detects further user input from the pointing device atblock 236, the timer T2 has not exceeded the predetermined time limit L2 and the user input is analyzed atblock 238 to determine whether the user has initiated a gesture or lift with the pointing device. If a lift is detected atblock 238, then this event serves as an indication to thedata entry system 26 that the selected candidate in theinteractive search list 30 has been accepted by the user, in which case the completion candidate is added to the text in place of the potential text entry and the search string is cleared atblock 240. Once a selected completion candidate has been added to the text atblock 240, thedata entry system 26 returns to block 204 where the initialized user interface is displayed on the touch-sensitive screen 14, and thedata entry system 26 awaits for further user input atblock 206. Any new characters received by thedata entry system 26 are then treated as being part of a new partial text entry. - If, on the other hand, a gesture is detected at
block 238, then the gesture is analyzed to determine if it is associated with a different completion candidate in theinteractive search list 30 atblock 242, and if the answer to the analysis ofblock 242 is “YES”, then thedata entry system 26 changes the completion candidate selection from theinteractive search list 30 and restarts the timer T2 atblock 244. Thedata entry system 26 then continues to monitor the timer T2 atblock 232 and user input atblock 234. From the user's perspective, a different completion candidate from the list of completion candidates is highlighted. If a gesture is detected atblock 238, but that gesture is not found to be associated with a different candidate atblock 242, then thedata entry system 26 determines atblock 246 whether the gesture is associated with a dead zone on the user interface. As discussed, dead zones are used to allow the user to deselect a selected completion candidate and to pause to consider what further action the user may wish to take. Dead zones are particularly useful when a timer such as timer T2 is used as the triggering mechanism to determine when a selected completion candidate is to be used to initiate a further automated search of thedictionary 20. If a gesture is found to be associated with a dead zone atblock 246, the currently selected completion candidate from theinteractive search list 30 is deselected atblock 248, the timer T2 is disabled, and thedata entry system 26 awaits for further user input atblock 256. Otherwise, processing returns to block 232. - If the timer T2 is found at any time at
block 232 to have exceeded the predetermined time limit L2, then this event serves as an indication to thedata entry system 26 that the selected candidate is to be used to initiate a further automated search, in which case processing proceeds to block 250 where the search string is set to equal the selected completion candidate and a new list of completion candidates is obtained from thedictionary 20 atblock 252. This new list of completion candidates is then displayed in theinteractive search list 30 atblock 254, and thedata entry system 26 then awaits further user input atblock 256. User input received atblock 256 is analyzed atblock 258 and if atblock 258 thedata entry system 26 determines that the user input corresponds to a gesture with the pointing device, the gesture is analyzed atblock 260 to determine if the gesture generated by the pointing device is associated with any of the completion candidates from the new list of completion candidates displayed in theinteractive search list 30. If the gesture is not associated with a completion candidate, then the data entry returns to block 256 and awaits for further user input from the pointing device. If, on the other hand, the gesture is associated with a completion candidate in theinteractive search list 30, then thedata entry system 26 returns to block 230 where the associated completion candidate is selected, the timer T2 is restarted, and thedata entry system 26 then monitors to see, as before, whether or not the user will use the selected completion candidate to either replace the partial text entry or initiate a further automated search. It should be noted that when the new list of completion candidates is displayed in theinteractive search list 30 atblock 254, and thedata entry system 26 awaits for user input atblock 256, that the pointing device remains in contact with the touch-sensitive screen 14. This situation is similar to the one atblocks interactive search list 30 has been updated to contain a new list of completion candidates for the user to select from. - Data Entry System Features
- The
data entry system 26 may include a variety of features and aspects to further enhance functionality and flexibility of text entry for the user when a single pointing device is used. Furthermore, each of the following features and aspects individually provides a beneficial enhancement and is an embodiment of the present invention. These additional features and aspects of the present invention will now be described below. Many of the features and aspects described below can also be applied in combination with various types of search lists containing completion candidates, including single and multi-level search lists. - As before, the following features and aspects can be applied to many types of personal computing devices and may be stored as computer-readable instructions in one or more types of computer-readable media.
- Notification of Active Entry Mode
- In one alternative embodiment, the
data entry system 26 is programmed to notify the user of the active entry mode. In this variation, thedata entry system 26 is programmed to display on thegraphical user interface 34 an express indication of the currently active entry mode (as illustrated inblocks FIGS. 23 and 24 ). In this variation, two entry modes are tracked with the data entry system 26: (1) a keyboard mode to indicate that thedigital keyboard 28 is active, and (2) a search mode to indicate that automated searching is active with theinteractive search list 30. Displaying on thegraphical user interface 34 an express indication of the current entry mode for thedata entry system 26 is achieved by displaying a different color signal (or set of signals) on thegraphical user interface 34 depending on which entry mode is currently active. Alternatively, specific icons can be assigned to each entry mode and displayed on thegraphical user interface 34 when the corresponding entry mode is active. Notifying the user of the entry mode with one or more express indicators on thegraphical user interface 34 minimizes the risk of the user losing track of whether the user is in keyboard mode or in search mode and enhances the ease of use of thedata entry system 26. This can be particularly useful when both thedigital keyboard 28 and theinteractive search list 30 are displayed simultaneously on thegraphical user interface 34. - Notification of Completion Candidates
- If the
interactive search list 30 has fewer than the predetermined maximum number of displayable completion candidates, then this will serve as an indication to the user that theinteractive search list 30 currently displayed contains all of the completion candidates in thedictionary 20 that begin with the partial text entry that the user has entered. If, however, theinteractive search list 30 is full when it is activated by the user, it will not be clear from looking at theinteractive search list 30 whether any other potential completion candidates for the current partial text entry may reside in thedictionary 20. In order to remove this ambiguity and expressly indicate if there are any more potential completion candidates and if so, how many, in another aspect thedata entry system 26 is programmed to display on thegraphical user interface 34 the number of potential completion candidates in thedictionary 20 that have leading characters matching the current partial text entry. The number of potential completion candidates is displayed and updated by thedata entry system 26 when thedigital keyboard 28 is in use and whenever theinteractive search list 30 is activated or updated with new completion candidates (as illustrated for example atblocks FIGS. 21 and 22 ). Alternatively, thedata entry system 26 can be programmed to display on the graphical user interface 34 a graphical indication of whether or not additional completion candidates having leading characters matching the current partial text entry are located in thedictionary 20, in addition to those candidates displayed in theinteractive search list 30. Here again, the graphical indication is displayed and updated by thedata entry system 26 when thedigital keyboard 28 is in use and whenever theinteractive search list 30 is activated or updated with new completion candidates. This notification feature enhances the user's ability to know, even before attempting to use theinteractive search list 30, when automated searching may retrieve a list of possible completion candidates (or a refined list). With this advanced notification feature, the user can better decide when to continue adding further characters to the partial text entry with thedigital keyboard 28 and when to activate and use theinteractive search list 30. - Digital Keyboard Features
- A variety of features may be implemented with the
digital keyboard 28 in order to further enhance the user's ability to enter text with thedata entry system 26. In one variation, thedigital keyboard 28 can be programmed to be displayed in a frequency distributed layout. The frequency distributed layout takes advantage of the well known principle that certain characters in a character set are more frequently used than other characters within the same character set. For example, the digital keyboard may contain the letters of the English alphabet displayed in a frequency distributed layout based on an analysis of a large corpus of text. It will be appreciated, of course, that the characters or symbols in a particular character set may have different relative frequencies depending upon the sample population of data used to rank such characters relative to each other within a particular character set. It will be appreciated that when thedata entry system 26 is employed, the frequency of characters entered may be different than that of traditional systems that enter text one character at a time. These general principles are used to generate a frequency distributed layout for the digital keyboard. - In one embodiment of the digital keyboard having a frequency distributed layout, the digital keyboard is programmed to include a plurality of characters assigned to predetermined locations within the layout for the digital keyboard according to a predetermined frequency distribution associated with the plurality of characters. The plurality of characters displayed on the digital keyboard include less commonly used characters and more commonly used characters based on the predetermined frequency distribution. In this embodiment, the digital keyboard is displayed on a graphical user interface with the less commonly used characters displayed substantially further from the center of the digital keyboard than the more commonly used characters. An example of this type of digital keyboard is illustrated generally in
FIG. 3 except that the “space” key has been located in the outer ring rather than closer to the center of thedigital keyboard 28. An example of thedigital keyboard 28 having a frequency distributed layout with the space key near the center is shown inFIG. 10 . - With the frequency distributed layout, it is preferable that the image of the
digital keyboard 28, when substantially circular or elliptical, has a first group of most frequently used characters (i.e. the most commonly used characters) located substantially near to the center of thedigital keyboard 28 with at least one group of less frequently used characters (relative to the first group) displayed at a distance further from the center of the keyboard than the characters of the first group. As illustrated byFIG. 3 , thedigital keyboard 28 is preferably configured to be displayed in a frequency distributed layout comprising a plurality of characters arranged into rings. When the characters on thedigital keyboard 28 are arranged into rings, then the characters in a particular ring can be arranged to each be about the same distance from the center of thedigital keyboard 28 providing some uniformity to the movements required to enter text. This can also be useful for certain arrangements including, for example, when thedigital keyboard 28 is programmed to be dynamically re-positionable as discussed further below. - In the keyboard layouts shown in
FIGS. 3 and 10 , at least one most commonly used character of a pre-selected character set or subset (such as a subset or the ASCII character set) is located substantially in or near the center of thedigital keyboard 28. As also shown, the next most commonly used characters are located within an intermediate ring, and the less commonly used characters of the character set are distributed in an outer ring of thedigital keyboard 28. When the most commonly used characters are located in or close to the center of thedigital keyboard 28, the degree of movement required with a pointing device to select characters displayed within the intermediate or inner rings of thedigital keyboard 28 is minimized. In addition, arranging characters on thedigital keyboard 28 in concentric-like rings according to their frequency of use provides an easy and efficient mechanism for retrieving characters and entering data using a pointing device. - When rings are used with the
digital keyboard 28, it will be appreciated that the arrangements of the characters within each ring is by no means limited to the layout shown inFIG. 3 or 10. For instance, the characters within a particular ring may be organized alphabetically in a clockwise (or counter clockwise) order. A challenge with many keyboard designs is that they take time to learn. The above ordered organization increases the opportunity to quickly learn and recall the location of characters displayed on thedigital keyboard 28, since user's are already familiar with this clockwise distribution. In another variation, the characters in one half of a ring (for example, the upper half) may be ordered alphabetically in one direction (for example, clockwise), and all characters in the other half of the same ring (for example, the lower half) may be ordered alphabetically in the other direction (counterclockwise). These type of organizations within the rings can also enable a user to more quickly learn to locate a desired character displayed in thedigital keyboard 28. - Several other characteristics of the
digital keyboard 28 may also vary. In general, the type of characters displayed and available, the type and number of characters displayed on particular keys of thedigital keyboard 28, the font size of each character displayed, and the value to be processed when a particular key is contacted (or selected) may all vary from keyboard to keyboard. As well, to minimize clutter thedigital keyboard 28 can be displayed with no graphics outlining the keys on thedigital keyboard 28. For circular or ring-like keyboard layouts, several other characteristics may also vary, including the number of rings making up the keyboard layout, the number of keys displayed in each ring, and in the keyboard as a whole and the thickness or width of each ring. - In another variation, the
digital keyboard 28 layout may be dynamically replaced by the user with another keyboard layout. This feature can be particularly advantageous when it is desirable to permit a user to quickly swap between several keyboard layouts (for example, as between the keyboard layouts inFIGS. 10, 11 and 12), as in the case where the touch-sensitive screen 14 is relatively small or the number of characters required to enter data exceeds the space available to display thedigital keyboard 28 within a location on the touch-sensitive screen 14. Permitting the user to swap between multiple keyboard layouts provides the user with a significant degree of flexibility when entering characters with thedata entry system 26. In addition, when multiple keyboard layouts are available, they can be organized according to various subclasses of characters. For instance, a default keyboard layout may contain alphabetic characters. A second keyboard layout may contain numeric characters. A third keyboard layout may contain special characters. Grouping a character set into logical subgroups and organizing these subgroups on multiple keyboard layouts provides the user with the ability to logically navigate amongst different types of keyboard layouts when desired. Preferably, the user may activate a particular keyboard layout using one or more hot keys each associated with at least one of the available keyboard layouts. A hot key may be any key or function associated with thedigital keyboard 28 that triggers the display of an alternative keyboard layout. When a hot key associated with a particular keyboard layout is selected by the user from thedigital keyboard 28, the currently displayed keyboard layout is replaced with the keyboard layout associated with the selected hot key. - In another variation, a number of different related symbols or characters may be accessed through one key on the
digital keyboard 28. For example, when the user touches a punctuation key, a number of different punctuation marks may be displayed, and the user may select one of these choices by gesturing to select the desired symbol or character. - Dictionary Features
- In another aspect of the present invention, multiple dictionaries may be stored in the computer-readable medium 16 (
FIG. 1 ), with each dictionary containing completion candidates with associated weight values for ranking completion candidates relative to each other. For example, the weight values may represent frequency of use values weighted according to usage in a particular language or a particular field of use (eg. engineering, general business, law, accounting) or a particular user's use. With multiple dictionaries, a user may readily switch between language sets or language subsets or dictionaries for a particular application. - In one variation, the data entry system 26 (for example, in
FIG. 1 to 5) can contain multiple simultaneously accessible dictionaries that the user can enable and disable individually. For instance, thedata entry system 26 can have a first dictionary containing completion candidates based on Oxford English and a second dictionary containing completion candidates based on American English, both active at the same time and both accessed and used by thecandidate prediction system 32 when a list of completion candidates is to be obtained. As another example, thedata entry system 26 can have a legal dictionary, a civil engineering dictionary, and a regular American English dictionary all active simultaneously. This feature enables the user to obtain a list of completion candidates simultaneously containing variations on particular words, phrases, or character sequences particular to specific areas of practice or particular to specific types of dictionaries. - With multiple, simultaneously accessible dictionaries, the
candidate prediction system 32 can be programmed to retrieve completion candidates from two or more dictionaries, each having their own weighting function for completion candidates (as illustrated inblocks FIGS. 25 and 26 ). When this is done, thecandidate prediction system 32 can generate a final list of completion candidates based on a combining function that takes into account the weight values associated with the completion candidates retrieved from the multiple dictionaries and which also prioritizes the completion candidates based on the source dictionary from which a particular completion candidate is retrieved. By way of example, thecandidate prediction system 32 may be programmed to include in the final list the top N completion candidates (where N≧1) from each list of completion candidates retrieved from the multiple a dictionaries. - A predefined dictionary may also be modified or generated based on a particular user's usage of particular words or character sequences over the course of using the
data entry system 26. Such a “personalized” dictionary may also be used to produce lists of the most common completion candidates used by a user. For example, the actual usage of completion candidates from the dictionary may be tracked by thedata entry system 26. A personalized dictionary may also be used in combination with other dictionaries. For example, using a standardized dictionary and a personalize dictionary thecandidate prediction system 32 may be programmed to give priority first to completion candidates (up to a predetermined limit) beginning with the contents of the search string and recorded in the personalized dictionary as having the highest weight values, and then, if space remains in theinteractive search list 30, to completion candidates having the highest weight values in the standardized dictionary and beginning with the contents of the search string. As another example, a new dictionary may be generated based on the completion candidates selected by the user through the use of thedata entry system 26 over time. The user may activate the new dictionary at any time so that it takes priority over any pre-existing dictionary(ies) if completion candidates beginning with the search string are located in the new dictionary. - In another variation of the
dictionary 20 and the use of thedictionary 20 via thedata entry system 26 of the present invention, thedata entry system 26 may be programmed to monitor a specific user's pattern of usage of completion candidates from theinteractive search list 30 over time. For example, as completion candidates are selected by the user and entered into the text using thedata entry system 26, an additional weight field in each entry of thedictionary 20 may be used by thedata entry system 26 to track the user's actual frequency of completion candidate usage. In this user-oriented variation thecandidate prediction system 32 may be configured to find the most common completion candidates in thedictionary 20 beginning with a search string based firstly on the degree of actual user usage tracked in the additional usage fields of thedictionary 20 associated with completion candidates therein, and secondly based on the predefined weight fields 24 if the additional usage fields are null or are less than a predetermined threshold value defining a minimum percentage level of usage for evaluation, or if the list of completion candidates retrieved using the additional usage fields results in a number of completion candidates less than the maximum number which may be displayed with theinteractive search list 30. In such a user-oriented variation, thecandidate prediction system 32 tracks the total number of selections made from the dictionary 20 (for example, in a TOTAL_USAGE field in the candidate prediction system 32) over time by the user, as well as the total number of occasions on which a particular completion candidate in thedictionary 20 is actually used by the user to replace a partial text entry (for example, in a COMPLETION_CANDIDATE_USAGE field in the candidate prediction system 32). To determine whether or not an additional usage field for a particular completion candidate in the dictionary is less than the predetermined threshold value for acceptable usage and evaluation, thedata entry system 26 compares the value COMPLETION_CANDIDATE_USAGE/TOTAL_USAGE with the predetermined threshold value. In such a variation the end-most commonly used completion candidates retrieved for display may be configured based primarily on the user's actual completion candidate usage as opposed to a predefined frequency distribution preprogrammed intofields 24 of thedictionary 20. - It will be further appreciated from reading this specification that including common word prefixes in lists of completion candidates can reduce user effort in arriving at the desired final completion candidate. The inclusion of prefixes provides for one common entry path to all words that begin with that prefix. Compared to using an initial candidate list which, for example, simply lists all words beginning with certain characters this alternative method may in turn shorten the number of searches, or search iterations, required to find the final completion candidate. This results because the weight field for the prefix completion candidate can be made the sum of the weight fields for all entries that begin with the prefix which in turn means the prefix is more likely to show up at an earlier stage of an iterative search sequence.
- An example of this is if the desired final candidate is the word ‘telescope’. Without the use of prefixes the user might enter ‘t’ but find no words beginning with ‘tele’ in the completion candidate list. This would then require the user to enter the next letter ‘e’ however telescope may still not have a high enough weight to show up in the next list of completion candidates and so the process would continue. The use of prefixes as completion candidates can shorten this process since the combined weight of all completion candidates beginning with ‘tele’ would cause this prefix to show up early in the search process, possibly as soon as ‘t’ is entered, which would then allow the user to immediately narrow the search to only those words beginning with the characters ‘tele’. Because the desired candidate is found after fewer searches or search iterations it reduces the memory load on the user which in turn can ease learning of the necessary sequence of operations the user must perform to enter words that begin with those prefixes.”
- Re-Positionable Keyboard
- In another aspect of the present invention, the
digital keyboard 28 is programmed to be dynamically re-positionable so as to follow the pointing device. When thedigital keyboard 28 is programmed to be dynamically re-positionable, its image follows the movement of the pointing device on the touch-sensitive screen 14 so that the keyboard image remains generally centered beneath the pointing device after each keyboard selection. In this aspect, whenever thedata entry system 26 is in keyboard mode, thedigital keyboard 28 is programmed to automatically re-center itself on a location within thegraphical user interface 34 associated with a last known set of position coordinates for the pointing device. For example, if the character “u” is selected with the pointing device from thedigital keyboard 28 inFIG. 3 , the =digital keyboard 28 re-centers itself substantially over the position coordinates which were used by the pointing device to select the character “u”. By substantially re-centering thedigital keyboard 28 over the last known set of position coordinates for the pointing device, the position and distance of the keys on thedigital keyboard 28 relative to the user's pointing device remains substantially constant. This provides a uniform mechanism for consistently selecting the same key on thedigital keyboard 28 using substantially the same movement with the pointing device. In addition, when thedigital keyboard 28 is dynamically re-positionable the degree and frequency with which the user is required to reposition the pointing device after selecting keyboard characters is minimized. If this re-positionable feature is combined with a frequency distributed keyboard having the most common characters near the center, the pointing device will generally always rest in the center of the most common characters. If the frequency distributed keyboard is made up of rings, then each of the characters in a particular ring will be equidistant from the pointing device when the pointing device is resting in the center of the keyboard, resulting in a uniformity of movement for character entry. - When the
digital keyboard 28 is programmed to be dynamically re-positionable, it may also be programmed to reposition to a substantially central location within the graphical user interface 34 (or to another user-definable position) when thedigital keyboard 28 approaches within a predetermined distance of any of the boundaries of thegraphical user interface 34. Repositioning thedigital keyboard 28 in this way provides a mechanism to adjust for circumstances where thedigital keyboard 28 drifts too close to a boundary of the touch-sensitive screen 14. In an alternative repositioning mechanism, a hot key may be used to automatically re-center thedigital keyboard 28. In yet another alternative, the dynamically re-positionabledigital keyboard 28 may be programmed to re-center about position coordinates for the pointing device when the position coordinates correspond to a part of the graphical user interface 34 (or screen) that is not currently occupied by thedigital keyboard 28. For example, if thedigital keyboard 28 approaches an edge of thegraphical user interface 34 the user can simply touch down in a center of thegraphical user interface 34 and thedigital keyboard 28 will relocate to that point. - When the
digital keyboard 28 is dynamically re-positionable, it is preferable in general that the amount of keyboard movement, or drift, is minimized. This can be achieved by arranging the keyboard layout so that the keyboard characters are distributed about thedigital keyboard 28 in a configuration that reduces the amount of drifting experienced when it is dynamically re-positionable. One way of achieving this is by configuring thedigital keyboard 28 so that the total of the frequency of use values for characters located within a particular portion (or sector) of thedigital keyboard 28 is substantially the same as other similarly shaped portions (sectors) of thedigital keyboard 28. It will be recalled that for the frequency distributed arrangement of keyboard characters discussed earlier, each keyboard character has a predetermined frequency of use value assigned to (or associated with) it. In order to minimize drifting, thedigital keyboard 28 may be divided into notional, substantially equally shaped sectors, and the keyboard characters may be assigned to locations within thedigital keyboard 28 such that the total of combined frequency values for characters within a particular sector of thedigital keyboard 28 is substantially equal to the total of combined frequency values for characters within any of the other sectors of thedigital keyboard 28. - In this way, the likelihood of selecting a character from any one of the predetermined sectors of the
digital keyboard 28 is substantially the same. Thus, if one wishes to minimize drift in the case of the circular-typedigital keyboard 28 layout inFIG. 3 , it is preferable that the keyboard characters are distributed such that when thedigital keyboard 28 is notionally divided into substantially equally shaped wedge-like sectors, each sector of the keyboard has substantially the same total ‘weight’ of characters, according to their frequency of use, as each of the other sectors. - Another way to minimize drift is to configure the
digital keyboard 28 in a substantially symmetric layout of characters with pairs of opposing characters displayed on thedigital keyboard 28 having substantially similar frequencies of use. With this configuration, the frequency of use of one character in a pair of opposing characters is as close as possible to that of the other character in the pair. An example of this configuration is shown inFIG. 13 which shows the frequencies (f(X1) and f(X2)) of characters X1 and X2 being substantially the same as each other, and the frequencies (f(X3) and f(X4)) of characters X3 and X4 being substantially the same as each other. The frequencies of use of the characters displayed in thedigital keyboard 28 may be calculated using well-known techniques of analysis on a large corpus of text. - The dynamically re-positionable
digital keyboard 28 minimizes the need for repositioning the pointing device and instead operates on the basis of repositioning thedigital keyboard 28 relative to the pointing device. Making thedigital keyboard 28 dynamically re-positionable also provides uniform movement for a particular character resulting in a more intuitive keyboard and a more intuitive data entry mechanism. When combined in a single embodiment, the character frequency distribution and the dynamically re-positionable aspects of thedigital keyboard 28 further reduce the movement required for the pointing device when characters are to be selected from thedigital keyboard 28. - Candidate Selection
- In the first embodiment described earlier above, completion candidates are selected from the
interactive search list 30 by way of gestures. Alternatively, other forms of candidate selection may be performed with pointing devices. For instance, if gesture-based selection is not desired for a particular implementation, candidates may be selected based on their location in theinteractive search list 30. As another example, when thedata entry system 26 is programmed to receive input from a mouse having two or more buttons, thedata entry system 26 can be programmed to use input from one mouse button to toggle between activating and deactivating theinteractive search list 30, and to use input from a second mouse button to insert a completion candidate from theinteractive search list 30 into the text when theinteractive search list 30 is active and the mouse has been used to highlight that completion candidate. In this latter case, thedata entry system 26 may also be programmed to use input from the second mouse button as a trigger to select a key from thedigital keyboard 28 if the mouse's cursor position (i.e. the mouse's position coordinates) on thegraphical user interface 34 is associated with a key on thedigital keyboard 28 at the time input from the second mouse button is received. - In another aspect, candidate selection using the
interactive search list 30 may be modified to replace the time delay-based technique for triggering the activation of theinteractive search list 30 or for triggering iterative searching, with other forms of input indicators from the pointing device. For instance, with a mouse, an input signal from a mouse button when the mouse position is located over a particular function button or location ongraphical user interface 34 or when a double click signal from that mouse button is received by thedata entry system 26. - Interactive Search List Layout
- Although the
interactive search list 30 is displayed in the first embodiment (FIG. 1 to 5) as a vertical list of completion candidates, theinteractive search list 30 can be displayed in several different ways depending upon which options thedata entry system 26 has been programmed with and which of those options have been selected by the user. There are four main considerations for the display of theinteractive search list 30. The first is where theinteractive search list 30 is to be positioned within thegraphical user interface 34. The second is whether theinteractive search list 30 is continuously visible or not. The third consideration is the type of interactive search list, more specifically, how the completion candidates in theinteractive search list 30 are arranged visually within thegraphical user interface 34. The fourth consideration is whether theinteractive search list 30 replaces thedigital keyboard 30 or whether theinteractive search list 30, when active, temporarily appears remote from or superimposed over a portion of thedigital keyboard 30. - In an alternative to swapping the display of the
interactive search list 30 with thedigital keyboard 28, theinteractive search list 30 may be displayed in a fixed location within thegraphical user interface 34. In another variation, theinteractive search list 30 may be docked with thedigital keyboard 28, when it is repositionable, and displayed continuously. With either the docked or fixed locationinteractive search list 30, the results of automated searching are continuously displayed within theinteractive search list 30 as the user enters characters with thedigital keyboard 28 or uses theinteractive search list 30 itself (as illustrated byblock FIGS. 23 and 24 ). Activating a docked or fixed locationinteractive search list 30 can be achieved by pausing with the pointing device on a keyboard character selected within thedigital keyboard 28. As soon as the predetermined time limit L1 has expired, theinteractive search list 30 becomes active. At this point, if the user wishes, the user can select one of the completion candidates (if any) within theinteractive search list 30 or the user can return to keyboard mode and continue adding to or otherwise modifying the current partial text entry from thedigital keyboard 28. In a further variation, theinteractive search list 30, when arranged in a docked or fixed location, may be continuously updated with potential completion candidates based on the current contents of the search string being constructed by the user via thedigital keyboard 28. In this variation, the user can simply continue adding characters to the end of the current partial text entry one character at a time via thedigital keyboard 28 so as to continue building the desired word, phrase, or character sequence until such time as the desired completion candidate or a partial completion candidate thereof appears in theinteractive search list 30. In a further alternative, theinteractive search list 30, when activated, may be shown superimposed over a portion of thedigital keyboard 28. - The
digital keyboard 28 may be instructed to make itself visible or invisible to view on thegraphical user interface 34. For instance, in the first embodiment inFIG. 1 to 5 thedigital keyboard 28 may be programmed to be displayed on thegraphical user interface 34 in response to a user selection on the personal computing device, and to be hidden (or cleared) from view in response to another user selection. This feature also provides, for example, the option for theapplication 27 to instruct thedigital keyboard 28 when to be visible and when to be invisible. Preferably, theapplication 27 is programmed to decide when and where thedigital keyboard 28 is to be displayed. This feature can be applied to many types of personal computing devices including, for example, where a touch-sensitive screen is used, or where thedigital keyboard 28 is displayed on a display device that is separate from thehardware input interface 17 such as with a data tablet, a proximity sensing input surface or an equivalent input interface. For example, the hardware input interface can be located on a remote control device used to control when thedigital keyboard 28 is displayed on a television or a remotely located computer display. With a proximity sensing input surface, the digital keyboard can be displayed when the pointing device is detected within a set predetermined distance of a proximity sensing input surface, and the digital keyboard can be hidden when the pointing device is not detected within the set predetermined distance of the proximity sensing input surface. As another example, if a proximity sensing input surface is used capable of position sensing, then when a stylus or the like is lifted a set predetermined distance from the proximity sensing input surface, theapplication 27 can instruct thedigital keyboard 28 to become invisible so as to swap to full text mode. When the stylus or the like is brought back within the set predetermined distance of the proximity sensing input surface, theapplication 27 reactivates the display of thedigital keyboard 28 over position coordinates associated with the position of the pointing device over the proximity sensing input surface or to an area remote to the pointing device. Variations on handling text entry with the proximity sensing input surface are discussed further on below. - The API for the
data entry system 26 also allows theapplication 27 to programmatically change the partial text entry which is used for searching. For example, the user of a text editor might place the cursor after a word or character sequence and theapplication 27 could then tell thedata entry system 26 to use that word or character sequence as a partial text entry for further searching. - Types of Interactive Search List Layouts
- Several alternative layouts for the
interactive search list 30 may be used by thedata entry system 26. In the first embodiment, a vertical list of completion candidates is used, as further illustrated inFIG. 14 . In other layouts, completion candidates within theinteractive search list 30 may be displayed in an X configuration (FIG. 15 ), in a rectangular configuration (FIG. 16 ), in a cross configuration (FIG. 17 ), in a T configuration (FIG. 18 ), or in a horizontal configuration. With the X configuration, one completion candidate is preferably located slightly offset in the x-axis or y-axis from a central location within the X configuration and surrounded by four or more completion candidates located within the north-west, north-east, south-west, and south-east directions (relative to the central completion candidate displayed). With an X configuration of the search list (like the one above) a unique direction is provided for each of the five completion candidates displayed in the list, so as to minimize pen movement. For the cross configuration, a substantially centrally displayed completion candidate within theinteractive search list 30 is surrounded by up to four completion candidates in the north, east, south, and west directions (relative to the central completion candidate displayed). - Interactive Search List Configuration
- When the
digital keyboard 28 is programmed to share the same display space on thegraphical user interface 34 as theinteractive search list 30, it is preferable that thedata entry system 26 is also programmed to display all completion candidates near the last known position coordinates for the pointing device so that they are slightly off-set from the x-axis or y-axis of the last known position coordinates so as to minimize the degree to which such completion candidates in theinteractive search list 30 are obscured from the user's view by the pointing device. This feature can be particularly useful when the pointing device is a pen or finger and the user interfaces with a touch-sensitive screen 14. In this way, the interactive search list may be displayed in a location which makes it easily visible and accessible to the user. - In another variation, when the
digital keyboard 28 and theinteractive search list 30 are interchangeably displayed, the list of completion candidates within theinteractive search list 30 can be displayed such that the most common of the completion candidates is displayed closest to the last known position coordinates of the pointing device while the other completion candidates within theinteractive search list 30 are displayed further away from the last known position coordinates of the pointing device relative to the most common of the completion candidates. This variation results in a frequency distributedinteractive search list 30 which can assist in further minimizing the amount of motion required with the pointing device in order to use and select from theinteractive search list 30. - As illustrated in
FIG. 19 , in another variation, both thedigital keyboard 28 and theinteractive search list 30 may be continuously displayed within fixed separate locations on thegraphical user interface 34, along with asearch string window 40 used to display the current contents of the search string. Atool bar 42 may also be displayed to identify predefined functions and commands that may be selected by the user while using thedata entry system 26. When thetool bar 42 is included in the dynamically re-positionable configuration for thedigital keyboard 28, thetool bar 42 may be repositioned dynamically along with thedigital keyboard 28, or thetool bar 42 may be located and remain in a fixed location within thegraphical user interface 34. As illustrated inFIG. 20 , whenever thedigital keyboard 28 is displayed, with or without theinteractive search list 30, atoolbar 42 and =additional character layouts 44 may be used to enhance the functionality for the user while using thedigital keyboard 28. - In another aspect of the present invention, a commonly used word or character sequence may appear in the same position each time such a word or character sequence is displayed in a search list. This helps the user become familiar with the location of such a word or character sequence within the search list, and thereby helps the user to access such a word or character sequence more readily.
- In another alternative embodiment, as the user begins to learn the position of common words within a search list, the user may begin to know which gesture is required to enter a certain word even before the predetermined delay period L1 has expired and the search list is displayed. In this case, the
date entry system 26 may be programmed to recognize such gestures even before the predetermined delay period L1 has expired and theinteractive search list 30 is displayed. - In another alternative embodiment, the interactive search list may display completion candidates with the part of each completion candidate matching the search string displayed in a different manner (for example a different color, font, or boldness) than the remaining parts of the completion candidates. For example, if the remaining parts were significantly bolder than the part of the completion candidates matching the search string, the user's eye can be drawn to those portions which distinguish the completion candidates on the interactive search list from one another, therefore facilitating selection of the desired completion candidate.
- Tracking the Movement of the Pointing Device
- In another variation, when a stylus (or pen, finger or the like) is used with a touch-sensitive or pressure-sensitive input surface (e.g. a touch-sensitive screen, a data tablet or input pad), the
data entry system 26 may be programmed to determine whether or not the position of a cursor displayed on thegraphical user interface 34 tracks the stylus position precisely or whether it moves relatively to the stylus movement. In the first case, if the cursor position tracks the stylus position precisely, then the stylus and cursor function like a mouse and the cursor on a conventional user interface and the position of the cursor tracks precisely the position of the stylus tip on the hardware input surface (i.e. the last known position coordinates for the stylus). - In the second case, the cursor displayed on the
graphical user interface 34 is moved by a distance proportional to the movement with the stylus. This latter behaviour can come into effect when theinteractive search list 30 is displayed. For instance, when moving up a vertical list of completion candidates, the cursor can move up faster than the actual physical movement of the stylus. - In either case, the stylus (or other pointing device) can be used locally on the display device if it is a touch-sensitive screen, or remotely such as with a data tablet, a proximity-sensing input interface, or with the character input space on a Palm Pilot™ or another hand-held personal computing device.
- Using the cursor to track the position coordinates of the pointing device can help the user keep their attention on the
digital keyboard 28 or theinteractive search list 30 displayed on thedisplay device 15 without having to be distracted with looking at the physical position of the pointing device (see, for instance,cursor 48 as illustrated inFIG. 27 ). This can be helpful when, for example, a data tablet or input pad is used and is located remote from the display area of thegraphical display device 15 where thedigital keyboard 28 or the interactive search list 30 (or both) are displayed. Also, using the cursor to remotely track the movement of the stylus, pen or finger provides a mechanism for using thedigital keyboard 28 and theinteractive search list 30 without obscuring them from the user's view with the stylus, pen or finger. - When the
data entry system 26 is programmed to use the cursor to remotely track the movement of the stylus, pen or finger, the cursor may be displayed over thedigital keyboard 28 when thedata entry system 26 is in keyboard mode, and the cursor may be programmed to relocate to the center of thedigital keyboard 28 whenever a character from the keyboard or a completion candidate from theinteractive search list 30 is selected. Once the cursor is centered in thedigital keyboard 28, further movements with the pointing device can be used to make selections from thedigital keyboard 28 as if the pointing device were physically centered about the center of thedigital keyboard 28. In this variation, thedigital keyboard 28 is displayed in a fixed remote location on thegraphical user interface 34. With this variation, the user is not visually distracted by movement of thedigital keyboard 28, while enjoying many of the advantages of the dynamically re-positionabledigital keyboard 28. For instance, when the cursor relocates to the center of thedigital keyboard 28 when the keyboard is active and waiting for user input, a particular character on thedigital keyboard 28 remains the same distance and direction from the pointing device no matter what input was made last with the pointing device. This feature of the cursor enables the user to incorporate unconscious learning and therefore, learned efficiency. When a frequency distributed keyboard layout is used with most frequently used characters located near a central location, relocating the cursor to the center of thedigital keyboard 28 enables ready access to the characters most likely to be chosen next, thereby reducing finger movement and increasing efficiency. - In one variation, the movement of the cursor need not necessarily be directly proportional to the movement of the pointing device. In this variation, the
data entry system 26 is programmed so that moving the pointing device a small distance equates to moving the cursor a larger distance on thedigital keyboard 28 or theinteractive search list 30. This variation uses scaling to minimize the movement required to accurately distinguish and select characters from thedigital keyboard 28 and completion candidates from theinteractive search list 30. In another variation, the distance of the cursor movement may be related by thedata entry system 26 to the speed that the pointing device moves, so that the faster the movement with the pointing device, the greater the distance traveled by the cursor, and the slower the movement of the pointing device, the less distance traveled by the cursor on thegraphical user interface 34. - It will be appreciated, as indicated above, that when the pointing device is used on a digital keyboard that is displayed in a location remote from the pointing device, such as where the pointing device is a mouse, a finger on a touch sensitive palette, or a stylus. In such cases the digital keyboard is not displayed under the pointing device, but is viewed on a display device, and pointer motion is seen as relative motions of a cursor displayed on the digital keyboard. In these cases characters on the digital keyboard are not obscured.
- Obtaining Completion Candidates of Specific Type or Minimum Length
- In another aspect of the present invention, a special display area containing a series of numbers are displayed as part of or in association with the
digital keyboard 28 to enable the user to rapidly instruct thedata entry system 26 to obtain and display in theinteractive search list 30 completion candidates having at least a minimum number of characters. In this aspect, when a user selects one of the characters on thedigital keyboard 28 followed by one of the numbers in the special display area, thedata entry system 26 is programmed to have thecandidate prediction system 32 obtain from thedictionary 20 completion candidates beginning with the selected character and having at least as many characters as the number that was selected by the user from the special display area. Alternatively, the user may type in a number from the special display area, followed by holding down on one of the characters on thedigital keyboard 28 to instruct thedata entry system 26 to have thecandidate prediction system 32 obtain from thedictionary 20 completion candidates beginning with the selected character and having at least as many characters as the number that was selected by the user from the special display area. In another variation, thedata entry system 26 may be programmed so that when the user touches a number from the special display area and lifts the pointing device, thedata entry system 26 retrieves a list of completion candidates having a number of characters equal to the number selected touched on the special display area. In yet another variation, thedata entry system 26 may be programmed to obtain completion candidates of at least a predetermined length when the user selects a number from the special display area with the pointing device, gestures a significant distance in a predetermined direction (for example, to the right), lifts up the pointing device, touches down on a character on thedigital keyboard 28 and then pauses on that character. In yet a further variation, another special display area may be included with thedigital keyboard 28 from which the category of completion candidates can be narrowed. In this further special display area, for example, thedata entry system 26 may be programmed to display general identifiers for nouns, verbs, adjectives, etc. If a general identifier is selected by the user before theinteractive search list 30 is activated, thedata entry system 26 in this variation is programmed to have thecandidate prediction system 32 obtain completion candidates that are identified in thedictionary 20 as falling within the category associated with the selected identifier (for example, only nouns, or only verbs). This variation may be combined with the other aspects herein to assist the user in obtaining completion candidates of one or more specific categories identified in thedictionary 20. - In another aspect, a physical button or switch located on the personal computing device, or on the pointing device, and within easy reach of a user's finger or hand, may be used to easily activate certain features of the
data entry system 26. As an example, when the button on the personal computing device is pressed, thedata entry system 26 may be programmed to make, with each press, thedigital keyboard 28 invisible or visible. Alternatively, thedata entry system 26 may be programmed to recognize that if the button or switch is pressed, theinteractive search list 30, when displayed, should display only certain types of completion candidates available within the dictionary. As another example, when the button is depressed, thedata entry system 26 may be programmed to activate theinteractive search list 30. As another example, when the button on the personal computing device is pressed, thedata entry system 26 may be programmed to require that theinteractive search list 30 display completion candidates of a certain minimum length of characters. - Proximity Triggered Display of Digital Keyboard
- In another variation, a stylus (or pen or finger or like hand-held pointing device) is used with a proximity sensing input surface. Proximity sensing input surfaces can detect the proximity of a pointing device to the input surface as well as the location of the pointing device over the proximity sensing input surface. The proximity sensing input surface may also detect the distance and angle that a pointing device is being held relative to the input surface. When a proximity sensing input surface is used, the
data entry system 26 can be programmed so as to display the digital keyboard 28 (or another digital keyboard) with the cursor displayed over it when the stylus approaches within a set predetermined distance of the proximity sensing input surface. The proximity sensing input surface detects the position of the stylus over the proximity sensing input surface when the stylus is within the set predetermined distance. As the user moves the pointing device over the proximity sensing input surface, the cursor moves correspondingly. Thedigital keyboard 28 can be displayed directly beneath the stylus in some embodiments or, for other embodiments, remote from the stylus. When the stylus is moved away from the proximity sensing input surface further than the set predetermined distance, thedata entry system 26 is programmed to hide (or clear) thedigital keyboard 28 from thegraphical user interface 34. This variation enables the entire screen to be used to display text while thedigital keyboard 28 is hidden. This variation also avoids screen clutter by displaying thedigital keyboard 28 only when the stylus is found to be within the set predetermined distance of the proximity sensing input surface. At the same time, the user can quickly and intuitively return to adding to or deleting from the text using thedigital keyboard 28 by bringing the stylus within the set predetermined distance of the proximity sensitive input surface. Thus, thedigital keyboard 28 is displayed when the user's hand controlling the stylus (the “typing hand”) is placed in a natural position for continuing text and data entry. The location where thedigital keyboard 28 is displayed on thegraphical user interface 34 may be near and possibly follow the line of text under construction by the user, so as to facilitate the eye following thedigital keyboard 28 and the entered text simultaneously. In another variation, thedigital keyboard 28 can be displayed in the same location as the stylus. In another variation, thedigital keyboard 28 is programmed to be displayed just below or above the line of text that the user is creating or editing on a personal computing device. These variations also allow the user to view thedigital keyboard 28 and the text simultaneously. - As an alternative to the above variation, if the user wishes to use the stylus (or pen or finger) to reposition the cursor within previously typed text, the
data entry system 26 may be programmed to allow for the cursor to be repositioned within previously typed text with the stylus while the stylus is within the minimum distance, provided the stylus is detected as approaching the proximity sensing input surface from a particular side of the input surface (for example, the right side of the proximity sensing input surface). Once the cursor was repositioned, the user could then approach the proximity sensing input surface from another direction (for example, from above) to trigger the display of thedigital keyboard 28 to assist with further text entry and modification. - In the first embodiment, the
data entry system 26 is application independent and communicates with applications via an API. In an alternative embodiment, thedata entry system 26 may be embedded in an application. - It will be appreciated that many of the aspects of the present invention may be applied to several types of digital keyboards and keyboard layouts, including traditional keyboard layouts, and rectangular keyboard layouts. It will also be appreciated that the digital keyboard may contain other symbols that could encode a language. One example of this would be a digital keyboard that contains regions representing the strokes used in writing an oriental language. The user would select the strokes by pointing to them, and the characters would be constructed from the strokes.
- Although this invention has been described with reference to illustrative and preferred embodiments of carrying out the invention, this description is not to be construed in a limiting sense. Various modifications of form, arrangement of parts, steps, details and order of operations of the embodiments illustrated, as well as other embodiments of the invention, will be apparent to persons skilled in the art upon reference to this description. It is therefore contemplated that the appended claims will cover such modifications and embodiments as fall within the true scope of the invention.
Claims (26)
1-92. (canceled)
93. A method of computer-assisted text generation, the method comprising:
(a) receiving a user input signal via a computer, the user input signal corresponding to a partial text entry comprising at least a first character;
(b) in response to receipt of the first character of the partial text entry, obtaining via the computer a first plurality of completion candidates from a dictionary based on a set of predetermined metrics; and
(c) providing the first plurality of completion candidates to an internet appliance remote from the computer for display of the first plurality of completion candidates on a display device associated with the internet appliance, wherein the display device is remote from the computer.
94. The method of claim 93 , further comprising:
(a) selecting a particular completion candidate from the first plurality of completion candidates in response to a first type of user input signal associated with the particular completion candidate; and
(b) modifying the partial text entry to correspond to the particular completion candidate selected from among the first plurality of completion candidates at least while the particular completion candidate remains selected.
95. The method of claim 94 , further comprising providing the modified partial text entry to the internet appliance for display on the display device.
96. The method of claim 94 , further comprising displaying the modified partial text entry on the display device associated with the internet appliance.
97. The method of claim 94 , further comprising:
(a) detecting modification of the partial text entry by the user via the computer; and
(b) obtaining a modified plurality of completion candidates from the dictionary based on the modified partial text entry, wherein each of the modified plurality of completion candidates includes a portion matching the modified partial text entry;
wherein the set of predetermined metrics comprises instructions for obtaining the first plurality of completion candidates and the modified plurality of completion candidates from the dictionary on the basis of frequency values stored in the dictionary.
98. The method of claim 97 , further comprising displaying the first plurality of completion candidates on the display device associated with the internet appliance.
99. The method of claim 98 , further comprising displaying the modified plurality of completion candidates on the display device associated with the internet appliance.
100. The method of claim 93 , further comprising:
(a) detecting modification of the partial text entry by the user via the computer; and
(b) obtaining a modified plurality of completion candidates from the dictionary based on the modified partial text entry, wherein each of the modified plurality of completion candidates includes a portion matching the modified partial text entry.
101. The method of claim 100 , wherein each individual completion candidate from either the first plurality of completion candidates or the modified plurality of completion candidates is displayed in a search list with the part of the individual completion candidate matching the partial text entry displayed in a manner different from the remaining part of the individual completion candidate.
102. The method of claim 93 , wherein at least part of the partial text entry is generated by an input device remote from the computer.
103. The method of claim 93 , wherein at least part of the partial text entry is generated by an input device associated with the internet appliance.
104. The method of claim 93 , wherein the internet appliance is controlled via a remote control and wherein at least part of the partial text entry is generated via the remote control.
105. The method of claim 93 , wherein at least part of the partial text entry is generated by a digitizing tablet associated with at least one of the computer and the internet appliance.
106. A computer-readable medium having computer-readable instructions for execution by a processing unit, the computer-readable instructions comprising:
(a) instructions for receiving via a computer a user input signal corresponding to a partial text entry comprising at least a first character;
(b) instruction for, in response to receipt of the first character of the partial text entry, obtaining via the computer a first plurality of completion candidates based on the partial text entry including obtaining the first plurality of completion candidates from a dictionary based on a set of predetermined metrics; and
(c) instruction for, providing the first plurality of completion candidates to an internet appliance remote from the computer for display of the first plurality of completion candidates on a display device associated with the internet appliance, wherein the display device is remote from the computer.
107. The computer-readable medium of claim 106 , the computer-readable instructions further comprising:
(a) instructions for detecting modification of the partial text entry by the user via the computer; and
(b) instructions for obtaining a modified plurality of completion candidates from the dictionary based on the modified partial text entry, wherein each of the modified plurality of completion candidates includes a portion matching the modified partial text entry.
108. The computer-readable medium of claim 107 , the computer-readable instructions further comprising instructions for displaying the modified plurality of completion candidates on the display device associated with the internet appliance.
109. The computer-readable medium of claim 106 , the computer-readable instructions further comprising instructions for displaying the first plurality of completion candidates on the display device associated with the internet appliance.
110. The computer-readable medium of claim 106 , the computer-readable instructions further comprising instructions for receiving at least part of the partial text entry from an input device remote from the computer.
111. The computer-readable medium of claim 106 , the computer-readable instructions further comprising instructions for receiving at least part of the partial text entry from an input device associated with the internet appliance.
112. The computer-readable medium of claim 106 , the computer-readable instructions further comprising instructions for receiving at least part of the partial text entry from a remote control associated with the internet appliance.
113. The computer-readable medium of claim 106 , the computer-readable instructions further comprising instructions for receiving at least part of the partial text entry from a digitizing tablet associated with at least one of the computer and the internet appliance.
114. A method of supporting text entry via an input device, the method comprising:
(a) receiving via the input device user input associated with a partial text entry, the partial text entry comprising at least a first character;
(b) in response to receipt of user input associated with the first character of the partial text entry, obtaining a first plurality of completion candidates from a dictionary, wherein each of the first plurality of completion candidates comprises at least a portion matching the partial text entry; and
(c) providing the first plurality of completion candidates to a display device remote from the input device for display of the first plurality of completion candidates on the display device.
115. A computer-readable medium having computer-readable instructions for execution by a processing unit, the computer-readable instructions comprising:
(a) instructions for receiving a partial text entry via an input device, the partial text entry comprising at least a first character;
(b) instructions for, in response to receipt of the first character of the partial text entry, obtaining a first plurality of completion candidates from among a group of completion candidates stored in a dictionary, wherein each of the first plurality of completion candidates comprises at least a portion matching the partial text entry; and
(c) instructions for providing the first plurality of completion candidates to a display device remote from the input device for display of the first plurality of completion candidates on the display device.
116. A system for supporting computer-assisted text generation for remote display, the system comprising:
(a) a computer operable to communicate via the Internet;
(b) a computer-readable medium operable to communicate with the computer, the computer-readable medium comprising computer-readable instructions for directing the computer to assist with text generation by:
(i) receiving a user input signal via the computer, the user input signal associated with a partial text entry comprising at least a first character;
(ii) obtaining, via the computer, a first plurality of completion candidates from a dictionary stored on at least one of the computer and the computer-readable medium, wherein the first plurality of completion candidates are obtained from the dictionary based on a set of predetermined metrics; and
(iii) communicating, via the Internet, the first plurality of completion candidates to a display device remote from the computer for display of the first plurality of completion candidates on the display device.
117. The system of claim 116 , wherein the set of predetermined metrics comprises instructions for obtaining the first plurality of completion candidates from the dictionary on the basis of frequency values stored in the dictionary.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US11/871,904 US20080088599A1 (en) | 1999-03-18 | 2007-10-12 | Data entry for personal computing devices |
Applications Claiming Priority (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US27270099A | 1999-03-18 | 1999-03-18 | |
PCT/CA2000/000285 WO2000057265A1 (en) | 1999-03-18 | 2000-03-15 | Data entry for personal computing devices |
US09/631,101 US7293231B1 (en) | 1999-03-18 | 2000-08-01 | Data entry for personal computing devices |
US11/871,904 US20080088599A1 (en) | 1999-03-18 | 2007-10-12 | Data entry for personal computing devices |
Related Parent Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US09/631,101 Continuation US7293231B1 (en) | 1999-03-18 | 2000-08-01 | Data entry for personal computing devices |
Publications (1)
Publication Number | Publication Date |
---|---|
US20080088599A1 true US20080088599A1 (en) | 2008-04-17 |
Family
ID=34987821
Family Applications (6)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US09/631,101 Expired - Lifetime US7293231B1 (en) | 1999-03-18 | 2000-08-01 | Data entry for personal computing devices |
US11/134,759 Abandoned US20050223308A1 (en) | 1999-03-18 | 2005-05-19 | Data entry for personal computing devices |
US11/133,770 Expired - Lifetime US7716579B2 (en) | 1999-03-18 | 2005-05-19 | Data entry for personal computing devices |
US11/871,900 Abandoned US20080030481A1 (en) | 1999-03-18 | 2007-10-12 | Data entry for personal computing devices |
US11/871,887 Expired - Fee Related US7921361B2 (en) | 1999-03-18 | 2007-10-12 | Data entry for personal computing devices |
US11/871,904 Abandoned US20080088599A1 (en) | 1999-03-18 | 2007-10-12 | Data entry for personal computing devices |
Family Applications Before (5)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US09/631,101 Expired - Lifetime US7293231B1 (en) | 1999-03-18 | 2000-08-01 | Data entry for personal computing devices |
US11/134,759 Abandoned US20050223308A1 (en) | 1999-03-18 | 2005-05-19 | Data entry for personal computing devices |
US11/133,770 Expired - Lifetime US7716579B2 (en) | 1999-03-18 | 2005-05-19 | Data entry for personal computing devices |
US11/871,900 Abandoned US20080030481A1 (en) | 1999-03-18 | 2007-10-12 | Data entry for personal computing devices |
US11/871,887 Expired - Fee Related US7921361B2 (en) | 1999-03-18 | 2007-10-12 | Data entry for personal computing devices |
Country Status (1)
Country | Link |
---|---|
US (6) | US7293231B1 (en) |
Cited By (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20090189864A1 (en) * | 2008-01-30 | 2009-07-30 | International Business Machine Corporation | Self-adapting virtual small keyboard apparatus and method |
US20100053092A1 (en) * | 2008-08-26 | 2010-03-04 | Au Optronics Corporation | Control Method for Touch Screen Device |
US20110154193A1 (en) * | 2009-12-21 | 2011-06-23 | Nokia Corporation | Method and Apparatus for Text Input |
US20120066244A1 (en) * | 2010-09-15 | 2012-03-15 | Kazuomi Chiba | Name retrieval method and name retrieval apparatus |
US20120268409A1 (en) * | 2008-10-10 | 2012-10-25 | At&T Intellectual Property I, L.P. | Augmented i/o for limited form factor user-interfaces |
US20130265228A1 (en) * | 2012-04-05 | 2013-10-10 | Seiko Epson Corporation | Input device, display system and input method |
US8599173B2 (en) | 2008-10-23 | 2013-12-03 | At&T Intellectual Property I, L.P. | Tracking approaching or hovering objects for user interfaces |
Families Citing this family (260)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US8479122B2 (en) | 2004-07-30 | 2013-07-02 | Apple Inc. | Gestures for touch sensitive input devices |
US9239673B2 (en) | 1998-01-26 | 2016-01-19 | Apple Inc. | Gesturing with a multipoint sensing device |
US7834855B2 (en) | 2004-08-25 | 2010-11-16 | Apple Inc. | Wide touchpad on a portable computer |
US9292111B2 (en) | 1998-01-26 | 2016-03-22 | Apple Inc. | Gesturing with a multipoint sensing device |
US7614008B2 (en) | 2004-07-30 | 2009-11-03 | Apple Inc. | Operation of a computer with touch screen interface |
ES2202070T3 (en) | 1999-03-18 | 2004-04-01 | 602531 British Columbia Ltd. | DATA ENTRY FOR PERSONAL INFORMATIC DEVICES. |
US7293231B1 (en) * | 1999-03-18 | 2007-11-06 | British Columbia Ltd. | Data entry for personal computing devices |
EP1192716B1 (en) | 1999-05-27 | 2009-09-23 | Tegic Communications, Inc. | Keyboard system with automatic correction |
US7750891B2 (en) | 2003-04-09 | 2010-07-06 | Tegic Communications, Inc. | Selective input system based on tracking of motion parameters of an input device |
US7821503B2 (en) | 2003-04-09 | 2010-10-26 | Tegic Communications, Inc. | Touch screen and graphical user interface |
US7286115B2 (en) | 2000-05-26 | 2007-10-23 | Tegic Communications, Inc. | Directional input system with automatic correction |
US7610194B2 (en) * | 2002-07-18 | 2009-10-27 | Tegic Communications, Inc. | Dynamic database reordering system |
US7030863B2 (en) * | 2000-05-26 | 2006-04-18 | America Online, Incorporated | Virtual keyboard system with automatic correction |
US6922810B1 (en) * | 2000-03-07 | 2005-07-26 | Microsoft Corporation | Grammar-based automatic data completion and suggestion for user input |
US7392326B2 (en) * | 2001-02-16 | 2008-06-24 | Microsoft Corporation | Method for text entry in an electronic device |
JP4084582B2 (en) * | 2001-04-27 | 2008-04-30 | 俊司 加藤 | Touch type key input device |
US6938221B2 (en) * | 2001-11-30 | 2005-08-30 | Microsoft Corporation | User interface for stylus-based user input |
JP4061094B2 (en) * | 2002-03-15 | 2008-03-12 | インターナショナル・ビジネス・マシーンズ・コーポレーション | Speech recognition apparatus, speech recognition method and program thereof |
KR100377432B1 (en) * | 2002-03-29 | 2003-05-09 | 주식회사 네오패드 | Creation method for characters/words and the information and communication service method thereby |
JP2004038896A (en) * | 2002-06-28 | 2004-02-05 | Clarion Co Ltd | Display control means |
US7657423B1 (en) * | 2003-10-31 | 2010-02-02 | Google Inc. | Automatic completion of fragments of text |
US20050195159A1 (en) | 2004-02-23 | 2005-09-08 | Hunleth Frank A. | Keyboardless text entry |
GB0406451D0 (en) | 2004-03-23 | 2004-04-28 | Patel Sanjay | Keyboards |
JP4302568B2 (en) * | 2004-04-06 | 2009-07-29 | 本田技研工業株式会社 | Information retrieval device |
US20050246324A1 (en) * | 2004-04-30 | 2005-11-03 | Nokia Inc. | System and associated device, method, and computer program product for performing metadata-based searches |
CN100437441C (en) * | 2004-05-31 | 2008-11-26 | 诺基亚(中国)投资有限公司 | Method and apparatus for inputting Chinese characters and phrases |
US7836044B2 (en) | 2004-06-22 | 2010-11-16 | Google Inc. | Anticipated query generation and processing in a search engine |
US7487145B1 (en) | 2004-06-22 | 2009-02-03 | Google Inc. | Method and system for autocompletion using ranked results |
US8381135B2 (en) * | 2004-07-30 | 2013-02-19 | Apple Inc. | Proximity detector in handheld device |
US7653883B2 (en) | 2004-07-30 | 2010-01-26 | Apple Inc. | Proximity detector in handheld device |
US7895218B2 (en) | 2004-11-09 | 2011-02-22 | Veveo, Inc. | Method and system for performing searches for television content using reduced text input |
US20060101499A1 (en) * | 2004-11-09 | 2006-05-11 | Veveo, Inc. | Method and system for secure sharing, gifting, and purchasing of content on television and mobile devices |
US7499940B1 (en) * | 2004-11-11 | 2009-03-03 | Google Inc. | Method and system for URL autocompletion using ranked results |
KR100595694B1 (en) * | 2004-11-12 | 2006-07-03 | 엘지전자 주식회사 | Method for registering addiction phase in the mobile terminal |
US20060106769A1 (en) * | 2004-11-12 | 2006-05-18 | Gibbs Kevin A | Method and system for autocompletion for languages having ideographs and phonetic characters |
US8473869B2 (en) * | 2004-11-16 | 2013-06-25 | Koninklijke Philips Electronics N.V. | Touchless manipulation of images for regional enhancement |
US7561740B2 (en) * | 2004-12-10 | 2009-07-14 | Fuji Xerox Co., Ltd. | Systems and methods for automatic graphical sequence completion |
US8552984B2 (en) | 2005-01-13 | 2013-10-08 | 602531 British Columbia Ltd. | Method, system, apparatus and computer-readable media for directing input associated with keyboard-type device |
US7599830B2 (en) * | 2005-03-16 | 2009-10-06 | Research In Motion Limited | Handheld electronic device with reduced keyboard and associated method of providing quick text entry in a message |
US7561145B2 (en) * | 2005-03-18 | 2009-07-14 | Microsoft Corporation | Systems, methods, and computer-readable media for invoking an electronic ink or handwriting interface |
GB0505941D0 (en) * | 2005-03-23 | 2005-04-27 | Patel Sanjay | Human-to-mobile interfaces |
GB0505942D0 (en) | 2005-03-23 | 2005-04-27 | Patel Sanjay | Human to mobile interfaces |
US8185841B2 (en) * | 2005-05-23 | 2012-05-22 | Nokia Corporation | Electronic text input involving a virtual keyboard and word completion functionality on a touch-sensitive display screen |
US20070024646A1 (en) * | 2005-05-23 | 2007-02-01 | Kalle Saarinen | Portable electronic apparatus and associated method |
US7886233B2 (en) * | 2005-05-23 | 2011-02-08 | Nokia Corporation | Electronic text input involving word completion functionality for predicting word candidates for partial word inputs |
US9785329B2 (en) * | 2005-05-23 | 2017-10-10 | Nokia Technologies Oy | Pocket computer and associated methods |
US20060271886A1 (en) * | 2005-05-25 | 2006-11-30 | Wenstrand John S | Character entry system and method for electronic devices |
US8122034B2 (en) | 2005-06-30 | 2012-02-21 | Veveo, Inc. | Method and system for incremental search with reduced text entry where the relevance of results is a dynamically computed function of user input search string character count |
EP1907920A4 (en) * | 2005-07-27 | 2012-12-12 | Nokia Corp | Method and device for entering text |
US7831913B2 (en) * | 2005-07-29 | 2010-11-09 | Microsoft Corporation | Selection-based item tagging |
WO2007019610A1 (en) | 2005-08-12 | 2007-02-22 | Kannuu Pty Ltd | Improved process and apparatus for selecting an item from a database |
US7788266B2 (en) | 2005-08-26 | 2010-08-31 | Veveo, Inc. | Method and system for processing ambiguous, multi-term search queries |
US7779011B2 (en) | 2005-08-26 | 2010-08-17 | Veveo, Inc. | Method and system for dynamically processing ambiguous, reduced text search queries and highlighting results thereof |
US7737999B2 (en) * | 2005-08-26 | 2010-06-15 | Veveo, Inc. | User interface for visual cooperation between text input and display device |
TWI313430B (en) * | 2005-09-16 | 2009-08-11 | Input method for touch screen | |
TW200713060A (en) * | 2005-09-30 | 2007-04-01 | Primax Electronics Ltd | Adaptive input method for touch screen |
US20070076862A1 (en) * | 2005-09-30 | 2007-04-05 | Chatterjee Manjirnath A | System and method for abbreviated text messaging |
US8459885B2 (en) | 2005-10-15 | 2013-06-11 | Byung Kon Min | Clock face keyboard |
US20070086825A1 (en) * | 2005-10-15 | 2007-04-19 | Min Byung K | Circular keyboard |
US20070094024A1 (en) * | 2005-10-22 | 2007-04-26 | International Business Machines Corporation | System and method for improving text input in a shorthand-on-keyboard interface |
US7644054B2 (en) | 2005-11-23 | 2010-01-05 | Veveo, Inc. | System and method for finding desired results by incremental search using an ambiguous keypad with the input containing orthographic and typographic errors |
US7716603B2 (en) * | 2005-12-09 | 2010-05-11 | Sony Corporation | On screen display for alpha-numeric input |
WO2007089663A2 (en) * | 2006-01-27 | 2007-08-09 | Veveo, Inc. | System and method for incremental user query on handheld device |
US20070186159A1 (en) * | 2006-02-08 | 2007-08-09 | Denso International America, Inc. | Universal text input method for different languages |
US8380726B2 (en) | 2006-03-06 | 2013-02-19 | Veveo, Inc. | Methods and systems for selecting and presenting content based on a comparison of preference signatures from multiple users |
US20070226189A1 (en) * | 2006-03-23 | 2007-09-27 | John William Piekos | Dynamically searching and browsing product catalogs with reduced user gestures |
US8073860B2 (en) | 2006-03-30 | 2011-12-06 | Veveo, Inc. | Method and system for incrementally selecting and providing relevant search engines in response to a user query |
US7956844B2 (en) | 2006-04-07 | 2011-06-07 | Research In Motion Limited | Handheld electronic device providing a learning function to facilitate correction of erroneous text entry in environment of text requiring multiple sequential actuations of the same key, and associated method |
US7683885B2 (en) * | 2006-04-07 | 2010-03-23 | Research In Motion Ltd. | Handheld electronic device providing proposed corrected input in response to erroneous text entry in environment of text requiring multiple sequential actuations of the same key, and associated method |
JP5138175B2 (en) * | 2006-04-12 | 2013-02-06 | 任天堂株式会社 | Character input program, character input device, character input system, and character input method |
JP5193183B2 (en) | 2006-04-20 | 2013-05-08 | ベベオ,インク. | User interface method and system for selecting and presenting content |
US7899251B2 (en) * | 2006-06-05 | 2011-03-01 | Microsoft Corporation | Balancing out-of-dictionary and in-dictionary recognition scores |
EP1868064B1 (en) * | 2006-06-14 | 2018-12-19 | BlackBerry Limited | Handheld electronic device with assisted text entry using existing message thread, and associated method |
US7778957B2 (en) | 2006-06-14 | 2010-08-17 | Research In Motion Limited | Handheld electronic device with assisted text entry using existing message thread, and associated method |
US7979469B2 (en) * | 2006-06-14 | 2011-07-12 | Research In Motion Limited | Handheld electronic device and associated method employing a multiple-axis input device and arranging words of an existing message thread in various linguistic categories for selection during text entry |
US20080141125A1 (en) * | 2006-06-23 | 2008-06-12 | Firooz Ghassabian | Combined data entry systems |
US20080046839A1 (en) * | 2006-06-27 | 2008-02-21 | Pixtel Media Technology (P) Ltd. | Input mode switching methods and devices utilizing the same |
US7400288B2 (en) * | 2006-07-15 | 2008-07-15 | Rogitz John L | Target visualization system |
US20100289750A1 (en) * | 2006-08-04 | 2010-11-18 | Hyung Gi Kim | Touch Type Character Input Device |
US7934156B2 (en) * | 2006-09-06 | 2011-04-26 | Apple Inc. | Deletion gestures on a portable multifunction device |
JP5161883B2 (en) | 2006-09-14 | 2013-03-13 | ベベオ,インク. | Method and system for dynamically rearranging search results into hierarchically organized concept clusters |
KR101319871B1 (en) * | 2006-09-29 | 2013-10-18 | 엘지전자 주식회사 | Apparatus of coordinates cognition and Method for generation of key code on the apparatus thereof |
WO2008045690A2 (en) | 2006-10-06 | 2008-04-17 | Veveo, Inc. | Linear character selection display interface for ambiguous text input |
WO2008063987A2 (en) | 2006-11-13 | 2008-05-29 | Veveo, Inc. | Method of and system for selecting and presenting content based on user identification |
US8161395B2 (en) * | 2006-11-13 | 2012-04-17 | Cisco Technology, Inc. | Method for secure data entry in an application |
US8970501B2 (en) | 2007-01-03 | 2015-03-03 | Apple Inc. | Proximity and multi-touch sensor detection and demodulation |
US8091045B2 (en) * | 2007-01-07 | 2012-01-03 | Apple Inc. | System and method for managing lists |
US20080168381A1 (en) * | 2007-01-08 | 2008-07-10 | Aol Llc | Non-modal search box with text-entry ribbon for a portable media player |
US8201087B2 (en) | 2007-02-01 | 2012-06-12 | Tegic Communications, Inc. | Spell-check for a keyboard system with automatic correction |
US8225203B2 (en) | 2007-02-01 | 2012-07-17 | Nuance Communications, Inc. | Spell-check for a keyboard system with automatic correction |
US7895518B2 (en) * | 2007-04-27 | 2011-02-22 | Shapewriter Inc. | System and method for preview and selection of words |
EP2156280A4 (en) * | 2007-05-07 | 2014-09-10 | Fourthwall Media Inc | Context-dependent prediction and learning with a universal re-entrant predictive text input software component |
DE102007023313A1 (en) * | 2007-05-16 | 2008-11-20 | Navigon Ag | Electronic display device and method for operating a display device |
US8549424B2 (en) | 2007-05-25 | 2013-10-01 | Veveo, Inc. | System and method for text disambiguation and context designation in incremental search |
WO2008148009A1 (en) | 2007-05-25 | 2008-12-04 | Veveo, Inc. | Method and system for unified searching across and within multiple documents |
JP2008293403A (en) * | 2007-05-28 | 2008-12-04 | Sony Ericsson Mobilecommunications Japan Inc | Character input device, portable terminal and character input program |
US20080303793A1 (en) * | 2007-06-05 | 2008-12-11 | Microsoft Corporation | On-screen keyboard |
US9251137B2 (en) * | 2007-06-21 | 2016-02-02 | International Business Machines Corporation | Method of text type-ahead |
US9043727B2 (en) * | 2007-07-26 | 2015-05-26 | Microsoft Technology Licensing, Llc | Visualization techniques for imprecise statement completion |
US20090037813A1 (en) * | 2007-07-31 | 2009-02-05 | Palo Alto Research Center Incorporated | Space-constrained marking menus for mobile devices |
US8146003B2 (en) * | 2007-08-17 | 2012-03-27 | Microsoft Corporation | Efficient text input for game controllers and handheld devices |
US20090058823A1 (en) * | 2007-09-04 | 2009-03-05 | Apple Inc. | Virtual Keyboards in Multi-Language Environment |
US8661340B2 (en) | 2007-09-13 | 2014-02-25 | Apple Inc. | Input methods for device having multi-language environment |
CN100592249C (en) * | 2007-09-21 | 2010-02-24 | 上海汉翔信息技术有限公司 | Method for quickly inputting related term |
US8296676B2 (en) * | 2007-10-15 | 2012-10-23 | Harman International Industries, Incorporated | System for a text speller |
US8504566B2 (en) * | 2007-11-02 | 2013-08-06 | Research In Motion Limited | Method of providing a number of search results for a handheld electronic device, and system and handheld electronic device employing the same |
US8839123B2 (en) * | 2007-11-19 | 2014-09-16 | Red Hat, Inc. | Generating a visual user interface |
US8943539B2 (en) | 2007-11-21 | 2015-01-27 | Rovi Guides, Inc. | Enabling a friend to remotely modify user data |
KR100864749B1 (en) * | 2007-11-22 | 2008-10-22 | 김연수 | Characters input method |
KR100958935B1 (en) * | 2007-12-04 | 2010-05-19 | 엔에이치엔(주) | Method and system for providing and using editable personal dictionary |
US8316035B2 (en) * | 2008-01-16 | 2012-11-20 | International Business Machines Corporation | Systems and arrangements of text type-ahead |
US8667413B2 (en) * | 2008-02-14 | 2014-03-04 | Creative Technology Ltd | Apparatus and method for information input in an electronic device with display |
US20090213079A1 (en) * | 2008-02-26 | 2009-08-27 | Microsoft Corporation | Multi-Purpose Input Using Remote Control |
US8908973B2 (en) * | 2008-03-04 | 2014-12-09 | Apple Inc. | Handwritten character recognition interface |
US8289283B2 (en) * | 2008-03-04 | 2012-10-16 | Apple Inc. | Language input interface on a device |
TWI484401B (en) * | 2008-04-24 | 2015-05-11 | 宏達國際電子股份有限公司 | Electronic device and automatically hiding keypad method and digital data storage media |
US8359532B2 (en) * | 2008-04-28 | 2013-01-22 | International Business Machines Corporation | Text type-ahead |
US9355090B2 (en) | 2008-05-30 | 2016-05-31 | Apple Inc. | Identification of candidate characters for text input |
US8542927B2 (en) * | 2008-06-26 | 2013-09-24 | Microsoft Corporation | Character auto-completion for online east asian handwriting input |
US8312032B2 (en) | 2008-07-10 | 2012-11-13 | Google Inc. | Dictionary suggestions for partial user entries |
KR100948124B1 (en) * | 2008-08-14 | 2010-03-18 | 강윤기 | Method of inputting words |
US8234219B2 (en) * | 2008-09-09 | 2012-07-31 | Applied Systems, Inc. | Method, system and apparatus for secure data editing |
US8769427B2 (en) | 2008-09-19 | 2014-07-01 | Google Inc. | Quick gesture input |
EP2350779A4 (en) * | 2008-11-25 | 2018-01-10 | Jeffrey R. Spetalnick | Methods and systems for improved data input, compression, recognition, correction, and translation through frequency-based language analysis |
US20100149190A1 (en) * | 2008-12-11 | 2010-06-17 | Nokia Corporation | Method, apparatus and computer program product for providing an input order independent character input mechanism |
US8669941B2 (en) * | 2009-01-05 | 2014-03-11 | Nuance Communications, Inc. | Method and apparatus for text entry |
US8296680B2 (en) | 2009-01-15 | 2012-10-23 | Research In Motion Limited | Method and handheld electronic device for displaying and selecting diacritics |
US8407599B1 (en) * | 2009-01-30 | 2013-03-26 | Sprint Communications Company L.P. | Address book extension |
RU2011134935A (en) * | 2009-02-04 | 2013-03-10 | Кейлесс Системз Лтд. | DATA INPUT SYSTEM |
US8605039B2 (en) * | 2009-03-06 | 2013-12-10 | Zimpl Ab | Text input |
US8564541B2 (en) * | 2009-03-16 | 2013-10-22 | Apple Inc. | Zhuyin input interface on a device |
US8850472B2 (en) * | 2009-04-01 | 2014-09-30 | Nuance Communications, Inc. | Method and apparatus for customizing user experience |
US20100293457A1 (en) * | 2009-05-15 | 2010-11-18 | Gemstar Development Corporation | Systems and methods for alphanumeric navigation and input |
CN102483752A (en) | 2009-06-03 | 2012-05-30 | 谷歌公司 | Autocompletion for partially entered query |
US20110041177A1 (en) * | 2009-08-14 | 2011-02-17 | Microsoft Corporation | Context-sensitive input user interface |
US20110042102A1 (en) * | 2009-08-18 | 2011-02-24 | Frank's International, Inc. | Method of and kit for installing a centralizer on a pipe segment |
US9110515B2 (en) | 2009-08-19 | 2015-08-18 | Nuance Communications, Inc. | Method and apparatus for text input |
US9166714B2 (en) | 2009-09-11 | 2015-10-20 | Veveo, Inc. | Method of and system for presenting enriched video viewing analytics |
US20110087961A1 (en) * | 2009-10-11 | 2011-04-14 | A.I Type Ltd. | Method and System for Assisting in Typing |
US9047052B2 (en) * | 2009-12-22 | 2015-06-02 | At&T Intellectual Property I, L.P. | Simplified control input to a mobile device |
US20110191330A1 (en) | 2010-02-04 | 2011-08-04 | Veveo, Inc. | Method of and System for Enhanced Content Discovery Based on Network and Device Access Behavior |
US8782556B2 (en) | 2010-02-12 | 2014-07-15 | Microsoft Corporation | User-centric soft keyboard predictive technologies |
US20130111380A1 (en) * | 2010-04-02 | 2013-05-02 | Symantec Corporation | Digital whiteboard implementation |
US8352468B2 (en) * | 2010-04-07 | 2013-01-08 | Apple Inc. | Top search hits based on learned user preferences |
US8327296B2 (en) * | 2010-04-16 | 2012-12-04 | Google Inc. | Extended keyboard user interface |
JP5791236B2 (en) | 2010-05-10 | 2015-10-07 | キヤノン株式会社 | Information processing apparatus, information processing apparatus control method, and program |
US8949725B1 (en) * | 2010-05-27 | 2015-02-03 | Speaktoit, Inc. | Chat information system for portable electronic devices |
US8577915B2 (en) | 2010-09-10 | 2013-11-05 | Veveo, Inc. | Method of and system for conducting personalized federated search and presentation of results therefrom |
WO2012037200A2 (en) | 2010-09-15 | 2012-03-22 | Spetalnick Jeffrey R | Methods of and systems for reducing keyboard data entry errors |
US8730188B2 (en) | 2010-12-23 | 2014-05-20 | Blackberry Limited | Gesture input on a portable electronic device and method of controlling the same |
EP2469384A1 (en) * | 2010-12-23 | 2012-06-27 | Research In Motion Limited | Portable electronic device and method of controlling same |
FR2971066B1 (en) | 2011-01-31 | 2013-08-23 | Nanotec Solution | THREE-DIMENSIONAL MAN-MACHINE INTERFACE. |
US20120200508A1 (en) * | 2011-02-07 | 2012-08-09 | Research In Motion Limited | Electronic device with touch screen display and method of facilitating input at the electronic device |
US9037459B2 (en) * | 2011-03-14 | 2015-05-19 | Apple Inc. | Selection of text prediction results by an accessory |
US8719724B2 (en) | 2011-03-16 | 2014-05-06 | Honeywell International Inc. | Method for enlarging characters displayed on an adaptive touch screen key pad |
WO2012132767A1 (en) * | 2011-03-31 | 2012-10-04 | 株式会社エヌ・ティ・ティ・ドコモ | Mobile terminal |
US9116614B1 (en) * | 2011-04-13 | 2015-08-25 | Google Inc. | Determining pointer and scroll gestures on a touch-sensitive input device |
US8316319B1 (en) | 2011-05-16 | 2012-11-20 | Google Inc. | Efficient selection of characters and commands based on movement-inputs at a user-inerface |
US9633109B2 (en) | 2011-05-17 | 2017-04-25 | Etsy, Inc. | Systems and methods for guided construction of a search query in an electronic commerce environment |
ES2735273T3 (en) * | 2011-05-23 | 2019-12-17 | Huawei Device Co Ltd | Input method, input device and terminal device |
EP2715499B1 (en) * | 2011-05-23 | 2020-09-02 | Microsoft Technology Licensing, LLC | Invisible control |
US8656315B2 (en) | 2011-05-27 | 2014-02-18 | Google Inc. | Moving a graphical selector |
US8826190B2 (en) | 2011-05-27 | 2014-09-02 | Google Inc. | Moving a graphical selector |
US20150040055A1 (en) * | 2011-06-07 | 2015-02-05 | Bowen Zhao | Dynamic soft keyboard for touch screen device |
US8478777B2 (en) * | 2011-10-25 | 2013-07-02 | Google Inc. | Gesture-based search |
US9778841B2 (en) | 2012-02-10 | 2017-10-03 | Hand Held Products, Inc. | Apparatus having random ordered keypad |
US9244612B1 (en) | 2012-02-16 | 2016-01-26 | Google Inc. | Key selection of a graphical keyboard based on user input posture |
US8667414B2 (en) | 2012-03-23 | 2014-03-04 | Google Inc. | Gestural input at a virtual keyboard |
KR101946886B1 (en) * | 2012-04-26 | 2019-02-13 | 엘지이노텍 주식회사 | Apparatus and method thereof for inputing information |
US8484573B1 (en) | 2012-05-23 | 2013-07-09 | Google Inc. | Predictive virtual keyboard |
USD722079S1 (en) * | 2012-06-15 | 2015-02-03 | Dassault Systemes | Transitional image for a portion of a display screen |
KR102091710B1 (en) * | 2012-08-28 | 2020-04-14 | 삼성전자주식회사 | Coordinate sensing apparatus and method for controlling thereof |
CN103677299A (en) * | 2012-09-12 | 2014-03-26 | 深圳市世纪光速信息技术有限公司 | Method and device for achievement of intelligent association in input method and terminal device |
CN104685451B (en) | 2012-09-18 | 2018-04-17 | 谷歌有限责任公司 | Posture adapts to selection |
US9081482B1 (en) | 2012-09-18 | 2015-07-14 | Google Inc. | Text input suggestion ranking |
US8656296B1 (en) | 2012-09-27 | 2014-02-18 | Google Inc. | Selection of characters in a string of characters |
US9021380B2 (en) | 2012-10-05 | 2015-04-28 | Google Inc. | Incremental multi-touch gesture recognition |
US8782549B2 (en) | 2012-10-05 | 2014-07-15 | Google Inc. | Incremental feature-based gesture-keyboard decoding |
US9304683B2 (en) * | 2012-10-10 | 2016-04-05 | Microsoft Technology Licensing, Llc | Arced or slanted soft input panels |
US8914751B2 (en) | 2012-10-16 | 2014-12-16 | Google Inc. | Character deletion during keyboard gesture |
US8713433B1 (en) | 2012-10-16 | 2014-04-29 | Google Inc. | Feature-based autocorrection |
US9569107B2 (en) | 2012-10-16 | 2017-02-14 | Google Inc. | Gesture keyboard with gesture cancellation |
US8850350B2 (en) | 2012-10-16 | 2014-09-30 | Google Inc. | Partial gesture text entry |
US8701032B1 (en) | 2012-10-16 | 2014-04-15 | Google Inc. | Incremental multi-word recognition |
US8612213B1 (en) | 2012-10-16 | 2013-12-17 | Google Inc. | Correction of errors in character strings that include a word delimiter |
US9557818B2 (en) | 2012-10-16 | 2017-01-31 | Google Inc. | Contextually-specific automatic separators |
US8843845B2 (en) | 2012-10-16 | 2014-09-23 | Google Inc. | Multi-gesture text input prediction |
US9304595B2 (en) | 2012-10-19 | 2016-04-05 | Google Inc. | Gesture-keyboard decoding using gesture path deviation |
US8994681B2 (en) | 2012-10-19 | 2015-03-31 | Google Inc. | Decoding imprecise gestures for gesture-keyboards |
US8704792B1 (en) | 2012-10-19 | 2014-04-22 | Google Inc. | Density-based filtering of gesture events associated with a user interface of a computing device |
US8819574B2 (en) | 2012-10-22 | 2014-08-26 | Google Inc. | Space prediction for text input |
US9804777B1 (en) | 2012-10-23 | 2017-10-31 | Google Inc. | Gesture-based text selection |
US8806384B2 (en) | 2012-11-02 | 2014-08-12 | Google Inc. | Keyboard gestures for character string replacement |
US11157436B2 (en) | 2012-11-20 | 2021-10-26 | Samsung Electronics Company, Ltd. | Services associated with wearable electronic device |
US10551928B2 (en) | 2012-11-20 | 2020-02-04 | Samsung Electronics Company, Ltd. | GUI transitions on wearable electronic device |
US8994827B2 (en) | 2012-11-20 | 2015-03-31 | Samsung Electronics Co., Ltd | Wearable electronic device |
US10185416B2 (en) | 2012-11-20 | 2019-01-22 | Samsung Electronics Co., Ltd. | User gesture input to wearable electronic device involving movement of device |
US11237719B2 (en) | 2012-11-20 | 2022-02-01 | Samsung Electronics Company, Ltd. | Controlling remote electronic device with wearable electronic device |
US11372536B2 (en) | 2012-11-20 | 2022-06-28 | Samsung Electronics Company, Ltd. | Transition and interaction model for wearable electronic device |
US10423214B2 (en) | 2012-11-20 | 2019-09-24 | Samsung Electronics Company, Ltd | Delegating processing from wearable electronic device |
US8832589B2 (en) | 2013-01-15 | 2014-09-09 | Google Inc. | Touch keyboard using language and spatial models |
US9047268B2 (en) | 2013-01-31 | 2015-06-02 | Google Inc. | Character and word level language models for out-of-vocabulary text input |
US10228819B2 (en) | 2013-02-04 | 2019-03-12 | 602531 British Cilumbia Ltd. | Method, system, and apparatus for executing an action related to user selection |
US9454240B2 (en) | 2013-02-05 | 2016-09-27 | Google Inc. | Gesture keyboard input of non-dictionary character strings |
FR3002052B1 (en) | 2013-02-14 | 2016-12-09 | Fogale Nanotech | METHOD AND DEVICE FOR NAVIGATING A DISPLAY SCREEN AND APPARATUS COMPRISING SUCH A NAVIGATION |
US9697281B1 (en) | 2013-02-26 | 2017-07-04 | Fast Simon, Inc. | Autocomplete search methods |
US8782550B1 (en) | 2013-02-28 | 2014-07-15 | Google Inc. | Character string replacement |
US8701050B1 (en) | 2013-03-08 | 2014-04-15 | Google Inc. | Gesture completion path display for gesture-based keyboards |
US8825474B1 (en) * | 2013-04-16 | 2014-09-02 | Google Inc. | Text suggestion output using past interaction data |
US9665246B2 (en) * | 2013-04-16 | 2017-05-30 | Google Inc. | Consistent text suggestion output |
US9122376B1 (en) * | 2013-04-18 | 2015-09-01 | Google Inc. | System for improving autocompletion of text input |
US8887103B1 (en) | 2013-04-22 | 2014-11-11 | Google Inc. | Dynamically-positioned character string suggestions for gesture typing |
US8756499B1 (en) | 2013-04-29 | 2014-06-17 | Google Inc. | Gesture keyboard input of non-dictionary character strings using substitute scoring |
US9081500B2 (en) | 2013-05-03 | 2015-07-14 | Google Inc. | Alternative hypothesis error correction for gesture typing |
US8997013B2 (en) * | 2013-05-31 | 2015-03-31 | Google Inc. | Multiple graphical keyboards for continuous gesture input |
CN103345308B (en) * | 2013-06-08 | 2016-02-24 | 百度在线网络技术(北京)有限公司 | For inputting the method and apparatus of amendment |
USD778293S1 (en) * | 2013-07-02 | 2017-02-07 | Microsoft Corporation | Display screen with graphical user interface |
USD746831S1 (en) | 2013-09-10 | 2016-01-05 | Apple Inc. | Display screen or portion thereof with graphical user interface |
US10042543B2 (en) * | 2013-09-18 | 2018-08-07 | Lenovo (Singapore) Pte. Ltd. | Indicating a word length using an input device |
USD829221S1 (en) | 2014-02-12 | 2018-09-25 | Google Llc | Display screen with animated graphical user interface |
US10691332B2 (en) | 2014-02-28 | 2020-06-23 | Samsung Electronics Company, Ltd. | Text input on an interactive display |
DE202015009325U1 (en) * | 2014-03-04 | 2017-02-22 | Google Inc. | Schematic representation of geographical locations |
US9983854B2 (en) | 2014-04-21 | 2018-05-29 | LogMeln, Inc. | Managing and synchronizing views in multi-user application with a canvas |
US9891794B2 (en) | 2014-04-25 | 2018-02-13 | Dropbox, Inc. | Browsing and selecting content items based on user gestures |
US10089346B2 (en) | 2014-04-25 | 2018-10-02 | Dropbox, Inc. | Techniques for collapsing views of content items in a graphical user interface |
TWI603255B (en) * | 2014-05-05 | 2017-10-21 | 志勇無限創意有限公司 | Handheld device and input method thereof |
WO2016011190A1 (en) * | 2014-07-15 | 2016-01-21 | T6 Health Systems Llc | Healthcare information analysis and graphical display presentation system |
US10534532B2 (en) | 2014-08-08 | 2020-01-14 | Samsung Electronics Co., Ltd. | Electronic device and method for processing letter input in electronic device |
KR20160029509A (en) * | 2014-09-05 | 2016-03-15 | 삼성전자주식회사 | Electronic apparatus and application executing method thereof |
US20160092102A1 (en) * | 2014-09-25 | 2016-03-31 | Georgeta Costina Johnson | Smartphone screen touch round keyboard with or without swift, with or without vowels |
USD739476S1 (en) * | 2014-11-11 | 2015-09-22 | William Linden | Puzzle |
US10318575B2 (en) * | 2014-11-14 | 2019-06-11 | Zorroa Corporation | Systems and methods of building and using an image catalog |
US10108335B2 (en) | 2015-06-05 | 2018-10-23 | Apple Inc. | Touch-based interactive learning environment |
CN106293444B (en) * | 2015-06-25 | 2020-07-03 | 小米科技有限责任公司 | Mobile terminal, display control method and device |
US10121471B2 (en) * | 2015-06-29 | 2018-11-06 | Amazon Technologies, Inc. | Language model speech endpointing |
US9996185B2 (en) * | 2015-07-31 | 2018-06-12 | Lenovo (Singapore) Pte. Ltd. | Preventing the automatic display of an onscreen keyboard |
US10637986B2 (en) | 2016-06-10 | 2020-04-28 | Apple Inc. | Displaying and updating a set of application views |
USD803247S1 (en) * | 2016-06-29 | 2017-11-21 | Symantec Corporation | Display screen with graphical user interface |
US20180101599A1 (en) * | 2016-10-08 | 2018-04-12 | Microsoft Technology Licensing, Llc | Interactive context-based text completions |
USD853408S1 (en) * | 2017-03-31 | 2019-07-09 | Harman International Industries, Incorporated | Display screen or portion thereof with graphical user interface |
US10671181B2 (en) * | 2017-04-03 | 2020-06-02 | Microsoft Technology Licensing, Llc | Text entry interface |
USD838729S1 (en) * | 2017-11-21 | 2019-01-22 | Salvatore Guerrieri | Display screen with graphical user interface |
JP7053995B2 (en) * | 2018-04-16 | 2022-04-13 | 富士通株式会社 | Optimization device and control method of optimization device |
JP7243109B2 (en) * | 2018-10-02 | 2023-03-22 | カシオ計算機株式会社 | ELECTRONIC DEVICE, CONTROL METHOD AND PROGRAM FOR ELECTRONIC DEVICE |
US11406286B2 (en) | 2018-10-11 | 2022-08-09 | Masimo Corporation | Patient monitoring device with improved user interface |
USD1041511S1 (en) | 2018-10-11 | 2024-09-10 | Masimo Corporation | Display screen or portion thereof with a graphical user interface |
USD998630S1 (en) * | 2018-10-11 | 2023-09-12 | Masimo Corporation | Display screen or portion thereof with a graphical user interface |
KR102527892B1 (en) * | 2018-11-26 | 2023-05-02 | 삼성전자주식회사 | Electronic device for providing predictive word and operating method thereof |
US11131967B2 (en) | 2019-05-06 | 2021-09-28 | Apple Inc. | Clock faces for an electronic device |
US11568307B2 (en) * | 2019-05-20 | 2023-01-31 | International Business Machines Corporation | Data augmentation for text-based AI applications |
KR20210016752A (en) * | 2019-08-05 | 2021-02-17 | 윤현진 | English input keyboard for critically ill patients |
USD924912S1 (en) | 2019-09-09 | 2021-07-13 | Apple Inc. | Display screen or portion thereof with graphical user interface |
US11727284B2 (en) | 2019-12-12 | 2023-08-15 | Business Objects Software Ltd | Interpretation of machine learning results using feature analysis |
US20210192376A1 (en) * | 2019-12-23 | 2021-06-24 | Sap Se | Automated, progressive explanations of machine learning results |
CN113448430B (en) * | 2020-03-26 | 2023-02-28 | 中移(成都)信息通信科技有限公司 | Text error correction method, device, equipment and computer readable storage medium |
US11580455B2 (en) | 2020-04-01 | 2023-02-14 | Sap Se | Facilitating machine learning configuration |
US11783198B2 (en) * | 2020-04-03 | 2023-10-10 | Baidu Usa Llc | Estimating the implicit likelihoods of generative adversarial networks |
KR20230039741A (en) * | 2020-07-24 | 2023-03-21 | 아길리스 아이즈프리 터치스크린 키보즈 엘티디 | Adaptive touchscreen keypad with dead zone |
US20230333867A1 (en) * | 2022-04-18 | 2023-10-19 | Celligence International Llc | Method and computing apparatus for operating a form-based interface |
US11893048B1 (en) * | 2023-01-17 | 2024-02-06 | VelocityEHS Holdings, Inc. | Automated indexing and extraction of multiple information fields in digital records |
Citations (98)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US3644898A (en) * | 1970-04-30 | 1972-02-22 | United Aircraft Corp | Information association through logical functions derived from language |
US4211497A (en) * | 1974-03-01 | 1980-07-08 | Montgomery Edward B | Data input system |
US4330845A (en) * | 1979-12-31 | 1982-05-18 | International Business Machines Corporation | Guess-ahead feature for a keyboard-display terminal data input system |
US4396992A (en) * | 1980-04-08 | 1983-08-02 | Sony Corporation | Word processor |
US4471459A (en) * | 1981-09-30 | 1984-09-11 | System Development Corp. | Digital data processing method and means for word classification by pattern analysis |
US4499553A (en) * | 1981-09-30 | 1985-02-12 | Dickinson Robert V | Locating digital coded words which are both acceptable misspellings and acceptable inflections of digital coded query words |
US4648044A (en) * | 1984-06-06 | 1987-03-03 | Teknowledge, Inc. | Basic expert system tool |
US4689768A (en) * | 1982-06-30 | 1987-08-25 | International Business Machines Corporation | Spelling verification system with immediate operator alerts to non-matches between inputted words and words stored in plural dictionary memories |
US4730252A (en) * | 1985-09-24 | 1988-03-08 | International Business Machines Corp. | Document composition from parts inventory |
US4744050A (en) * | 1984-06-26 | 1988-05-10 | Hitachi, Ltd. | Method for automatically registering frequently used phrases |
US4774666A (en) * | 1985-05-14 | 1988-09-27 | Sharp Kabushiki Kaisha | Translating apparatus |
US4807181A (en) * | 1986-06-02 | 1989-02-21 | Smith Corona Corporation | Dictionary memory with visual scanning from a selectable starting point |
US4847766A (en) * | 1988-01-05 | 1989-07-11 | Smith Corona Corporation | Dictionary typewriter with correction of commonly confused words |
US4891786A (en) * | 1983-02-22 | 1990-01-02 | Goldwasser Eric P | Stroke typing system |
US5040113A (en) * | 1987-01-28 | 1991-08-13 | Mickunas Marshall D | Data manipulation program |
US5096423A (en) * | 1985-12-11 | 1992-03-17 | Goldwasser Eric P | Computer system for teaching abbreviations for text and data processing functions |
US5203704A (en) * | 1990-12-21 | 1993-04-20 | Mccloud Seth R | Method of communication using pointing vector gestures and mnemonic devices to assist in learning point vector gestures |
US5218536A (en) * | 1988-05-25 | 1993-06-08 | Franklin Electronic Publishers, Incorporated | Electronic spelling machine having ordered candidate words |
US5220649A (en) * | 1991-03-20 | 1993-06-15 | Forcier Mitchell D | Script/binary-encoded-character processing method and system with moving space insertion mode |
US5220652A (en) * | 1986-07-21 | 1993-06-15 | Rowley Blair A | Computer application programs data input interface for handicapped persons responsive to multiple push buttons for selecting data stored in binary tree |
US5297041A (en) * | 1990-06-11 | 1994-03-22 | Semantic Compaction Systems | Predictive scanning input system for rapid selection of auditory and visual indicators |
US5305205A (en) * | 1990-10-23 | 1994-04-19 | Weber Maria L | Computer-assisted transcription apparatus |
US5327161A (en) * | 1989-08-09 | 1994-07-05 | Microtouch Systems, Inc. | System and method for emulating a mouse input device with a touchpad input device |
US5329609A (en) * | 1990-07-31 | 1994-07-12 | Fujitsu Limited | Recognition apparatus with function of displaying plural recognition candidates |
US5347295A (en) * | 1990-10-31 | 1994-09-13 | Go Corporation | Control of a computer through a position-sensed stylus |
US5392447A (en) * | 1992-01-10 | 1995-02-21 | Eastman Kodak Compay | Image-based electronic pocket organizer with integral scanning unit |
US5487616A (en) * | 1995-06-01 | 1996-01-30 | Jean D. Ichbiah | Method for designing an ergonomic one-finger keyboard and apparatus therefor |
US5500935A (en) * | 1993-12-30 | 1996-03-19 | Xerox Corporation | Apparatus and method for translating graphic objects and commands with direct touch input in a touch based input system |
US5543818A (en) * | 1994-05-13 | 1996-08-06 | Sony Corporation | Method and apparatus for entering text using an input device having a small number of keys |
US5543591A (en) * | 1992-06-08 | 1996-08-06 | Synaptics, Incorporated | Object position detector with edge motion feature and gesture recognition |
US5545857A (en) * | 1994-07-27 | 1996-08-13 | Samsung Electronics Co. Ltd. | Remote control method and apparatus thereof |
US5559942A (en) * | 1993-05-10 | 1996-09-24 | Apple Computer, Inc. | Method and apparatus for providing a note for an application program |
US5594640A (en) * | 1993-08-02 | 1997-01-14 | Apple Computer, Incorporated | Method and apparatus for correcting words |
US5596699A (en) * | 1994-02-02 | 1997-01-21 | Driskell; Stanley W. | Linear-viewing/radial-selection graphic for menu display |
US5606674A (en) * | 1995-01-03 | 1997-02-25 | Intel Corporation | Graphical user interface for transferring data between applications that support different metaphors |
US5621641A (en) * | 1988-12-21 | 1997-04-15 | Freeman; Alfred B. | Computer assisted text system |
US5623406A (en) * | 1995-03-06 | 1997-04-22 | Jean D. Ichbiah | Method and system for entering text in computer equipment |
US5629733A (en) * | 1994-11-29 | 1997-05-13 | News America Publications, Inc. | Electronic television program guide schedule system and method with display and search of program listings by title |
US5649104A (en) * | 1993-03-19 | 1997-07-15 | Ncr Corporation | System for allowing user of any computer to draw image over that generated by the host computer and replicating the drawn image to other computers |
US5649223A (en) * | 1988-12-21 | 1997-07-15 | Freeman; Alfred B. | Word based text producing system |
US5657397A (en) * | 1985-10-10 | 1997-08-12 | Bokser; Mindy R. | Preprocessing means for use in a pattern classification system |
US5666139A (en) * | 1992-10-15 | 1997-09-09 | Advanced Pen Technologies, Inc. | Pen-based computer copy editing apparatus and method for manuscripts |
US5724457A (en) * | 1994-06-06 | 1998-03-03 | Nec Corporation | Character string input system |
US5734749A (en) * | 1993-12-27 | 1998-03-31 | Nec Corporation | Character string input system for completing an input character string with an incomplete input indicative sign |
US5745116A (en) * | 1996-09-09 | 1998-04-28 | Motorola, Inc. | Intuitive gesture-based graphical user interface |
US5748841A (en) * | 1994-02-25 | 1998-05-05 | Morin; Philippe | Supervised contextual language acquisition system |
US5758324A (en) * | 1995-12-15 | 1998-05-26 | Hartman; Richard L. | Resume storage and retrieval system |
US5790115A (en) * | 1995-09-19 | 1998-08-04 | Microsoft Corporation | System for character entry on a display screen |
US5805158A (en) * | 1996-08-22 | 1998-09-08 | International Business Machines Corporation | Copying predicted input between computer systems |
US5805167A (en) * | 1994-09-22 | 1998-09-08 | Van Cruyningen; Izak | Popup menus with directional gestures |
US5805159A (en) * | 1996-08-22 | 1998-09-08 | International Business Machines Corporation | Mobile client computer interdependent display data fields |
US5805911A (en) * | 1995-02-01 | 1998-09-08 | Microsoft Corporation | Word prediction system |
US5864340A (en) * | 1996-08-22 | 1999-01-26 | International Business Machines Corporation | Mobile client computer programmed to predict input |
US5881169A (en) * | 1996-09-13 | 1999-03-09 | Ericsson Inc. | Apparatus and method for presenting and gathering text entries in a pen-based input device |
US5896321A (en) * | 1997-11-14 | 1999-04-20 | Microsoft Corporation | Text completion system for a miniature computer |
US5911485A (en) * | 1995-12-11 | 1999-06-15 | Unwired Planet, Inc. | Predictive data entry method for a keypad |
US5914708A (en) * | 1996-04-04 | 1999-06-22 | Cirque Corporation | Computer input stylus method and apparatus |
US5926178A (en) * | 1995-06-06 | 1999-07-20 | Silicon Graphics, Inc. | Display and control of menus with radial and linear portions |
US5926566A (en) * | 1996-11-15 | 1999-07-20 | Synaptics, Inc. | Incremental ideographic character input method |
US5943039A (en) * | 1991-02-01 | 1999-08-24 | U.S. Philips Corporation | Apparatus for the interactive handling of objects |
US5953541A (en) * | 1997-01-24 | 1999-09-14 | Tegic Communications, Inc. | Disambiguating system for disambiguating ambiguous input sequences by displaying objects associated with the generated input sequences in the order of decreasing frequency of use |
US6011554A (en) * | 1995-07-26 | 2000-01-04 | Tegic Communications, Inc. | Reduced keyboard disambiguating system |
US6026233A (en) * | 1997-05-27 | 2000-02-15 | Microsoft Corporation | Method and apparatus for presenting and selecting options to modify a programming language statement |
US6037942A (en) * | 1998-03-10 | 2000-03-14 | Magellan Dis, Inc. | Navigation system character input device |
US6084576A (en) * | 1997-09-27 | 2000-07-04 | Leu; Neng-Chyang | User friendly keyboard |
US6088649A (en) * | 1998-08-05 | 2000-07-11 | Visteon Technologies, Llc | Methods and apparatus for selecting a destination in a vehicle navigation system |
US6094197A (en) * | 1993-12-21 | 2000-07-25 | Xerox Corporation | Graphical keyboard |
US6098086A (en) * | 1997-08-11 | 2000-08-01 | Webtv Networks, Inc. | Japanese text input method using a limited roman character set |
US6097841A (en) * | 1996-05-21 | 2000-08-01 | Hitachi, Ltd. | Apparatus for recognizing input character strings by inference |
US6097392A (en) * | 1992-09-10 | 2000-08-01 | Microsoft Corporation | Method and system of altering an attribute of a graphic object in a pen environment |
US6101461A (en) * | 1997-02-28 | 2000-08-08 | Justsystem Corp. | Command inputting method |
US6111985A (en) * | 1997-06-06 | 2000-08-29 | Microsoft Corporation | Method and mechanism for providing partial results in full context handwriting recognition |
US6188789B1 (en) * | 1996-12-05 | 2001-02-13 | Palm, Inc. | Method and apparatus of immediate response handwriting recognition system that handles multiple character sets |
US20010000962A1 (en) * | 1998-06-26 | 2001-05-10 | Ganesh Rajan | Terminal for composing and presenting MPEG-4 video programs |
US6256030B1 (en) * | 1993-11-30 | 2001-07-03 | International Business Machines Corp. | Navigation within a graphical user interface for a compound graphical object using pointing device input |
US6262719B1 (en) * | 1994-09-02 | 2001-07-17 | Packard Bell Nec, Inc. | Mouse emulation with a passive pen |
US6271835B1 (en) * | 1998-09-03 | 2001-08-07 | Nortel Networks Limited | Touch-screen input device |
US6275612B1 (en) * | 1997-06-09 | 2001-08-14 | International Business Machines Corporation | Character data input apparatus and method thereof |
US6282315B1 (en) * | 1990-10-22 | 2001-08-28 | Samsung Electronics, Ltd. | System for entering handwritten data into computer generated forms |
US6369807B1 (en) * | 1997-06-04 | 2002-04-09 | Nec Corporation | Online character entry device |
US6377965B1 (en) * | 1997-11-07 | 2002-04-23 | Microsoft Corporation | Automatic word completion system for partially entered data |
US20020067377A1 (en) * | 1997-11-12 | 2002-06-06 | Mcgovern John | Method of inputting name |
US6405060B1 (en) * | 1995-07-19 | 2002-06-11 | Cirrus Logic, Inc. | User interface with improved data entry features for telephone system |
US6411950B1 (en) * | 1998-11-30 | 2002-06-25 | Compaq Information Technologies Group, Lp | Dynamic query expansion |
US20020087279A1 (en) * | 1998-09-23 | 2002-07-04 | Stuart Hall | Method and apparatus for displaying help screen information for measurement device |
US6424983B1 (en) * | 1998-05-26 | 2002-07-23 | Global Information Research And Technologies, Llc | Spelling and grammar checking system |
US6442295B2 (en) * | 1997-02-12 | 2002-08-27 | Stmicroelectronics S.R.L. | Word recognition device and method |
US6539421B1 (en) * | 1999-09-24 | 2003-03-25 | America Online, Inc. | Messaging application user interface |
US20030137605A1 (en) * | 2002-01-21 | 2003-07-24 | Samsung Electronics Co., Ltd. | Channel tuning method and television using channel name auto completion function |
US20040021691A1 (en) * | 2000-10-18 | 2004-02-05 | Mark Dostie | Method, system and media for entering data in a personal computing device |
US6734881B1 (en) * | 1995-04-18 | 2004-05-11 | Craig Alexander Will | Efficient entry of words by disambiguation |
US6751603B1 (en) * | 2000-05-16 | 2004-06-15 | Sun Microsystems, Inc. | Autocomplete method and apparatus for data file selection |
US6888141B2 (en) * | 2002-12-02 | 2005-05-03 | Multispectral Imaging, Inc. | Radiation sensor with photo-thermal gain |
US6934906B1 (en) * | 1999-07-08 | 2005-08-23 | At&T Corp. | Methods and apparatus for integrating external applications into an MPEG-4 scene |
US7003446B2 (en) * | 2000-03-07 | 2006-02-21 | Microsoft Corporation | Grammar-based automatic data completion and suggestion for user input |
US20070157122A1 (en) * | 1999-02-22 | 2007-07-05 | Stephen Williams | Communication Terminal Having A Predictive Editor Application |
US7257528B1 (en) * | 1998-02-13 | 2007-08-14 | Zi Corporation Of Canada, Inc. | Method and apparatus for Chinese character text input |
US20070188472A1 (en) * | 2003-04-18 | 2007-08-16 | Ghassabian Benjamin F | Systems to enhance data entry in mobile and fixed environment |
Family Cites Families (57)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US4559598A (en) | 1983-02-22 | 1985-12-17 | Eric Goldwasser | Method of creating text using a computer |
USRE32773E (en) | 1983-02-22 | 1988-10-25 | Method of creating text using a computer | |
EP0352377A1 (en) | 1988-07-26 | 1990-01-31 | Leonid D. Levin | Word processing apparatus and method |
US4969097A (en) | 1985-09-18 | 1990-11-06 | Levin Leonid D | Method of rapid entering of text into computer equipment |
US4782464A (en) * | 1985-12-26 | 1988-11-01 | Smith Corona Corporation | Compact spelling-check dictionary |
US4783761A (en) * | 1985-12-26 | 1988-11-08 | Smith Corona Corporation | Spelling check dictionary with early error signal |
EP0432139B1 (en) * | 1986-07-23 | 1997-05-28 | Wacom Company, Ltd. | Position designating device |
JPS6359660A (en) * | 1986-08-29 | 1988-03-15 | Brother Ind Ltd | Information processor |
US5060154A (en) * | 1989-01-06 | 1991-10-22 | Smith Corona Corporation | Electronic typewriter or word processor with detection and/or correction of selected phrases |
US5067165A (en) * | 1989-04-19 | 1991-11-19 | Ricoh Company, Ltd. | Character recognition method |
US5261112A (en) * | 1989-09-08 | 1993-11-09 | Casio Computer Co., Ltd. | Spelling check apparatus including simple and quick similar word retrieval operation |
JPH07104765B2 (en) | 1990-08-24 | 1995-11-13 | ゼロックス コーポレイション | Electronic documentation as a user interface to computer-resident software systems |
US5953735A (en) * | 1991-03-20 | 1999-09-14 | Forcier; Mitchell D. | Script character processing method and system with bit-mapped document editing |
US5258748A (en) * | 1991-08-28 | 1993-11-02 | Hewlett-Packard Company | Accessing and selecting multiple key functions with minimum keystrokes |
US5963671A (en) | 1991-11-27 | 1999-10-05 | International Business Machines Corporation | Enhancement of soft keyboard operations using trigram prediction |
GB2266797B (en) * | 1992-05-09 | 1995-06-14 | Nokia Mobile Phones Uk | Data storage apparatus |
DE69429643T2 (en) | 1993-08-31 | 2002-09-12 | Mitsuhiro Aida | Text entry procedure |
US6154758A (en) * | 1994-05-13 | 2000-11-28 | Apple Computer, Inc. | Text conversion method for computer systems |
US5574482A (en) * | 1994-05-17 | 1996-11-12 | Niemeier; Charles J. | Method for data input on a touch-sensitive screen |
US5704029A (en) * | 1994-05-23 | 1997-12-30 | Wright Strategies, Inc. | System and method for completing an electronic form |
US6008799A (en) | 1994-05-24 | 1999-12-28 | Microsoft Corporation | Method and system for entering data using an improved on-screen keyboard |
US5812697A (en) * | 1994-06-10 | 1998-09-22 | Nippon Steel Corporation | Method and apparatus for recognizing hand-written characters using a weighting dictionary |
US6978421B1 (en) * | 1994-06-19 | 2005-12-20 | Mitsuhiro Aida | Handwriting text input system |
US5974558A (en) * | 1994-09-02 | 1999-10-26 | Packard Bell Nec | Resume on pen contact |
JP2741575B2 (en) * | 1994-09-22 | 1998-04-22 | 日本アイ・ビー・エム株式会社 | Character recognition character completion method and computer system |
US5838302A (en) * | 1995-02-24 | 1998-11-17 | Casio Computer Co., Ltd. | Data inputting devices for inputting typed and handwritten data in a mixed manner |
US6295372B1 (en) * | 1995-03-03 | 2001-09-25 | Palm, Inc. | Method and apparatus for handwriting input on a pen based palmtop computing device |
US6005549A (en) | 1995-07-24 | 1999-12-21 | Forest; Donald K. | User interface method and apparatus |
US5818437A (en) | 1995-07-26 | 1998-10-06 | Tegic Communications, Inc. | Reduced keyboard disambiguating computer |
US5963666A (en) * | 1995-08-18 | 1999-10-05 | International Business Machines Corporation | Confusion matrix mediated word prediction |
US6473006B1 (en) * | 1995-12-11 | 2002-10-29 | Openwave Systems Inc. | Method and apparatus for zoomed display of characters entered from a telephone keypad |
JP3980679B2 (en) * | 1996-02-16 | 2007-09-26 | 角田 達雄 | Character / character string input processing device |
US5845300A (en) * | 1996-06-05 | 1998-12-01 | Microsoft Corporation | Method and apparatus for suggesting completions for a partially entered data item based on previously-entered, associated data items |
US5821512A (en) * | 1996-06-26 | 1998-10-13 | Telxon Corporation | Shopping cart mounted portable data collection device with tethered dataform reader |
JPH10154144A (en) | 1996-11-25 | 1998-06-09 | Sony Corp | Document inputting device and method therefor |
JP3889466B2 (en) | 1996-11-25 | 2007-03-07 | ソニー株式会社 | Text input device and method |
US6144378A (en) | 1997-02-11 | 2000-11-07 | Microsoft Corporation | Symbol entry system and methods |
US5982351A (en) * | 1997-09-30 | 1999-11-09 | Motorola, Inc. | Method and apparatus for supplementing a keyboard and for helping a user operate an electronic device |
JPH11167569A (en) | 1997-12-02 | 1999-06-22 | Sony Corp | Text input device and method and recording medium |
WO1999028811A1 (en) | 1997-12-04 | 1999-06-10 | Northern Telecom Limited | Contextual gesture interface |
GB2333386B (en) * | 1998-01-14 | 2002-06-12 | Nokia Mobile Phones Ltd | Method and apparatus for inputting information |
US6157412A (en) * | 1998-03-30 | 2000-12-05 | Sharp Laboratories Of America, Inc. | System for identifying video fields generated from film sources |
KR100327209B1 (en) * | 1998-05-12 | 2002-04-17 | 윤종용 | Software keyboard system using the drawing of stylus and method for recognizing keycode therefor |
US6167411A (en) * | 1998-06-22 | 2000-12-26 | Lucent Technologies Inc. | User interface for entering and editing data in data entry fields |
US6167412A (en) * | 1998-07-14 | 2000-12-26 | Agilent Technologies, Inc. | Handheld medical calculator and medical reference device |
US6266048B1 (en) | 1998-08-27 | 2001-07-24 | Hewlett-Packard Company | Method and apparatus for a virtual display/keyboard for a PDA |
US7293231B1 (en) | 1999-03-18 | 2007-11-06 | British Columbia Ltd. | Data entry for personal computing devices |
ES2202070T3 (en) | 1999-03-18 | 2004-04-01 | 602531 British Columbia Ltd. | DATA ENTRY FOR PERSONAL INFORMATIC DEVICES. |
EP1192716B1 (en) * | 1999-05-27 | 2009-09-23 | Tegic Communications, Inc. | Keyboard system with automatic correction |
JP4151158B2 (en) * | 1999-06-14 | 2008-09-17 | ソニー株式会社 | Scene description generation apparatus and method |
US6654733B1 (en) * | 2000-01-18 | 2003-11-25 | Microsoft Corporation | Fuzzy keyboard |
US6661920B1 (en) * | 2000-01-19 | 2003-12-09 | Palm Inc. | Method and apparatus for multiple simultaneously active data entry mechanisms on a computer system |
US20010027468A1 (en) * | 2000-03-09 | 2001-10-04 | Sanyo Electric Co., Ltd. | Transmission system, reception system, and transmission and reception system capable of displaying a scene with high quality |
US6654038B1 (en) | 2000-06-02 | 2003-11-25 | Sun Microsystems, Inc. | Keyboard navigation of non-focusable components |
US6970513B1 (en) * | 2001-06-05 | 2005-11-29 | At&T Corp. | System for content adaptive video decoding |
US6920205B2 (en) | 2003-05-23 | 2005-07-19 | Cisco Technology, Inc. | System and method for interactive communications with an end-user |
US7747690B2 (en) * | 2003-12-29 | 2010-06-29 | International Business Machines Corporation | Method for extracting and managing message addresses |
-
2000
- 2000-08-01 US US09/631,101 patent/US7293231B1/en not_active Expired - Lifetime
-
2005
- 2005-05-19 US US11/134,759 patent/US20050223308A1/en not_active Abandoned
- 2005-05-19 US US11/133,770 patent/US7716579B2/en not_active Expired - Lifetime
-
2007
- 2007-10-12 US US11/871,900 patent/US20080030481A1/en not_active Abandoned
- 2007-10-12 US US11/871,887 patent/US7921361B2/en not_active Expired - Fee Related
- 2007-10-12 US US11/871,904 patent/US20080088599A1/en not_active Abandoned
Patent Citations (101)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US3644898A (en) * | 1970-04-30 | 1972-02-22 | United Aircraft Corp | Information association through logical functions derived from language |
US4211497A (en) * | 1974-03-01 | 1980-07-08 | Montgomery Edward B | Data input system |
US4330845A (en) * | 1979-12-31 | 1982-05-18 | International Business Machines Corporation | Guess-ahead feature for a keyboard-display terminal data input system |
US4396992A (en) * | 1980-04-08 | 1983-08-02 | Sony Corporation | Word processor |
US4471459A (en) * | 1981-09-30 | 1984-09-11 | System Development Corp. | Digital data processing method and means for word classification by pattern analysis |
US4499553A (en) * | 1981-09-30 | 1985-02-12 | Dickinson Robert V | Locating digital coded words which are both acceptable misspellings and acceptable inflections of digital coded query words |
US4689768A (en) * | 1982-06-30 | 1987-08-25 | International Business Machines Corporation | Spelling verification system with immediate operator alerts to non-matches between inputted words and words stored in plural dictionary memories |
US4891786A (en) * | 1983-02-22 | 1990-01-02 | Goldwasser Eric P | Stroke typing system |
US4648044A (en) * | 1984-06-06 | 1987-03-03 | Teknowledge, Inc. | Basic expert system tool |
US4744050A (en) * | 1984-06-26 | 1988-05-10 | Hitachi, Ltd. | Method for automatically registering frequently used phrases |
US4774666A (en) * | 1985-05-14 | 1988-09-27 | Sharp Kabushiki Kaisha | Translating apparatus |
US4730252A (en) * | 1985-09-24 | 1988-03-08 | International Business Machines Corp. | Document composition from parts inventory |
US5657397A (en) * | 1985-10-10 | 1997-08-12 | Bokser; Mindy R. | Preprocessing means for use in a pattern classification system |
US5096423A (en) * | 1985-12-11 | 1992-03-17 | Goldwasser Eric P | Computer system for teaching abbreviations for text and data processing functions |
US4807181A (en) * | 1986-06-02 | 1989-02-21 | Smith Corona Corporation | Dictionary memory with visual scanning from a selectable starting point |
US5220652A (en) * | 1986-07-21 | 1993-06-15 | Rowley Blair A | Computer application programs data input interface for handicapped persons responsive to multiple push buttons for selecting data stored in binary tree |
US5040113A (en) * | 1987-01-28 | 1991-08-13 | Mickunas Marshall D | Data manipulation program |
US4847766A (en) * | 1988-01-05 | 1989-07-11 | Smith Corona Corporation | Dictionary typewriter with correction of commonly confused words |
US5218536A (en) * | 1988-05-25 | 1993-06-08 | Franklin Electronic Publishers, Incorporated | Electronic spelling machine having ordered candidate words |
US5649223A (en) * | 1988-12-21 | 1997-07-15 | Freeman; Alfred B. | Word based text producing system |
US5621641A (en) * | 1988-12-21 | 1997-04-15 | Freeman; Alfred B. | Computer assisted text system |
US5327161A (en) * | 1989-08-09 | 1994-07-05 | Microtouch Systems, Inc. | System and method for emulating a mouse input device with a touchpad input device |
US5297041A (en) * | 1990-06-11 | 1994-03-22 | Semantic Compaction Systems | Predictive scanning input system for rapid selection of auditory and visual indicators |
US5329609A (en) * | 1990-07-31 | 1994-07-12 | Fujitsu Limited | Recognition apparatus with function of displaying plural recognition candidates |
US6282315B1 (en) * | 1990-10-22 | 2001-08-28 | Samsung Electronics, Ltd. | System for entering handwritten data into computer generated forms |
US5305205A (en) * | 1990-10-23 | 1994-04-19 | Weber Maria L | Computer-assisted transcription apparatus |
US5347295A (en) * | 1990-10-31 | 1994-09-13 | Go Corporation | Control of a computer through a position-sensed stylus |
US5203704A (en) * | 1990-12-21 | 1993-04-20 | Mccloud Seth R | Method of communication using pointing vector gestures and mnemonic devices to assist in learning point vector gestures |
US5943039A (en) * | 1991-02-01 | 1999-08-24 | U.S. Philips Corporation | Apparatus for the interactive handling of objects |
US5220649A (en) * | 1991-03-20 | 1993-06-15 | Forcier Mitchell D | Script/binary-encoded-character processing method and system with moving space insertion mode |
US5392447A (en) * | 1992-01-10 | 1995-02-21 | Eastman Kodak Compay | Image-based electronic pocket organizer with integral scanning unit |
US5543591A (en) * | 1992-06-08 | 1996-08-06 | Synaptics, Incorporated | Object position detector with edge motion feature and gesture recognition |
US6097392A (en) * | 1992-09-10 | 2000-08-01 | Microsoft Corporation | Method and system of altering an attribute of a graphic object in a pen environment |
US5666139A (en) * | 1992-10-15 | 1997-09-09 | Advanced Pen Technologies, Inc. | Pen-based computer copy editing apparatus and method for manuscripts |
US5649104A (en) * | 1993-03-19 | 1997-07-15 | Ncr Corporation | System for allowing user of any computer to draw image over that generated by the host computer and replicating the drawn image to other computers |
US5559942A (en) * | 1993-05-10 | 1996-09-24 | Apple Computer, Inc. | Method and apparatus for providing a note for an application program |
US5594640A (en) * | 1993-08-02 | 1997-01-14 | Apple Computer, Incorporated | Method and apparatus for correcting words |
US6256030B1 (en) * | 1993-11-30 | 2001-07-03 | International Business Machines Corp. | Navigation within a graphical user interface for a compound graphical object using pointing device input |
US6094197A (en) * | 1993-12-21 | 2000-07-25 | Xerox Corporation | Graphical keyboard |
US5734749A (en) * | 1993-12-27 | 1998-03-31 | Nec Corporation | Character string input system for completing an input character string with an incomplete input indicative sign |
US5500935A (en) * | 1993-12-30 | 1996-03-19 | Xerox Corporation | Apparatus and method for translating graphic objects and commands with direct touch input in a touch based input system |
US5596699A (en) * | 1994-02-02 | 1997-01-21 | Driskell; Stanley W. | Linear-viewing/radial-selection graphic for menu display |
US5748841A (en) * | 1994-02-25 | 1998-05-05 | Morin; Philippe | Supervised contextual language acquisition system |
US5543818A (en) * | 1994-05-13 | 1996-08-06 | Sony Corporation | Method and apparatus for entering text using an input device having a small number of keys |
US5724457A (en) * | 1994-06-06 | 1998-03-03 | Nec Corporation | Character string input system |
US5545857A (en) * | 1994-07-27 | 1996-08-13 | Samsung Electronics Co. Ltd. | Remote control method and apparatus thereof |
US6262719B1 (en) * | 1994-09-02 | 2001-07-17 | Packard Bell Nec, Inc. | Mouse emulation with a passive pen |
US5805167A (en) * | 1994-09-22 | 1998-09-08 | Van Cruyningen; Izak | Popup menus with directional gestures |
US5629733A (en) * | 1994-11-29 | 1997-05-13 | News America Publications, Inc. | Electronic television program guide schedule system and method with display and search of program listings by title |
US5606674A (en) * | 1995-01-03 | 1997-02-25 | Intel Corporation | Graphical user interface for transferring data between applications that support different metaphors |
US5805911A (en) * | 1995-02-01 | 1998-09-08 | Microsoft Corporation | Word prediction system |
US5623406A (en) * | 1995-03-06 | 1997-04-22 | Jean D. Ichbiah | Method and system for entering text in computer equipment |
US6734881B1 (en) * | 1995-04-18 | 2004-05-11 | Craig Alexander Will | Efficient entry of words by disambiguation |
US5487616A (en) * | 1995-06-01 | 1996-01-30 | Jean D. Ichbiah | Method for designing an ergonomic one-finger keyboard and apparatus therefor |
US5926178A (en) * | 1995-06-06 | 1999-07-20 | Silicon Graphics, Inc. | Display and control of menus with radial and linear portions |
US6405060B1 (en) * | 1995-07-19 | 2002-06-11 | Cirrus Logic, Inc. | User interface with improved data entry features for telephone system |
US6011554A (en) * | 1995-07-26 | 2000-01-04 | Tegic Communications, Inc. | Reduced keyboard disambiguating system |
US5790115A (en) * | 1995-09-19 | 1998-08-04 | Microsoft Corporation | System for character entry on a display screen |
US5911485A (en) * | 1995-12-11 | 1999-06-15 | Unwired Planet, Inc. | Predictive data entry method for a keypad |
US5758324A (en) * | 1995-12-15 | 1998-05-26 | Hartman; Richard L. | Resume storage and retrieval system |
US5914708A (en) * | 1996-04-04 | 1999-06-22 | Cirque Corporation | Computer input stylus method and apparatus |
US6097841A (en) * | 1996-05-21 | 2000-08-01 | Hitachi, Ltd. | Apparatus for recognizing input character strings by inference |
US5805159A (en) * | 1996-08-22 | 1998-09-08 | International Business Machines Corporation | Mobile client computer interdependent display data fields |
US5864340A (en) * | 1996-08-22 | 1999-01-26 | International Business Machines Corporation | Mobile client computer programmed to predict input |
US5805158A (en) * | 1996-08-22 | 1998-09-08 | International Business Machines Corporation | Copying predicted input between computer systems |
US5745116A (en) * | 1996-09-09 | 1998-04-28 | Motorola, Inc. | Intuitive gesture-based graphical user interface |
US5881169A (en) * | 1996-09-13 | 1999-03-09 | Ericsson Inc. | Apparatus and method for presenting and gathering text entries in a pen-based input device |
US5926566A (en) * | 1996-11-15 | 1999-07-20 | Synaptics, Inc. | Incremental ideographic character input method |
US6188789B1 (en) * | 1996-12-05 | 2001-02-13 | Palm, Inc. | Method and apparatus of immediate response handwriting recognition system that handles multiple character sets |
US5953541A (en) * | 1997-01-24 | 1999-09-14 | Tegic Communications, Inc. | Disambiguating system for disambiguating ambiguous input sequences by displaying objects associated with the generated input sequences in the order of decreasing frequency of use |
US6442295B2 (en) * | 1997-02-12 | 2002-08-27 | Stmicroelectronics S.R.L. | Word recognition device and method |
US6101461A (en) * | 1997-02-28 | 2000-08-08 | Justsystem Corp. | Command inputting method |
US7322023B2 (en) * | 1997-05-27 | 2008-01-22 | Microsoft Corporation | Computer programming language statement building and information tool with non obstructing passive assist window |
US6026233A (en) * | 1997-05-27 | 2000-02-15 | Microsoft Corporation | Method and apparatus for presenting and selecting options to modify a programming language statement |
US6369807B1 (en) * | 1997-06-04 | 2002-04-09 | Nec Corporation | Online character entry device |
US6111985A (en) * | 1997-06-06 | 2000-08-29 | Microsoft Corporation | Method and mechanism for providing partial results in full context handwriting recognition |
US6275612B1 (en) * | 1997-06-09 | 2001-08-14 | International Business Machines Corporation | Character data input apparatus and method thereof |
US6098086A (en) * | 1997-08-11 | 2000-08-01 | Webtv Networks, Inc. | Japanese text input method using a limited roman character set |
US6084576A (en) * | 1997-09-27 | 2000-07-04 | Leu; Neng-Chyang | User friendly keyboard |
US6377965B1 (en) * | 1997-11-07 | 2002-04-23 | Microsoft Corporation | Automatic word completion system for partially entered data |
US20020067377A1 (en) * | 1997-11-12 | 2002-06-06 | Mcgovern John | Method of inputting name |
US6608639B2 (en) * | 1997-11-12 | 2003-08-19 | Alpine Electronics, Inc. | Method of inputting name |
US5896321A (en) * | 1997-11-14 | 1999-04-20 | Microsoft Corporation | Text completion system for a miniature computer |
US7257528B1 (en) * | 1998-02-13 | 2007-08-14 | Zi Corporation Of Canada, Inc. | Method and apparatus for Chinese character text input |
US6037942A (en) * | 1998-03-10 | 2000-03-14 | Magellan Dis, Inc. | Navigation system character input device |
US6424983B1 (en) * | 1998-05-26 | 2002-07-23 | Global Information Research And Technologies, Llc | Spelling and grammar checking system |
US20010000962A1 (en) * | 1998-06-26 | 2001-05-10 | Ganesh Rajan | Terminal for composing and presenting MPEG-4 video programs |
US6088649A (en) * | 1998-08-05 | 2000-07-11 | Visteon Technologies, Llc | Methods and apparatus for selecting a destination in a vehicle navigation system |
US6271835B1 (en) * | 1998-09-03 | 2001-08-07 | Nortel Networks Limited | Touch-screen input device |
US20020087279A1 (en) * | 1998-09-23 | 2002-07-04 | Stuart Hall | Method and apparatus for displaying help screen information for measurement device |
US6411950B1 (en) * | 1998-11-30 | 2002-06-25 | Compaq Information Technologies Group, Lp | Dynamic query expansion |
US20070157122A1 (en) * | 1999-02-22 | 2007-07-05 | Stephen Williams | Communication Terminal Having A Predictive Editor Application |
US6934906B1 (en) * | 1999-07-08 | 2005-08-23 | At&T Corp. | Methods and apparatus for integrating external applications into an MPEG-4 scene |
US6539421B1 (en) * | 1999-09-24 | 2003-03-25 | America Online, Inc. | Messaging application user interface |
US7003446B2 (en) * | 2000-03-07 | 2006-02-21 | Microsoft Corporation | Grammar-based automatic data completion and suggestion for user input |
US6751603B1 (en) * | 2000-05-16 | 2004-06-15 | Sun Microsystems, Inc. | Autocomplete method and apparatus for data file selection |
US20040021691A1 (en) * | 2000-10-18 | 2004-02-05 | Mark Dostie | Method, system and media for entering data in a personal computing device |
US7224409B2 (en) * | 2002-01-21 | 2007-05-29 | Samsung Electronics Co., Ltd. | Channel tuning method and television using channel name auto completion function |
US20030137605A1 (en) * | 2002-01-21 | 2003-07-24 | Samsung Electronics Co., Ltd. | Channel tuning method and television using channel name auto completion function |
US6888141B2 (en) * | 2002-12-02 | 2005-05-03 | Multispectral Imaging, Inc. | Radiation sensor with photo-thermal gain |
US20070188472A1 (en) * | 2003-04-18 | 2007-08-16 | Ghassabian Benjamin F | Systems to enhance data entry in mobile and fixed environment |
Non-Patent Citations (6)
Title |
---|
Jones, P.,"Virtual keyboard with scanning and augmented by prediction," in Proc. 2nd Euro. Conf. Disability, Virtual Reality & Assoc. Tech., Skövde, Sweden, 1998, pp. 45-51. * |
Madenta, "web page description of Telepathic II," dated 07/08/1997, archived 06/12/1998 at WayBack Machine, https://web.archive.org/web/19980612230414/http://www.madenta.com/tracker3.html">, 2 pages. * |
NCIP, "web page description of Telepathic II," dated 09/1997, downloaded from , 2 pages. * |
Nielsen, J.,"WebTV Usability Review," dated 02/01/1997, downloaded from , 11 pages. * |
SofType Demo Information web page screen dump, archived 01/11/1998, retreived from WayBack Machine, 1 page. * |
SofType On-Screen Keyboard for Windows 3.1 and Windows 95 web page screen dump, archived 01/11/1998, retrieved from Wayback Machine, 3 pages. * |
Cited By (20)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US8456425B2 (en) * | 2008-01-30 | 2013-06-04 | International Business Machines Corporation | Self-adapting keypad |
US9448725B2 (en) | 2008-01-30 | 2016-09-20 | International Business Machines Corporation | Self-adapting keypad |
US20090189864A1 (en) * | 2008-01-30 | 2009-07-30 | International Business Machine Corporation | Self-adapting virtual small keyboard apparatus and method |
US20100053092A1 (en) * | 2008-08-26 | 2010-03-04 | Au Optronics Corporation | Control Method for Touch Screen Device |
US20140189556A1 (en) * | 2008-10-10 | 2014-07-03 | At&T Intellectual Property I, L.P. | Augmented i/o for limited form factor user-interfaces |
US8704791B2 (en) * | 2008-10-10 | 2014-04-22 | At&T Intellectual Property I, L.P. | Augmented I/O for limited form factor user-interfaces |
US20120268409A1 (en) * | 2008-10-10 | 2012-10-25 | At&T Intellectual Property I, L.P. | Augmented i/o for limited form factor user-interfaces |
US10101888B2 (en) * | 2008-10-10 | 2018-10-16 | At&T Intellectual Property I, L.P. | Augmented I/O for limited form factor user-interfaces |
US9110574B2 (en) * | 2008-10-10 | 2015-08-18 | At&T Intellectual Property I, L.P. | Augmented I/O for limited form factor user-interfaces |
US8988395B2 (en) | 2008-10-23 | 2015-03-24 | At&T Intellectual Property I, L.P. | Tracking approaching or hovering objects for user-interfaces |
US8599173B2 (en) | 2008-10-23 | 2013-12-03 | At&T Intellectual Property I, L.P. | Tracking approaching or hovering objects for user interfaces |
US9310935B2 (en) | 2008-10-23 | 2016-04-12 | At&T Intellectual Property I, L.P. | Tracking approaching or hovering objects for user-interfaces |
US9690429B2 (en) | 2008-10-23 | 2017-06-27 | At&T Intellectual Property I, L.P. | Tracking approaching or hovering objects for user-interfaces |
US10114511B2 (en) | 2008-10-23 | 2018-10-30 | At&T Intellectual Property I, L.P. | Tracking approaching or hovering objects for user-interfaces |
US10394389B2 (en) | 2008-10-23 | 2019-08-27 | At&T Intellectual Property I, L.P. | Tracking approaching or hovering objects for user-interfaces |
US20110154193A1 (en) * | 2009-12-21 | 2011-06-23 | Nokia Corporation | Method and Apparatus for Text Input |
US20120066244A1 (en) * | 2010-09-15 | 2012-03-15 | Kazuomi Chiba | Name retrieval method and name retrieval apparatus |
US8306968B2 (en) * | 2010-09-15 | 2012-11-06 | Alpine Electronics, Inc. | Name retrieval method and name retrieval apparatus |
US9134814B2 (en) * | 2012-04-05 | 2015-09-15 | Seiko Epson Corporation | Input device, display system and input method |
US20130265228A1 (en) * | 2012-04-05 | 2013-10-10 | Seiko Epson Corporation | Input device, display system and input method |
Also Published As
Publication number | Publication date |
---|---|
US20050210402A1 (en) | 2005-09-22 |
US7716579B2 (en) | 2010-05-11 |
US7293231B1 (en) | 2007-11-06 |
US20080030481A1 (en) | 2008-02-07 |
US20080030480A1 (en) | 2008-02-07 |
US7921361B2 (en) | 2011-04-05 |
US20050223308A1 (en) | 2005-10-06 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US7716579B2 (en) | Data entry for personal computing devices | |
US7681124B2 (en) | Data entry for personal computing devices | |
US9557916B2 (en) | Keyboard system with automatic correction | |
JP3727399B2 (en) | Screen display type key input device | |
US9400782B2 (en) | Virtual keyboard system with automatic correction | |
US10747334B2 (en) | Reduced keyboard disambiguating system and method thereof | |
US20110010655A1 (en) | Method, system and media for entering data in a personal computing device | |
EP1256871A2 (en) | Reduced keyboard disambiguating system | |
EP1356368B1 (en) | Data entry method and system for personal computer, and corresponding computer readable medium | |
EP1887451A2 (en) | Data entry method and system for personal computer, and corresponding computer readable medium | |
Arif et al. | A survey of text entry techniques for smartwatches | |
US20180260110A1 (en) | Virtual keyboard system and method of operation for the same | |
CA2425799A1 (en) | Data entry system for personal computer | |
Shah | Text entry for Smart Watches | |
JPS61138385A (en) | I/o device of character graphic |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: WORDLOGIC SYSTEMS INC., CANADA Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:GUNN, HAROLD DAVID;CHAPMAN, JOHN;REEL/FRAME:027555/0312 Effective date: 20000315 Owner name: 602531 BRITISH COLUMBIA LTD., CANADA Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:WORDLOGIC SYSTEMS INC.;REEL/FRAME:027555/0321 Effective date: 20000502 |
|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |