Nothing Special   »   [go: up one dir, main page]

WO2018129720A1 - Expanding functionalities of pointing stick - Google Patents

Expanding functionalities of pointing stick Download PDF

Info

Publication number
WO2018129720A1
WO2018129720A1 PCT/CN2017/071179 CN2017071179W WO2018129720A1 WO 2018129720 A1 WO2018129720 A1 WO 2018129720A1 CN 2017071179 W CN2017071179 W CN 2017071179W WO 2018129720 A1 WO2018129720 A1 WO 2018129720A1
Authority
WO
WIPO (PCT)
Prior art keywords
trajectory
gesture
pressure
pointing stick
triggering
Prior art date
Application number
PCT/CN2017/071179
Other languages
French (fr)
Inventor
Masaaki Fukumoto
Original Assignee
Microsoft Technology Licensing, Llc.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Microsoft Technology Licensing, Llc. filed Critical Microsoft Technology Licensing, Llc.
Priority to PCT/CN2017/071179 priority Critical patent/WO2018129720A1/en
Publication of WO2018129720A1 publication Critical patent/WO2018129720A1/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/1633Constructional details or arrangements of portable computers not specific to the type of enclosures covered by groups G06F1/1615 - G06F1/1626
    • G06F1/1684Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675
    • G06F1/169Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675 the I/O peripheral being an integrated pointing device, e.g. trackball in the palm rest area, mini-joystick integrated between keyboard keys, touch pads or touch stripes
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0338Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of limited linear or angular displacement of an operating part of the device from a neutral position, e.g. isotonic or isometric joysticks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser

Definitions

  • Pointing devices are input equipment allowing users to input into electronic devices such as computers.
  • Pointing stick is kind of widely-used point device.
  • a pointing stick is usually in a form of small joystick that can be integrated on a keyboard as a pointing device.
  • the pointing stick is different from touchpad-another kind of pointing device-in terms of structures and operating principles.
  • the pointing stick includes a force sensor for sensing direction and size of a force applied on the pointing stick by a tool such as a user’s finger.
  • the force sensor can translate the sensed force into a control signal for controlling movement of a cursor or nay other objects on a user interface (UI) .
  • UI user interface
  • Conventionally, pointing sticks only provide fewer manners for controlling the UI object.
  • the pointing stick can be used in connection with mouse buttons on laptop computers, extra spaces are required for arranging the mouse buttons. However, such spaces on smaller electronic devices may be quite limited or even infeasible.
  • a touchpad typically includes a capacitive sensor for sensing user input. Touch position of a user’s finger on the touchpad is determined based on a varying capacitance detected by the capacitive sensor. In this way, the capacitive sensor can continuously detect a movement path of the user’s finger on the touchpad.
  • the touchpad often requires the user to repeatedly reposition his/her finger when the user desires to move a cursor a long distance across a UI.
  • a touchpad occupies a relative large area on the host device. In comparison, pointing sticks are more flexible, but conventionally can only provide fewer manners for user-machine interaction.
  • Implementations of the subject matter described herein provide a solution in which a pointing stick can be used for achieving various control purposes, such as left-click, right-click, and drag operation.
  • the pointing stick includes an interacting body and a plurality of sensors or electrodes.
  • the sensors are configured to detect a gesture applied on the interacting body and to convert the gesture into a trajectory.
  • the trajectory indicates a continuous displacement value during application of the gesture.
  • the trajectory is represented in a coordinate system defined by the plurality of sensors.
  • Such a trajectory can be used to trigger a respective event. In this way, the user can trigger different events by applying gestures with different trajectories. As such, the functionality of the pointing stick is expanded.
  • FIG. 1 is a perspective view of an electronic device with a pointing stick
  • FIGs. 2a-2b show an example structure of a pointing stick according to an implementation of the subject matter described herein;
  • FIG. 3 illustrates a schematic sensor structure according to an implementation of the subject matter described herein;
  • FIG. 4a shows a schematic side and top views of four different tap gestures applied on a pointing stick according to an implementation of the subject matter described herein;
  • FIGs. 4b-4d show corresponding sensor output based on the tap gesture as shown in FIG. 4a.
  • FIG. 5a-5c show example region division patterns on the touch surface of a pointing stick according to implementations of the subject matter described herein;
  • FIGs. 6A and 6B illustrate a first tap position of a tool on the touch surface and associated coordinate point of an expected touch position according to an implementation of the subject matter described herein;
  • FIGs. 7A and 7B illustrate a second tap position of a tool on the touch surface and associated coordinate point of the expected touch position according to an implementation of the subject matter described herein;
  • FIGs. 8A and 8B illustrate a third tap position of a tool on the touch surface and associated coordinate point of the expected touch position according to an implementation of the subject matter described herein;
  • FIG. 9 illustrates a flowchart of a method for detecting a tap of a tool on the touch surface of the pointing stick in accordance with one implementation of the subject matter described herein;
  • FIG. 10 is a graph showing a pressure applied on the touch surface by a tool with respect to time
  • FIG. 11 illustrates a flowchart of a method for controlling a cursor on the UI in accordance with one implementation of the subject matter described herein;
  • FIG. 12 shows a graph showing a pressure applied on the touch surface of the pointing stick with respect to time
  • FIG. 13 illustrates a block diagram of an example implementation of the electronic device in which one or more implementations of the subject matter described herein may be implemented.
  • the term “includes” and its variants are to be read as open terms that mean “includes, but is not limited to. ”
  • the term “based on” is to be read as “based at least in part on. ”
  • the term “one implementation” and “an implementation” are to be read as “at least one implementation. ”
  • the term “another implementation” is to be read as “at least one other implementation. ”
  • the terms “first, ” “second, ” and the like may refer to different or same objects. Other definitions, explicit and implicit, may be included below. A definition of a term is consistent throughout the description unless the context clearly indicates otherwise.
  • FIG. 1 is a perspective view of an electronic device 10 such as a laptop computer with a pointing stick.
  • the electronic device 10 has the shape of a laptop computer and has an upper housing 12A and a lower housing 12B with components such as a keyboard 16 and a touchpad 18.
  • the electronic device 10 has hinge structures 20 (sometimes referred to as a clutch barrel) to allow the upper housing 12A to rotate in directions 22 about a rotational axis 24 relative to the lower housing 12B.
  • a display 14 is mounted in the upper housing 12A for providing a user interface (UI) .
  • the upper housing 12A which may sometimes be referred to as a display housing or lid, is placed in a closed position by rotating the upper housing 12A towards the lower housing 12B about the rotational axis 24.
  • a pointing stick 160 is integrated in the area of the keyboard 16.
  • FIGs. 2a and 2b schematically illustrate a side view of a pointing stick 160 in which the subject matter described herein can be implemented.
  • the pointing stick 160 generally includes an interacting body or actuator 210 and a plurality of force sensitive sensors 230.
  • the interacting body 210 can be of any suitable shape such as a stick, a disc, a dome or the like.
  • the force sensitive sensors 220 are configured to sense direction and size of a force applied on the touch surface. Examples of the force sensitive sensor (s) 220 include, but are not limited to, resistive, capacitive, strain-gauge, and/or optical sensors.
  • the pointing stick 160 works on the basis of force sensitive resistive (FSR) .
  • FSR force sensitive resistive
  • This example design includes an “embossed” FSR film 220 and a “UFO” -shaped interacting body 210.
  • the FSR film 220 is always in contact with the multiple sensors 230 embedded in the substrate 240 via a contact point P0.
  • Such example pointing stick 160 has a thin profile and can realize a quick response to the user’s interaction.
  • the interacting body 210 is operable to receive a gesture by a user.
  • the sensors 230 are coupled to the interacting body 210 and operable to detect the gesture and to convert the gesture into a trajectory. Given the trajectory, the processing unit of the host machine can trigger a respective functionality or event, thereby providing the user with more manners for interacting with the machine.
  • a trajectory indicates a continuous displacement value during the whole tap gesture in a coordinate system defined by the plurality of sensors 230.
  • Such continuous “trajectory” output can even be obtained from a very “short” tap gesture (indicated by the downward arrow F in FIG. 2b) .
  • the contact point P0 will keep in a contact with the surface of the sensors and continuously move along the lateral direction L from the origin P0 (that is, the default position with no force applied) to the new position P1 (x, y) .
  • the contact point will move from the new position P 1 (x, y) back to the origin P0.
  • the contact point may bounce back from position P 1 (x, y) back to the origin P0 with the aid of the so-called “spring” effect.
  • the sticking point may work on the basis of movement of the gravity center of a contact region instead of a contact point.
  • the sensors may work even without any contact point or contact region.
  • the pointing stick 160 includes multiple (in this case, four) sensors 300 1 , 300 2 , 300 3 and 300 4 (collectively referred to as “300” ) which are symmetrically arranged (also indicated by X+, X-, Y+, and Y-) .
  • These sensors 300 can detect a force applied onto the pointing stick 160 and generate output signals (represented by [X+] , [X-] , [Y+] , and [Y-] ) , each of which reflects the pressure detected by the corresponding sensor 300 1 , 300 2 , 300 3 and 300 4 .
  • FIG. 4a shows schematic side and top views of four different tap gestures (labeled by “A, ” “B, ” “C, ” and “D” ) applied on different positions of the touch surface of the pointing stick 160 with different size of force.
  • FIGs. 4b-4d schematically illustrate the corresponding sensor outputs [X+] , [X-] , [Y+] and [Y-] associated with different touch points.
  • tap gestures “A” and “B” are both applied on the center position of the pointing stick 160, and the pressure of “A” is smaller than that of “B” .
  • Tap gestures “C” and “D” are both applied on the upper-right region of the pointing stick 160, with the pressure of “C” smaller than that of “D” .
  • Such displacement values (dx, dy) can further be used to determine a position on a display 14 of the electronic device 10 to which a cursor is expected to move in response to the detected gesture.
  • non-limiting examples of converting equations for the four sensors case include:
  • fx, fy and fz represent predetermined functions.
  • point (0, 0) is continuously reported while untouched.
  • point (10, 10) at the touch surface of a pointing stick 160 is touched (given the tapping duration is 50ms, and the sampling period is 5ms) , it will report a “trajectory” covering a broad distributed points, even such a short tapping (with 50ms tapping duration) is held.
  • a trajectory of x and y displacements with respect to time can be represented as below:
  • such sequence as above includes the third dimension.
  • a sequence of detected displacement along with the detected force with respect to time can be represented as below:
  • This trajectory data may reflect characteristics of tap gesture (not only pointing location and pressure, but also pointing styles (e.g. which finger is used, and which portion of the finger is used (fingertip, ball of a finger) ) even the duration of the tap gesture is very short.
  • some sensor structure of pointing stick such as that shown on FIG. 1 and FIG. 2 can convert tapping “direction” (or tapping “vector” ) to sensor output, so that it may generate different “trajectories” from a plurality of tapping gestures that have same tapping point but different tapping “direction” . Therefore, the proposed system can expand the functionality by recognizing different types of tapping gestures even when tapped location and pressures are closed.
  • touch pad 18 does not support such a continuous “trajectory” feature especially during a “short” tap gesture.
  • touch pad detects absolute “pointed location” (for example, via an array of sensors with multiple sensor rows and columns) rather than relative displacement as pointing stick does, and thus there are no points reported while the touch pad is untouched. In this case, when shortly tapped, touch pad may just report one point, or a very narrow distributed point group. In other words, no “transition” points (for example, from low values to the maximum values) can be reported.
  • the touch pad gives no output while no touch is applied, and when the point (10, 10) at the touch surface of a touch pad is touched (again, given the tapping duration is 50ms, and the sampling period is 5ms) , it will not continuously report a “trajectory” covering a broad distributed points. Instead, it will only report a single point or at most a very narrow-distributed point group (which narrow point distribution might be possibly caused by fluctuation of the operation or the environmental noise) .
  • a sequence of detected position with respect to time can be represented as below:
  • sequence as above may similarly include the third dimension.
  • a sequence of detected displacement along with the detected force with respect to time can be represented as below:
  • implementations of the subject matter expands the functionality of the pointing stick by allowing various events to be triggered based on the detected trajectories and associated forces.
  • sensors detect a gesture applied on a touch surface of the pointing stick 160 and convert the gesture into a trajectory.
  • the detected gesture then can be provided to the processing unit of the device 10.
  • the processing unit then triggers different events based on different trajectory.
  • a first event can be triggered based on a first trajectory.
  • a second event that is different from the first event can be triggered based on a second trajectory that is different from the first trajectory.
  • the first event may be a left-click event and the second event may be a right-click event.
  • other events such as drag operation, are also possible in accordance with the requirement from the user.
  • a gesture applied on the pointing stick 160 can be a “short” gesture. That is, a “tap” gesture with duration below threshold duration, such as 50ms.
  • threshold duration such as 50ms.
  • a gesture is applied on the pointing stick 160 with a long time duration exceeding the threshold duration, it can be recognized as a normal touch, for example, a long click.
  • the detected gesture include, but are not limited to, a tap, a quick click, and the like.
  • a tap gesture again will be described as an example.
  • a different event can be triggered based on pressure in addition to the trajectory.
  • additional pressure information in addition to the 2D trajectory more gestures, especially some complex actions can be achieved, rending a further expansion of the functionality of pointing sticks.
  • tap gestures “A” and “B” are both applied on the same point (that is, the center of the sensors 300) , due to the different applied pressure, different events can still be triggered.
  • the tap gesture “A” may trigger a normal “left-click” event
  • the tap gesture “B” may trigger a “double-click” event.
  • different tapping gestures may generate different sensor output sequences (or “trajectories” ) .
  • any available pattern recognition techniques for time-synchronized multiple signals can be used as a classifier (for example, machine learning) .
  • classifier for example, machine learning
  • Such kinds of classifier may require heavy computation and thus sometimes may not be necessary for some applications.
  • a much simpler method can be used for separating a plurality of tapping patterns. For example, some representative points or even a single point selected from the trajectory curve might be enough to trigger a specific event.
  • an event can be triggered based on a selected displacement value (or a selected point) from a trajectory.
  • the selected point corresponds to the maximum displacement value (X max , Y max ) .
  • the maximum displacement value (X max , Y max ) usually can accurately reflect the position on which the user actually taps (also referred to a representative position) . In such case, the event triggering does not need to rely on the “whole” trajectory, but only the representative point, that is, the maximum displacement (X max , Y max ) , which enables an easy determination of event.
  • the selected displacement value when the pointing stick supports the pressure detection feature, the selected displacement value may be a displacement value recorded at a time when the maximum pressure value (P max ) during the tap gesture is detected. Normally, such displacement value corresponding to the maximum pressure value shows a very good approximation of the maximum displacement (X max , Y max ) . Therefore, it enables a quick point determination with little loss on the detection accuracy.
  • the displacement value determined at the time when the peak value of the pressure is detected may no longer shows a good approximation of the maximum displacement during the time duration of the pressure. Therefore, in alternative implementations, the actual tap position determination may be determined by considering both maximum displacement and the maximum pressure. In this way, the position of the tap on the touch surface can be determined with improved detection accuracy.
  • a touchpad can determine the finger’s position based on an absolute position of user’s finger on the touchpad due to a relatively large touching area.
  • the pointing stick 160 is only provided with a much smaller surface area for user’s touch. Therefore, compared with large-size touchpads, sometimes it might be a challenge to precisely detect where the gesture is applied on the pointing stick. Therefore, in some circumstances, the users may be more interested or focused on a region where the gesture is applied.
  • touch points or the various trajectories may be grouped or clustered to correspond to various regions on the touch surface of the interacting body, so as to facilitate the user’s operations.
  • any touch points falling within a same region will be regarded as a same point or trajectory, and subsequently a same event will be correspondingly triggered.
  • the first region and the second region can be predefined.
  • the first region or region “A” is located in the center of the touch surface of the pointing stick, and the second region or region “B” laterally surrounds the first region.
  • the first event such as left-click
  • the second event such as right-click event or drag event
  • the first region corresponds to the left half of the touch surface of the pointing stick
  • the second region corresponds to the right half of the touch surface of the pointing stick.
  • the first event such as left-click event
  • the second event such as right-click event or drag event
  • the first region may correspond to the upper half of the touch surface of the pointing stick
  • the second region may correspond to the lower half of the touch surface of the pointing stick. In this way, the user can also easily execute two types of tap operations through the pointing stick, such as left-click and right-click.
  • the first and second regions only correspond to a portion of the touch surface of the pointing stick, and the other portion the touch surface is not predefined for triggering specific events.
  • specific events can only be triggered by tapping the first and second regions, whereas tapping the other portion of the touch surface of the pointing stick would not trigger any event.
  • the touch surface of the pointing stick can be divided into more than two regions.
  • the touch surface of the pointing stick may include a first region (or region “A” ) located in a center, a second region (or region “B” ) on the lower-left region, and a third region (or region “C” ) on the lower-right region.
  • tapping the first, second, and third regions can trigger different events, respectively.
  • tapping the first region can trigger a left-click event
  • tapping the second region can trigger a right-click event
  • tapping the third region can trigger a drag event.
  • tapping the first, second, and third regions can trigger other events.
  • the touch surface of the pointing stick can be divided into four regions (or regions “A, ” “B, ” “C” and “D” ) that are uniformly and symmetrically distributed.
  • the touch surface of the pointing stick can be further divided into more than four regions, such as five, six, seven, ..., depending on the specific requirements of the user or the applications.
  • the touch surface 1600 of the pointing stick 160 includes a first region, a second region, and a third region.
  • the first region is located in a center of the touch surface 1600
  • the second region is located to bottom right of the first region
  • the third region is located to bottom left of the first region as shown in FIG. 5B.
  • tapping the first region can trigger a left-click event
  • tapping the second region can trigger a right-click event
  • tapping the third region can trigger a drag event.
  • the associated coordinate points 610, 710, and 810 of the expected touch position on the interacting body are illustrated in a coordinate system.
  • the first tap position of the user’s finger 2000 on the touch surface 1600 of the pointing stick 160 is located in the first region. Accordingly, as shown in FIG. 6B, the associated coordinate point 610 of the expected touch position on the interacting body is at an original point in the coordinate system. In this case, in response to a tap on the center area of touch surface 1600 of the pointing stick 160, the first event, such as left-click event, can be triggered by the processing unit.
  • the second tap position of the user’s finger 2000 on the touch surface 1600 of the pointing stick 160 is located in the second region. Accordingly, as shown in FIG. 7B, the associated coordinate point 610 of the expected touch position on the interacting body is to bottom right of the original point in the coordinate system.
  • the second event such as a right-click event, can be triggered by the processing unit.
  • the third tap position of the user’s finger 2000 on the touch surface 1600 of the pointing stick 160 is located in the third region. Accordingly, as shown in FIG. 8B, the associated coordinate point 810 of the expected touch position on the interacting body is to bottom left of the original point in the coordinate system.
  • the third event such as a drag event, can be triggered by the processing unit.
  • a gesture may be a tap gesture with a short (or very short) duration.
  • the accurate detection/determination of a tap gesture on the touch surface of the pointing stick is very important and may be implemented in various manners or under various criteria.
  • an example implementation of detecting the tap will be discussed in conjunction with FIGs. 9 and 10.
  • FIG. 9 illustrates a flowchart of a process 900 for detecting a tap of a tool on the touch surface of the pointing stick in accordance with one implementation of the subject matter described herein.
  • a user for detecting a tap of a tool on the touch surface of the pointing stick in accordance with one implementation of the subject matter described herein.
  • a user for detecting a tap of a tool on the touch surface of the pointing stick in accordance with one implementation of the subject matter described herein.
  • a user’s finger will be described as an example of the touch tool. It is to be understood however that this example suggests no limitation as to the scope of the subject matter described herein.
  • Pen, stylus, or any other suitable tool can be used likewise.
  • FIG. 10 is a graph showing a pressure applied on the touch surface by the tool along time.
  • the time duration T DUR of the pressure P is below the threshold duration and a peak value P PEAK of the pressure P at the time T PEAK exceeds a threshold pressure P THD , a tap of the user’s finger on the touch surface of the pointing stick is detected.
  • the threshold duration and the threshold pressure are predefined according to the user’s operating habits. Generally, the threshold duration corresponds to a small amount of time, for example, 50ms duration. This means that a touch on the touch surface of the pointing stick for a short time may be regarded as a tap.
  • a threshold pressure P THD corresponding to a predefined pressure may be set by the user.
  • the peak value P PEAK of the pressure P exceeds the threshold pressure P THD and the touch on the touch surface of the pointing stick is held for a short time, such gesture can be regarded as a tap.
  • the peak value P PEAK of the pressure P is below the threshold pressure P THD , even the touch on the touch surface of the pointing stick is held for a short time, such gesture would still not be regarded as a tap. In this way, an unintentional touch on the touch surface of the pointing stick would not be interpreted as any predetermined operation.
  • the tap on its touch surface 1600 may also cause a slight movement of the cursor on the UI.
  • the method 1100 may additionally include some optional acts.
  • FIG. 11 illustrates a flowchart of a method 1100 controlling a cursor on the UI in accordance with one implementation of the subject matter described herein. It will be appreciated that the acts in the method 1100 can be carried out after the acts involved in the method 900.
  • the movement of the cursor is locked by the processing unit.
  • the pressure applied on the touch surface of the pointing stick initially reaches the threshold pressure, the cursor on the UI will not move.
  • the coordinate of the expected position of the cursor is recorded by a memory unit of an electronic device, such as the electronic device 10 of FIG. 1.
  • the recorded coordinate is discarded by the memory unit.
  • the movement of the cursor to the expected position is initiated by the processing unit. In this way, if the tap is detected, the movement of the cursor on the UI during the time duration of the tap can be prevented. Furthermore, if it is determined that other gestures with long time duration are detected, the movement of the cursor can be initiated.
  • timer starts with a status of “not determined yet” 1210. Then, if pressure value exceeds threshold pressure P THD within a threshold window T THD , status is changed to “tap candidate” 1220 and cursor is temporally stopped, otherwise the status is changed to “non-tap” 1230.
  • “tap candidate” stated pressure value returned to zero by the threshold duration D THD a tap gesture will be determined/issued (i.e. “tap issued” 1240) . After the threshold duration D THD , the remained tap candidates are changed to “non-tap” 1230 and the saved cursor movement will be restored.
  • tapping different regions of the touch surface of the pointing stick can trigger various events, such as left-click event, right-click event, and drag event.
  • events such as left-click event, right-click event, and drag event.
  • tapping different regions of the touch surface of the pointing stick can also trigger other types of events. In this way, the pointing stick can be used for additional control purposes.
  • the electronic device 10 is in a form of a general-purpose computing device.
  • Components of the electronic device 10 may include, but are not limited to, one or more processors or processing units 1310, a memory 1320, one or more input devices 1330, one or more output devices 1340, storage 1350, and one or more communication units 1360.
  • the processing unit 1310 may be a real or a virtual processor and is capable of performing various processes in accordance with a program stored in the memory 1320. In a multi-processing system, multiple processing units execute computer-executable instructions to increase processing power.
  • the electronic device 10 typically includes a variety of machine readable medium. Such medium may be any available medium that is accessible by the computing system/server 1000, including volatile and non-volatile medium, removable and non-removable medium.
  • the memory 1320 may be volatile memory (e.g., registers, cache, a random-access memory (RAM) ) , non-volatile memory (e.g., a read only memory (ROM) , an electrically erasable programmable read only memory (EEPROM) , a flash memory) , or some combination thereof.
  • the storage 1350 may be removable or non-removable, and may include machine readable medium such as flash drives, magnetic disks or any other medium which can be used to store information and which can be accessed within the electronic device 10.
  • the electronic device 10 may further include other removable/non-removable, volatile/non-volatile computing system storage medium.
  • a disk driver for reading from or writing to a removable, non-volatile disk (e.g., a “floppy disk” )
  • an optical disk driver for reading from or writing to a removable, non-volatile optical disk
  • the memory 1320 may include at least one program product having a set (e.g., at least one) of program modules that are configured to carry out the functions of various implementations of the subject matter described herein.
  • a program/utility tool 1322 having a set (at least one) of the program modules 1324 may be stored in, for example, the memory 1320.
  • Such program modules 1324 include, but are not limited to, an operating system, one or more applications, other program modules, and program data. Each or a certain combination of these examples may include an implementation of a networking environment.
  • the program modules 1324 generally carry out the functions and/or methodologies of implementations of the subject matter described herein, for example, the method 800 and method 1000.
  • the input unit (s) 1330 may be one or more of various different input devices.
  • the input unit (s) 1330 may include a user device such as a mouse, keyboard, trackball, a pointing stick, etc.
  • the input unit (s) 1330 may implement one or more natural user interface techniques, such as speech recognition or touch and stylus recognition.
  • the input unit (s) 1330 may include a scanning device, a network adapter, or another device that provides input to the electronic device 10.
  • the output unit (s) 1340 may be a display, printer, speaker, network adapter, or another device that provides output from the electronic device 10.
  • the input unit (s) 1330 and output unit (s) 1340 may be incorporated in a single system or device, such as a touch screen or a virtual reality system.
  • the pointing stick 160 can be touched by a user’s finger. Upon touched, the pointing stick 160 generates a pressure signal representing a pressure applied by the user on the touch surface of the pointing stick 160 and sends to the processing unit 1310.
  • the processing unit 1310 can detect the user’s tap on the touch surface of the pointing stick 160 by using the pressure signal. Upon detecting the user’s tap, the processing unit 1310 can trigger different events in response to different regions of the touch surface being tapped. Generally, all the methods described herein can be implemented by the processing unit 1310.
  • the communication unit (s) 1360 enables communication over communication medium to another computing entity. Additionally, functionality of the components of the electronic device 10 may be implemented in a single computing machine or in multiple computing machines that are able to communicate over communication connections. Thus, the electronic device 10 may operate in a networked environment using logical connections to one or more other servers, network personal computers (PCs) , or another common network node.
  • communication media include wired or wireless networking techniques.
  • the electronic device 10 may also communicate, as required, with one or more external devices (not shown) such as a storage device, a display device, and the like, one or more devices that enable a user to interact with the electronic device 10, and/or any device (e.g., network card, a modem, etc. ) that enables the electronic device 10 to communicate with one or more other computing devices. Such communication may be performed via an input/output (I/O) interface (s) (not shown) .
  • I/O input/output
  • the functionally described herein can be performed, at least in part, by one or more hardware logic components.
  • illustrative types of hardware logic components include Field-Programmable Gate Arrays (FPGAs) , Application-specific Integrated Circuits (ASICs) , Application-specific Standard Products (ASSPs) , System-on-a-chip systems (SOCs) , Complex Programmable Logic Devices (CPLDs) , and the like.
  • Program code for carrying out methods of the subject matter described herein may be written in any combination of one or more programming languages. These program codes may be provided to a processor or controller of a general purpose computer, special purpose computer, or other programmable data processing apparatus, such that the program codes, when executed by the processor or controller, cause the functions/operations specified in the flowcharts and/or block diagrams to be implemented.
  • the program code may execute entirely on a machine, partly on the machine, as a stand-alone software package, partly on the machine and partly on a remote machine or entirely on the remote machine or server.
  • a machine readable medium may be any tangible medium that may contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device.
  • the machine readable medium may be a machine readable signal medium or a machine readable storage medium.
  • a machine readable medium may include but not limited to an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any suitable combination of the foregoing.
  • machine readable storage medium More specific examples of the machine readable storage medium would include an electrical connection having one or more wires, a portable computer diskette, a hard disk, a random access memory (RAM) , a read-only memory (ROM) , an erasable programmable read-only memory (EPROM or Flash memory) , an optical fiber, a portable compact disc read-only memory (CD-ROM) , an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
  • RAM random access memory
  • ROM read-only memory
  • EPROM or Flash memory erasable programmable read-only memory
  • CD-ROM portable compact disc read-only memory
  • magnetic storage device or any suitable combination of the foregoing.
  • a device comprising: a processing unit; and a memory coupled to the processing unit and storing instructions thereon, the instructions, when executed by the processing unit, cause the device to perform acts including: detecting a gesture applied on an interacting body of a pointing stick, a time duration of the gesture on the pointing stick being below a threshold duration; converting the gesture into a trajectory, the trajectory indicating a continuous displacement value during the gesture in a coordinate system defined by a plurality of sensors of the pointing stick; and triggering an event at least based on the trajectory.
  • the triggering an event based on the trajectory further comprises: triggering the event based on the trajectory and a pressure of the gesture associated with the trajectory.
  • the triggering an event based on the trajectory further comprises: selecting a displacement value from the trajectory; and triggering the event based on the selected displacement value.
  • the selecting a displacement value from the trajectory comprises: selecting a maximum displacement value from the trajectory; or selecting a displacement value corresponding to the maximum pressure value during the gesture.
  • the triggering an event based on the trajectory further comprises: determining, based on the selected displacement value, one of a plurality of regions that are defined based on grouping of a plurality of displacement values; and triggering the event based on the determined region.
  • the region includes: a first region that is located in a center of the touch surface, and a second region that laterally surrounds the first region.
  • the detecting a gesture comprises: detecting a pressure applied on the interacting body; and in response to a peak value of the pressure exceeding a threshold pressure and a time duration of the pressure being below the threshold duration, determining that the gesture is detected.
  • the detecting a gesture comprises: stopping the cursor movement in response to the pressure exceeding a threshold pressure; and recording a coordinate of an expected position of a cursor on a display of the device to which the cursor is to move in response to the gesture.
  • the detecting a gesture further comprises: in response to the time duration of the pressure being below the threshold duration, discarding the recorded coordinate; and in response to the time duration of the pressure exceeding the threshold duration, initiating movement of the cursor to the expected position.
  • a computer-implemented method comprises: detecting a gesture applied on an interacting body of a pointing stick, a time duration of the gesture on the pointing stick being below a threshold duration; converting the gesture into a trajectory, the trajectory indicating a continuous displacement value during the gesture in a coordinate system defined by a plurality of sensors of the pointing stick; and triggering an event at least based on the trajectory.
  • the triggering an event based on the trajectory further comprises: triggering the event based on the trajectory and a pressure of the gesture associated with the trajectory.
  • the triggering an event based on the trajectory further comprises: selecting a displacement value from the trajectory; and triggering the event based on the selected displacement value.
  • the selecting a displacement value from the trajectory comprises: selecting a maximum displacement value from the trajectory; or selecting a displacement value corresponding to the maximum pressure value during the gesture.
  • the triggering an event based on the trajectory further comprises: determining, based on the selected displacement value, one of a plurality of regions that are defined based on grouping of a plurality of displacement values; and triggering the event based on the determined region.
  • the region includes: a first region that is located in a center of the touch surface, and a second region that laterally surrounds the first region.
  • the detecting a gesture comprises: detecting a pressure applied on the interacting body; and in response to a peak value of the pressure exceeding a threshold pressure and a time duration of the pressure being below the threshold duration, determining that the gesture is detected.
  • the detecting a gesture comprises: stopping the cursor movement in response to the pressure exceeding a threshold pressure; recording a coordinate of an expected position of a cursor on a display of the device to which the cursor is to move in response to the gesture.
  • the detecting a gesture further comprises: in response to the time duration of the pressure being below the threshold duration, discarding the recorded coordinate; and in response to the time duration of the pressure exceeding the threshold duration, initiating movement of the cursor to the expected position.
  • a pointing stick comprises: an interacting body; a plurality of sensors coupled to the interacting body and operable to: detect a gesture applied on the interacting body; convert the gesture into a trajectory, the trajectory indicating a continuous displacement value during the gesture in a coordinate system defined by the plurality of sensors; and provide the trajectory to a processing unit coupled to a plurality of sensors to trigger an event based on the trajectory.
  • the triggering an event based on the trajectory further comprises: triggering the event based on the trajectory and a pressure of the gesture associated with the trajectory.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Hardware Design (AREA)
  • Position Input By Displaying (AREA)

Abstract

Implementations of the subject matter described herein provide a solution in which a pointing stick can be used for achieving various interaction purposes. In the solution, a gesture applied on an interacting body of a pointing stick is detected. The gesture is converted into a trajectory, and the trajectory indicates a continuous displacement value during the gesture in a coordinate system defined by the plurality of sensors. The trajectory is provided to a processing unit coupled to a plurality of sensors to trigger an event based on the trajectory. In this way, the user can trigger different events by applying gestures with different trajectories. As such, the functionality of the pointing stick is expanded.

Description

EXPANDING FUNCTIONALITIES OF POINTING STICK BACKGROUND
Pointing devices are input equipment allowing users to input into electronic devices such as computers. Pointing stick is kind of widely-used point device. A pointing stick is usually in a form of small joystick that can be integrated on a keyboard as a pointing device. The pointing stick is different from touchpad-another kind of pointing device-in terms of structures and operating principles. Typically, the pointing stick includes a force sensor for sensing direction and size of a force applied on the pointing stick by a tool such as a user’s finger. The force sensor can translate the sensed force into a control signal for controlling movement of a cursor or nay other objects on a user interface (UI) . Conventionally, pointing sticks only provide fewer manners for controlling the UI object. Although the pointing stick can be used in connection with mouse buttons on laptop computers, extra spaces are required for arranging the mouse buttons. However, such spaces on smaller electronic devices may be quite limited or even infeasible.
SUMMARY
In general, two commonly used pointing devices are touchpads and pointing sticks that work on basis of different mechanisms. A touchpad typically includes a capacitive sensor for sensing user input. Touch position of a user’s finger on the touchpad is determined based on a varying capacitance detected by the capacitive sensor. In this way, the capacitive sensor can continuously detect a movement path of the user’s finger on the touchpad. However, the touchpad often requires the user to repeatedly reposition his/her finger when the user desires to move a cursor a long distance across a UI. Additionally, a touchpad occupies a relative large area on the host device. In comparison, pointing sticks are more flexible, but conventionally can only provide fewer manners for user-machine interaction.
Implementations of the subject matter described herein provide a solution in which a pointing stick can be used for achieving various control purposes, such as left-click, right-click, and drag operation. The pointing stick includes an interacting body and a plurality of sensors or electrodes. The sensors are configured to detect a gesture applied on the interacting body and to convert the gesture into a trajectory. Here the trajectory  indicates a continuous displacement value during application of the gesture. The trajectory is represented in a coordinate system defined by the plurality of sensors. Such a trajectory can be used to trigger a respective event. In this way, the user can trigger different events by applying gestures with different trajectories. As such, the functionality of the pointing stick is expanded.
It is to be understood that the Summary is not intended to identify key or essential features of implementations of the subject matter described herein, nor is it intended to be used to limit the scope of the subject matter described herein. Other features of the subject matter described herein will become easily comprehensible through the description below.
BRIEF DESCRIPTION OF THE DRAWINGS
The above and other objectives, features and advantages of the subject matter described herein will become more apparent through more detailed depiction of example implementations of the subject matter described herein in conjunction with the accompanying drawings, wherein in the example implementations of the subject matter described herein, same reference numerals usually represent same components.
FIG. 1 is a perspective view of an electronic device with a pointing stick;
FIGs. 2a-2b show an example structure of a pointing stick according to an implementation of the subject matter described herein;
FIG. 3 illustrates a schematic sensor structure according to an implementation of the subject matter described herein;
FIG. 4a shows a schematic side and top views of four different tap gestures applied on a pointing stick according to an implementation of the subject matter described herein;
FIGs. 4b-4d show corresponding sensor output based on the tap gesture as shown in FIG. 4a.
FIG. 5a-5c show example region division patterns on the touch surface of a pointing stick according to implementations of the subject matter described herein;
FIGs. 6A and 6B illustrate a first tap position of a tool on the touch surface and associated coordinate point of an expected touch position according to an implementation of the subject matter described herein;
FIGs. 7A and 7B illustrate a second tap position of a tool on the touch surface and associated coordinate point of the expected touch position according to an implementation of the subject matter described herein;
FIGs. 8A and 8B illustrate a third tap position of a tool on the touch surface and associated coordinate point of the expected touch position according to an implementation of the subject matter described herein;
FIG. 9 illustrates a flowchart of a method for detecting a tap of a tool on the touch surface of the pointing stick in accordance with one implementation of the subject matter described herein;
FIG. 10 is a graph showing a pressure applied on the touch surface by a tool with respect to time;
FIG. 11 illustrates a flowchart of a method for controlling a cursor on the UI in accordance with one implementation of the subject matter described herein;
FIG. 12 shows a graph showing a pressure applied on the touch surface of the pointing stick with respect to time; and
FIG. 13 illustrates a block diagram of an example implementation of the electronic device in which one or more implementations of the subject matter described herein may be implemented.
DETAILED DESCRIPTION
The subject matter described herein will now be discussed with reference to several example implementations. It should be understood these implementations are discussed only for the purpose of enabling those skilled persons in the art to better understand and thus implement the subject matter described herein, rather than suggesting any limitations on the scope of the subject matter.
As used herein, the term “includes” and its variants are to be read as open terms that mean “includes, but is not limited to. ” The term “based on” is to be read as “based at least in part on. ” The term “one implementation” and “an implementation” are to be read as “at least one implementation. ” The term “another implementation” is to be read as “at least one other implementation. ” The terms “first, ” “second, ” and the like may refer to different or same objects. Other definitions, explicit and implicit, may be included below.  A definition of a term is consistent throughout the description unless the context clearly indicates otherwise.
Example Environment
FIG. 1 is a perspective view of an electronic device 10 such as a laptop computer with a pointing stick. The electronic device 10 has the shape of a laptop computer and has an upper housing 12A and a lower housing 12B with components such as a keyboard 16 and a touchpad 18. The electronic device 10 has hinge structures 20 (sometimes referred to as a clutch barrel) to allow the upper housing 12A to rotate in directions 22 about a rotational axis 24 relative to the lower housing 12B. A display 14 is mounted in the upper housing 12A for providing a user interface (UI) . The upper housing 12A, which may sometimes be referred to as a display housing or lid, is placed in a closed position by rotating the upper housing 12A towards the lower housing 12B about the rotational axis 24. In this example, a pointing stick 160 is integrated in the area of the keyboard 16.
FIGs. 2a and 2b schematically illustrate a side view of a pointing stick 160 in which the subject matter described herein can be implemented. As shown, the pointing stick 160 generally includes an interacting body or actuator 210 and a plurality of force sensitive sensors 230. The interacting body 210 can be of any suitable shape such as a stick, a disc, a dome or the like. The force sensitive sensors 220 are configured to sense direction and size of a force applied on the touch surface. Examples of the force sensitive sensor (s) 220 include, but are not limited to, resistive, capacitive, strain-gauge, and/or optical sensors.
In the example implementation shown in FIGs. 2a and 2b, the pointing stick 160 works on the basis of force sensitive resistive (FSR) . This example design includes an “embossed” FSR film 220 and a “UFO” -shaped interacting body 210. In some implementations, the FSR film 220 is always in contact with the multiple sensors 230 embedded in the substrate 240 via a contact point P0. Such example pointing stick 160 has a thin profile and can realize a quick response to the user’s interaction.
Principles of Operations
For ease of discussions, now some fundamental principles of operations of the subject matter described herein will be described with reference to FIG. 2. However, it is to be understood that the subject matter described herein can be also implemented in other  types of pointing sticks with other detection mechanisms, which will be mentioned later.
In order to expand the functionalities of the pointing stick to provide more flexibility and to enhance the user experience, according to implementations of the subject matter described herein, the interacting body 210 is operable to receive a gesture by a user. The sensors 230 are coupled to the interacting body 210 and operable to detect the gesture and to convert the gesture into a trajectory. Given the trajectory, the processing unit of the host machine can trigger a respective functionality or event, thereby providing the user with more manners for interacting with the machine.
In the context of the present disclosure, a trajectory indicates a continuous displacement value during the whole tap gesture in a coordinate system defined by the plurality of sensors 230. Such continuous “trajectory” output can even be obtained from a very “short” tap gesture (indicated by the downward arrow F in FIG. 2b) . This is because, no matter which lateral position P1 (x, y) other than the origin/center point P0 (0, 0) is touched on the touch surface of the interacting body 210, the contact point P0 will keep in a contact with the surface of the sensors and continuously move along the lateral direction L from the origin P0 (that is, the default position with no force applied) to the new position P1 (x, y) . Then, with a release of the pressure from the interacting body 210, the contact point will move from the new position P 1 (x, y) back to the origin P0. For example, the contact point may bounce back from position P 1 (x, y) back to the origin P0 with the aid of the so-called “spring” effect.
The process described as above would result in a continuous output at each sensor of the multiple sensors 230. In such a way, continuous displacement curves along x direction and y direction with respect to time can be determined based on those output results from sensors.
It is to be understood that although the above example is described with reference to an arrangement where there is a contact point, this is merely for illustration without suggesting any limitations as to the scope of the subject matter described herein. In another implementation, the sticking point may work on the basis of movement of the gravity center of a contact region instead of a contact point. In yet another implementation, the sensors may work even without any contact point or contact region.
In some implementations, depending on structures of sensors, continuous force/pressure result with respect to time can also be detected along with the determination  of the displacement curves. In some implementations, as shown in FIG. 3, the pointing stick 160 includes multiple (in this case, four)  sensors  3001, 3002, 3003 and 3004 (collectively referred to as “300” ) which are symmetrically arranged (also indicated by X+, X-, Y+, and Y-) . These sensors 300 can detect a force applied onto the pointing stick 160 and generate output signals (represented by [X+] , [X-] , [Y+] , and [Y-] ) , each of which reflects the pressure detected by the corresponding  sensor  3001, 3002, 3003 and 3004.
FIG. 4a shows schematic side and top views of four different tap gestures (labeled by “A, ” “B, ” “C, ” and “D” ) applied on different positions of the touch surface of the pointing stick 160 with different size of force. FIGs. 4b-4d schematically illustrate the corresponding sensor outputs [X+] , [X-] , [Y+] and [Y-] associated with different touch points. As shown in FIGs. 4a-4d, tap gestures “A” and “B” are both applied on the center position of the pointing stick 160, and the pressure of “A” is smaller than that of “B” . Tap gestures “C” and “D” are both applied on the upper-right region of the pointing stick 160, with the pressure of “C” smaller than that of “D” .
As illustrated in FIG. 4b, due to the fact that tap gestures “A” and “B” are applied in the center, all four  sensors  3001, 3002, 3003 and 3004 will detect the same amount of pressure. In this case, the output curves from all four  sensors  3001, 3002, 3003 and 3004 are substantially the same. Also, since the pressure of “A” is smaller than the pressure of “B” , the peak of output curve “A” is lower than that of curve “B. ” 
As illustrated in FIG. s 4c-4d, due to the fact that the tap gestures “C” and “D” are applied on the upper-right edge of the touch surface of the interacting body 410, which is far away from the sensor 3002 and 3004 (or X-and Y-) , the sensor outputs [X+] and [Y+] from  sensors  3001 and 3003 thus are substantially above the sensor outputs [X-] and [Y-] from  sensor  3002 and 3004.
Those sensor output [X+] , [X-] , [Y+] and [Y-] from the four  sensors  3001, 3002, 3003, 3004 subsequently can be used to determine the displacement in x and y directions (dx, dy) on the coordinate system defined by the sensors as well as the force (F) applied. Such displacement values (dx, dy) can further be used to determine a position on a display 14 of the electronic device 10 to which a cursor is expected to move in response to the detected gesture. In some implementations, non-limiting examples of converting equations for the four sensors case include:
dx =fx ( [X+] - [X-] )               (1)
dy=fy ( [Y+] - [Y-] )               (2)
F=fz ( [X+] + [X-] + [Y+] + [Y-] )          (3)
where fx, fy and fz represent predetermined functions.
As an example, point (0, 0) is continuously reported while untouched. When point (10, 10) at the touch surface of a pointing stick 160 is touched (given the tapping duration is 50ms, and the sampling period is 5ms) , it will report a “trajectory” covering a broad distributed points, even such a short tapping (with 50ms tapping duration) is held. For example, a trajectory of x and y displacements with respect to time can be represented as below:
..., (0, 0) , (0, 0) , <Start Tapping> (1, 1) , (5, 5) , (7, 7) , (9, 9) , (10, 10) , (8, 8) , (6, 6) , (4, 4) , (2, 2) , (1, 1) , <End Tapping> (0, 0) , (0, 0) , ...
where (0, 0) means “no touch. ”
With the detected pressure, such sequence as above includes the third dimension. In this case, during the tap with a maximum force value of 20, a sequence of detected displacement along with the detected force with respect to time can be represented as below:
..., (0, 0, 0) , (0, 0, 0) , <Start Tapping> (1, 1, 2) , (5, 5, 10) , (7, 7, 14) , (9, 9, 18) , (10, 10, 20) , (8, 8, 16) , (6, 6, 12) , (4, 4, 8) , (2, 2, 4) , (1, 1, 2) , <End Tapping> (0, 0, 0) , (0, 0, 0) , ...
This trajectory data may reflect characteristics of tap gesture (not only pointing location and pressure, but also pointing styles (e.g. which finger is used, and which portion of the finger is used (fingertip, ball of a finger) ) even the duration of the tap gesture is very short. In addition, some sensor structure of pointing stick such as that shown on FIG. 1 and FIG. 2 can convert tapping “direction” (or tapping “vector” ) to sensor output, so that it may generate different “trajectories” from a plurality of tapping gestures that have same tapping point but different tapping “direction” . Therefore, the proposed system can expand the functionality by recognizing different types of tapping gestures even when tapped location and pressures are closed.
The touch pad 18 as shown in FIG. 1, however, does not support such a continuous “trajectory” feature especially during a “short” tap gesture. As known, touch pad detects  absolute “pointed location” (for example, via an array of sensors with multiple sensor rows and columns) rather than relative displacement as pointing stick does, and thus there are no points reported while the touch pad is untouched. In this case, when shortly tapped, touch pad may just report one point, or a very narrow distributed point group. In other words, no “transition” points (for example, from low values to the maximum values) can be reported.
As an example of touch pad, the touch pad gives no output while no touch is applied, and when the point (10, 10) at the touch surface of a touch pad is touched (again, given the tapping duration is 50ms, and the sampling period is 5ms) , it will not continuously report a “trajectory” covering a broad distributed points. Instead, it will only report a single point or at most a very narrow-distributed point group (which narrow point distribution might be possibly caused by fluctuation of the operation or the environmental noise) . For example, during the tap event with 50ms tap duration, a sequence of detected position with respect to time can be represented as below:
..., (-, -) , (-, -) , <Start Tapping> (9, 10) , (10, 10) , (10, 9) , (10, 10) , (10, 10) , (10, 10) , (9, 10) , (9, 9) , (9, 8) , (8, 8) , <End Tapping> (-, -) , (-, -) , ...
wherein (-, -) means “no point detected. ”
With the additional force sensing feature supported by some touch pads, such sequence as above may similarly include the third dimension. In this case, during the tap with a maximum force value of 20, a sequence of detected displacement along with the detected force with respect to time can be represented as below:
..., (-, -, 0) , (-, -, 0) , <Start Tapping> (9, 10, 19) , (10, 10, 20) , (10, 9, 19) , (10, 10, 19) , (10, 10, 20) , (10, 10, 21) , (9, 10, 20) , (9, 9, 19) , (9, 8, 19) , (8, 8, 18) , <End Tapping> (-, -, 0) , (-, -, 0), ...
Though a sequence of the detected points as above from a touch pad can be still obtained, it to be noted that compared to the point distribution (in a range from 1-10) from the example stick point, such sequence from a touch pad shows a very narrow point distribution (in a range from 9-10) . In other words, the touch pad system due to its structure and sensing mechanism, cannot report enough “trajectory” as a pointing stick does that can be used to expand the functionality associated with a “short” tap gesture.
Example Processes
In light of the inventor’s discovery and observations of the difference between natures of the pointing stick as outlined above, implementations of the subject matter expands the functionality of the pointing stick by allowing various events to be triggered based on the detected trajectories and associated forces.
Specifically, as discussed above, sensors detect a gesture applied on a touch surface of the pointing stick 160 and convert the gesture into a trajectory. The detected gesture then can be provided to the processing unit of the device 10. The processing unit then triggers different events based on different trajectory. For example, in some implementations, a first event can be triggered based on a first trajectory. Further, a second event that is different from the first event can be triggered based on a second trajectory that is different from the first trajectory. For example, the first event may be a left-click event and the second event may be a right-click event. Of course, other events, such as drag operation, are also possible in accordance with the requirement from the user.
Here, a gesture applied on the pointing stick 160 can be a “short” gesture. That is, a “tap” gesture with duration below threshold duration, such as 50ms. On the other hand, if a gesture is applied on the pointing stick 160 with a long time duration exceeding the threshold duration, it can be recognized as a normal touch, for example, a long click. Examples of the detected gesture include, but are not limited to, a tap, a quick click, and the like. For ease of discussion and without suggesting any limitations as to the scope of the subject matter described herein, a tap gesture again will be described as an example.
By using the sensors to further detect the pressure applied on the interacting body, in some implementations, a different event can be triggered based on pressure in addition to the trajectory. With the additional pressure information in addition to the 2D trajectory, more gestures, especially some complex actions can be achieved, rending a further expansion of the functionality of pointing sticks.
For example, still with reference to FIGs. 4a and 4b, although tap gestures “A” and “B” are both applied on the same point (that is, the center of the sensors 300) , due to the different applied pressure, different events can still be triggered. For example, the tap gesture “A” may trigger a normal “left-click” event, while the tap gesture “B” may trigger a “double-click” event.
Point-based Interaction
Basically, different tapping gestures may generate different sensor output sequences (or “trajectories” ) . In this way, any available pattern recognition techniques for time-synchronized multiple signals can be used as a classifier (for example, machine learning) . However, such kinds of classifier may require heavy computation and thus sometimes may not be necessary for some applications. In this case, a much simpler method can be used for separating a plurality of tapping patterns. For example, some representative points or even a single point selected from the trajectory curve might be enough to trigger a specific event.
In some implementations, an event can be triggered based on a selected displacement value (or a selected point) from a trajectory. In some implementations, the selected point corresponds to the maximum displacement value (Xmax, Ymax) . The maximum displacement value (Xmax, Ymax) usually can accurately reflect the position on which the user actually taps (also referred to a representative position) . In such case, the event triggering does not need to rely on the “whole” trajectory, but only the representative point, that is, the maximum displacement (Xmax, Ymax) , which enables an easy determination of event.
In some other implementations, when the pointing stick supports the pressure detection feature, the selected displacement value may be a displacement value recorded at a time when the maximum pressure value (Pmax) during the tap gesture is detected. Normally, such displacement value corresponding to the maximum pressure value shows a very good approximation of the maximum displacement (Xmax, Ymax) . Therefore, it enables a quick point determination with little loss on the detection accuracy.
In some other implementations, for example, due to the presence of operating errors, the displacement value determined at the time when the peak value of the pressure is detected may no longer shows a good approximation of the maximum displacement during the time duration of the pressure. Therefore, in alternative implementations, the actual tap position determination may be determined by considering both maximum displacement and the maximum pressure. In this way, the position of the tap on the touch surface can be determined with improved detection accuracy.
Region-based Interaction
As discussed above, a touchpad can determine the finger’s position based on an absolute position of user’s finger on the touchpad due to a relatively large touching area. On the contrary, the pointing stick 160 is only provided with a much smaller surface area for user’s touch. Therefore, compared with large-size touchpads, sometimes it might be a challenge to precisely detect where the gesture is applied on the pointing stick. Therefore, in some circumstances, the users may be more interested or focused on a region where the gesture is applied.
In some implementations, touch points or the various trajectories may be grouped or clustered to correspond to various regions on the touch surface of the interacting body, so as to facilitate the user’s operations. In this case, any touch points falling within a same region will be regarded as a same point or trajectory, and subsequently a same event will be correspondingly triggered.
In some implementations, the first region and the second region can be predefined. For example, in some implementations as illustrated in FIG. 5A, the first region or region “A” is located in the center of the touch surface of the pointing stick, and the second region or region “B” laterally surrounds the first region. In this case, if the user’s finger taps the center area of touch surface of the pointing stick, the first event, such as left-click, can be triggered by the processing unit. If the user’s finger taps the peripheral area of the touch surface of the pointing stick, the second event, such as right-click event or drag event, can be triggered by the processing unit. In this way, the user can easily execute two types of tap operations through the pointing stick, such as left-click and right-click.
In some other implementations, the first region corresponds to the left half of the touch surface of the pointing stick, and the second region corresponds to the right half of the touch surface of the pointing stick. In this case, if the user’s finger taps the left half of touch surface of the pointing stick, the first event, such as left-click event, can be triggered by the processing unit. If the user’s finger taps the right half of the touch surface of the pointing stick, the second event, such as right-click event or drag event, can be triggered by the processing unit. Likewise, the first region may correspond to the upper half of the touch surface of the pointing stick, and the second region may correspond to the lower half of the touch surface of the pointing stick. In this way, the user can also easily execute two types of tap operations through the pointing stick, such as left-click and right-click.
In some other implementations, the first and second regions only correspond to a  portion of the touch surface of the pointing stick, and the other portion the touch surface is not predefined for triggering specific events. In this case, specific events can only be triggered by tapping the first and second regions, whereas tapping the other portion of the touch surface of the pointing stick would not trigger any event.
In some other implementations, the touch surface of the pointing stick can be divided into more than two regions. For example, as shown in FIG. 5B, the touch surface of the pointing stick may include a first region (or region “A” ) located in a center, a second region (or region “B” ) on the lower-left region, and a third region (or region “C” ) on the lower-right region. In this case, tapping the first, second, and third regions can trigger different events, respectively. In some implementations, tapping the first region can trigger a left-click event, tapping the second region can trigger a right-click event, and tapping the third region can trigger a drag event. In other implementations, tapping the first, second, and third regions can trigger other events.
In some other implementations as shown in FIG. 5C, the touch surface of the pointing stick can be divided into four regions (or regions “A, ” “B, ” “C” and “D” ) that are uniformly and symmetrically distributed. Of course, the touch surface of the pointing stick can be further divided into more than four regions, such as five, six, seven, ..., depending on the specific requirements of the user or the applications.
Hereinafter, some example tap positions of the user’s finger on the touch surface and associated coordinate point of an expected position of the cursor on the UI will be described in connection with FIGs. 6A-8B. In the shown examples, the touch surface 1600 of the pointing stick 160 includes a first region, a second region, and a third region. By way of example, the first region is located in a center of the touch surface 1600, the second region is located to bottom right of the first region, and the third region is located to bottom left of the first region as shown in FIG. 5B. As an example, tapping the first region can trigger a left-click event, tapping the second region can trigger a right-click event, and tapping the third region can trigger a drag event. As shown in Figs 6B, 7B and 8B, the associated coordinate  points  610, 710, and 810 of the expected touch position on the interacting body are illustrated in a coordinate system.
As shown in FIG. 6A, the first tap position of the user’s finger 2000 on the touch surface 1600 of the pointing stick 160 is located in the first region. Accordingly, as shown in FIG. 6B, the associated coordinate point 610 of the expected touch position on the  interacting body is at an original point in the coordinate system. In this case, in response to a tap on the center area of touch surface 1600 of the pointing stick 160, the first event, such as left-click event, can be triggered by the processing unit.
Referring to FIG. 7A, the second tap position of the user’s finger 2000 on the touch surface 1600 of the pointing stick 160 is located in the second region. Accordingly, as shown in FIG. 7B, the associated coordinate point 610 of the expected touch position on the interacting body is to bottom right of the original point in the coordinate system. In this case, in response to a tap on the second region of the pointing stick 160, the second event, such as a right-click event, can be triggered by the processing unit.
In FIG. 8A, the third tap position of the user’s finger 2000 on the touch surface 1600 of the pointing stick 160 is located in the third region. Accordingly, as shown in FIG. 8B, the associated coordinate point 810 of the expected touch position on the interacting body is to bottom left of the original point in the coordinate system. In this case, in response to a tap on the third region of the pointing stick 160, the third event, such as a drag event, can be triggered by the processing unit.
Tap Detection
As discussed above, a gesture may be a tap gesture with a short (or very short) duration. In such embodiments, the accurate detection/determination of a tap gesture on the touch surface of the pointing stick is very important and may be implemented in various manners or under various criteria. Hereinafter, an example implementation of detecting the tap will be discussed in conjunction with FIGs. 9 and 10.
FIG. 9 illustrates a flowchart of a process 900 for detecting a tap of a tool on the touch surface of the pointing stick in accordance with one implementation of the subject matter described herein. For ease of discussion, a user’s finger will be described as an example of the touch tool. It is to be understood however that this example suggests no limitation as to the scope of the subject matter described herein. Pen, stylus, or any other suitable tool can be used likewise.
As shown in FIG. 9, at 910, a pressure applied by the user on the touch surface of the pointing stick is detected. The pressure, as discussed above, can be detected by the force sensors in the pointing stick and sent to the processing unit. FIG. 10 is a graph showing a pressure applied on the touch surface by the tool along time. Referring to FIGs.  9 and 10, at 920, if the time duration TDUR of the pressure P is below the threshold duration and a peak value PPEAK of the pressure P at the time TPEAK exceeds a threshold pressure PTHD, a tap of the user’s finger on the touch surface of the pointing stick is detected. The threshold duration and the threshold pressure are predefined according to the user’s operating habits. Generally, the threshold duration corresponds to a small amount of time, for example, 50ms duration. This means that a touch on the touch surface of the pointing stick for a short time may be regarded as a tap.
Furthermore, a threshold pressure PTHD corresponding to a predefined pressure may be set by the user. In this case, only if the peak value PPEAK of the pressure P exceeds the threshold pressure PTHD and the touch on the touch surface of the pointing stick is held for a short time, such gesture can be regarded as a tap. In other words, if the peak value PPEAK of the pressure P is below the threshold pressure PTHD, even the touch on the touch surface of the pointing stick is held for a short time, such gesture would still not be regarded as a tap. In this way, an unintentional touch on the touch surface of the pointing stick would not be interpreted as any predetermined operation.
Cursor Control
For the pointing stick 160, the tap on its touch surface 1600 may also cause a slight movement of the cursor on the UI. In order to avoid the movement of the cursor on the UI during the time duration of the tap, the method 1100 may additionally include some optional acts. In this regard, FIG. 11 illustrates a flowchart of a method 1100 controlling a cursor on the UI in accordance with one implementation of the subject matter described herein. It will be appreciated that the acts in the method 1100 can be carried out after the acts involved in the method 900.
As shown in FIG. 11, at 1150, in response to the pressure applied on the touch surface of the pointing stick reaching the threshold pressure, the movement of the cursor is locked by the processing unit. Thus, if the pressure applied on the touch surface of the pointing stick initially reaches the threshold pressure, the cursor on the UI will not move.
At 1160, the coordinate of the expected position of the cursor is recorded by a memory unit of an electronic device, such as the electronic device 10 of FIG. 1. At 1170, in response to the time duration of the pressure being below the threshold duration, the recorded coordinate is discarded by the memory unit. At 1180, in response to the time duration of the pressure exceeding the threshold duration, the movement of the cursor to the  expected position is initiated by the processing unit. In this way, if the tap is detected, the movement of the cursor on the UI during the time duration of the tap can be prevented. Furthermore, if it is determined that other gestures with long time duration are detected, the movement of the cursor can be initiated.
In some implementations with reference to FIG. 12, when the pressure is changed from zero to non-zero, timer starts with a status of “not determined yet” 1210. Then, if pressure value exceeds threshold pressure PTHD within a threshold window TTHD, status is changed to “tap candidate” 1220 and cursor is temporally stopped, otherwise the status is changed to “non-tap” 1230. Next, if “tap candidate” stated pressure value returned to zero by the threshold duration DTHD a tap gesture will be determined/issued (i.e. “tap issued” 1240) . After the threshold duration DTHD, the remained tap candidates are changed to “non-tap” 1230 and the saved cursor movement will be restored.
As described above, tapping different regions of the touch surface of the pointing stick can trigger various events, such as left-click event, right-click event, and drag event. However, it is obvious for those skilled in the art that tapping different regions of the touch surface of the pointing stick can also trigger other types of events. In this way, the pointing stick can be used for additional control purposes.
Example Device
Hereinafter, an example implementation of the electronic device 10 is shown in FIG. 13. In this example, the electronic device 10 is in a form of a general-purpose computing device. Components of the electronic device 10 may include, but are not limited to, one or more processors or processing units 1310, a memory 1320, one or more input devices 1330, one or more output devices 1340, storage 1350, and one or more communication units 1360. The processing unit 1310 may be a real or a virtual processor and is capable of performing various processes in accordance with a program stored in the memory 1320. In a multi-processing system, multiple processing units execute computer-executable instructions to increase processing power.
The electronic device 10 typically includes a variety of machine readable medium. Such medium may be any available medium that is accessible by the computing system/server 1000, including volatile and non-volatile medium, removable and non-removable medium. The memory 1320 may be volatile memory (e.g., registers, cache, a random-access memory (RAM) ) , non-volatile memory (e.g., a read only memory (ROM) ,  an electrically erasable programmable read only memory (EEPROM) , a flash memory) , or some combination thereof. The storage 1350 may be removable or non-removable, and may include machine readable medium such as flash drives, magnetic disks or any other medium which can be used to store information and which can be accessed within the electronic device 10.
The electronic device 10 may further include other removable/non-removable, volatile/non-volatile computing system storage medium. Although not shown in FIG. 10, a disk driver for reading from or writing to a removable, non-volatile disk (e.g., a “floppy disk” ) , and an optical disk driver for reading from or writing to a removable, non-volatile optical disk can be provided. The memory 1320 may include at least one program product having a set (e.g., at least one) of program modules that are configured to carry out the functions of various implementations of the subject matter described herein.
A program/utility tool 1322 having a set (at least one) of the program modules 1324 may be stored in, for example, the memory 1320. Such program modules 1324 include, but are not limited to, an operating system, one or more applications, other program modules, and program data. Each or a certain combination of these examples may include an implementation of a networking environment. The program modules 1324 generally carry out the functions and/or methodologies of implementations of the subject matter described herein, for example, the method 800 and method 1000.
The input unit (s) 1330 may be one or more of various different input devices. For example, the input unit (s) 1330 may include a user device such as a mouse, keyboard, trackball, a pointing stick, etc. The input unit (s) 1330 may implement one or more natural user interface techniques, such as speech recognition or touch and stylus recognition. As other examples, the input unit (s) 1330 may include a scanning device, a network adapter, or another device that provides input to the electronic device 10. The output unit (s) 1340 may be a display, printer, speaker, network adapter, or another device that provides output from the electronic device 10. The input unit (s) 1330 and output unit (s) 1340 may be incorporated in a single system or device, such as a touch screen or a virtual reality system.
In case the input unit 1330 includes a pointing stick 160, the pointing stick 160 can be touched by a user’s finger. Upon touched, the pointing stick 160 generates a pressure signal representing a pressure applied by the user on the touch surface of the pointing stick 160 and sends to the processing unit 1310. The processing unit 1310 can detect the user’s  tap on the touch surface of the pointing stick 160 by using the pressure signal. Upon detecting the user’s tap, the processing unit 1310 can trigger different events in response to different regions of the touch surface being tapped. Generally, all the methods described herein can be implemented by the processing unit 1310.
The communication unit (s) 1360 enables communication over communication medium to another computing entity. Additionally, functionality of the components of the electronic device 10 may be implemented in a single computing machine or in multiple computing machines that are able to communicate over communication connections. Thus, the electronic device 10 may operate in a networked environment using logical connections to one or more other servers, network personal computers (PCs) , or another common network node. By way of example, and not limitation, communication media include wired or wireless networking techniques.
The electronic device 10 may also communicate, as required, with one or more external devices (not shown) such as a storage device, a display device, and the like, one or more devices that enable a user to interact with the electronic device 10, and/or any device (e.g., network card, a modem, etc. ) that enables the electronic device 10 to communicate with one or more other computing devices. Such communication may be performed via an input/output (I/O) interface (s) (not shown) .
The functionally described herein can be performed, at least in part, by one or more hardware logic components. For example, and without limitation, illustrative types of hardware logic components that can be used include Field-Programmable Gate Arrays (FPGAs) , Application-specific Integrated Circuits (ASICs) , Application-specific Standard Products (ASSPs) , System-on-a-chip systems (SOCs) , Complex Programmable Logic Devices (CPLDs) , and the like.
Program code for carrying out methods of the subject matter described herein may be written in any combination of one or more programming languages. These program codes may be provided to a processor or controller of a general purpose computer, special purpose computer, or other programmable data processing apparatus, such that the program codes, when executed by the processor or controller, cause the functions/operations specified in the flowcharts and/or block diagrams to be implemented. The program code may execute entirely on a machine, partly on the machine, as a stand-alone software package, partly on the machine and partly on a remote machine or entirely on the remote  machine or server.
In the context of this disclosure, a machine readable medium may be any tangible medium that may contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. The machine readable medium may be a machine readable signal medium or a machine readable storage medium. A machine readable medium may include but not limited to an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any suitable combination of the foregoing. More specific examples of the machine readable storage medium would include an electrical connection having one or more wires, a portable computer diskette, a hard disk, a random access memory (RAM) , a read-only memory (ROM) , an erasable programmable read-only memory (EPROM or Flash memory) , an optical fiber, a portable compact disc read-only memory (CD-ROM) , an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
Further, while operations are depicted in a particular order, this should not be understood as requiring that such operations be performed in the particular order shown or in sequential order, or that all illustrated operations be performed, to achieve desirable results. In certain circumstances, multitasking and parallel processing may be advantageous. Likewise, while several specific implementation details are contained in the above discussions, these should not be construed as limitations on the scope of the subject matter described herein, but rather as descriptions of features that may be specific to particular implementations. Certain features that are described in the context of separate implementations may also be implemented in combination in a single implementation. Conversely, various features that are described in the context of a single implementation may also be implemented in multiple implementations separately or in any suitable sub-combination.
Example Implementations
Hereinafter, some example implementations of the subject matter described herein will be listed.
In some implementations, there is provided a device. The device comprises: a processing unit; and a memory coupled to the processing unit and storing instructions thereon, the instructions, when executed by the processing unit, cause the device to perform acts including: detecting a gesture applied on an interacting body of a pointing stick, a time  duration of the gesture on the pointing stick being below a threshold duration; converting the gesture into a trajectory, the trajectory indicating a continuous displacement value during the gesture in a coordinate system defined by a plurality of sensors of the pointing stick; and triggering an event at least based on the trajectory.
In some implementations, the triggering an event based on the trajectory further comprises: triggering the event based on the trajectory and a pressure of the gesture associated with the trajectory.
In some implementations, the triggering an event based on the trajectory further comprises: selecting a displacement value from the trajectory; and triggering the event based on the selected displacement value.
In some implementations, the selecting a displacement value from the trajectory comprises: selecting a maximum displacement value from the trajectory; or selecting a displacement value corresponding to the maximum pressure value during the gesture.
In some implementations, the triggering an event based on the trajectory further comprises: determining, based on the selected displacement value, one of a plurality of regions that are defined based on grouping of a plurality of displacement values; and triggering the event based on the determined region.
In some implementations, the region includes: a first region that is located in a center of the touch surface, and a second region that laterally surrounds the first region.
In some implementations, the detecting a gesture comprises: detecting a pressure applied on the interacting body; and in response to a peak value of the pressure exceeding a threshold pressure and a time duration of the pressure being below the threshold duration, determining that the gesture is detected.
In some implementations, the detecting a gesture comprises: stopping the cursor movement in response to the pressure exceeding a threshold pressure; and recording a coordinate of an expected position of a cursor on a display of the device to which the cursor is to move in response to the gesture.
In some implementations, the detecting a gesture further comprises: in response to the time duration of the pressure being below the threshold duration, discarding the recorded coordinate; and in response to the time duration of the pressure exceeding the threshold duration, initiating movement of the cursor to the expected position.
In some implementations, there is provided a computer-implemented method. The method comprises: detecting a gesture applied on an interacting body of a pointing stick, a time duration of the gesture on the pointing stick being below a threshold duration; converting the gesture into a trajectory, the trajectory indicating a continuous displacement value during the gesture in a coordinate system defined by a plurality of sensors of the pointing stick; and triggering an event at least based on the trajectory.
In some implementations, the triggering an event based on the trajectory further comprises: triggering the event based on the trajectory and a pressure of the gesture associated with the trajectory.
In some implementations, the triggering an event based on the trajectory further comprises: selecting a displacement value from the trajectory; and triggering the event based on the selected displacement value.
In some implementations, the selecting a displacement value from the trajectory comprises: selecting a maximum displacement value from the trajectory; or selecting a displacement value corresponding to the maximum pressure value during the gesture.
In some implementations, the triggering an event based on the trajectory further comprises: determining, based on the selected displacement value, one of a plurality of regions that are defined based on grouping of a plurality of displacement values; and triggering the event based on the determined region.
In some implementations, the region includes: a first region that is located in a center of the touch surface, and a second region that laterally surrounds the first region.
In some implementations, the detecting a gesture comprises: detecting a pressure applied on the interacting body; and in response to a peak value of the pressure exceeding a threshold pressure and a time duration of the pressure being below the threshold duration, determining that the gesture is detected.
In some implementations, the detecting a gesture comprises: stopping the cursor movement in response to the pressure exceeding a threshold pressure; recording a coordinate of an expected position of a cursor on a display of the device to which the cursor is to move in response to the gesture.
In some implementations, the detecting a gesture further comprises: in response to the time duration of the pressure being below the threshold duration, discarding the  recorded coordinate; and in response to the time duration of the pressure exceeding the threshold duration, initiating movement of the cursor to the expected position.
In some implementations, there is provided a pointing stick. The pointing stick comprises: an interacting body; a plurality of sensors coupled to the interacting body and operable to: detect a gesture applied on the interacting body; convert the gesture into a trajectory, the trajectory indicating a continuous displacement value during the gesture in a coordinate system defined by the plurality of sensors; and provide the trajectory to a processing unit coupled to a plurality of sensors to trigger an event based on the trajectory.
In some implementations, the triggering an event based on the trajectory further comprises: triggering the event based on the trajectory and a pressure of the gesture associated with the trajectory.

Claims (20)

  1. A device comprising:
    a processing unit; and
    a memory coupled to the processing unit and storing instructions thereon, the instructions, when executed by the processing unit, cause the device to perform acts including:
    detecting a gesture applied on an interacting body of a pointing stick, a time duration of the gesture on the pointing stick being below a threshold duration;
    converting the gesture into a trajectory, the trajectory indicating a continuous displacement value during the gesture in a coordinate system defined by a plurality of sensors of the pointing stick; and
    triggering an event at least based on the trajectory.
  2. The device of claim 1, wherein the triggering an event based on the trajectory further comprises:
    triggering the event based on the trajectory and a pressure of the gesture associated with the trajectory.
  3. The device of claim 2, wherein the triggering an event based on the trajectory further comprises:
    selecting a displacement value from the trajectory; and
    triggering the event based on the selected displacement value.
  4. The device of claim 3, wherein the selecting a displacement value from the trajectory comprises:
    selecting a maximum displacement value from the trajectory; or
    selecting a displacement value corresponding to the maximum pressure value during the gesture.
  5. The device of claim 3, wherein the triggering an event based on the trajectory further comprises:
    determining, based on the selected displacement value, one of a plurality of regions that are defined based on grouping of a plurality of displacement values; and
    triggering the event based on the determined region.
  6. The device of claim 5, wherein the region includes:
    a first region that is located in a center of the touch surface; and
    a second region that laterally surrounds the first region.
  7. The device of claim 1, wherein the detecting a gesture comprises:
    detecting a pressure applied on the interacting body; and
    in response to a peak value of the pressure exceeding a threshold pressure and a time duration of the pressure being below the threshold duration, determining that the gesture is detected.
  8. The device of claim 7, wherein the detecting a gesture comprises:
    stopping the cursor movement in response to the pressure exceeding a threshold pressure; and
    recording a coordinate of an expected position of a cursor on a display of the device to which the cursor is to move in response to the gesture.
  9. The device of claim 8, wherein the detecting a gesture further comprises:
    in response to the time duration of the pressure being below the threshold duration, discarding the recorded coordinate; and
    in response to the time duration of the pressure exceeding the threshold duration, initiating movement of the cursor to the expected position.
  10. A computer-implemented method comprising:
    detecting a gesture applied on an interacting body of a pointing stick, a time duration of the gesture on the pointing stick being below a threshold duration;
    converting the gesture into a trajectory, the trajectory indicating a continuous displacement value during the gesture in a coordinate system defined by a plurality of sensors of the pointing stick; and
    triggering an event at least based on the trajectory.
  11. The method of claim 10, wherein the triggering an event based on the trajectory further comprises:
    triggering the event based on the trajectory and a pressure of the gesture associated with the trajectory.
  12. The method of claim 11, wherein the triggering an event based on the trajectory further comprises:
    selecting a displacement value from the trajectory; and
    triggering the event based on the selected displacement value.
  13. The method of claim 12, wherein the selecting a displacement value from the trajectory comprises:
    selecting a maximum displacement value from the trajectory; or
    selecting a displacement value corresponding to the maximum pressure value during the gesture.
  14. The device of claim 12, wherein the triggering an event based on the trajectory further comprises:
    determining, based on the selected displacement value, one of a plurality of regions that are defined based on grouping of a plurality of displacement values; and
    triggering the event based on the determined region.
  15. The device of claim 14, wherein the region includes:
    a first region that is located in a center of the touch surface; and
    a second region that laterally surrounds the first region.
  16. The method of claim 10, wherein the detecting a gesture comprises:
    detecting a pressure applied on the interacting body; and
    in response to a peak value of the pressure exceeding a threshold pressure and a time duration of the pressure being below the threshold duration, determining that the gesture is detected.
  17. The method of claim 16, wherein the detecting a gesture comprises:
    stopping the cursor movement in response to the pressure exceeding a threshold pressure;
    recording a coordinate of an expected position of a cursor on a display of the device  to which the cursor is to move in response to the gesture.
  18. The method of claim 17, wherein the detecting a gesture further comprises:
    in response to the time duration of the pressure being below the threshold duration, discarding the recorded coordinate; and
    in response to the time duration of the pressure exceeding the threshold duration, initiating movement of the cursor to the expected position.
  19. A pointing stick, comprising:
    an interacting body;
    a plurality of sensors coupled to the interacting body and operable to:
    detect a gesture applied on the interacting body, a time duration of the gesture on the pointing stick being below a threshold duration;
    convert the gesture into a trajectory, the trajectory indicating a continuous displacement value during the gesture in a coordinate system defined by the plurality of sensors; and
    provide the trajectory to a processing unit coupled to a plurality of sensors to trigger an event based on the trajectory.
  20. The pointing stick of claim 19, wherein the triggering an event based on the trajectory further comprises:
    triggering the event based on the trajectory and a pressure of the gesture associated with the trajectory.
PCT/CN2017/071179 2017-01-13 2017-01-13 Expanding functionalities of pointing stick WO2018129720A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/CN2017/071179 WO2018129720A1 (en) 2017-01-13 2017-01-13 Expanding functionalities of pointing stick

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2017/071179 WO2018129720A1 (en) 2017-01-13 2017-01-13 Expanding functionalities of pointing stick

Publications (1)

Publication Number Publication Date
WO2018129720A1 true WO2018129720A1 (en) 2018-07-19

Family

ID=62839233

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2017/071179 WO2018129720A1 (en) 2017-01-13 2017-01-13 Expanding functionalities of pointing stick

Country Status (1)

Country Link
WO (1) WO2018129720A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2023097803A (en) * 2021-12-28 2023-07-10 レノボ・シンガポール・プライベート・リミテッド Information processing apparatus and control method

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101349949A (en) * 2007-07-16 2009-01-21 旭达电脑(昆山)有限公司 Mouse with pointing rod
TW201324253A (en) * 2011-12-06 2013-06-16 Howay Corp Method of touch recognition and capacitive pointing stick device
CN204390196U (en) * 2014-11-19 2015-06-10 杨丽 A kind of TrackPoint of low cost
JP2016066133A (en) * 2014-09-24 2016-04-28 レノボ・シンガポール・プライベート・リミテッド Method for processing input of pointing stick, computer and computer program

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101349949A (en) * 2007-07-16 2009-01-21 旭达电脑(昆山)有限公司 Mouse with pointing rod
TW201324253A (en) * 2011-12-06 2013-06-16 Howay Corp Method of touch recognition and capacitive pointing stick device
JP2016066133A (en) * 2014-09-24 2016-04-28 レノボ・シンガポール・プライベート・リミテッド Method for processing input of pointing stick, computer and computer program
CN204390196U (en) * 2014-11-19 2015-06-10 杨丽 A kind of TrackPoint of low cost

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2023097803A (en) * 2021-12-28 2023-07-10 レノボ・シンガポール・プライベート・リミテッド Information processing apparatus and control method
JP7335318B2 (en) 2021-12-28 2023-08-29 レノボ・シンガポール・プライベート・リミテッド Information processing device and control method
US11928266B2 (en) 2021-12-28 2024-03-12 Lenovo (Singapore) Pte. Ltd. Information processing apparatus and controlling method

Similar Documents

Publication Publication Date Title
US11435851B2 (en) System for detecting and characterizing inputs on a touch sensor
EP3105660B1 (en) Low-profile pointing stick
US20150153897A1 (en) User interface adaptation from an input source identifier change
US9075095B2 (en) Device and method for localized force sensing
US20150160779A1 (en) Controlling interactions based on touch screen contact area
CN105992992B (en) Low shape TrackPoint
US20150160794A1 (en) Resolving ambiguous touches to a touch screen interface
US20130257769A1 (en) Systems and methods for dynamically modulating a user interface parameter using an input device
US20130154948A1 (en) Force sensing input device and method for determining force information
US20050052427A1 (en) Hand gesture interaction with touch surface
US20040008189A1 (en) Multi-mouse actions stylus
TW200907770A (en) Integrated touch pad and pen-based tablet input system
CN116507995A (en) Touch screen display with virtual track pad
US8970498B2 (en) Touch-enabled input device
US20150242112A1 (en) Human interface device with touch sensor
CN105474164B (en) The ambiguity inputted indirectly is eliminated
US20140298275A1 (en) Method for recognizing input gestures
US20100271300A1 (en) Multi-Touch Pad Control Method
WO2018129720A1 (en) Expanding functionalities of pointing stick
US20150103010A1 (en) Keyboard with Integrated Pointing Functionality
US10282025B2 (en) Clickable touchpad systems and methods
CN117157611A (en) Touch screen and trackpad touch detection

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17891299

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 17891299

Country of ref document: EP

Kind code of ref document: A1