EP3326052A1 - Apparatus and method for detecting gestures on a touchpad - Google Patents
Apparatus and method for detecting gestures on a touchpadInfo
- Publication number
- EP3326052A1 EP3326052A1 EP16741039.8A EP16741039A EP3326052A1 EP 3326052 A1 EP3326052 A1 EP 3326052A1 EP 16741039 A EP16741039 A EP 16741039A EP 3326052 A1 EP3326052 A1 EP 3326052A1
- Authority
- EP
- European Patent Office
- Prior art keywords
- touchpad
- user
- touch
- proximity
- gesture
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Withdrawn
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/033—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
- G06F3/0354—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of 2D relative movements between the device, or an operating part thereof, and a plane or surface, e.g. 2D mice, trackballs, pens or pucks
- G06F3/03547—Touch pads, in which fingers can move on a surface
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/041—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
- G06F3/0416—Control or interface arrangements specially adapted for digitisers
- G06F3/04166—Details of scanning methods, e.g. sampling time, grouping of sub areas or time sharing with display driving
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/041—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
- G06F3/042—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means
- G06F3/0421—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means by interrupting or reflecting a light beam, e.g. optical touch-screen
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2203/00—Indexing scheme relating to G06F3/00 - G06F3/048
- G06F2203/041—Indexing scheme relating to G06F3/041 - G06F3/045
- G06F2203/04108—Touchless 2D- digitiser, i.e. digitiser detecting the X/Y position of the input means, finger or stylus, also when it does not touch, but is proximate to the digitiser's interaction surface without distance measurement in the Z direction
Definitions
- the present invention relates to a touchpad and interpretation of gestures performed on the touchpad or in close proximity thereto.
- Recent applications include systems that may detect position of objects in proximity fields around an apparatus, also called “proximity sensing”.
- a system is disclosed by Synaptics Inc, and as published in US patent application number US 2007/10262951.
- US patent application number US2011/0279397 discloses a monitoring unit for monitoring a hand or finger in three dimensions in the vicinity of a touch screen such that the monitoring unit is working in contact mode or in contact-less mode.
- Other publications disclosing user interfaces, for example interpreting gestures comprise US patent US8830181 and US patent applications US2008/0168403, US2010/0245289, and US20120162073.
- a gesture for example a swiping action with a finger
- a gesture is interpreted according to a predetermined orientation and location of the user relative to the detector.
- the gesture has to be performed by the user in relation to the actual orientation of the user interface of the touchpad relative to the user.
- symbols or text need to be printed or applied otherwise to the surface of the detector to ensure the correct orientation of the detector relative to the user. This requires attention by the user and is a limitation to the user-friendl iness. it would be desirable to increase the user-friendliness of touch pads and to remove the need for symbols or text on the surface of the detector.
- the present invention provides an apparatus and a method for detecting user-supplied control commands given via gestures on a touch-sensitive surface, also called touch- pad, of the apparatus, for example multimedia apparatus, AV system, loudspeaker, remote control or media player.
- the touchpad comprises a proximity detection system for detecting the location of an object, for example a user ' s finger, in the proximity of the touchpad, and for detecting a movement performed with the object by the user in the prox imity of the touchpad.
- the apparatus further comprises a touch detection system for detecting contact by said object with the surface of the touchpad and for detecting a gesture performed w ith the object by the user while the object is in contact with the touchpad.
- the function of the touch detection system is combined w ith the function of the proximity detection system, where the latter detects the presence of a finger or hand or pointer of a user before, during or after the gesture on the touchpad in order to determine the location of the user relative to the touchpad.
- This information is used to interpret the intended direction of the gesture. For example, when a user swipes a finger across the touchpad, the l ine of the swipe is calculated from the touch data, and the position of the user is calculated from the related proximity data. Thus, it is possible to determine if the user is swiping right or left as seen from the user ' s own perspective. For example, when the finger moves across the surface of the touch pad along a path being linear or curved and performed like a swipe from one position to another position, the left- or right orientation of said object movement is interpreted to be left or right according to the actual user position in front of the omnidirectional touchpad.
- the movement is interpreted as being a swipe that is directed to the right, also cal led a right-swipe, not only if the user is on one side of the touchpad but also if the user is located on an opposite side of the touchpad.
- the system detects the location of the user relative to the touchpad and adjusts the gesture interpretation accordingly. This is in contrast to the prior art, where the user interface has to be oriented correctly, relative to the user, or the user has to adjust the gesture, for example swipe, to match the direction of the user interface. Also, in the invention, there is no need for symbols or text on the surface of the user interface. Typical applications are operation of multimedia apparatus, AV systems, loudspeakers, media players, remote controls and similar equipment.
- proximity sensing of a finger or hand is done by a capacitive sensor or by a light beam emitter in combination with a sensor that detect reflected light beams.
- the position of the person relative to the apparatus is sensed by a reflected light beam, especially an infrared ( I R) reflected l ight beam.
- I R infrared
- Detection of gestures is performed in a 3 -dimensional space around the apparatus and by touching directly on the apparatus.
- Dedicated means are used for detecting position of objects, for example a finger, a hand, or a pointer dev ice, located at a distance close to the apparatus and in direct physical contact w ith the apparatus.
- the invention has properties making it suitable for mounting on printed circuit boards and improving the quality of the detection system such that user-suppl ied commands may certainly and unambiguously be detected, interpreted and executed as related functions in a given apparatus.
- the a user-direction is found relative to a predetermined orientation of the touchpad, and the orientation of the detected gesture on the touch- pad is adjusted according to this difference prior to interpreting the gesture with respect to a gest u re-assoc i ated command and executing the command.
- this can be achieved by detecting a proximity-position P of an ob ject in close proximity to the touchpad, detect ing a touch-position T of the object while in contact with the touchpad, and from the prox i m i ty-posi t ion P and the touch-position T determining a user-direction towards the user.
- a sequence of touch positions T can be used, for example in the case of a swipe.
- the prox imity-position P and/or the touch-position T are averaged positions, for example achieved by a weighted averaging, which in the fol lowing are called dominant proximity-position P and dominant touch-position T.
- the method comprises
- the proximity detection system detecting a movement of the object in close prox- imity to the touchpad and averaging this movement to a dominant proximity-position
- the method also contains the step of detecting a gesture of the object by the touch detection system while in contact with the touchpad and averaging the gesture to a dominant touch-position T.
- the detected proximity movement of the object and the gesture on the touchpad are translated to coordinate sequences in a pre-defined coordinate sys- tern, in this case, a practical embodiment of the method comprises
- the method comprises adjusting the orientation of the gesture with re- spect to righ-to-left or left-to-right direction prior to interpreting the gesture.
- the apparatus determines whether the swipe-path is a left-swipe or right-swipe depending on the determined user-direction in the coordinate-system.
- the proximity detection system comprises plurality of proximity sensors organized along an outer perimeter of the touchpad, optionally circular touchpad.
- the touch detection system comprises a plurality of touch sensors organized on the surface of the touchpad and surrounded by the prox imity sensors.
- the proximity detection system comprises a plurality of infrared light emitters and a plural ity of infrared light receivers, and said receivers are configured for measuring a background level of infrared l ight and correcting infrared proximity signals by subtracting the background level.
- the plural ity of infrared l ight emitters are configured for one emitter of the plurality of emitters being active at a time, and wherein an electronic control circuit is con figured for receiving a separate set of proximity signals from each one or of the plural ity of infrared receivers, for every subsequent activation of each further infrared emitter of the plurality of emitters.
- the method comprises receiving the proximity signals from two infrared receivers, one on either side of the corresponding emitter, for every subsequent activation of each further infrared emitter of the plural ity of emitters
- the apparatus comprises an input device hav ing a primary surface in an X.Y plane.
- a plural ity of capacitive means operable to generate a plural ity of electric fields, wherein at least two of said capacitive means are posi- tioned on the X.Y plane of said surface.
- At least one infrared transmitter means and at least two infrared receiver means are positioned on said surface and are configured to issue I R l ight beams, primarily orthogonally out from the said surface and receive I R light beams caused by the reflection from the object, for example a finger, of a user, and primarily above and orthogonal ly to the X.Y plane of said sur- face, wherein the method comprises the steps of:
- An aspect of the invention is an omnidirectional touchpad, integrated into an appa- ratus, enabled to detect user given commands and to determine if the user is making a right or left swipe gesture independent of where the user is positioned relative to the touch pad.
- the omnidirectional touchpad is configured with means to detect the proximity of an object, e.g. a user finger, and is configured with means to detect the touch pressure by said object onto the surface of the omnidirectional touchpad, this characterized by:
- a first X.Y position of the object is determined as sensed by the proximity means, and validated accordingly,
- o a second X.Y position of the object is determined as sensed by the touch means and based on the values of the first X.Y position
- o a third X.Y position of the object is determined as sensed by the touch means, and validated accordingly.
- o a resulting X,Y is calculated based on the second X,Y position value and the third X,Y position/value
- a further aspect of the invention is:
- An even further aspect of the invention is an omnidirectional touchpad. integrated into an apparatus, enabled to detect user given commands and to determine if the user is making a right or left swipe gesture independent of where the user is positioned relative to the touch pad.
- the omnidirectional touchpad is configured with means to detect the proximity of an object, e.g. a user finger or hand, and is configured with means to detect the touch/pressure by said object onto the surface of the omnidirec- t ion a I touchpad, this characterized by:
- a first X.Y position of the object is determined as sensed by the proximity means, and validated accordingly.
- a second, third and subsequent X,Y positions of the object are determined as sensed by the touch means and val idated accordingly, until no further user interaction is detected by the touch means
- a second, third and subsequent X.Y positions of the object are determined as sensed by the proximity means and validated accordingly, until no further user interaction is detected by the proximity means a resulting dominant X.Y touch position or a resulting X,Y touch sw ipe vector is calculated relative to the fixed orthogonal X,Y coordinate system, based on the sequence of detected touch X,Y values, a resulting dominant X,Y proximity position of the user's hand or finger is calculated relative to the fixed orthogonal X.Y coordinate system, based on the sequence of detected proximity X,Y values, a corrected dominant X,Y touch position or a corrected X,Y swipe vector is calculated relative to a orthogonal X.Y coordinate system, rotated towards the user
- a command corresponding to the resulting corrected dominant X,Y touch position or the resulting corrected X,Y touch vector is interpreted by the apparatus and executed accordingly and
- one or more of the capacitive means arc divided into two or more segments, which are indiv idually receptive to user input commands, and wherein the method includes the step of determining at which segment said user input command is detected.
- said input device comprises a substantially planar body being integrated into an apparatus, e.g. a media player, or alternatively the means are configured as a standalone product, e.g. a remote control ler, smartphone. tablet or alike.
- the touch commands may be input on different adjacent surfaces of a three- dimensional object, such a system al lows for a greater combination or arrangement of acceptable input commands.
- At least one of said infrared means is operable to generate an infrared l ight beam field substantially in front of said primary surface, wherein the method comprises the step of detecting that an object is moved into said at least one infrared l ight beam field and/or moved out of said at least one infrared light beam field or moved within said at least one infrared l ight beam field wherein said step of generating a control command is based in part on said detection.
- at least one of said infra- red means is operable to detect an infrared light beam field substantially in front of said primary surface.
- object may refer to an object or token held by a user, for example a pointer device, or may also refer to at least a portion of the user's body detected by the system, e.g. a finger.
- a prox imity field As detection can be made based on the movement of an object relative to a prox imity field, this allows commands to be entered by the user without touching the input dev ice.
- different gestures may be interpretable as di fferent user commands.
- different commands may be generated based on how close the user/object is to the input device, e.g. a display image may be adjusted based on the user's proximity to the display.
- the method comprises the step of detecting that an object touches a touch sensitive field, and generating a related action.
- a touch-based input apparatus comprising:
- said control unit is operable to detect a user command in the form of a touch command or a non-touch command by a gesture remote from the touchpad surface and to generate a control command based on the user command detected.
- one or more of said capacitive means arc divided into a plurality of segments individually receptive to user input commands, and wherein the said control unit is operable to determine at which segment said user input command is detected.
- a resistive based or other touch system may be applied to the capacitive means.
- said apparatus is configured with a substantial ly planar body selected from one of the following materials: a glass panel, a plastic panel, or a combination of the two materials.
- said control unit is operable to detect a touch command applied directly to the surface of said substantially planar body.
- two or more of the capacitive means are positioned in the same X.Y plane, each disposed along a line and mutually in parallel along the X-axis or along the Y- axis; or alternatively arranged within two or more concentric circles.
- two or more of the infrared means are positioned in the same X.Y plane, each disposed along a line arranged within two or more concentric circles.
- one or more of the infrared means are divided into two or more segments, which are individually receptive to active input signals.
- one or more of the infrared means are configured in one or more pairs, a pair including at least one IR sender and one IR receiver and/or one IR transceiver.
- the infrared means I R is emitting and reception means are detecting user- supplied control commands issued in a remote field at a distance from the apparatus which is within the defined proximity field distance.
- the inv ention comprises use of any suitable capacitive sensor technology, e.g. surface capacitance, projected capacitiv e touch, etc.
- the inv ention operates w ith a number of functional properties:
- the invention operates with a number of control commands executed in the apparatus, control commands related to the detected user-supplied commands, and examples are. but not limited to:
- the surface on the device has not an x-y orientation as such; thus, the commands L/R giv en as abov e arc relativ e to the user ' s position in front of the device to be controlled, and with the user finger at any position along the outer perimeter of the top surface of the device.
- the device type of the invention defined to be an Omnidirectional Touchpad.
- An omnidirectional touchpad integrated into an apparatus, is ena- bled to detect user given commands, and if the user is making a right or left sw ipe gesture independent of where the user is positioned relative to the touch pad, the omnidirectional touchpad is configured with means to detect the proximity of an object, e.g. a user finger, and is configured with means to detect the touch pressure by said object onto the surface of the omnidirectional touchpad, this characterized by:
- a second X,Y position of the object is determined as sensed by the touch means and based on the v alues of the first X.Y position.
- a resulting X.Y is calculated based on the second X.Y position/value and the third X.Y position value.
- a command corresponding to the resulting X,Y is interpreted by the apparatus and executed accordingly.
- the object moves across the surface of the touchpad along a path bein glinear or curved and performed like a swipe from one X,Y position to another X, Y position, and
- a start vector is initialized and oriented from one point P in a detected proximity X,Y position to another point T in a detected touch X,Y position, and
- a first movement vector is initialized and oriented from one point T in a detected touch X,Y position to another point T in another detected touch X,Y position, and
- a second movement vector is initialized and oriented from one point P in a detected proximity X,Y position to another point P in another detected proximity X,Y position, and
- the first movement vector is substantial parallel with the second movement vector.
- ASPECT 5. An omnidirectional touchpad according to any aspect above, where the proximity means are organized on or along an outer perimeter of the omnidirectional touchpad.
- ASPECT 6. An omnidirectional touchpad according to aspect 5, where the touch means are organized on the surface of the omnidirectional touchpad, and the touch means surrounded by the proximity means.
- ASPECT 7 An omnidirectional touchpad according to aspect 6, where the touch means are based on capacitivc means, or resistive means, or a combination of the two.
- ASPECT 8 An omnidirectional touchpad according to aspect 7, where the proximity means are based on capacitive means, or light means, in frared or laser, or a communication of the two..
- An omnidirectional touchpad according to aspect 8, where proximity- detect ion are implemented by one or more of light emitters and a number of l ight receivers, and said receiver detects if an object is in proximity.
- ASPECT 10 An omnidirectional touchpad according to aspect 9, where one emitter is active at a time and thus the electronic control circuit gets a separate set of proximity signals from each receiver for every subsequent emitter activation. ASPECT 1 1 . An omnidirectional touchpad according to aspect 10, where the emitters and receivers closest to object gives the highest signal .
- An omnidirectional touchpad according to aspect 1 1 where the touch area is:
- each conductive pad is connected to the input of a capacitance to digital converter (CDC ), and c. the digital signals are fed into a microprocessor ( ⁇ ).
- CDC capacitance to digital converter
- ASPECT 13 An omnidirectional touchpad according to aspect 12, where the sensing means including the touch area and the proximity detectors scanned at a rela- tively high rate (50-100Hz) and al l data continuously processed by the uP.
- Figure 2a shows the layout of the touchpad, and Figure 2b illustrates definitions of directions
- Figures 3 shows a block diagram of the electronics in the detection means
- Figures 4 and 5 shows the layout of the touchpad and reflection caused by object
- Figure 6 shows an alternative layout of the detection means
- FIG. 7 displays principles of command detection
- the omnidirectional touchpad is primarily intended to be positioned in the horizontal plane.
- the omnidirectional touchpad can detect whether the user is making a right or left swipe gesture as seen from the user's own perspective, independent of where the user is positioned relative to the touch pad (see Figure 1).
- the omnidirectional touch- pad therefore does not require any print of logos to indicate a certain swipe direction or touch area. This is a great advantage as compared to user interfaces where the user either has to perform the swiping action from a certain location or the user has to adjust the direction relative to the orientation of the user interface of the touch pad, which could possibly be upside-down for the user from, the specific user position. Also, this eliminates constraints, especially, for circular touchpads. If the omnidirectional touch pad is mounted vertically (e.g. on the wall), this property allows for simple mounting without the need for ensuring a certain orientation.
- the omnidirectional touchpad is realized by a combination of a touch area and a num- ber of proximity detectors placed around the perimeter of the touch area as shown in Figure 2a.
- the number of IR emitters and receivers in the illustrated case is three, but a different number of emitters and receiver pairs is possible, such as two or four of each.
- One possible implementation of proximity detection is by means of IR technology.
- an I R proximity detection system consists of one I R emitter and one IR receiver. If a hand or finger is in proximity, the emitted I R light is reflected and the IR receiver can detect this reflection. The closer the hand is, the higher the reflection.
- background I R radiation will be present, for example due to sunlight or artificial IR sources in the room. Therefore, the proximity detection method needs to cancel out the contribution of such background I R radiation.
- I R infrared
- IR receivers IR receivers
- ⁇ microprocessor
- the emitters and receivers closest to the hand or finger will give the highest signal .
- the ⁇ can calculate from, which side the hand is approaching.
- S l ,S2,S3.. . Sn the criteria for a proximity sensing related to an X,Y position is:
- Figure 2b illustrates further details of a proximity detection system.
- the term “background” is used instead of the term “ambient”, and the term, “total” is used as a substitution for the term ''ambient+reflected”.
- the proximity detection system based on the I R emitter and IR receiver means is used to determine the position of the touching object (finger or hand) relative to the touch swipe or touch position on the omnidirectional touchpad.
- the proximity detection system continuously and repeatedly calculates a dominant X,Y position of the touching object, typical ly a finger or hand of the user.
- a dominant X,Y position of the touching object typical ly a finger or hand of the user.
- the procedure for calculating the dominant X,Y can be as follows.
- one position vector is found, representing the dominant position of the object (hand or finger), relative to the orthogonal X,Y coordinate system.
- An alternative technology for the proximity detection could be capacitive proximity technology.
- capacitive proximity technology is very sensitive to common mode noise entering the product (through e.g. the mains cord ). I f the common mode noise is low capacitive proximity may work, but in many cases, the common mode noise will interfere with the proximity signal causing the detection to be unrel iable.
- IR proximity is not sensitive to common mode noise and therefore IR proximity is in some cases preferred for a reliable detection.
- Suitable proximity detection technologies without this draw back are also for example: Ultrasonic sound or R F/radar.
- the touch area is implemented with known capacitive technology.
- Other suitable touch detection technologies are: Resistive touch, Force-Sensing Resistor touch.
- Acoustical touch e.g. Surface Acoustic Wave
- Strain Gauge etc.
- both the touch area and the proximity detectors are scanned at a relatively high rate (50-100Hz) and all data is continuously processed by the ⁇ .
- a relatively high rate 50-100Hz
- the line of the swipe calculated from the touch data and from the proximity data will be used to calculate the position of the user.
- Other gestures as single tap, double tap etc. can also be detected.
- the user is tapping off-center, it is possible to detect in which position (seen from the user's perspective) the tap is applied.
- Figure 3 further shows a block diagram of the omnidirectional touchpad circuit in an apparatus equipped with means for generating electric signals and fields used for de- tecting control commands issued by a user.
- the means are a combination of IR signal generators and IR detectors and electric fields generated via capacitive controlled signal generators.
- the IR system is used for detecting the presence of an object and/or movement in a remote field.
- a pulse-based IR proximity detection method is used here.
- An implementation can be based on a standard chip e.g. Sil l4x from Sili- conLabs.
- a number of conductive pads are placed below the surface, as shown in Figure 3.
- Each conductive pad is connected to the input of a capacitance to digital converter (CDC) and the digital signals fed into a microprocessor ( ⁇ ).
- CDC capacitance to digital converter
- ⁇ microprocessor
- the "capacitive touch system” is based on conducting areas or conducting strips being applied to the printed circuit board (PCB) or other carrier, which is hidden behind the front plate of the apparatus, which may be display screen, or a plate of glass, plastic or similar.
- the conducting strips can be made of copper, carbon or other conducting material, which is applied or vaporized on the PCB. Depending on the functional demands to a given apparatus, two or more conducting areas or strips are applied.
- the touch area is divided into a number of segments, each representing a touch sensi- tive area.
- the system may detect that the user touches one or more areas at the same time, and detects a movement, like a sweep done by an object/finger across the touch sensitive surface.
- Figure 3 shows an apparatus having a touch field divided into 12 fields, e.g. corresponding a specific function, which is activated by the user touch/swipe of the respective fields.
- the criteria for a capacitive sensing related to X,Y position is:
- Figure 4 displays how an object e.g. a finger of a user is detected by the proximity means at a given distance from the surface from the device to the finger.
- the user ' s physical touch on the surface is the trigger for a device command accordingly.
- Figure 5 displays how an object e.g. a finger of a user is detected by the prox imity means at a given distance from the surface from the device to the finger.
- the object reflects an emitter l ight beam and a light sensor detects accordingly the presence of the object, at this given position.
- Touch pad with resistive matrix means or capacitive means and proximity with l ight means emitting from the edge of the surface of the device ( Figure 6c ).
- Light emission and detection from the edge of the device are, optionally, used for detection of the user's position relative to the touchpad. Accordingly, the gesture, for example swipe action, by the user can be correctly interpreted by the apparatus with respect to a correct direction. For the latter, the detected gesture is rotated into a direction that matches with the calculated location of the user relative to the touch pad. This is one way of interpreting the correct direction of the gesture, for example a swiping action .
- Figure 7 displays one embodiment on how the device detects and interprets a user given command.
- An object e.g. the finger of user
- P X,Y position
- T X,Y position
- the X,Y positions are relative to a fixed coordinate system with origin at the center of the device surface or alternatively relative to a floating coordinate system with the origin created at the point of the detected proximity posi- tion.
- P With origin in P (71) a start vector (75) being connection to (T), where the vector is substantially orthogonal to the X axis in the detected P point.
- Movements done by the finger, and detected by the proximity means and the touch means define the movement vector (76, 77). The movement will typically be substantially along the X- axis with predefined acceptance iimit(s) of the related values along Y-axis.
- the accept angles (vl ,v2,v3,v4) defines the tolerance of X,Y values within which detected touch positions and proximity positions are validated to be legal values appl ied in the evaluation X.Y values of "P and T" corresponding to a specific functional command.
- one or more intermediate sensor values are detected and applied in the determining the executed path and the resulting X,Y positions of P and T.
- the resulting position X,Y of the object e.g. the finger touching the surface from a given user position is calculated from the X,Y position of the touch and the X.Y position as detected by the proximity means.
- a legal activation includes that the user executes a single touch onto the surface in "a single point", without moving the finger to another touch position; a pressure performed by w ith the finger at the same point, for a period - short or long as applicable, may follow the touch.
- the touch and proximity concept as disclosed enables the user to activate commands on the user interface; such the commands interpreted to be "Left to Right “ or “Right to Left” relative to the user, and the user at any position along or around the border of device the user is controll ing.
- Figure 7 il lustrates that commands executed at the lower half (80), or along the middle (90), or at the upper half ( 1 00) of a circular device, are all interpreted equally to be from Left to R ight. The same applies for command operated from Right to Left.
- the method includes l ikewise operation performed by the user (1 10), along the complete perimeter of the device; see also Figure 1.
- Typical applications are support of operation of multimedia apparatuses, AV systems, media players, remote controllers and similar equipment.
- the invention has properties making it suitable for mounting on printed circuit boards and improving the qual ity of the detection system such that user-suppl ied commands may be certainly and unambiguously detected, interpreted and executed as related functions in a given apparatus.
- Figure 8 further il lustrates the proximity detection system when combined with the touch detection system, for example a capacitive touch detection system.
- the dominant position found with the proximity detection system will in the vast majority of cases be closer to the user than the positions touched on the touch detection system. This enables to make correct detection of the touch positions and/or touch swipes, regardless of the user's position orientation relative to the omnidirectional touch pad.
- the user 1 10 operates a touch swipe T 200 from Left to Right over the surface of the touchpad, parallel with the X-axis.
- the measured averaged dominant position of the object (hand or finger) making the swipe is marked with P 201.
- the touch swipe being predominantly in the X direction, the averaged dominant position of the object is evaluated relative to the Y axis to determine the position of the user. In most cases it is sufficient to detect whether the averaged dominant position of the object has a positive or negative Y coordinate.
- the Y-position of the averaged dominant position can be compared with the average Y-position of the touch swipe.
- the touch swipe will be tilted compared to the X-axis, even with the user aligned with the Y axis. This may be due to the omnidirectional touchpad being operated by either the user's left hand or right hand. In the middle drawing of Figure 8, the touchpad is operated from the left with the user's left hand.
- the touch swipe T 202 is tilted clockwise relative to the X axis.
- the measured averaged dominant position of the object (hand or finger) making the swipe is marked with P 203. With the touch swipe still being predominantly in the X direction, the averaged dominant position of the object is evaluated relative to the Y axis to determine the position of the user.
- the touchpad is operated from the right with the user ' s right hand.
- the touch swipe T (204 ) is tilted counter clockw ise relative to the X axis.
- the measured averaged dominant position of the object (hand or finger) making the swipe is marked with P (205). Also in this case the same detection principle can be used.
- the tilt angle of the touch swipe exceeds +45 or - 45 degrees relative to the X- axis, the touch swipe will be predominantly in the Y direction. In that case, the averaged dominant position of the object must be evaluated relative to the X axis to determine the position of the user. Again, in most cases it will then be sufficient to de- tect whether the averaged dominant position of the object has a positive or negative X coordinate.
- the X-position of the averaged dominant position can be compared with the average X-position of the touch swipe.
- the tilt angle of the touch swipe exceeds +45 or - 45 degrees relative to the X-axis, will be consistent with a different position of the user, for example position 210 rather than the position 1 10 indicated in Figure 8, causing the method for determining the position of the user to remain valid.
Landscapes
- Engineering & Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Position Input By Displaying (AREA)
Abstract
Description
Claims
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
DKPA201500422 | 2015-07-20 | ||
PCT/EP2016/067335 WO2017013186A1 (en) | 2015-07-20 | 2016-07-20 | Apparatus and method for detecting gestures on a touchpad |
Publications (1)
Publication Number | Publication Date |
---|---|
EP3326052A1 true EP3326052A1 (en) | 2018-05-30 |
Family
ID=56464223
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
EP16741039.8A Withdrawn EP3326052A1 (en) | 2015-07-20 | 2016-07-20 | Apparatus and method for detecting gestures on a touchpad |
Country Status (3)
Country | Link |
---|---|
EP (1) | EP3326052A1 (en) |
CN (1) | CN107850969B (en) |
WO (1) | WO2017013186A1 (en) |
Families Citing this family (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP7442940B2 (en) | 2020-07-07 | 2024-03-05 | アルプスアルパイン株式会社 | Proximity detection device |
CN113190164A (en) * | 2021-05-14 | 2021-07-30 | 歌尔股份有限公司 | Operation method, system and equipment of equipment |
CN115856912B (en) * | 2023-02-06 | 2023-05-30 | 宜科(天津)电子有限公司 | Data processing system for detecting movement direction of object |
Family Cites Families (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US8339379B2 (en) * | 2004-04-29 | 2012-12-25 | Neonode Inc. | Light-based touch screen |
US7519223B2 (en) * | 2004-06-28 | 2009-04-14 | Microsoft Corporation | Recognizing gestures and using gestures for interacting with software applications |
US8558161B2 (en) * | 2010-08-10 | 2013-10-15 | Avago Technologies General Ip (Singapore) Pte. Ltd. | Lens having multiple conic sections for LEDs and proximity sensors |
WO2013056157A1 (en) * | 2011-10-13 | 2013-04-18 | Autodesk, Inc. | Proximity-aware multi-touch tabletop |
US9223340B2 (en) * | 2013-08-14 | 2015-12-29 | Lenovo (Singapore) Pte. Ltd. | Organizing display data on a multiuser display |
-
2016
- 2016-07-20 WO PCT/EP2016/067335 patent/WO2017013186A1/en active Application Filing
- 2016-07-20 EP EP16741039.8A patent/EP3326052A1/en not_active Withdrawn
- 2016-07-20 CN CN201680042769.7A patent/CN107850969B/en active Active
Also Published As
Publication number | Publication date |
---|---|
CN107850969A (en) | 2018-03-27 |
CN107850969B (en) | 2021-06-08 |
WO2017013186A1 (en) | 2017-01-26 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US9477324B2 (en) | Gesture processing | |
US8169404B1 (en) | Method and device for planary sensory detection | |
US9448645B2 (en) | Digitizer using multiple stylus sensing techniques | |
US20100207910A1 (en) | Optical Sensing Screen and Panel Sensing Method | |
EP2274666A1 (en) | Interactive input system and pen tool therefor | |
WO2002057089A1 (en) | Electronic input device | |
AU2009244012A1 (en) | Interactive input system with optical bezel | |
US20120249487A1 (en) | Method of identifying a multi-touch shifting gesture and device using the same | |
US20140111478A1 (en) | Optical Touch Control Apparatus | |
US20210389818A1 (en) | System and method for human interaction with virtual objects | |
US10042464B2 (en) | Display apparatus including touchscreen device for detecting proximity touch and method for controlling the same | |
EP3326052A1 (en) | Apparatus and method for detecting gestures on a touchpad | |
US20170170826A1 (en) | Optical sensor based mechanical keyboard input system and method | |
US9703410B2 (en) | Remote sensing touchscreen | |
US10372268B2 (en) | Spatial image display apparatus and spatial image display method | |
US20130120361A1 (en) | Spatial 3d interactive instrument | |
KR101672731B1 (en) | 3d hovering digitizer system using pen tilt | |
KR101966585B1 (en) | Space touch device and and display device comprising the same | |
JP5692764B2 (en) | Object detection method and apparatus using the same | |
EP2315106A2 (en) | Method and system for detecting control commands | |
JP4136584B2 (en) | Coordinate input device, coordinate value output method and program | |
CN113906372A (en) | Aerial imaging interaction system | |
KR101652973B1 (en) | Digitizer using gradient of stylus pen | |
KR102254091B1 (en) | Touchscreen device and method for controlling the same and display apparatus | |
KR20140081425A (en) | Input method with touch panel |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: THE INTERNATIONAL PUBLICATION HAS BEEN MADE |
|
PUAI | Public reference made under article 153(3) epc to a published international application that has entered the european phase |
Free format text: ORIGINAL CODE: 0009012 |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: REQUEST FOR EXAMINATION WAS MADE |
|
17P | Request for examination filed |
Effective date: 20180201 |
|
AK | Designated contracting states |
Kind code of ref document: A1 Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR |
|
AX | Request for extension of the european patent |
Extension state: BA ME |
|
DAV | Request for validation of the european patent (deleted) | ||
DAX | Request for extension of the european patent (deleted) | ||
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: EXAMINATION IS IN PROGRESS |
|
17Q | First examination report despatched |
Effective date: 20190909 |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: EXAMINATION IS IN PROGRESS |
|
RAP1 | Party data changed (applicant data changed or rights of an application transferred) |
Owner name: BANG & OLUFSEN A/S |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: EXAMINATION IS IN PROGRESS |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: THE APPLICATION IS DEEMED TO BE WITHDRAWN |
|
18D | Application deemed to be withdrawn |
Effective date: 20220104 |