Nothing Special   »   [go: up one dir, main page]

WO2010078996A2 - Appareil muni d'un dispositif de saisie pour saisir des instructions de commande - Google Patents

Appareil muni d'un dispositif de saisie pour saisir des instructions de commande Download PDF

Info

Publication number
WO2010078996A2
WO2010078996A2 PCT/EP2009/065734 EP2009065734W WO2010078996A2 WO 2010078996 A2 WO2010078996 A2 WO 2010078996A2 EP 2009065734 W EP2009065734 W EP 2009065734W WO 2010078996 A2 WO2010078996 A2 WO 2010078996A2
Authority
WO
WIPO (PCT)
Prior art keywords
light
sensor
light source
input
input member
Prior art date
Application number
PCT/EP2009/065734
Other languages
German (de)
English (en)
Other versions
WO2010078996A3 (fr
Inventor
Robert ECKMÜLLER
Martin Paulus
Original Assignee
Continental Automotive Gmbh
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Continental Automotive Gmbh filed Critical Continental Automotive Gmbh
Priority to CN2009801510598A priority Critical patent/CN102257461A/zh
Priority to EP09784066A priority patent/EP2380077A2/fr
Publication of WO2010078996A2 publication Critical patent/WO2010078996A2/fr
Publication of WO2010078996A3 publication Critical patent/WO2010078996A3/fr

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/042Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means
    • G06F3/0421Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means by interrupting or reflecting a light beam, e.g. optical touch-screen

Definitions

  • the invention relates to a device with an input device for inputting control commands.
  • Such devices are known in the prior art, in which the position of an input member is detectable and a control command can be generated from the position of the input member, wherein the input device has at least one light source for emitting light beams and at least one sensor for receiving the emitted light beams ,
  • the input device has at least one light source for emitting light beams and at least one sensor for receiving the emitted light beams
  • such light sources and sensors are arranged above a display such that an input element, for example a finger of the operator or a stylus, touches the display and thus interrupts one or more light beams that run over the surface of the display and so on Control command can be generated.
  • a disadvantage of this embodiment is that the display is touched and so the display is slightly unsightly when using fingers of the operators.
  • the object of the invention is therefore to provide a device with an input device in which a user interface in the form of, for example, a display is not soiled.
  • This object is achieved in that at least one light source and at least one sensor are arranged such that the light of the at least one light source from the input member to the at least one sensor is reflected and that the input device comprises an evaluation unit, which consists of the differences between the the light emitted by the light source and received by the sensor determines the respective total distance traveled by the light from the respective light source via the input member to the respective sensor, or one of the respective total distance proportional size determined.
  • This ensures that the input member is detectable even at a certain distance from the device without the input member touches the device and in particular a possible existing user interface or a possibly existing display.
  • the distance between the input element and the light source and / or the light sensor can then be determined from the total distance or distances which the light has traveled from the respective light source to the respective light sensor.
  • three light sources transmit their light rays, which are reflected by the input member and received by the sensor. This results in three total distances; the first total distance from the first light source via the input member to the sensor, the second total distance from the second light source via the input member to the sensor, and the third total distance from the third light source via the input member to the sensor.
  • the structure is particularly simple when the light sources alternately emit their light, in particular modulated light, and so the total distances can be determined one after the other.
  • the sequence of the individual measurements takes place in such short time intervals that the input member has changed its spatial position only insignificantly even when the input member has moved.
  • the distance of the input member to the sensor for all three total distances is the same, then the distance of the input member to the individual light sources can be determined from the total distances, and so the spatial position of the input member can be determined exactly.
  • the above-described construction can also be modified by replacing the light source with the sensor and replacing the sensor with the light source so that, for example, a light source is present whose light beams are reflected by an input element and received by three sensors. This structure seems to be more expensive.
  • the position of the input member can be determined only in one area, wherein the surface to be monitored is defined by a limitation of the light rays in the corresponding plane. For example, this surface may be at right angles to the device.
  • the determination of the respective total distance is particularly simple when the light source or the light sources emits or emit modulated light and the respective total distance can be determined from the phase shift between the emitted light and the light received by the sensor or sensors respective phase shift is proportional to the total distance. If the position determination can be carried out repeatedly within a limited period of time, then the movement curve of the input member can be determined.
  • the determined movement curve of the input member can be compared with the stored movement curves by means of a comparison device, is triggered in the case of coincidence of the movement curve associated control process .
  • the evaluation device can be arranged within the evaluation unit or designed as a separate component.
  • the device can be beautifully designed.
  • the cover does not dampen the light beams too much so that the sensor or sensors can receive a sufficient amount of light.
  • this cover is configured as a liquid crystal display, it is possible to indicate to an operator which control commands can be given at present. For example, a menu structure can be displayed on the screen and a submenu can be selected.
  • a liquid crystal cell passes infrared light in a manner that the sensor or sensors receive enough light to evaluate.
  • FIG. 1 shows an exemplary embodiment of a device according to the invention with an input device with three light sources and one sensor,
  • FIG. 2 shows a second embodiment of a device according to the invention with two light sources and a sensor.
  • FIG. 1 shows a device 1 with a display 2.
  • the device furthermore has light sources 3, 4, 5, a sensor 6 and an evaluation unit 7.
  • FIG. 1 shows an input element O and sections a, b, c, d.
  • the light sources 3, 4, 5 emit light, which is reflected by the input member O to the sensor 6.
  • the path of the light from the individual light sources 3, 4, 5 to the sensor 6 is represented by the distances a, b, c and d.
  • the total distance A which travels the light from the light source 3 via the input member O to the sensor 6, consists of the distances a and d, the total distance B corresponding to the path from the light source 4 to the sensor 6 from the distances b and d and Total distance C corresponding to the path from the light source 5 to the sensor 6 from the distances c and d.
  • the light sources 3, 4, 5 successively emit modulated light which is reflected by the input member O to the sensor 6.
  • the evaluation unit 7 is connected to the light sources 3, 4, 5 and the sensor 6 and detects the phase shift of the light, which leaves the light sources 3, 4, 5 and enters the sensor 6. This phase shift is proportional to the distance A, B and C, respectively, traveled by the light.
  • a movement curve can be recorded and compared with a movement curve stored in the evaluation unit 7 or in another component.
  • a control command assigned to this movement curve can be executed.
  • Selectable operating functions or operating menus can be displayed on the display 2, these operating menus correspondingly representing a submenu or a higher-level menu depending on the detected position of the input member O or a detected movement curve of the input member O, or a control function corresponding to the position or the operating curve ,
  • the device 1 in FIG. 2 has a display 2, light sources 3 and 4, a sensor 6 and an evaluation unit 7. Furthermore, one recognizes again an input member O, which may be, for example, a finger of an operator.
  • the light sources 3, 4 are designed such that they can emit light only in a plane which is spanned by the distances a and b. In this way, a unique position determination of the input member O can be realized within this level. It is also possible to design the sensor 6 so that it can only receive light from this plane,
  • a cover may be arranged.
  • the light sources 3 to 5 and the sensor 6 can also be arranged behind the display 2 or a cover, so that a harmonious overall appearance is given.
  • the cover may have, for example, symbols or characters and thus represent the operator the possible executable operating functions.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Position Input By Displaying (AREA)

Abstract

L'invention concerne un appareil muni d'un dispositif de saisie servant à saisir des instructions de commande dans l'appareil. La position d'une unité de saisie peut être détectée et une instruction de commande peut être générée sur la base de cette position. L'unité de commande n'est pas liée mécaniquement à l'appareil. Le dispositif de saisie présente au moins une source lumineuse pour émettre des rayons lumineux et au moins un capteur pour recevoir les rayons lumineux émis. Ladite source lumineuse et le capteur sont disposés de sorte que la lumière provenant de la source lumineuse puisse être réfléchie par l'unité de saisie en direction du capteur. Le dispositif de saisie présente une unité d'évaluation qui détermine, en fonction de la différence entre la lumière émise par la source lumineuse et la lumière reçue par le capteur, chacune des distances totales couvertes par la lumière entre la source lumineuse dont elle provient et le capteur correspondant, par le biais de l'unité de saisie, ou bien une grandeur proportionnelle à la distance totale parcourue.
PCT/EP2009/065734 2008-12-18 2009-11-24 Appareil muni d'un dispositif de saisie pour saisir des instructions de commande WO2010078996A2 (fr)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN2009801510598A CN102257461A (zh) 2008-12-18 2009-11-24 具有用于输入控制指令的输入装置的设备
EP09784066A EP2380077A2 (fr) 2008-12-18 2009-11-24 Appareil muni d'un dispositif de saisie pour saisir des instructions de commande

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
DE102008062715A DE102008062715A1 (de) 2008-12-18 2008-12-18 Gerät mit einer Eingabevorrichtung zur Eingabe von Steuerbefehlen
DE102008062715.1 2008-12-18

Publications (2)

Publication Number Publication Date
WO2010078996A2 true WO2010078996A2 (fr) 2010-07-15
WO2010078996A3 WO2010078996A3 (fr) 2011-01-27

Family

ID=42194048

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/EP2009/065734 WO2010078996A2 (fr) 2008-12-18 2009-11-24 Appareil muni d'un dispositif de saisie pour saisir des instructions de commande

Country Status (4)

Country Link
EP (1) EP2380077A2 (fr)
CN (1) CN102257461A (fr)
DE (1) DE102008062715A1 (fr)
WO (1) WO2010078996A2 (fr)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102012022882A1 (de) 2012-11-23 2014-05-28 Heidelberger Druckmaschinen Ag Gestensteuerung für Druckmaschinen
DE202016002469U1 (de) 2016-04-18 2016-07-14 Kastriot Merlaku Augenschutz-System für Fernsehgeräte oder Bilderzeugungsvorrichtungen jeglicher Art

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2626769A1 (fr) * 2012-02-10 2013-08-14 Research In Motion Limited Procédé et dispositif pour la réception d'entrées à base de réflexion
CN104750317B (zh) * 2013-12-30 2017-10-17 北京壹人壹本信息科技有限公司 光学触控定位方法、装置及终端
CN107491227A (zh) * 2017-07-14 2017-12-19 北京汇冠触摸技术有限公司 一种通过光学测距实现的触摸识别装置及方法

Family Cites Families (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB1470294A (en) * 1974-09-25 1977-04-14 Cetec Systems Ltd Optical digitising system
US5900863A (en) * 1995-03-16 1999-05-04 Kabushiki Kaisha Toshiba Method and apparatus for controlling computer without touching input device
DE19539955A1 (de) * 1995-10-26 1997-04-30 Sick Ag Optische Erfassungseinrichtung
US6313825B1 (en) * 1998-12-28 2001-11-06 Gateway, Inc. Virtual input device
IL136432A0 (en) * 2000-05-29 2001-06-14 Vkb Ltd Data input device
US20060001656A1 (en) * 2004-07-02 2006-01-05 Laviola Joseph J Jr Electronic ink system
US20060158424A1 (en) * 2005-01-19 2006-07-20 Tong Xie Optical slide pad
JP4502933B2 (ja) * 2005-10-26 2010-07-14 Nec液晶テクノロジー株式会社 バックライトユニット及び液晶表示装置
DE102007023290A1 (de) * 2007-05-16 2008-11-20 Volkswagen Ag Multifunktionsanzeige- und Bedienvorrichtung und Verfahren zum Betreiben einer Multifunktionsanzeige- und Bedienvorrichtung mit verbesserter Auswahlbedienung

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
None

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102012022882A1 (de) 2012-11-23 2014-05-28 Heidelberger Druckmaschinen Ag Gestensteuerung für Druckmaschinen
EP2735947A1 (fr) 2012-11-23 2014-05-28 Heidelberger Druckmaschinen AG Commande gestuelle pour imprimantes
US9898690B2 (en) 2012-11-23 2018-02-20 Heidelberger Druckmaschinen Ag Gesture control for printing presses
DE202016002469U1 (de) 2016-04-18 2016-07-14 Kastriot Merlaku Augenschutz-System für Fernsehgeräte oder Bilderzeugungsvorrichtungen jeglicher Art

Also Published As

Publication number Publication date
DE102008062715A1 (de) 2010-06-24
EP2380077A2 (fr) 2011-10-26
CN102257461A (zh) 2011-11-23
WO2010078996A3 (fr) 2011-01-27

Similar Documents

Publication Publication Date Title
EP2016480B1 (fr) Dispositif optoélectronique pour saisir la position et/ou le mouvement d'un objet et procédé associé
EP1657819B1 (fr) Commande de plaque de cuisson
EP3194197B1 (fr) Dispositif d'affichage et de commande, notamment pour un véhicule à moteur, élément de commande et véhicule à moteur
EP2462497B1 (fr) Procédé permettant de faire fonctionner un dispositif de commande et dispositif de commande dans un vehicule
DE102015210657A1 (de) Verfahren zur Erkennung einer Stellbewegung eines auf einer Anzeigefläche befindlichen Stellelementes in einem Kraftfahrzeug und Vorrichtung zur Durchführung des Verfahrens
DE102006037156A1 (de) Interaktive Bedienvorrichtung und Verfahren zum Betreiben der interaktiven Bedienvorrichtung
WO2010078996A2 (fr) Appareil muni d'un dispositif de saisie pour saisir des instructions de commande
EP1691486A1 (fr) Méthode de commande pour un dispositif électrique
DE102010041088A1 (de) Eingabeerfassungsvorrichtung sowie Verfahren zum Betreiben einer Eingabeerfassungsvorrichtung
DE102016108899A1 (de) Kombiniertes Ein- und Ausgabegerät und Verfahren zur Bedienung eines Ein- und Ausgabegerätes
EP1810405B1 (fr) Arrangement comprenant un plan de travail et une plaque en vitrocéramique arrangée dedans
EP2811318A1 (fr) Capteur optoélectronique
DE102015202459A1 (de) Verfahren und Vorrichtung zum Bedienen einer Nutzerschnittstelle in einem Fahrzeug
DE19918072A1 (de) Bedienverfahren und Bedienvorrichtung für einen bildschirmgesteuerten Prozeß
DE102014008296A1 (de) Schaltwippenvorrichtung, Lenkradvorrichtung mit der Schaltwippenvorrichtung sowie Steuerungsvorrichtung mit der Schaltwippen- oder Lenkradvorrichtung
DE3843454C1 (fr)
DE102006040572A1 (de) Vorrichtung zum Bedienen von Funktionen eines Gerätes
DE20117645U1 (de) Bedieneinrichtung
DE10359561B4 (de) Bedienelement für ein Haushaltsgerät
DE102016207611B4 (de) Anzeige- und Bediensystem
DE102019002884A1 (de) Bedienvorrichtung
DE102010026181A1 (de) Bedieneinheit für ein Fahrzeug
DE102014016020A1 (de) Steuerungsanordnung mit Lenkradbedienung
DE102018205445A1 (de) Optische Anzeigevorrichtung sowie Verfahren zur Gestenerkennung von Nutzern auf einer optischen Anzeigeeinrichtung
EP0802086A2 (fr) Dispositif de commande multifonctions

Legal Events

Date Code Title Description
WWE Wipo information: entry into national phase

Ref document number: 200980151059.8

Country of ref document: CN

121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 09784066

Country of ref document: EP

Kind code of ref document: A2

WWE Wipo information: entry into national phase

Ref document number: 2009784066

Country of ref document: EP

NENP Non-entry into the national phase

Ref country code: DE