Nothing Special   »   [go: up one dir, main page]

GB2498833A - Ultrasonic gesture recognition for vehicle - Google Patents

Ultrasonic gesture recognition for vehicle Download PDF

Info

Publication number
GB2498833A
GB2498833A GB1221465.6A GB201221465A GB2498833A GB 2498833 A GB2498833 A GB 2498833A GB 201221465 A GB201221465 A GB 201221465A GB 2498833 A GB2498833 A GB 2498833A
Authority
GB
United Kingdom
Prior art keywords
text
vehicle
signal transit
stored
ultrasonic sensor
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
GB1221465.6A
Other versions
GB201221465D0 (en
GB2498833B (en
Inventor
Andreas Eisele
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Robert Bosch GmbH
Original Assignee
Robert Bosch GmbH
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Robert Bosch GmbH filed Critical Robert Bosch GmbH
Publication of GB201221465D0 publication Critical patent/GB201221465D0/en
Publication of GB2498833A publication Critical patent/GB2498833A/en
Application granted granted Critical
Publication of GB2498833B publication Critical patent/GB2498833B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/52Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S15/00
    • G01S7/539Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S15/00 using analysis of echo signal for target characterisation; Target signature; Target cross-section
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S15/00Systems using the reflection or reradiation of acoustic waves, e.g. sonar systems
    • G01S15/02Systems using the reflection or reradiation of acoustic waves, e.g. sonar systems using reflection of acoustic waves
    • G01S15/50Systems of measurement, based on relative movement of the target
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S15/00Systems using the reflection or reradiation of acoustic waves, e.g. sonar systems
    • G01S15/02Systems using the reflection or reradiation of acoustic waves, e.g. sonar systems using reflection of acoustic waves
    • G01S15/50Systems of measurement, based on relative movement of the target
    • G01S15/52Discriminating between fixed and moving objects or between objects moving at different speeds
    • G01S15/523Discriminating between fixed and moving objects or between objects moving at different speeds for presence detection
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S15/00Systems using the reflection or reradiation of acoustic waves, e.g. sonar systems
    • G01S15/86Combinations of sonar systems with lidar systems; Combinations of sonar systems with systems not using wave reflection
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S15/00Systems using the reflection or reradiation of acoustic waves, e.g. sonar systems
    • G01S15/88Sonar systems specially adapted for specific applications

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • General Physics & Mathematics (AREA)
  • Acoustics & Sound (AREA)
  • Measurement Of Velocity Or Position Using Acoustic Or Ultrasonic Waves (AREA)
  • Traffic Control Systems (AREA)
  • Power-Operated Mechanisms For Wings (AREA)

Abstract

A vehicle carries one or more ultrasonic sensors, wherein the ultrasonic sensor emits and receives ultrasonic signals depending on at least one operating condition of the vehicle. A control unit calculates, from signal transit-times of reflected ultrasonic signals, distances from objects in the surround field of the vehicle, and compares changes of signal transit-times with stored changes of signal transit-times corresponding to specific user-defined gestures. If the measured and stored transit times correlate, at least one actuator is driven to open a car door such as the tail-gate (boot, trunk). Different sensors can open different doors. The user can be recognised by a video camera, and the system only operated when the driver is near the vehicle. A radio key may also be used.

Description

Description Title
Process for controlling at least *one actuator on the basis of changes of signal transit-time of at least one ultrasonic sensor The invention takes as*its starting-point a process and an apparatus for an assistance system of a vehicle with at least one ultrasonic sensor, according to the precharacterising portion of the independent claims.
State of the Art From DE 10 2008 025 668 Al a process and an apparatus are known for operating a vehicle-closing device. Via various sensors of the vehicle the approach of potential vehicle occupants is established. When a potential vehicle occupant is present, a door, sliding door or hinged cover of the vehicle is opened if a response of the potential vehicle occupant is detected. Depending on the embodiment, a vehicle dobr is opened completely via an actuator or is only opened so far that it can be opened, for example, by an elbow of the potential vehicle occupant.
From DE 10 2004 005 225 Ala process and an apparatus are known for warning a driver of the opening of a door. For the purpose of detecting obstacles in the opening region of a door, ultrasonic sensors can be utilised which are also already present for a parking-assistance system.
From DElO 2006 001 666 Al a ptocess and an apparatus are known for the purpose of checking closing hinged covers, in particular automatically actuated boot lids of a motor vehicle. Obstacles in the opening region are registered via sensors operating capacitively. The closing of a hinged cover may be brought about via a gesture, whereby use may be made, inter alia, of ultrasonic sensors for the purpose of registration.
Disclosure of the Invention
Advantages of the Invention -
-
The procedue according to the invention with the * characterising features of the independent claims has, in contrast, the advantage that concordances are recognised by comparing changes of signal transit-time in relation to at least one ultrasonic sensor of a vehicle with at least one suitable change of signal transit-time, whereby advantageously a periodically fluctuating change of signal transit-time is assigned to a swaying back and forth of an object by way of gesture of a person located in the surround field of the vehicle, and whereby, depending on the gesture, at least one actuator is driven.
Further advantageous embodiments of the present invention are the subject-matter of the dependent claims.
A periodically fluctuating change of signal transit-time is, for example, a distance of an object within the detection range of at least one ultrasonic sensor of a vehicle, the at least one change of signal transit-time having been stored in a memori of a control unit. An actuator is,. for example, a closing element for opening and/or closing a tailgate of the vehicle.
Advantageously an evaluation of the sensor data of an ultrasonic sensor is effected independently of sensor data of other ultrasonic sensors, for example by filtering the sensor data. As a result, the sensor data can be used by different ultrasonic sensors for the purpose of driving different actuators. Accordingly, a person located in the surround field of the vehicle is able to drive various actuators with only one gesture, by the person performing the gesture within the detection range of ultrasonic sensors that have been assigned to particular actuators.
The person locaed in the surround field of the vehicle can, for example, open and/or close left or right doors by a gesture in front of an ultrasonic sensor of a vehicle that has been fitted on the left or right side, respectively, of the vehicle in a bumper of the vehicle, and can open a boot by the same gesture within the detection range of ultrasonic sensors that have been fitted centrally in a bumper.
In a further embodiment, the sensor data from other ultrasonic sensors, for example in the form of cross-echoes, may be drawn upon for the purpose of calculating changes of signal transit-time, whereby gestures that are performed within the detection range of several ultrasonic sensors are advantageously recognised by this means.
In a further embodiment, the user can freely define gestures for controlling at least one actuator of the vehicle. Consequently the ease of use for the user is enhanced, since he/she can choose motions that are easiest for him/her.
By virtue of the free definition of gestures, in a further embodiment the person within the detection range of an arbitrary ultrasonic sensor is able to drive a particular actuator by a gesture assigned to this actuator.
On the basis of, for example, a radio key that a person located in the surround field of the vehicle is carrying, or by virtue of further sensors on the vehicle, such as a video camera for example, the control unit recognises that the driver is in the vicinity of the vehicle.
Advantageously the comparison of the changes of the signal transit-times is carried cut only when the driver is in the vicinity. This prevents, for example, a battery of the vehicle from discharging in the parked state of the vehicle and making a restart of the vehicle imposible.
Additionally, further operating conditions -such as, for example, a stopping of the engine of the vehicle -can additionally be examined.
With a view to more precise recognition of the gestures, use may additionally be made of a camera fitted tothe vehicle in addition to the ultrasonic sensor. In order in poor lighting conditions to ensure a reliable recognition by the camera, for.example a reversing camera, use may be made of an additional light-source. By way cf additional light-source, use is made of an LED that emits light in the visible region, an infrared LED and/or a registration-plate illumination of thevehicle.
The apparatus according to the invention can be used directed both forwards and backwards on a vehicle.
Brief Description of the Figures
Exemplary embodiments of the invention are represented in the drawing and elucidated in more detail in the following
description.
Shown are: Figure 1 a schematic representation of an emboditnent of an apparatus according to the state of the art for ascertaining objects; Figure 2a a schematic representation of a first embodiment of the present invention for registering changes of signal transit-time of at least one ultrasonic sensor; Figure 2b a schematic representation of a second embodiment of the present invention for registering changes of signal transit-time of at least one ultrasonic sensor; Figure 2c a schematic representation of a third embodiment of the present invention for registering changes of signal transit-time of at least one ultrasonic sensor; Figure 3 a sketch for an example of a sequence of * motions of an object as basis of the process * according to the invention; Figure 4a a schematic representation of -an example of a * signal form as basis of the process according * -to the invention, in which an object is moved in temporally limited manner approximately with the same frequency and amplitude iz a registered region along the vehicle axis; Figure 4b a schematic representation of an example of a signal form as basis of the process according to the invention, in which an object is firstly moved approximately with the same frequency and amplitude and then with a numerically changed amplitude in a registered region along the vehicle axis; Figure 4c a schematic representation of an example of a signal form as basis of the process according to the invention, in which an.object is moved between a non-registered second region and a first registered region at right angles to the vehicle axis; Figure 5 a schematic representation of an embodiment of the apparatus according to the invention, in which an object is moved between a registered second region and a first registered region at right angles to the vehicle axis:
Detailed Description of the Exemplary Embodiments
In all the Figures, identical reference symbols denote the same articles.
Figure 1 shows an embodiment of an apparatus for ascertaining objects according to the state of the art.
A vehicle (101) includes at least one control unit (102) Data are madeavailable to the control unit (102) from at least one sensor (104). Further sensors (104 (1), 104 (2), 104 (n)) may be present. The sensor data may be communibated to the control unit (102) directly via an interface or alternatively via a bus system such as, for example, a CAN bus. The control unit contains at least one memory (106) for storing and processing the sensor data, the kinematic operating data and the vehicle data. The control unit (102) drives at least one actuator (103) Further actuators (103 (1), 103 (2), ... 103 (ra)) may be, for example, a starter or a closing element. The vehicle (101) and the control unit (102) may also exhibit further interfaces (107), for example for the interaction witha driver in the form of queries, and prompts.
The apparatus represented in Figure 2a includes, in a first embodiment, simplified in a top view, an ultrasonic sensor (4 (1)) with an acoustic acceptance angle (8 (1) ) . The distances dmin and dmax specify a minimal and maximal detection distance, respectively, of the ultrasonic sensor (4 (1)), i.e. the minimal and maximal distance of an object from the ultrasonic sensor (4 (1) ) , at which a distance from an object can be calculated correctly on the basis of reflectedsignals.
With the aid of the at least one ultrasonic sensor (4 (1) ) objects can be detected. For this purpose, signals are transmitted and received by the at least one sensor (4 (1)), and, from the signal transit-times, ranges are calculated and space coordinates relating to the objects are ascertained. The emitted signals are represented as half-waves (5 (1)) in Figure 2a.
A first region (6) is delimited by the minimal distance and a distance (db) *stored in a memory (106) of the control unit (102). A second region (7) is situated outside a maximal detection distance. By means of the regions (6) and (7) , the regions within which signal transit-times are taken into account can, for example, be defined.
In a second embodiment, which is shown in Fig. 2b, a first region (6) is likewise delimited by the minimal distance dun and a distance (db) stored in the memory (106) of the control unit (102) . A second region (7) is situated outside the acceptance angle of the ultrasonic sensor (4 (1)), i.e. an object within the second region (7) is not recognised by the ultrasonic sensor (4 (1) Fig. 2c shows a third embodiment, in which a first region (6) is defined by the minimal detection distance of the ultrasonic sensor (4 (1)) and a distance (db) stored in the control unit, the stored distance being smaller than the maximal distance dmax. The second region (7) is delimited approximately by a distance of the object d0hj from the vehicle and a value stored in the memory (106) of the control unit (102) This value may be an absolute distance from the ultrasonic sensor (4 (1) ) . Alternatively, the value may, for example, represent a width of theregion, so that the region is delimited by the minimal detection distance dmjn of the ultrasonic sensor (4 (1)) and by the sum of the distance of the object dobi from the vehicle and the value stored in the memory (106) of the control unit (102) Figure 3 shows an embodiment with two superposed regions (6) and (7). A control uni (1) of a vehicle (9) possesses at least one ultrasonic sensor (4) The ultrasonic sensor is advantageously an ultrasonic sensor Lhat has already been provided for a parking-assistance system.
Depending on further operating conditions, distances (dobi) from objects (10) in the surround field of the vehicle (9) are calculated. For this purpose the ultrasonic sensor (4) emits signals. On the basis of the received signals the control unit (1) calculates distances (dOb) from objects (10) . The control unit (1) compares changes of signal transit-times over time with changes of signal transit-times stored in the control unit (1) . In the case of a concordance, at least one actuator is driven.
fly means of the regions (6) and (7) , the regions within which signal transit-times are taken into account can, for example, be defined. For instance, an oscillation back and forth of an object without change of the regions (6) and (7) with small amplitude can be ignored as a gesture.
In Figure 3 an oscillating of an object (10) albng the vehicle axis, for example, is recognised. In a position of rest, the object (10) is located within a second region (7) . In the course of the motion of the object along the vehicle axis the object is firstly located within the second region (7) and isthen, for example, moved in the direction of the vehicle (9) . As a result, the object is no longer located in the second region (7) but is only located in the first region (6) . In the bourse of a subsequent motion contrary to the direction of the vehicle the object is moved through the second region (7) before the object is once again only in the first region (6) . The control unit (1) compares the changes of signal transit-time with the changes of transit-time stored in the control unit (1) . In Figures 4a, 4b and 4c diagrams are represented, whereby on the vertical axis the calculated distances dOb of an object between the minimal detection distance dmjn and the maximal detection distance drnax of the ultrasonic sensor have been plotted over a time axis. Moreover, the first.
region (6) and the second region (7) have been drawn in.
In the diagrams a possible calbulated signal forth of distances from objects in the surround field of the vehicle is represented over time. On the time axis, two markers (tl, t2) have furthermore been drawn in which show the temporalregion within which the control unit has recognised a concordance with signal forms stored in the control unit.
The diagrams in Figures 4a and 4b represent a posible signal form when an object (10) is moved, for example, as in Figure 3. The signal form in the diagram from Figure 4a shows within the two temporal markers, for example, a three-times-repeated oscillating back and forth of the object (10) along the vehicle axis.
From the diagram it is evident that a second region (7) is delimited approximately symmetrically with respect to a distance of the object dobj from the vehicle and by dimensions of the object. The maximal displacement of the object lies in a first region (6) which, for example, is delimited by the minimal and maximal detection distances dmin and dmax, respectively. The diagram in Figure 4b shows, for example, a twice-repeated oscillating forwards in the vehicle direction, followed by a simple oscillating backwards of the object (10) along the vehicle axis. Here too, there is a first region (6) and a second region (7), within which the signal form moves.
The diagram in Figure 4c shows a possible signal form which arises when an object from a region that cannot be registered by the ultrasonic sensor penetrates into the region of the acceptance angle of the ultrasonic sensor. A S first region (6) is delimited by the minimal detectipn distnce d1 and a value stored in the control unit. A second region (7) lies outside the maximal detection distance dma and/or outside the acceptance angle of the ultrasonic sensor. A signal formof such a type is brought about, for example, by a motion at right angles to the vehicle axis. In Figure 5 a possible embodiment of the present invention is represented in a top view, not true to scale. In a rear bumper (12) of a vehicle four ultrasonic sensors (4 (1) to 4 (4) ) have been fitted. These have a specific acceptance angle; for example, sensor (4 (3) possesses an acceptance angle (8 (3) ) and transmits and receives ultrasonic signals (5 (3)). For the other sensors a corresponding labelling has been dispensed with, with a view to better clarity.
In Figure 5 a person (11) located in the surround field of the vehicle, with an object (10) in his/her right hand, is standing in front of the rear bumper. On the basis of, for example, a radio key that the person located in the surround field of the vehicle is carrying, *or by virtue of further sensors on the vehicle, such as a video camera for example, the control unitrecognises that, for example, a driver is in the vicinity of the vehicle. Additionally, further operating conditions, sucH as stopping cf the engine for example, can be examined. The ultrasonic sensors (4 (1) to 4 (4)) now emit ultrasonic signals and communicate the measured signal transit-times to the control unit (1) via the at least one interface (3) -12--As is evident from Figure 5, the object (10) is moved at right angles to the vehicle axis by the person (11) located
in the surround field of the vehicle. This motion
generates a signal form as represented in exemplary manner in Figure 4c. The control unit (1) drives at least one actuator (13) via the interface (3) if, for example, a change of signal transit-time has been recognised that corresponds to a change of signal transit-time stored in a memory (2) in the control unit (1) By way of actuator (13), a closing element of a tailgate, of a bonnet, of a door, of a sliding roof, and/or hydraulic elements, for example, is/are driven.

Claims (1)

  1. <claim-text>Claims 1. Process for an assistance system of a vehicle with at least one ultrasonic sensor, wherein the ultrasonic sensor emits and receives ultrasonic signals depending on at least one operating condition, wherein a control unit calculates, from signal transit-times of reflected ultrasonic signals, distances from objects in the surround field of the vehicle, characterised in that the control unit compares changes of signal transit-times with stored changes of signal transit-times, whereby in the event of a concordance with at least one stored change of signal transit-time at least one actuator is driven.</claim-text> <claim-text>2. Process according to Claim 1, characterised in that a comparison of the changes of the signal transit-times with stored changes of signal transit-times is undertaken when a distance of the object in the surround field of the vehicle during the changes of the signal transit-times lies approximately within a first region and a further distance of the object lies approximately within a second region in the surroundfield of the vehicle.</claim-text> <claim-text>3. Process according to one of the preceding claims, characterised in that changes of the signal transit-times correspond to a succession of changes of signal transit-times stored in the control unit.</claim-text> <claim-text>4. Process according to one of the preceding claims, characterised in that changes of the signal transit-times correspond substantially to a periodic sequence with a frequency stored in he control unit and/or to a signal form stored in the control unit.</claim-text> <claim-text>5. Process according to one of the preceding claims, S characterised in that the first region is delimited by a minimal detection distance of the ultrasonic sensor and by a maximal distance from the ultrasonic ensor, stored in the control unit, *and the second region lies outside a maximal detection distance and/or outside the acceptance angle of the ultrasonic sensor.</claim-text> <claim-text>6. Process according to Claim 4, characterised in that the first region is defined by a minimal detection distance of the ultrasonic sensor and a maximal distance stored in the control unit, and the second region is delimited by approximately the distance of the object from the vehicle at the start of the periodic sequence and/or of the stored signal form.</claim-text> <claim-text>7. Process according to one of the preceding claims, characterised in that a gesture of a person located inthe surround field of the vehicle is assigned to astored change of signal transit-time.</claim-text> <claim-text>8. Process according to Claim 7, characterised in that the gesture of the person located in the surround field of the vehicle corresponds to an oscillating back and forth of the object perpendicular to, along and/or diagonally relative to the detection direction of the ultrasonic sensor.</claim-text> <claim-text>9. Process according to Claim 7 or 8,. characterised in that with a view to more precise recognition of the * gestures use is additionally made of a camera fitted to *the vehicle in addition to the ultrasonic sensor.</claim-text> <claim-text>* 10. Process according to Claim 9, characterised in that with a view to more precise recognition of the gestures in poor lighting conditions use is made of an additional light-source in addition to the ultrasonic sensor and the camera fitted to the vehicle.</claim-text> <claim-text>11. Process according to one of the preceding claims, characterised in that the camera that is present is a reversing camera and the additional light-source is an LED that ethits light in the visible region, or an infrared LED.</claim-text> <claim-text>12. Apparatus for an assistance system of a vehicle with at least one ultrasonic sensor, wherein the ultrasonic sensor emits and receives ultrasonic signals depending on at least one operating condition, wherein a control unit calculates, from signal transit-times cf reflected ultrasonic signals, distances from objects in the surround field of the vehicle, characterised in that the control unit compares changes of signal transit- * times with stored changes of signal transit-times, whe±eby in the event of a concordance with at least one stored change of signal transit-time at least one actuator is driven.</claim-text> <claim-text>13. Apparatus according to Claim 12, characterised in. that the at least one actuator is a closing element of a * boot lid and/or of at least one door of the vehicle.</claim-text> <claim-text>14. Computer program with program-code means in order to implement all the steps of a process according to one of Claims 1 to 1]. when the computer program is executed on a computer or on an appropriate arithmetic unit.</claim-text> <claim-text>15; Computer-program product with program-code means that have been stored on a computer-readable data carrier; in order to implement all the steps of a process according to one of Claims 1 to 11 when a computer program according to Claim 14 is executed on a computer or on an appropriate arithmetic unit.</claim-text> <claim-text>16. A process for an assistance system of a vehicle substantially as herein described with reference to the accompanying drawings.</claim-text> <claim-text>17. Apparatus for an assistance system of a vehicle substantially as herein described with reference to the accompanying drawings.</claim-text>
GB1221465.6A 2011-11-29 2012-11-28 Process for controlling at least one actuator on the basis of changes of signal transit-time of at least one ultrasonic sensor Active GB2498833B (en)

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
DE102011087347.3A DE102011087347B4 (en) 2011-11-29 2011-11-29 Method for controlling at least one actuator based on changes in the signal propagation time of at least one ultrasonic sensor

Publications (3)

Publication Number Publication Date
GB201221465D0 GB201221465D0 (en) 2013-01-09
GB2498833A true GB2498833A (en) 2013-07-31
GB2498833B GB2498833B (en) 2016-12-14

Family

ID=47560871

Family Applications (1)

Application Number Title Priority Date Filing Date
GB1221465.6A Active GB2498833B (en) 2011-11-29 2012-11-28 Process for controlling at least one actuator on the basis of changes of signal transit-time of at least one ultrasonic sensor

Country Status (3)

Country Link
DE (1) DE102011087347B4 (en)
FR (1) FR2983308B1 (en)
GB (1) GB2498833B (en)

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107531212A (en) * 2015-05-08 2018-01-02 大众汽车有限公司 Utilize manipulation of the sonac to the closure member of vehicle
US10546423B2 (en) 2015-02-03 2020-01-28 Globus Medical, Inc. Surgeon head-mounted display apparatuses
US11348257B2 (en) 2018-01-29 2022-05-31 Philipp K. Lang Augmented reality guidance for orthopedic and other surgical procedures
US11452568B2 (en) 2016-03-12 2022-09-27 Philipp K. Lang Augmented reality display for fitting, sizing, trialing and balancing of virtual implants on the physical joint of a patient for manual and robot assisted joint replacement
US11553969B1 (en) 2019-02-14 2023-01-17 Onpoint Medical, Inc. System for computation of object coordinates accounting for movement of a surgical site for spinal and other procedures
US11786206B2 (en) 2021-03-10 2023-10-17 Onpoint Medical, Inc. Augmented reality guidance for imaging systems
US11857378B1 (en) 2019-02-14 2024-01-02 Onpoint Medical, Inc. Systems for adjusting and tracking head mounted displays during surgery including with surgical helmets
DE102022213045A1 (en) 2022-12-02 2024-06-13 Continental Automotive Technologies GmbH Sensor system
US12053247B1 (en) 2020-12-04 2024-08-06 Onpoint Medical, Inc. System for multi-directional tracking of head mounted displays for real-time augmented reality guidance of surgical procedures

Families Citing this family (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10154239B2 (en) 2014-12-30 2018-12-11 Onpoint Medical, Inc. Image-guided surgery with surface reconstruction and augmented reality visualization
DE102015204280A1 (en) 2015-03-10 2016-09-15 Robert Bosch Gmbh A method for activating an actuator of a motor vehicle, device configured for carrying out the method and computer program product
CA3049662A1 (en) 2017-01-16 2018-07-19 Philipp K. Lang Optical guidance for surgical, medical, and dental procedures
US10852427B2 (en) * 2017-06-30 2020-12-01 Gopro, Inc. Ultrasonic ranging state management for unmanned aerial vehicles
WO2019051464A1 (en) 2017-09-11 2019-03-14 Lang Philipp K Augmented reality display for vascular and other interventions, compensation for cardiac and respiratory motion
CN109324331B (en) * 2018-12-10 2024-03-15 株洲中车特种装备科技有限公司 Monorail manned tourist car anti-collision system and method based on ultrasonic ranging

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB2267363A (en) * 1992-05-25 1993-12-01 Toshiba Kk A ventilator with a sensor which detects human actions
US20080247275A1 (en) * 2007-03-26 2008-10-09 Furuno Electric Company, Limited Underwater detection apparatus
WO2012016868A1 (en) * 2010-08-03 2012-02-09 Valeo Schalter Und Sensoren Gmbh Method and device for monitoring the surroundings of a vehicle

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2774186B2 (en) * 1990-07-25 1998-07-09 松下電工株式会社 Ultrasonic sensor
DE102004005225A1 (en) 2004-02-03 2005-08-18 Robert Bosch Gmbh Driver assistance device
DE102006001666A1 (en) 2006-01-12 2007-08-16 Ident Technology Ag Method and control system for closing flaps
US8091280B2 (en) * 2007-06-01 2012-01-10 GM Global Technology Operations LLC Arms full vehicle closure activation apparatus and method

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB2267363A (en) * 1992-05-25 1993-12-01 Toshiba Kk A ventilator with a sensor which detects human actions
US20080247275A1 (en) * 2007-03-26 2008-10-09 Furuno Electric Company, Limited Underwater detection apparatus
WO2012016868A1 (en) * 2010-08-03 2012-02-09 Valeo Schalter Und Sensoren Gmbh Method and device for monitoring the surroundings of a vehicle

Cited By (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10546423B2 (en) 2015-02-03 2020-01-28 Globus Medical, Inc. Surgeon head-mounted display apparatuses
CN107531212A (en) * 2015-05-08 2018-01-02 大众汽车有限公司 Utilize manipulation of the sonac to the closure member of vehicle
US20180118161A1 (en) * 2015-05-08 2018-05-03 Volkswagen Aktiengesellschaft Actuation of a locking element for a vehicle using an ultrasound sensor
US10730478B2 (en) * 2015-05-08 2020-08-04 Volkswagen Aktiengesellschaft Actuation of a locking element for a vehicle using an ultrasound sensor
CN107531212B (en) * 2015-05-08 2021-06-08 大众汽车有限公司 Method and device for actuating a closing element of a vehicle and vehicle
US11452568B2 (en) 2016-03-12 2022-09-27 Philipp K. Lang Augmented reality display for fitting, sizing, trialing and balancing of virtual implants on the physical joint of a patient for manual and robot assisted joint replacement
US11602395B2 (en) 2016-03-12 2023-03-14 Philipp K. Lang Augmented reality display systems for fitting, sizing, trialing and balancing of virtual implant components on the physical joint of the patient
US11348257B2 (en) 2018-01-29 2022-05-31 Philipp K. Lang Augmented reality guidance for orthopedic and other surgical procedures
US11727581B2 (en) 2018-01-29 2023-08-15 Philipp K. Lang Augmented reality guidance for dental procedures
US12086998B2 (en) 2018-01-29 2024-09-10 Philipp K. Lang Augmented reality guidance for surgical procedures
US11553969B1 (en) 2019-02-14 2023-01-17 Onpoint Medical, Inc. System for computation of object coordinates accounting for movement of a surgical site for spinal and other procedures
US11857378B1 (en) 2019-02-14 2024-01-02 Onpoint Medical, Inc. Systems for adjusting and tracking head mounted displays during surgery including with surgical helmets
US12053247B1 (en) 2020-12-04 2024-08-06 Onpoint Medical, Inc. System for multi-directional tracking of head mounted displays for real-time augmented reality guidance of surgical procedures
US11786206B2 (en) 2021-03-10 2023-10-17 Onpoint Medical, Inc. Augmented reality guidance for imaging systems
DE102022213045A1 (en) 2022-12-02 2024-06-13 Continental Automotive Technologies GmbH Sensor system

Also Published As

Publication number Publication date
FR2983308B1 (en) 2020-07-17
DE102011087347A1 (en) 2013-05-29
DE102011087347B4 (en) 2022-06-09
FR2983308A1 (en) 2013-05-31
GB201221465D0 (en) 2013-01-09
GB2498833B (en) 2016-12-14

Similar Documents

Publication Publication Date Title
GB2498833A (en) Ultrasonic gesture recognition for vehicle
US12054169B2 (en) Vehicular cabin monitoring system
CN107128282B (en) Moving device control of electric vehicle door
US11225822B2 (en) System and method for opening and closing vehicle door
US20170234054A1 (en) System and method for operating vehicle door
WO2021037052A1 (en) Control apparatus and method
KR101719529B1 (en) Object detection device for a vehicle
US20180363359A1 (en) Vehicle door non-contact open apparatus
US20190122056A1 (en) Ultrasonic object detection system for motor vehicles and method of operation thereof
US20190145150A1 (en) Vehicle opening-closing body control device and vehicle opening-closing body control method
JP6393537B2 (en) Vehicle apparatus, vehicle control system, and vehicle control method
US20170167180A1 (en) Hands-free rear vechicle access system and improvements thereto
US20170185763A1 (en) Camera-based detection of objects proximate to a vehicle
US20180328094A1 (en) Intelligent Vehicle Access Point Opening System
US11898396B2 (en) System and method for detecting operator characteristic to adjust position of power actuated movable panel
US11898382B2 (en) Vehicle having powered door control
US10253531B2 (en) Detection sensor
JP2014214472A (en) Drive control device for opening/closing body for vehicle
JP6451342B2 (en) Opening and closing body drive device for vehicle
KR100892511B1 (en) Power tailgate system for vehicles
KR20150022030A (en) Apparatus and method for controlling power trunk or power tailgate using hall sensor
US20240301738A1 (en) Method and surroundings monitoring system for detecting objects in the surroundings of a vehicle, and method for determining a movement limit for a movement of a movable component of a vehicle, and door opening system or collision warning system for vehicle doors for carrying out the method
US11932200B2 (en) Vehicle and method of controlling a powered door based on user identification
US20210094484A1 (en) Vehicle information conveying device
KR20190027649A (en) Electronic device and method for protecting vehicle