Nothing Special   »   [go: up one dir, main page]

CN114616140A - Control apparatus and method - Google Patents

Control apparatus and method Download PDF

Info

Publication number
CN114616140A
CN114616140A CN202080060963.4A CN202080060963A CN114616140A CN 114616140 A CN114616140 A CN 114616140A CN 202080060963 A CN202080060963 A CN 202080060963A CN 114616140 A CN114616140 A CN 114616140A
Authority
CN
China
Prior art keywords
user
response
sensing device
target object
preset
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202080060963.4A
Other languages
Chinese (zh)
Inventor
傅继奋
张涛
郭宁
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Asu Tech Co ltd
Original Assignee
Beijing Asu Tech Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Asu Tech Co ltd filed Critical Beijing Asu Tech Co ltd
Publication of CN114616140A publication Critical patent/CN114616140A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R25/00Fittings or systems for preventing or indicating unauthorised use or theft of vehicles
    • B60R25/20Means to switch the anti-theft system on or off
    • B60R25/2054Means to switch the anti-theft system on or off by foot gestures
    • GPHYSICS
    • G07CHECKING-DEVICES
    • G07CTIME OR ATTENDANCE REGISTERS; REGISTERING OR INDICATING THE WORKING OF MACHINES; GENERATING RANDOM NUMBERS; VOTING OR LOTTERY APPARATUS; ARRANGEMENTS, SYSTEMS OR APPARATUS FOR CHECKING NOT PROVIDED FOR ELSEWHERE
    • G07C9/00Individual registration on entry or exit
    • G07C9/00174Electronically operated locks; Circuits therefor; Nonmechanical keys therefor, e.g. passive or active electrical keys or other data carriers without mechanical keys
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R25/00Fittings or systems for preventing or indicating unauthorised use or theft of vehicles
    • B60R25/20Means to switch the anti-theft system on or off
    • B60R25/25Means to switch the anti-theft system on or off using biometry
    • GPHYSICS
    • G07CHECKING-DEVICES
    • G07CTIME OR ATTENDANCE REGISTERS; REGISTERING OR INDICATING THE WORKING OF MACHINES; GENERATING RANDOM NUMBERS; VOTING OR LOTTERY APPARATUS; ARRANGEMENTS, SYSTEMS OR APPARATUS FOR CHECKING NOT PROVIDED FOR ELSEWHERE
    • G07C2209/00Indexing scheme relating to groups G07C9/00 - G07C9/38
    • G07C2209/60Indexing scheme relating to groups G07C9/00174 - G07C9/00944
    • G07C2209/63Comprising locating means for detecting the position of the data carrier, i.e. within the vehicle or within a certain distance from the vehicle
    • GPHYSICS
    • G07CHECKING-DEVICES
    • G07CTIME OR ATTENDANCE REGISTERS; REGISTERING OR INDICATING THE WORKING OF MACHINES; GENERATING RANDOM NUMBERS; VOTING OR LOTTERY APPARATUS; ARRANGEMENTS, SYSTEMS OR APPARATUS FOR CHECKING NOT PROVIDED FOR ELSEWHERE
    • G07C9/00Individual registration on entry or exit
    • G07C9/00174Electronically operated locks; Circuits therefor; Nonmechanical keys therefor, e.g. passive or active electrical keys or other data carriers without mechanical keys
    • G07C9/00309Electronically operated locks; Circuits therefor; Nonmechanical keys therefor, e.g. passive or active electrical keys or other data carriers without mechanical keys operated with bidirectional data transmission between data carrier and locks

Landscapes

  • Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Mechanical Engineering (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Optical Radar Systems And Details Thereof (AREA)
  • Traffic Control Systems (AREA)
  • Lock And Its Accessories (AREA)
  • Power-Operated Mechanisms For Wings (AREA)

Abstract

A control apparatus and a control method are provided. The control device (001) comprises a first sensing means (200), an optical means (300) and a second sensing means (400) each communicatively connected to the controller (100). The first sensing device (200) detects whether the user (U) is near the target object (T), and if so, transmits object detection data to the controller (100), which further determines whether the user (U) is within a preset distance from the target object (T), and if so, controls the optical device (300) to project a preset image. Then, the second sensing device (400) detects a response of the user to the preset image and transmits user response data to the controller (100). The controller (100) further determines whether the response from the user satisfies a predetermined criterion, and if so, controls the target object (T) to perform the task. The control device can be used to achieve a hands-free opening of the vehicle door.

Description

Control apparatus and method
Cross Reference to Related Applications
This application claims priority to chinese patent application No.201910789846.0 filed on 26.8.2019, the disclosure of which is incorporated herein by reference in its entirety.
Technical Field
The present invention relates generally to the field of control technologies, and more particularly, to a control apparatus and method that can be applied to various fields including the field of smart vehicle technologies, and more particularly, to a vehicle door control apparatus and method.
Background
In the field of intelligent vehicle technology, how to intelligently open a vehicle door (i.e., a rear door), or more specifically, an electric trunk door, is an important research topic. Currently, several main ways of opening a vehicle trunk door include: the method comprises the steps of pressing a 'trunk opening' button arranged on a center console in the vehicle, pressing a 'trunk opening' button on a vehicle key or pressing a switch button arranged on a trunk door.
The above three ways are essentially the conventional ways of opening the trunk door of a vehicle. When a user's hand is inconvenient to use a car key or to perform a button operation, a trunk door of the vehicle cannot be automatically opened.
Disclosure of Invention
In a first aspect, the present disclosure provides a control device.
The control apparatus includes a controller, a first inductive sensing, an optical device, and a second sensing device. Each of the first sensing device, the optical device, and the second sensing device is communicatively coupled to the controller.
The first sensing device is configured to detect whether the user is near the target object and, if so, to send the user's object detection data to the controller. The controller is configured to determine whether the user is within a preset distance from the target object based on the object detection data, and if so, to send a first control command to the optical apparatus. The optical device is configured to project a preset image to a preset area of the surface for presentation to a user upon receiving a first control command. The second sensing device is configured to detect a response of the user to the preset image and then transmit user response data to the controller. The controller is further configured to determine, based on the user response data, whether the response from the user meets a predetermined criterion, and if so, to send a second control command to the target object for performing the task.
Here, according to some preferred embodiments, the target object is a door and the task is to open the door. Alternatively, however, the target object can be another object (such as an elevator) and the task is to stop the elevator at the same level as the user, thereby making preparations for the user to ride. The target object may also be a robotic assistant and the task is moving close to the user to provide a service.
More preferably, the target object is a vehicle door, and the predetermined area of the surface is a ground area near the vehicle door. Here, the vehicle door can be any door, such as a rear trunk door, a front engine cover, or any powered door that provides access to the interior of the vehicle.
In the control apparatus, the second sensing device optionally comprises an obstacle detector configured to detect whether an obstacle is present between the target object and the image projected on the surface by the optical device as a result of the user's response to the image.
Here, the obstacle detector may include at least one of a camera, a radar sensor, a non-contact capacitive sensor, an infrared sensing device, or a TOF detecting device.
The second sensing device may optionally comprise a user behaviour detector configured to detect a behaviour of the user in response to the preset image. The user behavior detector can be a camera, the response detected by the camera including at least one of a user's motion, gesture, or facial expression. The user behavior detector can also be a microphone, the response detected by the microphone comprising the voice of the user.
In the control device, the predetermined criteria may include detecting any action performed by the user, which may include at least one of an action, a gesture, a facial expression, or speech.
In the control device, the controller may optionally be further configured to perform feature recognition based on the user response data, and the predetermined criterion comprises a substantial match between a result of the feature recognition and a pre-stored record. In this document, the terms "substantially", "essentially" and the like shall be considered interchangeable with the phrase "in most details, if not completely", i.e. defined as more than 80% on the matching level.
In the control apparatus, the first sensing device can optionally include at least one of an ultrasonic sensing device, a radar sensor, a camera, a wireless signal detector, an infrared sensing device, a pyroelectric sensor, a ToF (time of flight) camera, or a VCSEL (vertical cavity surface emitting laser) sensor.
According to some embodiments, the first sensing device comprises a wireless signal detector configured to detect a wireless signal emitted from an on-body device carried by the user, wherein the wireless signal comprises at least one of a Radio Frequency (RF) signal, a WiFi signal, a bluetooth signal, a 4G signal, or a 5G signal.
Here, the carry-on device carried by the user includes at least one of a car key, a mobile phone, or a wireless signal transmitter.
In the control apparatus as described above, the preset distance can be about 0.01 to 10 meters, preferably about 0.1 to 5 meters, and further preferably about 0.2 to 1 meter.
According to some embodiments of the control apparatus, the first sensing device is essentially a functional module embedded in the controller.
According to some embodiments of the control apparatus, the optical device is further configured to transmit time stamp information to the controller at a start of projecting the preset image; and the controller is further configured to calculate an operating time of the optical apparatus based on the time stamp information, determine whether the operating time is longer than a preset threshold, and if so, further send a stop command to the optical apparatus to stop the projection if no response is received from the user.
Here, the preset threshold value can be about 2 seconds to 30 minutes, preferably about 5 seconds to 5 minutes, and further preferably about 10 seconds to 1 minute.
In a second aspect, the present disclosure also provides a control method.
The control method comprises the following steps:
(1) determining whether the user is within a preset distance from the target object;
(2) if so, controlling the optical device to project a preset image to a preset area of the surface for presentation to the user;
(3) detecting a response of a user to a preset image;
(4) based on the response, determining whether a predetermined criterion is satisfied; and
(5) and if so, controlling the target object to execute the task.
Here, the target object can optionally be a door and the task can be to open the door.
According to some embodiments of the control method, the target object is a vehicle door, and the predetermined area of the surface is a ground area near the vehicle door.
According to some embodiments of the control method, the step (1) of determining whether the user is within a preset distance from the target object comprises the sub-steps of:
(a) acquiring object detection data of a user; and
(b) it is determined whether the user is within a preset distance from the target object based on the object detection data.
Here, optionally, the sub-step (a) of acquiring object detection data of the user can be performed by means of a first sensing device. The first sensing device can include at least one of an ultrasonic sensing device, a radar sensor, a camera, a wireless signal detector, an infrared sensing device, a pyroelectric sensor, a ToF (time of flight) camera, or a VCSEL (vertical cavity surface emitting laser) sensor.
In certain embodiments of the control method, the first sensing device comprises an ultrasonic sensing device, and in sub-step (b) of determining whether the user is within a preset distance from the target object based on the object detection data, the distance of the user from the target object is estimated by at least one of an intensity of an echo signal received by the ultrasonic sensing device or a time period during which the echo signal is received.
Optionally, the first sensing means comprises a camera and in sub-step (b) of determining whether the user is within a preset distance from the target object based on the object detection data, the distance of the user from the target object is estimated by analyzing an image of the user.
Optionally, the first sensing device comprises a wireless signal detector configured to detect a wireless signal emitted from an on-body device carried by the user, and in the sub-step (b) of determining whether the user is within a preset distance from the target object based on the object detection data, the distance of the user from the target object is estimated by the strength of the wireless signal detected by the wireless signal detector. Here, the carry-on device carried by the user includes at least one of a car key, a mobile phone, or a wireless signal transmitter.
In the control method as described above, the preset distance is about 0.01 to 10 meters, preferably about 0.1 to 5 meters, and further preferably about 0.2 to 1 meter.
In the control method as described above, the step (3) of detecting the response of the user to the preset image is performed by the second sensing means. The second sensing device may include at least one of an obstacle detector or a user behavior detector. The obstacle detector is configured to detect whether an obstacle has occurred between the target object and a preset image projected on the surface by the optical device as a result of a response of the user to the preset image. The user behavior detector is configured to detect a behavior of a user in response to a preset image.
According to some embodiments of the control method, the second sensing device comprises an obstacle detector, which may comprise at least one of a camera, a radar sensor, a contactless capacitive sensor, an infrared sensing device or a TOF detecting device. Preferably, the second sensing means comprises TOF detecting means.
According to some embodiments of the control method, the second sensing device comprises a user behavior detector. Here, the user behavior detector can include at least one of a camera or a microphone.
In an embodiment of the control method in which the user behavior detector comprises a camera, the response detected by the camera comprises at least one of a user's action, gesture or facial expression.
In an embodiment of the control method wherein the user behavior detector comprises a microphone, the response detected by the microphone comprises the voice of the user.
Here, the predetermined criteria may optionally include detecting any action performed by the user, which may include at least one of an action, a gesture, a facial expression, or speech.
According to some embodiments of the control method, the step (4) of determining whether the predetermined criterion is met based on the response comprises the sub-step of performing feature recognition based on the user response data. Here, the predetermined criterion includes a substantial match of the result of the feature recognition with a pre-stored record.
According to some embodiments, after step (2) of, if so, controlling the optical device to project the preset image to the preset area of the surface for presentation to the user, the controlling method further comprises the steps of:
calculating the working time of the optical device; and
and controlling the optical device to stop projecting if the working time is longer than a preset threshold and no response from a user is detected.
Here, the preset threshold value can be about 2 seconds to 30 minutes, preferably about 5 seconds to 5 minutes, and further preferably about 10 seconds to 1 minute.
Any of the embodiments of the control method described above can be performed by the control apparatus according to the first aspect.
Drawings
In the following, in order to more clearly illustrate the various embodiments of the invention provided in the present disclosure, drawings of some embodiments of the present disclosure will be briefly provided.
These drawings are to be understood as encompassing only some embodiments of the invention disclosed herein, and not all others, as would be apparent to one of ordinary skill in the art based on the drawings provided herein, without undue inventive effort.
Fig. 1A and 1B illustrate a block diagram of a control apparatus and a control method using the control apparatus, respectively, according to some embodiments of the present disclosure;
fig. 2 shows a vehicle door control apparatus and a control method based on the control apparatus and the control method shown in fig. 1A and 1B;
fig. 3A and 3B show a specific application scenario of the vehicle door control device and a work flow of a vehicle door control method using the vehicle door control device, respectively;
FIG. 4 illustrates a block diagram of a controller, according to certain embodiments of the present disclosure; and
fig. 5 illustrates a block diagram of a controller provided by some embodiments of the present disclosure.
Detailed Description
Hereinafter, technical solutions provided by various embodiments of the present invention will be described in more detail with reference to the accompanying drawings attached to the present disclosure. It should be noted that the embodiments provided in the present disclosure should be considered as merely representative of a part of the embodiments covered by the present disclosure, not all of them, and should not be considered as imposing any limitation on the scope of protection of the present disclosure. Other embodiments, which have slight variations in design based on the embodiments provided herein, are considered to be within the scope of the present disclosure as long as they follow the inventive subject matter disclosed herein and can be readily obtained by one of ordinary skill in the art without involving any inventive step.
In certain aspects, the present disclosure provides a control apparatus and a control method. The control method is basically performed by a control device.
Fig. 1A is a block diagram of a control device according to some embodiments of the present disclosure. As shown, the control apparatus 001 includes a controller 100, a first sensing device 200, an optical device 300, and a second sensing device 400. Each of the first sensing device 200, the optical device 300, and the second sensing device 400 is communicatively coupled to the controller 100.
The first sensing device 200 is essentially an object detector configured to detect whether the user U is near the target object T and then transmit object detection data to the controller 100.
The controller 100 is configured to determine whether the user U is within a preset distance from the target object T based on the object detection data from the first sensing device 200, and is further configured to transmit a first control command (i.e., "first control command" in fig. 1) to the optical device 300 if it is determined that the user U is within the preset distance.
The optical device 300 is configured to project a preset image onto a preset area of the surface for presentation to the user U upon receiving a first control command.
The user U may exhibit a response thereto in prompt of the image projected by the optical device 300.
The second sensing device 400 is configured to detect a response of the user U to the projected image and then transmit user response data to the controller 100.
The controller 100 is further configured to determine whether the response from the user U satisfies a predetermined criterion based on the user response data from the second sensing device 400, and is further configured to transmit a second control command (i.e., "second control command" in fig. 1) to the target object T for performing a task corresponding to the second control command.
Accordingly, fig. 1B illustrates a flow chart of a control method using the control device 001, according to some embodiments of the present disclosure. As shown in fig. 1B, the control method includes the steps of:
s10: determining whether the user is within a preset distance from the target object;
s20: if so, controlling the optical device to project a preset image to a preset area of the surface for presentation to the user;
s30: detecting a response of a user to a preset image;
s40: determining whether a predetermined criterion is satisfied based on the user's response;
s50: and if so, controlling the target object to execute the corresponding task.
As used throughout this disclosure, the term "user" generally refers to a human, but can also be extended to refer to an animal, a robot, a machine, or anything that can respond to an image projected by an optical device.
In one such scenario, which also encompasses a preferred embodiment of the present disclosure, the control device 001 is disposed in a vehicle (e.g., a passenger car) and is configured to control the vehicle to perform vehicle related tasks, such as opening a door (e.g., a trunk lid or trunk door). In this particular scenario, the target object is a vehicle, the user can be a driver or passenger approaching the vehicle and wishing to open a door of interest (e.g., a trunk door or a tailgate), and the target object is to perform a task of opening the door.
Thus, by means of the control apparatus 001 and the control method described above, it can be achieved that when a user approaches the trunk of a vehicle (e.g., a passenger car) in an attempt to open the trunk door, an optical device (e.g., a projector) provided on the vehicle can project an image (e.g., a light spot, or an optical pattern such as a light ring or a specific mark) onto a ground area (i.e., "a preset area of the surface") directly in front of the trunk of the vehicle when the control apparatus detects that the user is sufficiently close to the trunk. Under the prompt of an optical pattern projected on the ground, the user can kick a foot to sweep over the optical pattern. Then, upon detecting the user response, the control device can control the trunk door of the vehicle to open. In this way, the user can easily open the trunk door of the automobile without using hands, which is particularly useful when the user holds many things (e.g., a large box) with both hands and intends to put them in the trunk.
Hereinafter, a more detailed description will be provided of the above-described specific embodiments and other related embodiments of the control apparatus and control method for controlling opening of the vehicle door.
Fig. 2 shows a vehicle door control device and a vehicle door control method, which are based essentially on the control device and method as explained above and shown in fig. 1A and 1B.
As shown in fig. 2, the door control apparatus 001A includes a vehicle controller 110, an object detector 210, an optical device 310, and a user response detector 410, which correspond to the controller 100, the first sensing device 200, the optical device 300, and the second sensing device 400 of the control apparatus 001 shown in fig. 1A.
The various steps of the door control method are also shown in FIG. 2. In short, the object detector 210 detects whether the user U is approaching the door T of interest (e.g., a trunk door, see S100), and then transmits object detection data to the vehicle controller 110 (S200). Then, based on the object detection data, the vehicle controller 110 determines whether the user U is within a preset distance from the door T, and if so, transmits a first control command to the optical device 310 (S300), thereby controlling the optical device 310 to project a preset image onto the ground (S400). At the prompt of the projected image, the user U may exhibit a response, which is then detected by the user response detector 410 (S500), and the user response detector 410 further transmits user response data to the vehicle controller 110 (S600). Then, based on the user response data, the vehicle controller 110 further determines whether the response from the user U satisfies a predetermined criterion, and if so, controls the door T to open (S700).
As used herein, the term "vehicle" generally refers to a transportation machine that transports people or cargo, which can include motor vehicles, such as passenger cars, automobiles, Sport Utility Vehicles (SUVs), motorcycles, buses, trucks, or various commercial vehicles, and can also include hybrid vehicles, electric vehicles, plug-in hybrid electric vehicles, hydrogen-powered vehicles, and other alternative fuel (e.g., fuel derived from resources other than petroleum) vehicles. The term can also include other types of transportation, such as trucks, bicycles, rail vehicles (e.g., trains, trams, etc.), watercraft (e.g., ships or boats), aircraft, and spacecraft, etc., within the broader scope of the present disclosure as well.
As used herein, the term "door" should be considered equivalent to a "gate," "cover," etc., and refers to a movable portion of a vehicle that, if open, provides an opening into a particular compartment, and if closed, closes the opening of a particular compartment, such as the cargo trunk (or trunk for short, below), the interior driver/passenger space, or the engine compartment of the vehicle. In the embodiment shown in fig. 2, the vehicle door is a trunk door provided at the rear of the passenger vehicle, however, it should be noted that alternatively the "vehicle door" could be a powered door leading to the interior driver/passenger space or a lid/door leading to other compartments of the vehicle (e.g., the engine compartment). Such a vehicle door can be a hinged door or a sliding door, but can also be another type of door.
As used herein, a "user" can be a driver or passenger who intends to ride a vehicle, or can be a guest or third party person who does not intend to ride a vehicle.
As used herein, the "preset distance" from the vehicle door can be preset by a technician as needed. For example, in a typical passenger car, a user is considered to have a need to open the door only when the user is within about 1m (meter) of the door. Thus, according to some embodiments, the preset distance is set to a range with a radius of 1.0 meter. Alternatively, however, the preset distance can be set to 0.5m, 1.5m, 2.0m, or any other distance. In the present disclosure, the preset distance can be generally set to 0.1m to 5m, preferably 0.5m to 2m, and more preferably about 1 m.
In both the general control apparatus shown in fig. 1A and the specific embodiment of the vehicle door control apparatus shown in fig. 2, the first sensing device 200 or the object detector 210 can have different embodiments according to specific modes or operation mechanisms. For the sake of brevity, only a description of the object detector 210 in the vehicle door control apparatus shown in fig. 2 is provided, but it should be noted that such description will also be applied without any limitation to the first sensing device 200 in the general control apparatus shown in fig. 1A.
In certain embodiments, the object detector 210 comprises an ultrasonic sensing device capable of periodically emitting an ultrasonic signal. When a user is approaching the door of the vehicle, echo signals reflected by the user can be received by the ultrasonic sensing device. The distance of the user from the vehicle door can then be measured or estimated based on a particular predetermined relationship between the strength of the echo signal and the time period over which the echo signal is received. Similarly, the object detector 210 includes a radar sensing device (i.e., a radar sensor), such as a millimeter wave (i.e., mmWave) radar sensor, which can be used for relatively short range (e.g., about 30m) or relatively long range (e.g., about 200m) object detection, as may be desired.
In some embodiments, the object detector 210 measures or estimates the user's distance from the vehicle door depending on the image taken by it. For example, in one such embodiment, the object detector 210 includes a camera (e.g., a video camera). The camera can periodically photograph a scene near the door of the vehicle. The camera can periodically take a series of pictures of the user while the user is approaching the door of the vehicle. The distance of the user from the vehicle door can then be measured or estimated based on each of the series of photographs. This can be done, for example, by feature recognition and by measuring the size of the user or of parts thereof (e.g. certain feature points on the head or body) in each picture, and then based on a predetermined size-distance relationship, the distance of the user from the vehicle door can be calculated or estimated. In addition to the above methods, other distance estimation methods are possible. Further, the series of photographs may optionally include images containing certain features of the user (such as facial features, gestures, and/or actions, etc.), and through feature comparison and deep learning, the identity of the user may be further determined for other purposes, such as authentication.
In some embodiments, the object detector 210 relies on wireless signals captured thereby to measure or estimate the user's distance from the vehicle door, and thus the object detector 210 may optionally include a wireless signal detector configured to detect wireless signals emitted from an on-body device carried by an approaching user. Here, the "wireless signal" can be a Radio Frequency (RF) signal, a WiFi signal, a bluetooth signal, a 4G/LTE signal, a 5G signal, etc., which can be transmitted by a carry-on device, such as a car key, a mobile phone, or a dedicated wireless signal transmitter, which is basically a wireless signal transmitting device. Such wireless signal transmitting devices are typically carried by a vehicle user and can therefore be used to estimate the user's distance as he or she is approaching the vehicle door.
In one such particular embodiment, the object detector 210 includes an RF detector (such as a wireless sensing module sometimes embedded in the vehicle controller 110) configured to sense or detect RF signals emitted from a vehicle key. When a user carrying such a car key is approaching the car door, the wireless sensing module can estimate the distance of the user from the car door based on the strength of the RF signal. Alternatively, the vehicle key may transmit other types of wireless signals (e.g., bluetooth signals), and the corresponding object detector may be configured to receive these types of wireless signals and operate in a similar manner.
In another such embodiment, the object detector 210 comprises a wireless signal detector configured to detect one or more of an RF signal, a bluetooth signal, a WiFi signal, a bluetooth signal, a 4G/LTE signal, a 5G signal, etc., transmitted from a mobile phone. When a user carrying a mobile phone is approaching a vehicle door, the signal detector can estimate the distance of the user from the vehicle door based on the received wireless signal strength.
In other embodiments, the object detector may include other types of sensing devices that may be used to measure or estimate the distance of an approaching user from the vehicle door, such as infrared sensing devices, pyroelectric sensors, ToF (time-of-flight) cameras, VCSEL (vertical cavity surface emitting laser) sensors, and the like.
Here, there can optionally be different ways of providing the object detector 210 and the vehicle controller 110 in the vehicle.
According to some embodiments, the object detector 210 and the vehicle controller 110 are physically separate, but communicatively connected. The vehicle controller 110 can determine or determine whether the user is present within a preset distance from the vehicle door according to the object detection signal emitted from the object detector 210.
According to some other embodiments, the object detector 210 may be functionally and/or physically embedded in the vehicle controller 110. For example, the vehicle controller 110 itself may have a function of detecting a user. For example, the wireless sensing module configured in the vehicle controller 110 can be basically used as the object detector 210 for sensing or acquiring information of the user.
Alternatively, the object detector 210 may comprise any combination of the different embodiments described above. In one such embodiment, the object detector 210 of the door control device 001A may include a camera and an RF signal detector (e.g., a wireless sensing module embedded in the vehicle controller 110). The camera is configured to capture an image of a user in proximity, and the RF signal detector is configured to detect an RF signal emitted from a car key carried by the user. Such combined use may enable better security, robustness (i.e., if one fails, the other still works), higher effectiveness (e.g., the first method acts as a trigger to activate the second method), or a combined function of distance estimation and user authentication. More specifically, if a user is detected only within a preset distance from the vehicle door, which may be a person who does not carry a vehicle key, there is a risk of false opening of the vehicle door. Therefore, in order to ensure better safety, it can be determined simultaneously whether an object is detected to be present within a preset distance from the vehicle door and whether an RF signal from a vehicle key is detected. If both are detected, a drive command can be sent to the optical device.
Here, in S200 of the vehicle door control method shown in fig. 2, the object detection data obtained by the object detector 210 and transmitted to the vehicle controller 110 can be any data type (such as an electric signal, a data packet, or a binary code, etc.), and can have different embodiments.
According to some embodiments, the object detection data may include initial data (e.g., images, wireless signal strength, echo reflection time, etc.) periodically captured by the object detector 210 while the user is approaching the vehicle door. Thus, such captured initial data is periodically transmitted to the vehicle controller 110, which is able to calculate the distance by a calculation module in the processor, and is also able to determine whether the user has reached within a preset distance from the vehicle door by a determination module in the processor. Thus, in these embodiments, the object detector 210 does not process the object detection data, but rather transmits all of the object detection data to the vehicle controller 110 for processing and determination.
According to some other embodiments, the object detection data may only include the determination result. Here, in addition to being able to periodically capture distance-related data (e.g., images, wireless signal strength, echo reflection time, etc.), the object detector 210 itself is also able to determine whether the captured data meets certain criteria (e.g., the strength of the wireless signal is above a predetermined threshold, or the time period of the echo signal is less than a predetermined threshold), which is substantially equivalent to determining whether an approaching user is within a preset distance from the vehicle door. Thus, the object detection data may be in the form of protocol data. For example, the object detection data may include only a binary code "1" indicating that the user is within a preset distance. Accordingly, when the user is approaching the door, the object detector 210 periodically acquires the data related to the user and determines whether the approaching user is within a preset distance from the door, and transmits the binary code "1" to the vehicle controller 110 only when it is determined that the user is within the preset distance. Upon receiving the binary code "1", the vehicle controller 110 accordingly determines that the user is within the preset distance, and can further send a first control command to the optical device 310 to control the optical device to project the preset image onto the ground. Thus, in these embodiments, the object detector 210 processes the detected data about the user as best as possible and makes a determination based thereon, and transmits only the determination result, which is substantially the object detection data, to the vehicle controller 110.
It should be noted that in addition to the two different embodiments described above, there can be an intermediate type of object detection data, i.e., the object detector 210 partially processes detected data about the user and then transmits such processed data as object detection data to the vehicle controller 110, whereupon the vehicle controller 110 further determines whether the approaching user is within a preset distance from the vehicle door.
In the door control apparatus and method shown in fig. 2, the optical device 310 can be a lighting device installed at a proper position on the vehicle and directed to a preset area of the ground in a proper direction. Here, different embodiments can exist for the optical device 310 and the preset image projected thereon on the ground by the same (S400).
In certain embodiments, the optical device 310 comprises a projector (e.g., a laser projector, an LED projector, etc.). Here, the image projected by the projector can be of any shape, any color, any form, or for any purpose. Alternatively, the image may be a dot, a ring, an arrow of a particular pattern (such as a logo), or include text. Further alternatively, the image may be a monochrome image or a color image. Further alternatively, the image may be a still image, a flashing image, an alternating series of images, a moving image, or a video (e.g., a commercial).
Further alternatively, the image may be used for some purpose. In one example, the image may include prompt text such as "please slide your foot over or step on the image," which essentially prompts the user to perform a particular action or action as prompted to open the door.
In another example, the image may include a prompt, such as "please perform an action as a pass to open the trunk door," and the user may perform an action that has been previously agreed (e.g., counterclockwise movement of the foot) as a pass. Then, the action may be captured by the camera (i.e., user response detector 410), and vehicle controller 110 may further determine whether the action performed by the user matches a pre-stored key, and if so, vehicle controller 110 may control the doors to open, and if not, vehicle controller 110 may further control optical device 310 to project another preset image as a prompt, requesting the user to retry. Thus, this action essentially serves as an authentication level.
In certain other embodiments, the optical device 310 comprises a spotlight (e.g., a general spotlight, an LED light, etc.) configured to simply project a light beam onto the ground, and the preset image formed thereby may include only the spots formed by the light beam on the ground.
In S500 of the door control method shown in fig. 2, the user exhibits a response after observing the image projected on the ground by the optical device 310, the response being detected by the user response detector 410. Further, in S600, the user response detector 410 transmits the user response data to the vehicle controller 110. Then, based on the user response data, the vehicle controller 110 further determines whether the response from the user U satisfies a predetermined criterion, and if so, controls the door T to open (S700).
In the present disclosure, there can be various embodiments of the user's response to the projected image that can be detected depending on the type of image projected by the optical device 310.
In some embodiments, the image projected onto the ground by optical device 310 may accordingly include still images (e.g., a light spot, a light ring, a logo, and/or cueing text such as "please slide your foot over or step on top of the image"). Thus, the user can perform an image-specified action or action, such as sliding or stepping his/her foot over the image, prompted by the projected image. Then, user response detector 410 may include an obstacle detection device (i.e., obstacle detector) configured with vehicle controller 110 to detect whether an obstacle is present between the image projected on the ground by optical device 310 and the door of the vehicle. Here, the presence or absence of an obstacle therebetween essentially constitutes a predetermined criterion for controlling the opening of the door.
In one embodiment, the obstacle detector comprises a camera. When a shadow occurs in the area of the projected image, which is caused by the user's foot sliding over or on the image, the camera (i.e., user response detector 410) can take one or more pictures (essentially user response data) that are then transmitted to vehicle controller 110 for analysis and determination. Here, more specifically, the vehicle controller 110 can detect whether there is any pixel change in the projected image based on the received one or more photographs, thereby determining whether there is an obstacle between the projected image and the door, or more specifically, whether there is an obstacle in the optical path of the projected image.
In other embodiments, the obstacle detector may comprise an infrared sensing device or a TOF detecting device. When the blocking of the infrared light beam or the fluctuation of the TOF detection signal is detected in the region where the preset image is projected, it is regarded that an obstacle has occurred.
In other embodiments, the obstacle detector may alternatively comprise a radar sensor (e.g. a mmWave radar sensor, a short range radar sensor or an obstacle detection radar) or a contactless capacitive sensor.
In other embodiments, the user's response to the projected image may be a behavior (e.g., an action, a gesture, a facial expression, a voice, etc.) performed or exhibited by the user while the user sees the image projected by the optical device. The user response detector 410 may accordingly comprise a user behavior detection device or a user behavior detector (e.g., an imaging device such as a camera or a voice recording device such as a microphone) capable of detecting a response performed by the user. Then, upon receiving the user response data transmitted from the user response detector 410, the vehicle controller 110 determines whether a predetermined criterion is satisfied, and if so, further controls the door to be opened.
In one example, the projected image may prompt the user to make an action (e.g., move a leg or move a foot counterclockwise, etc.), and a motion detector (e.g., a camera) may be used as the user response detector 410 to detect whether movement of the user's leg is detected or whether the user's action can be recognized (i.e., a predetermined criterion). If so, the vehicle controller 110 can further control the door opening.
In another example, the projected image may prompt the user to direct his/her face toward a particular portion of the vehicle door (e.g., a camera mounted on the trunk door), and the camera may be used as the user response detector 410 to capture the user's face. The vehicle controller 110 then determines whether the user's face is detected, or even whether the user's face can be identified (i.e., a predetermined criterion), and if so, further controls the doors to open.
In yet another example, the projected image may prompt the user to speak or speak a particular word, and a microphone mounted in the vehicle may be used as the user response detector 410 for capturing the user's voice. The vehicle controller 110 then determines whether the user's voice is detected, or even whether the user's voice can be recognized (i.e., a predetermined criterion), and if so, further controls the doors to open.
In any of the above embodiments, the predetermined criteria determined by the vehicle controller 110 may vary according to specific needs. In one example, the predetermined criteria may be a simple criterion without any feature recognition functionality (i.e., "yes" versus "no" depending on whether user behavior is detected). That is, upon detecting a user's behavior (such as motion, voice, face, or gesture), the vehicle controller 110 may determine that the predetermined criteria are satisfied. In another example, the predetermined criteria applied by the vehicle controller 110 may be complex criteria involving feature recognition. That is, after the user response detector 410 detects the user's motion, voice, face, gesture, or the like, the vehicle controller 110 further performs feature recognition (e.g., face recognition, motion/gesture recognition, voice recognition, or the like), and determines that the predetermined criterion is satisfied only after the result of the feature recognition indicates that the user's feature matches a pre-stored record (e.g., the user's identity can be recognized). Thus, these latter embodiments essentially grant user authentication before opening the door.
In any of the above embodiments, optionally, the user response data transmitted from the user response detector 410 to the vehicle controller 110 can be periodically transmitted to the vehicle controller. As such, the user response data may include raw data that records the user response captured by the user response detector 410, such as image data captured by a camera or voice data captured by a microphone. The vehicle controller 110 is equipped with a function of analyzing the raw data to determine whether a predetermined criterion is satisfied. Such analysis and determination by the vehicle controller 110 may be simple (e.g., yes or no determination) or complex (e.g., feature recognition).
Alternatively, the user response detector 410 itself may have the ability to analyze and determine based on the raw response data without resorting to the vehicle controller 110. Thus, the user response data transmitted from the user response detector 410 to the vehicle controller 110 may include only the determination results that can be in the form of protocol data. For example, a code "1" means that a predetermined criterion is satisfied, and a code "0" means that the predetermined criterion is not satisfied.
In certain embodiments, when the optical device 310 receives the first control command from the vehicle controller 110, the optical device 310 can also send time stamp information to the vehicle controller 110, the time stamp information recording the time at which the optical device 110 starts projecting the preset image. Upon receiving the time stamp information, the vehicle controller 110 can calculate the operating time of the optical device 310 based on the time stamp information.
If the optical device 310 is operated for longer than a preset threshold (e.g., about 2 seconds to 30 minutes, preferably 5 seconds to 5 minutes, and more preferably about 10 seconds to 1 minute) while the user response detector 410 does not receive a response from the user, the vehicle controller 110 can further send a stop command to the optical device 310, thereby controlling the optical device 310 to stop projecting. When the user approaches the door, it does not necessarily mean that the user wants to open the door. In this case, after the optical device 310 projects the preset image, the user's response may not be detected for a certain period of time. By calculating the operation period of the optical device 310, the vehicle controller 110 can send a stop projection command to the optical device 310 to stop projection. This feature can therefore save power and extend the operational life of the optical device 310.
There can be various different ways in which the vehicle controller 110 controls the door to open depending on the different mechanisms by which the door is opened.
In some embodiments, the vehicle door is opened by means of a hydraulic lever operatively connected to and driven by the drive motor. Therefore, after the vehicle controller 110 issues the second control command to the drive motor, the drive motor can drive the hydraulic rod to move in a specific direction, thereby opening the vehicle door.
In another embodiment, the door is locked by means of a latch. When the vehicle controller 110 sends a second control command to an actuator (e.g., a motor or electromagnetic device), the actuator can then release the latch, which then spring pushes or pulls the door open.
In certain embodiments, the vehicle controller 110 can be further configured to record a period of time that the door remains open. The vehicle controller can control the doors to close or close if the time period is longer than a preset threshold. In a particular embodiment in which the vehicle door is opened or closed by means of a hydraulic lever, this can be achieved by means of a drive motor which, upon receiving a reverse drive signal from a vehicle controller, drives the hydraulic lever in the opposite direction, thereby closing the vehicle door.
The vehicle door control apparatus and method disclosed herein can be used to conveniently achieve hands-free opening of a vehicle door. The object detector 210 and the vehicle controller 110 can detect and determine whether a user is approaching and present within a preset distance from the vehicle door. If so, the vehicle controller 110 controls the optical device 310 to project a preset image onto the ground. The user response detector 410 is capable of recording the user's response. Based on the user response received from user response detector 410, vehicle controller 110 further determines whether a predetermined criterion is met, and if so, controls the doors to open.
In order to better understand the above vehicle door control apparatus and method, a specific application scenario as shown in fig. 3A and 3B is provided below.
As shown in fig. 3A, when a user U enters the sensing range a of the ultrasonic sensing device 211 (i.e., one type of object detector 210 shown in fig. 2), the optical device 311 is triggered to project a light beam B, forming a projected image C on the ground. After the user observes the projected image on the ground, he or she can perform an action (e.g., slide a foot or step) above or onto the projected image C. After the TOF detecting device (i.e., an obstacle detector, which is a type of user response detector 410 shown in fig. 2 but not shown in fig. 3) detects an action, the vehicle controller (i.e., the vehicle controller 110 shown in fig. 2 but not shown in fig. 3) controls the trunk door T to open.
FIG. 3B further illustrates a more detailed description of a door control method using the door control apparatus shown in FIG. 3A. As shown in fig. 3B, the object detector 211 (e.g., the ultrasonic sensing device in fig. 3A) continuously and periodically detects whether there is an object (e.g., a user) approaching and present within a preset distance from the trunk door T, and transmits object detection data to the vehicle controller 111. If it is determined that the user is present within the preset distance, the vehicle controller 111 transmits a first control command to the optical device 311 to project the current image onto the ground. An obstacle detector 411 (e.g., TOF detecting means in fig. 3A) continuously periodically detects whether an obstacle exists between the projected image and the vehicle door, and transmits obstacle detection data (i.e., one type of user response data shown in fig. 1A) to the vehicle controller 111. If it is determined that there is an obstacle (for example, the user steps on the projected image as shown in fig. 3A), the vehicle controller 111 transmits a second control command to the trunk door T, thereby opening it. Here, since the vehicle controller 111 can determine in time that an obstacle exists between the projected image and the door, there is improved responsiveness when the door is opened.
Hereinafter, a more detailed description will be provided of the controller in the control device 001 as described above and shown in fig. 1A and 1B.
Fig. 4 shows a block diagram of a controller referring to the control device 001. As shown, the controller includes a receiving module 401, a determining module 402, a controlling module 403, a transmitting module 404, and optionally may also include a feature recognition module 405.
The receiving module 401 is configured to receive detection data from one or more detection devices in the control apparatus 001, including receiving object detection data from the first sensing device 200 and receiving user response data from the second sensing device 400. Optionally, the receiving module 401 comprises a first receiving sub-module 4011 and a second receiving sub-module 4012 configured to receive object detection data from the first sensing arrangement 200 and user response data from the second sensing arrangement 400, respectively.
The determination module 402 is configured to make the determination based on one or more detection data received from one or more detection devices, including determining whether a user is present within a preset distance from a target object based on the object detection data and determining whether a response from the user meets a predetermined criterion based on the user response data. Optionally, the determination module 402 includes a first determination submodule 4021 and a second determination submodule 4022 configured to make the above two determinations based on the object detection data and the user response data, respectively.
The control module 403 is configured to generate a first control command configured to control the optical device to project the preset image if the determination module 402 determines that the user is present within the preset distance from the target object, and to generate a second control command configured to control the target object to perform a corresponding task if the determination module 402 determines that the response from the user meets the predetermined criterion. Optionally, the control module 403 includes a first control sub-module 4031 and a second control sub-module 4032 configured to generate a first control command and a second control command, respectively.
The transmission module 404 is configured to transmit the first control command to the optical device and the second control command to the target object. Optionally, the transmission module 404 includes a first transmission submodule 4041 and a second transmission submodule 4042 configured to transmit the first control command and the second control command, respectively.
Optionally, the controller further comprises a feature recognition module 405 configured to perform feature recognition based on the user response data. Here, the feature recognition module 405 may include any one of voice recognition, motion recognition, face recognition, or the like corresponding to the user response data, which may additionally facilitate the user authentication process.
In some embodiments, to save power and extend the operational life of the optical device, the controller is further configured to effect a calculation of the optical device operating time such that if the optical device operating time is longer than a preset time period (e.g., 1 minute) while no response is received from the user, the controller controls the optical device to stop projecting. Thus, the receiving module 401 is configured to receive the time stamp information from the optical device and thereby record the moment when the optical device starts projection. The controller further comprises a calculation module 406 configured to calculate an operating time of the optical device based on the time stamp information. The determination module 402 is further configured to determine whether the operating time of the optical device is longer than a preset threshold (e.g., 1 minute). The control module 403 is further configured to generate a stop command for controlling the optical apparatus to stop projecting if the operation time of the optical apparatus is longer than a preset threshold and no response is received from the user. The transmission module 404 is further configured to transmit the stop command generated by the control module 403 to the optical apparatus.
In some embodiments, the controller described above can be customized as a vehicle controller for a vehicle door control device as described above, and the detailed description of each customized functional module can refer to the vehicle door control device and method as described above, and will not be repeated here.
As used herein, each of the terms "module," "sub-module," and the like refers to a computer-implemented functional entity that can include both hardware components (i.e., processor or memory) and software components. The combined operation of certain hardware components and software components allows prescribed functions corresponding to certain functional modules to be performed in the controller.
Fig. 5 illustrates a block diagram of a controller provided by some embodiments of the present disclosure. As shown, the controller includes a memory 501 and a processor 502. The memory 501 is configured to store a computer program comprising executable instructions that, when executed by the processor, perform one or more steps of a control method (such as a vehicle door control method) as provided by the present disclosure. The processor 602 is configured to execute computer programs stored in the memory 501.
Here, examples of the memory 501 can include a Random Access Memory (RAM) and/or a nonvolatile memory (NVM, e.g., disk memory). Alternatively, the memory 501 can be remote from the processor 502.
The processor 502 can be a general-purpose processor, such as a Central Processing Unit (CPU), a Network Processor (NP), a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), a programmable logic device (e.g., a Field Programmable Gate Array (FPGA)), a discrete gate or transistor logic device, or discrete hardware components, or the like.
In certain aspects, the present disclosure also provides a non-volatile storage medium readable by a computer. The storage medium is configured to store computer executable instructions which, when executed by the processor, cause the processor to perform the steps of the control method according to any of the embodiments described above.
Here, the machine-readable nonvolatile storage medium can be a portable hard disk (i.e., HDD), a flash drive, a solid state disk (i.e., SSD), an optical disk (e.g., CD or DVD), or a magnetic tape, etc.
It should be noted, however, that the control device and control method as provided herein are not limited to applications for controlling the opening of a vehicle door, but can be applied to many other different application scenarios.
In one such scenario, similar to the one described above, the target object is a door of a building (not a vehicle), and the object can be a person intended to enter the building through the door. By means of the control device, it is achieved that a user can open the door into the building conveniently without the need to use his hands.
It is noted that throughout this disclosure, relational terms such as "first," "second," and the like may be used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. As used herein, the terms "comprises," "comprising," "includes," "including," and the like are synonymous and are used inclusively, in an open-ended fashion, and do not exclude additional elements, features, steps, acts, operations, and the like.
As used herein, the terms "about," "approximately," "left-right," and the like refer to a number, level, value, number, frequency, percentage, dimension, size, degree, weight, or length that varies by as much as 30%, 25%, 20%, 25%, 10%, 9%, 8%, 7%, 6%, 5%, 4%, 3%, 2%, or 1% as compared to the number, level, value, frequency, percentage, dimension, size, degree, weight, or length of a reference. In particular embodiments, the term "about" or "approximately" preceding a value indicates a range of the value plus or minus 15%, 10%, 5%, or 1%.
Throughout this disclosure, all embodiments are described in a related manner. Descriptions of the same or similar parts, such as components, members, steps or processes, of each embodiment may be referenced to each other. Each embodiment is described with emphasis on differences from the other embodiments. In particular, the description is relatively simple for the apparatus, system, vehicle controller and machine-readable storage medium since they are substantially similar to the method, with respect to which reference is made to the description of the method.
All of the embodiments provided and described above should be understood as merely representative of the relatively preferred embodiments of the present invention disclosed in this disclosure and are not intended to limit the scope of the disclosure. Any modification, equivalent replacement, improvement, etc. made within the spirit and principle of the present invention disclosed herein should be considered as being included in the protection scope of the present invention.

Claims (35)

1. A control device, comprising:
a controller; and
a first sensing device, an optical device, and a second sensing device, each communicatively coupled to the controller;
wherein:
the first sensing device is configured to detect whether a user is near a target object, and if so, to send object detection data of the user to the controller;
the controller is configured to determine whether a user is within a preset distance from the target object based on the object detection data, and if so, to send a first control command to the optical apparatus;
the optical device is configured to project a preset image to a preset area of a surface for presentation to a user upon receiving the first control command;
the second sensing device is configured to detect a user response to the preset image and then transmit user response data to the controller;
the controller is further configured to determine, based on the user response data, whether a response from the user meets a predetermined criterion, and if so, to send a second control command to the target object for performing a task.
2. The control apparatus according to claim 1, wherein the target object is a door, and the task is opening the door.
3. The control apparatus according to claim 2, wherein the target object is a vehicle door, and the preset area of the surface is a ground area near the vehicle door.
4. The control device according to claim 3, wherein the vehicle door is at least one of a trunk door or a power door.
5. The control apparatus according to any one of claims 1 to 4, wherein the second sensing device includes an obstacle detector configured to detect whether an obstacle has occurred between the target object and an image projected on a surface by the optical device as a result of a response of a user to the image.
6. The control apparatus of claim 5, wherein the obstacle detector comprises at least one of a camera, a radar sensor, a contactless capacitive sensor, an infrared sensing device, or a TOF detection device.
7. A control apparatus according to claim 6, wherein said obstacle detector comprises TOF detection means.
8. The control device of any one of claims 1 to 4, wherein the second sensing arrangement comprises a user behavior detector configured to detect a behavior of a user in response to an image, wherein the user behavior detector comprises at least one of:
a camera, wherein the response detected by the camera comprises at least one of a user's action, gesture, or facial expression; or
A microphone, wherein the response detected by the microphone comprises a voice of a user.
9. The control device of claim 8, wherein the predetermined criteria comprises detecting any behavior performed by a user, wherein the any behavior comprises at least one of an action, a gesture, a facial expression, or speech.
10. The control device of claim 8, wherein the controller is further configured to perform feature recognition based on the user response data, and the predetermined criteria comprises a substantial match between a result of the feature recognition and a pre-stored record.
11. The control apparatus according to any one of claims 1 to 10, wherein the first sensing device comprises at least one of an ultrasonic sensing device, a radar sensor, a camera, a wireless signal detector, an infrared sensing device, a pyroelectric sensor, a ToF (time of flight) camera, or a VCSEL (vertical cavity surface emitting laser) sensor.
12. The control apparatus of claim 11, wherein the first sensing device comprises a wireless signal detector configured to detect a wireless signal emitted from an on-body device carried by a user, wherein the wireless signal comprises at least one of a Radio Frequency (RF) signal, a WiFi signal, a bluetooth signal, a 4G signal, or a 5G signal.
13. The control apparatus of claim 12, wherein the user-carried on-body device comprises at least one of a car key, a mobile phone, or a wireless signal transmitter.
14. The control device according to any one of claims 1 to 13, wherein the preset distance is about 0.01 to 10 meters, preferably about 0.1 to 5 meters, further preferably about 0.2 to 1 meter.
15. A control apparatus according to any one of claims 1 to 14, wherein the first sensing means is substantially a functional module embedded in the controller.
16. The control apparatus according to any one of claims 1 to 15, wherein:
the optical apparatus is further configured to transmit timestamp information to the controller when the projection of the preset image is started; and
the controller is further configured to calculate an on-time of the optical apparatus based on the time stamp information, determine whether the on-time is longer than a preset threshold, and if so, further send a stop command to the optical apparatus to stop the projection if no response is received from the user.
17. The control device according to claim 16, wherein the preset threshold value is about 2 seconds to 30 minutes, preferably about 5 seconds to 5 minutes, further preferably about 10 seconds to 1 minute.
18. A control method, comprising:
determining whether the user is within a preset distance from the target object;
if so, controlling the optical device to project a preset image to a preset area of the surface for presentation to the user;
detecting a response of a user to the preset image;
determining whether a predetermined criterion is satisfied based on the response; and
and if so, controlling the target object to execute the task.
19. The control method according to claim 18, wherein the target object is a door, and the task is opening the door.
20. The control method according to claim 19, wherein the target object is a vehicle door, and the preset area of the surface is a ground area near the vehicle door.
21. The control method of any one of claims 18 to 20, wherein determining whether the user is within a preset distance from the target object comprises:
acquiring object detection data of a user; and
determining whether a user is within a preset distance from the target object based on the object detection data.
22. The control method of claim 21, wherein the acquiring object detection data of the user is performed by means of a first sensing device comprising at least one of an ultrasonic sensing device, a radar sensor, a camera, a wireless signal detector, an infrared sensing device, a pyroelectric sensor, a ToF (time of flight) camera or a VCSEL (vertical cavity surface emitting laser) sensor.
23. The control method according to claim 22, wherein the first sensing device includes an ultrasonic sensing device, and in determining whether the user is within a preset distance from the target object based on the object detection data, the distance of the user from the target object is estimated by at least one of an intensity of an echo signal received by the ultrasonic sensing device or a time period during which the echo signal is received.
24. The control method of claim 22, wherein the first sensing device comprises a camera, and the distance of the user from the target object is estimated by analyzing an image of the user in determining whether the user is within a preset distance from the target object based on the object detection data.
25. The control method according to claim 22, wherein the first sensing device includes a wireless signal detector configured to detect a wireless signal emitted from an on-body device carried by the user, a distance of the user from the target object being estimated by a strength of the wireless signal detected by the wireless signal detector in determining whether the user is within a preset distance from the target object based on the object detection data.
26. The control method according to claim 25, wherein the user-carried on-body device includes at least one of a car key, a mobile phone, or a wireless signal transmitter.
27. The control method according to any one of claims 18 to 26, wherein the preset distance is about 0.01 to 10 meters, preferably about 0.1 to 5 meters, and further preferably about 0.2 to 1 meter.
28. The control method of any one of claims 18 to 27, wherein detecting the user's response to the preset image is performed by a second sensing device comprising at least one of:
an obstacle detector configured to detect whether an obstacle appears between the target object and the preset image projected on a surface by the optical device as a result of a response of a user to the preset image; or
A user behavior detector configured to detect a behavior of the user in response to the preset image.
29. The control method of claim 28, wherein the second sensing device comprises an obstacle detector comprising at least one of a camera, a radar sensor, a contactless capacitive sensor, an infrared sensing device, or a TOF detecting device.
30. A control method according to claim 29, wherein the second sensing means comprises TOF detecting means.
31. The control method of claim 28, wherein the second sensing device comprises a user behavior detector comprising at least one of:
a camera, wherein the response detected by the camera comprises at least one of a user's action, gesture, or facial expression; or
A microphone, wherein the response detected by the microphone comprises a voice of a user.
32. The control method of claim 31, wherein the predetermined criteria comprises detecting any behavior performed by the user, wherein the any behavior comprises at least one of an action, a gesture, a facial expression, or speech.
33. The control method of claim 32, wherein determining whether a predetermined criterion is satisfied based on the response comprises:
performing feature recognition based on the user response data, wherein the predetermined criteria comprises a substantial match between a result of the feature recognition and a pre-stored record.
34. The control method of any one of claims 18 to 33, after the step of controlling the optical device to project the preset image to the preset area of the surface for presentation to the user if so, the control method further comprising:
calculating the working time of the optical device; and
and if the working time is longer than a preset threshold value and no response from a user is detected, controlling the optical device to stop projecting.
35. The control method according to claim 34, wherein the preset threshold value is about 2 seconds to 30 minutes, preferably about 5 seconds to 5 minutes, and further preferably about 10 seconds to 1 minute.
CN202080060963.4A 2019-08-26 2020-08-26 Control apparatus and method Pending CN114616140A (en)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
CN2019107898460 2019-08-26
CN201910789846.0A CN110525377A (en) 2019-08-26 2019-08-26 A kind of automobile trunk door control method and device
PCT/CN2020/111329 WO2021037052A1 (en) 2019-08-26 2020-08-26 Control apparatus and method

Publications (1)

Publication Number Publication Date
CN114616140A true CN114616140A (en) 2022-06-10

Family

ID=68662819

Family Applications (2)

Application Number Title Priority Date Filing Date
CN201910789846.0A Pending CN110525377A (en) 2019-08-26 2019-08-26 A kind of automobile trunk door control method and device
CN202080060963.4A Pending CN114616140A (en) 2019-08-26 2020-08-26 Control apparatus and method

Family Applications Before (1)

Application Number Title Priority Date Filing Date
CN201910789846.0A Pending CN110525377A (en) 2019-08-26 2019-08-26 A kind of automobile trunk door control method and device

Country Status (3)

Country Link
EP (1) EP4021767A4 (en)
CN (2) CN110525377A (en)
WO (1) WO2021037052A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116181188A (en) * 2022-12-22 2023-05-30 重庆长安汽车股份有限公司 Control method and system for opening vehicle door and vehicle

Families Citing this family (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110525377A (en) * 2019-08-26 2019-12-03 北京一数科技有限公司 A kind of automobile trunk door control method and device
CN111391783B (en) * 2020-04-28 2024-06-18 一汽奔腾轿车有限公司 Automatic control system for automobile tail door
CN111497737A (en) * 2020-04-28 2020-08-07 一汽奔腾轿车有限公司 Automobile door control device and method
CN111411859A (en) * 2020-04-28 2020-07-14 一汽奔腾轿车有限公司 Automatic control device for automobile door
CN111691786A (en) * 2020-05-11 2020-09-22 富晟(广东)汽车电子有限公司 Tail gate light and shadow assembly control method and device
CN114103871B (en) * 2021-11-03 2024-02-20 长春富晟汽车电子有限公司 Light and shadow one-foot kick interaction control method for vehicle tail door
CN116409247A (en) * 2021-12-29 2023-07-11 博泰车联网(南京)有限公司 Control method and control system for vehicle trunk and vehicle
CN114291034B (en) * 2021-12-31 2023-08-08 佛山市安驾科技有限公司 Skirting control method and control system for electric tail door of automobile
CN115126353A (en) * 2022-05-30 2022-09-30 北京一数科技有限公司 Vehicle door control method, vehicle controller, vehicle door control system, and storage medium
DE102022129015A1 (en) 2022-11-03 2024-05-08 Valeo Schalter Und Sensoren Gmbh METHOD AND DEVICE FOR CONTACTLESS PROVISION OF A FUNCTION IN A MOTOR VEHICLE
DE102022129019A1 (en) 2022-11-03 2024-05-08 Valeo Schalter Und Sensoren Gmbh METHOD AND DEVICE FOR CONTACTLESS PROVISION OF A FUNCTION IN A MOTOR VEHICLE
CN116605176B (en) * 2023-07-20 2023-11-07 江西欧迈斯微电子有限公司 Unlocking and locking control method and device and vehicle

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103998299A (en) * 2011-09-12 2014-08-20 法雷奥安全座舱公司 Method for opening a movable panel of a motor vehicle
DE102014101661A1 (en) * 2014-02-11 2015-08-13 Brose Fahrzeugteile Gmbh & Co. Kommanditgesellschaft, Hallstadt Method for controlling a closure element arrangement of a motor vehicle
CN105335144A (en) * 2014-07-31 2016-02-17 比亚迪股份有限公司 Vehicle trunk automatic opening system and control method therefor
CN105644465A (en) * 2014-09-17 2016-06-08 戴姆勒大中华区投资有限公司 Automatic opening control system for vehicle trunk
CN105781278A (en) * 2016-03-01 2016-07-20 福建省汽车工业集团云度新能源汽车股份有限公司 Car trunk opening control method and system
CN108204187A (en) * 2016-12-19 2018-06-26 大众汽车(中国)投资有限公司 For the method and apparatus of the boot of unlocking vehicle
CN109505482A (en) * 2018-11-21 2019-03-22 北京长城华冠汽车科技股份有限公司 Automatically turn on the control system and vehicle of vehicle trunk

Family Cites Families (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4569305B2 (en) * 2005-01-31 2010-10-27 マツダ株式会社 Smart entry system for vehicles
KR101316873B1 (en) * 2012-07-04 2013-10-08 현대자동차주식회사 System and method for operating gate
KR101962728B1 (en) * 2012-12-26 2019-03-27 현대모비스 주식회사 Apparatus for Controlling automobile Trunk and Door
EP2860704B1 (en) * 2013-10-10 2016-04-27 U-Shin France SAS Method for opening a movable panel of the motor vehicle and corresponding opening control device
DE102014101208A1 (en) * 2014-01-31 2015-08-06 Huf Hülsbeck & Fürst Gmbh & Co. Kg mounting module
DE112015001401T5 (en) * 2014-03-26 2017-02-16 Magna Mirrors Of America, Inc. Vehicle function control system using projected characters
EP2930071B1 (en) * 2014-04-10 2018-11-14 U-Shin France Method for opening a movable panel of the motor vehicle and corresponding opening control device
DE102014116171A1 (en) * 2014-11-06 2016-05-12 Valeo Schalter Und Sensoren Gmbh Device with external motion sensor and illuminated marking for a motor vehicle
JP6649036B2 (en) * 2015-10-22 2020-02-19 株式会社ユーシン Door opening and closing device
US10563448B2 (en) * 2015-11-10 2020-02-18 Ford Global Technologies, Llc Approach activated closure entry system for a motor vehicle
JP6634345B2 (en) * 2016-05-31 2020-01-22 株式会社ミツバ Touch sensor unit
WO2019043769A1 (en) * 2017-08-29 2019-03-07 河西工業株式会社 Tailgate opening and closing device
CN107719481B (en) * 2017-09-02 2019-06-07 浙江吉润汽车有限公司 A kind of induction trigger-type automobile trunk open method and device
CN107905676B (en) * 2017-10-10 2019-06-25 吉利汽车研究院(宁波)有限公司 A kind of vehicle trunk automatically turns on control system, method and vehicle
CN109747587A (en) * 2019-03-18 2019-05-14 上海科世达-华阳汽车电器有限公司 A kind of method, apparatus and system of intelligent opening automobile trunk
CN110525377A (en) * 2019-08-26 2019-12-03 北京一数科技有限公司 A kind of automobile trunk door control method and device

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103998299A (en) * 2011-09-12 2014-08-20 法雷奥安全座舱公司 Method for opening a movable panel of a motor vehicle
DE102014101661A1 (en) * 2014-02-11 2015-08-13 Brose Fahrzeugteile Gmbh & Co. Kommanditgesellschaft, Hallstadt Method for controlling a closure element arrangement of a motor vehicle
CN105335144A (en) * 2014-07-31 2016-02-17 比亚迪股份有限公司 Vehicle trunk automatic opening system and control method therefor
CN105644465A (en) * 2014-09-17 2016-06-08 戴姆勒大中华区投资有限公司 Automatic opening control system for vehicle trunk
CN105781278A (en) * 2016-03-01 2016-07-20 福建省汽车工业集团云度新能源汽车股份有限公司 Car trunk opening control method and system
CN108204187A (en) * 2016-12-19 2018-06-26 大众汽车(中国)投资有限公司 For the method and apparatus of the boot of unlocking vehicle
CN109505482A (en) * 2018-11-21 2019-03-22 北京长城华冠汽车科技股份有限公司 Automatically turn on the control system and vehicle of vehicle trunk

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116181188A (en) * 2022-12-22 2023-05-30 重庆长安汽车股份有限公司 Control method and system for opening vehicle door and vehicle

Also Published As

Publication number Publication date
EP4021767A4 (en) 2024-01-24
CN110525377A (en) 2019-12-03
EP4021767A1 (en) 2022-07-06
WO2021037052A1 (en) 2021-03-04

Similar Documents

Publication Publication Date Title
CN114616140A (en) Control apparatus and method
US11225822B2 (en) System and method for opening and closing vehicle door
US10407968B2 (en) System and method for operating vehicle door
US10814866B2 (en) Input signal management for vehicle park-assist
CN107128282B (en) Moving device control of electric vehicle door
US10227813B2 (en) Device and method for opening trunk of vehicle, and recording medium for recording program for executing method
US11518341B2 (en) Method for controlling a locking element of a vehicle
US7477138B2 (en) Function operation warning device
US11267394B2 (en) Projection apparatus for indicating a recommended position to observe a movable body, portable device, and recording medium
US10829978B2 (en) System and method for operating vehicle door
US20190308614A1 (en) Input signal management for vehicle park-assist
US20190255989A1 (en) Turn by turn activation of turn signals
CN105599724B (en) Control device and control method
CN111152232B (en) Service robot and method for operating the same
GB2498833A (en) Ultrasonic gesture recognition for vehicle
US20160039302A1 (en) Method and Apparatus for Operating a Contactless Charging Device for the Disturbance Free Operation of the Keyless Entry System
CN114233120B (en) Hidden door handle control method, hidden door handle control device, hidden door handle control equipment and storage medium
KR102126021B1 (en) Automatic Car Door Opening-and-Closing System Using AVM and Method thereof
US11878654B2 (en) System for sensing a living being proximate to a vehicle
KR102429499B1 (en) Appartus and method for preventing clash slidong door of vehicle
US12090824B2 (en) System for a vehicle with a trailer coupled thereto
US20240294141A1 (en) Vehicle control apparatus, vehicle control method, and storage medium
CN114809833B (en) Control method for opening vehicle door, vehicle door control device and vehicle door control system
US20240294140A1 (en) Vehicle control apparatus, vehicle control method, and storage medium
US11390249B2 (en) Vehicle vision system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination