Nothing Special   »   [go: up one dir, main page]

CN114516270B - Vehicle control method, apparatus, electronic device, and computer-readable medium - Google Patents

Vehicle control method, apparatus, electronic device, and computer-readable medium Download PDF

Info

Publication number
CN114516270B
CN114516270B CN202210237918.2A CN202210237918A CN114516270B CN 114516270 B CN114516270 B CN 114516270B CN 202210237918 A CN202210237918 A CN 202210237918A CN 114516270 B CN114516270 B CN 114516270B
Authority
CN
China
Prior art keywords
brightness
glasses
vehicle
user
screen
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202210237918.2A
Other languages
Chinese (zh)
Other versions
CN114516270A (en
Inventor
杨骏涛
陈小莹
倪凯
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Heduo Technology Guangzhou Co ltd
Original Assignee
HoloMatic Technology Beijing Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by HoloMatic Technology Beijing Co Ltd filed Critical HoloMatic Technology Beijing Co Ltd
Priority to CN202210237918.2A priority Critical patent/CN114516270B/en
Publication of CN114516270A publication Critical patent/CN114516270A/en
Application granted granted Critical
Publication of CN114516270B publication Critical patent/CN114516270B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K35/00Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R11/00Arrangements for holding or mounting articles, not otherwise provided for
    • B60R11/02Arrangements for holding or mounting articles, not otherwise provided for for radio sets, television sets, telephones, or the like; Arrangement of controls thereof
    • B60R11/0229Arrangements for holding or mounting articles, not otherwise provided for for radio sets, television sets, telephones, or the like; Arrangement of controls thereof for displays, e.g. cathodic tubes
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K2360/00Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
    • B60K2360/20Optical features of instruments
    • B60K2360/33Illumination features
    • B60K2360/349Adjustment of brightness

Landscapes

  • Engineering & Computer Science (AREA)
  • Mechanical Engineering (AREA)
  • Chemical & Material Sciences (AREA)
  • Combustion & Propulsion (AREA)
  • Transportation (AREA)
  • Controls And Circuits For Display Device (AREA)
  • Fittings On The Vehicle Exterior For Carrying Loads, And Devices For Holding Or Mounting Articles (AREA)

Abstract

Embodiments of the present disclosure disclose a vehicle control method, apparatus, electronic device, and computer-readable medium. One embodiment of the method comprises the following steps: controlling a vehicle-mounted camera for driving a vehicle to shoot a user image of a driving user, and determining whether the user displayed by the user image wears glasses or not; extracting an area image of the glasses displayed in the user image as a glasses area image in response to determining that the user displayed in the user image wears the glasses; and adjusting the screen brightness of the vehicle-mounted screen and the instrument panel of the driving vehicle according to the glasses area image. According to the embodiment, a user can see the display and the instrument panel clearly, and the safety of the automatic driving vehicle is improved.

Description

Vehicle control method, apparatus, electronic device, and computer-readable medium
Technical Field
Embodiments of the present disclosure relate to the field of vehicle control, and in particular, to a vehicle control method, apparatus, electronic device, and computer readable medium.
Background
Currently, the brightness adjustment of displays and dashboards in vehicles is usually performed by the following methods: the display brightness of the display and the instrument panel is adjusted based on the brightness of the environment in the vehicle.
However, the following technical problems generally exist in the above manner:
firstly, the display brightness of the display and the instrument panel cannot be adaptively adjusted according to the glasses worn by the user, so that the display brightness of the display and the instrument panel is brighter or darker, and the safety of driving the vehicle is reduced;
secondly, in the state that the user wears the sunglasses, the user cannot see the display and the instrument panel clearly, so that the safety of driving the vehicle is reduced;
third, the degree of glasses and the degree of astigmatism of the user are not considered, and when the display brightness of the display and the instrument panel is low or high, the user cannot see the display and the instrument panel clearly, so that the safety of driving the vehicle is reduced.
Disclosure of Invention
The disclosure is in part intended to introduce concepts in a simplified form that are further described below in the detailed description. The disclosure is not intended to identify key features or essential features of the claimed subject matter, nor is it intended to be used to limit the scope of the claimed subject matter.
Some embodiments of the present disclosure propose a vehicle control method, an electronic device, and a computer-readable medium to solve one or more of the technical problems mentioned in the background section above.
In a first aspect, some embodiments of the present disclosure provide a vehicle control method applied to a driving vehicle, the method including: controlling the vehicle-mounted camera for driving the vehicle to shoot a user image of a driving user, and determining whether the user displayed by the user image wears glasses or not; extracting an area image of the glasses displayed in the user image as a glasses area image in response to determining that the user displayed in the user image wears the glasses; and adjusting the screen brightness of the vehicle-mounted screen and the instrument panel of the driving vehicle according to the glasses area image.
In a second aspect, some embodiments of the present disclosure provide a vehicle control apparatus applied to an autonomous vehicle, the apparatus comprising: a control unit configured to control the vehicle-mounted camera for driving the vehicle to capture a user image of a driving user, and to determine whether or not a user displayed by the user image wears glasses; an extracting unit configured to extract, as a glasses area image, an area image of glasses displayed in the user image in response to determining that the user displayed by the user image wears glasses; and an adjusting unit configured to adjust screen brightness of the on-board screen and dashboard of the driving vehicle based on the glasses area image.
In a third aspect, some embodiments of the present disclosure provide an electronic device comprising: one or more processors; a storage device having one or more programs stored thereon; the one or more programs, when executed by the one or more processors, cause the one or more processors to implement the method described in any of the implementations of the first aspect described above.
In a fourth aspect, some embodiments of the present disclosure provide a computer readable medium having a computer program stored thereon, wherein the program, when executed by a processor, implements the method described in any of the implementations of the first aspect above.
The above embodiments of the present disclosure have the following advantageous effects: according to the vehicle control method, in the state that the user wears the glasses, the display brightness of the display and the display brightness of the instrument panel can be adjusted adaptively, so that the user can see the display and the instrument panel clearly, and the safety of the automatic driving vehicle is improved. Specifically, the reason why the safety of the automatically driven vehicle is lowered is that: the display brightness of the display and the instrument panel cannot be adjusted adaptively according to the glasses worn by the user, so that the display brightness of the display and the instrument panel is brighter or darker, and the safety of driving the vehicle is reduced. Based on this, the vehicle control method of some embodiments of the present disclosure first controls the in-vehicle camera of the driving vehicle to capture a user image of the driving user, and determines whether or not the user displayed by the user image wears glasses, and then extracts, as a glasses area image, an area image of the glasses displayed in the user image in response to determining that the user displayed by the user image wears glasses. Thus, data support is provided for adjusting screen brightness. Finally, according to the glasses area image, the screen brightness of the vehicle-mounted screen and the instrument panel of the driving vehicle is adjusted. Thus, the screen brightness of the in-vehicle screen and the dashboard can be adjusted based on the glasses image. Therefore, the user can see the display and the instrument panel clearly, and the safety of the automatic driving vehicle is improved.
Drawings
The above and other features, advantages, and aspects of embodiments of the present disclosure will become more apparent by reference to the following detailed description when taken in conjunction with the accompanying drawings. The same or similar reference numbers will be used throughout the drawings to refer to the same or like elements. It should be understood that the figures are schematic and that elements and components are not necessarily drawn to scale.
FIG. 1 is a flow chart of some embodiments of a vehicle control method according to the present disclosure;
FIG. 2 is a schematic structural view of some embodiments of a vehicle control device according to the present disclosure;
fig. 3 is a schematic structural diagram of an electronic device suitable for use in implementing some embodiments of the present disclosure.
Detailed Description
Embodiments of the present disclosure will be described in more detail below with reference to the accompanying drawings. While certain embodiments of the present disclosure are shown in the drawings, it should be understood that the present disclosure may be embodied in various forms and should not be construed as limited to the embodiments set forth herein. Rather, these embodiments are provided so that this disclosure will be thorough and complete. It should be understood that the drawings and embodiments of the present disclosure are for illustration purposes only and are not intended to limit the scope of the present disclosure.
It should be noted that, for convenience of description, only the portions related to the present invention are shown in the drawings. Embodiments of the present disclosure and features of embodiments may be combined with each other without conflict.
It should be noted that the terms "first," "second," and the like in this disclosure are merely used to distinguish between different devices, modules, or units and are not used to define an order or interdependence of functions performed by the devices, modules, or units.
It should be noted that references to "one", "a plurality" and "a plurality" in this disclosure are intended to be illustrative rather than limiting, and those of ordinary skill in the art will appreciate that "one or more" is intended to be understood as "one or more" unless the context clearly indicates otherwise.
The names of messages or information interacted between the various devices in the embodiments of the present disclosure are for illustrative purposes only and are not intended to limit the scope of such messages or information.
The present disclosure will be described in detail below with reference to the accompanying drawings in conjunction with embodiments.
FIG. 1 illustrates a flow 100 of some embodiments of a vehicle control method according to the present disclosure. The vehicle control method is applied to driving a vehicle and comprises the following steps:
Step 101, controlling an on-board camera of a driving vehicle to shoot a user image of a driving user, and determining whether the user displayed by the user image wears glasses or not.
In some embodiments, an execution subject of the vehicle control method (e.g., an in-vehicle controller mounted on a driving vehicle) may control an in-vehicle camera of the driving vehicle to capture a user image of a driving user, and determine whether a user displayed by the user image wears glasses. Here, the in-vehicle camera may be a camera built in the vehicle for driving the vehicle, for photographing the driver. For example, the in-vehicle camera may be a DMS (Driver Monitoring System) camera. In practice, the execution subject may recognize whether or not the eye region of the user displayed with the user image is provided with glasses through a neural network model (for example, a convolutional neural network model) for recognizing glasses or a DMS (Driver Monitoring System) camera.
And 102, in response to determining that the user displayed by the user image wears glasses, extracting an area image of the glasses displayed in the user image as a glasses area image.
In some embodiments, the executing body may extract, as the glasses area image, an area image of the glasses displayed in the user image in response to determining that the user displayed the user image wears the glasses. In practice, the execution subject may extract, as the glasses area image, the area image of the glasses displayed in the user image in various ways. For example, the region image of the glasses displayed in the above-described user image may be extracted as a glasses region image by an OpenCV-based partial image extraction technique or a neural network model for extracting glasses images trained in advance.
And step 103, adjusting the screen brightness of the vehicle-mounted screen and the instrument panel of the driving vehicle according to the eyeglass region image.
In some embodiments, the executing body may adjust screen brightness of an on-board screen and an instrument panel of the driving vehicle according to the glasses area image. In practice, first, the above-described execution subject may recognize the skin color value of the driving user included in the glasses area image as the first color value (i.e., the color value of the skin area covered by the glasses) through the DMS (Driver Monitoring System) camera. Next, the executing body may recognize the color value of the facial skin of the non-glasses area of the driving user as the second color value through the DMS (Driver Monitoring System) camera. The ratio of the first color value to the second color value may then be determined as the eyeglass transmittance. Then, a screen brightness corresponding to the above-mentioned glasses transmittance is selected as a target screen brightness from a preset screen brightness adjustment table based on the glasses transmittance. Finally, the screen brightness of the on-board screen and the dashboard of the driving vehicle may be adjusted to the target screen brightness. Here, the screen brightness adjustment table based on the transmittance of the glasses may refer to a table set in advance for comparing the transmittance of the glasses with the screen brightness. Here, the screen brightness adjustment table based on the glasses transmittance may be composed of the glasses transmittance section and the screen brightness corresponding to the glasses transmittance section. For example, the screen brightness adjustment table based on the transmittance of the glasses may be:
Glass light transmittance interval Screen brightness (cd/m) 2 )
>=0.9 2
< 0.9 and > =0.8 3
< 0.8 and > = 0.7 4
In practice, the screen brightness corresponding to the glasses transmittance section including the above-mentioned glasses transmittance may be selected as the target screen brightness from the screen brightness adjustment table based on the glasses transmittance.
In some optional implementations of some embodiments, the executing body may adjust the screen brightness of the on-board screen and dashboard of the driving vehicle according to the glasses area image by:
first, the type of glasses displayed by the glasses area image is identified, and the ambient brightness in the driving vehicle is identified. In practice, the execution subject may recognize the type of glasses displayed by the glasses area image and recognize the ambient brightness in the driving vehicle in various ways. Here, the type of glasses may refer to a category of glasses, and may include, but is not limited to: a sunglass type and a non-sunglass type. For example, the type of glasses displayed by the above-described glasses region image may be identified by a neural network model (e.g., a full convolutional network (Fully Convolutional Networks, FCN)) trained in advance for identifying the type of glasses. For example, the above-described ambient brightness in the driving vehicle may be recognized by a brightness detector or DMS (Driver Monitoring System) camera.
And a second step of identifying the transmittance and the abrasion degree of the glasses displayed in the glasses area image in response to the glasses type being a sunglasses type. Here, for the detection of the transmittance, see the implementation of generating the transmittance in the embodiment in step 103. In practice, for the recognition of the degree of wear of the glasses displayed in the above-described glasses area image, first, the wear area of the glasses displayed in the above-described glasses area image may be recognized by a neural network model (for example, a convolutional neural network model) trained in advance for recognizing the wear area of the glasses lenses in the image. Then, the ratio of the abrasion area to the total lens area of the glasses displayed in the glasses area image may be determined as the abrasion degree.
And thirdly, acquiring a transmittance-based brightness mapping table corresponding to the ambient brightness from the vehicle-mounted database of the driving vehicle. Here, the transmittance-based luminance map corresponding to the above-mentioned ambient luminance may refer to a luminance change table that changes with transmittance within a certain ambient luminance section set in advance. Here, the transmittance-based luminance map may be composed of a transmittance interval and the luminance corresponding to the transmittance interval. For example, when the ambient brightness is 5cd/m, the brightness map based on transmittance may be:
Transmittance interval Brightness (cd/m) 2 )
>=0.9 2.5
< 0.9 and > =0.8 3
< 0.8 and > = 0.7 3.5
Fourth, selecting a luminance corresponding to the transmittance from the transmittance-based luminance map as a target luminance. In practice, the luminance corresponding to the transmittance interval including the above-described transmittance may be selected as the target luminance from the transmittance-based luminance map.
And fifthly, adjusting the screen brightness of the vehicle-mounted screen and the dashboard to the target brightness in response to determining that the abrasion degree is smaller than a first preset abrasion degree. Here, the setting of the first preset wear degree is not limited. For example, the first preset wear level may be 0.9.
Optionally, in response to determining that the wear degree is greater than or equal to the first preset wear degree, a brightness increase value mapping table based on the wear degree of the glasses is obtained from the vehicle-mounted database.
In some embodiments, the executing body may obtain the brightness increase value mapping table based on the wear degree of the glasses from the on-vehicle database in response to determining that the wear degree is equal to or greater than the first preset wear degree. Here, the luminance increase value map table may refer to a table of luminance increase values that vary with the wear degree of the glasses. The brightness increase value map may be composed of a glasses wear degree section and brightness increase values corresponding to the glasses wear degree section. For example, the luminance increase value map table may be:
Wear degree interval of glasses Brightness increment value (cd/m) 2 )
>=0.9 0.1
< 0.9 and > =0.8 0.2
< 0.8 and > = 0.7 0.3
Alternatively, a luminance increase value corresponding to the degree of abrasion is selected from the luminance increase value map as the target luminance increase value.
In some embodiments, the executing entity may select a brightness increase value corresponding to the wear degree from the brightness increase value map as the target brightness increase value. In practice, the execution subject may select, from the luminance increase value map, a luminance increase value corresponding to the eyeglass wear section including the wear degree as the target luminance increase value.
Alternatively, the sum of the target luminance and the target luminance increase value is determined as a luminance adjustment value.
In some embodiments, the execution body may determine a sum of the target luminance and the target luminance increase value as a luminance adjustment value.
Optionally, screen brightness of the vehicle-mounted screen and the dashboard is adjusted to the brightness adjustment value.
In some embodiments, the execution body may adjust screen brightness of the in-vehicle screen and the dashboard to the brightness adjustment value.
The content in some alternative implementations is taken as an invention point of the present disclosure, thereby solving the second technical problem mentioned in the background art, namely that in a state that a user wears the sunglasses, the user cannot see the display and the dashboard clearly, and the safety of driving the vehicle is reduced. The reason why the safety of driving the vehicle is lowered is that: in a state that the user wears the sunglasses, the user cannot see the display and the instrument panel clearly, and safety of driving the vehicle is reduced. If the above factors are solved, the effect of improving the safety of driving the vehicle can be achieved. To achieve this effect, the present disclosure first recognizes the type of glasses displayed by the above-described glasses area image, and recognizes the ambient brightness in the above-described driving vehicle. Thus, data support is provided for adjusting screen brightness. Next, in response to the type of glasses being a sunglass type, the transmittance and the degree of wear of the glasses displayed in the glasses area image are recognized. Therefore, the brightness of the screen is convenient to adjust according to the transmittance and the abrasion degree of the glasses, so that a user can see the display and the instrument panel clearly. Then, a transmittance-based luminance map corresponding to the ambient luminance is obtained from the on-vehicle database of the driving vehicle. Thus, data support is provided for determining the brightness of the adjusted screen. Then, a luminance corresponding to the transmittance is selected as a target luminance from the transmittance-based luminance map. And finally, adjusting the screen brightness of the vehicle-mounted screen and the instrument panel to the target brightness in response to determining that the abrasion degree is smaller than a first preset abrasion degree. Therefore, in the state that the user wears the sunglasses, the screen brightness of the vehicle-mounted screen and the screen brightness of the instrument panel can be adjusted according to the transmittance of the sunglasses. Therefore, a driver with the sunglasses can see the display and the instrument panel clearly, and safety of driving the vehicle is improved.
In other optional implementations of some embodiments, the executing body may adjust the screen brightness of the on-board screen and dashboard of the driving vehicle according to the glasses area image by:
first, in response to the type of glasses being a non-sunglasses type, the wear degree of glasses displayed in the glasses area image is recognized as a target wear degree. Here, for a specific recognition of the target wear degree, reference may be made to the above-described implementation of recognizing the wear degree from among the transmittance and the wear degree of the glasses displayed in the above-described glasses area image. Here, the non-sunglass type may represent myopic glasses.
And a second step of determining whether the user displayed by the user image is a target user. Here, the target user may refer to an owner of the vehicle (owner user).
And thirdly, determining whether glasses information corresponding to the target user exists in the vehicle-mounted database or not in response to determining that the user displayed by the user image is the target user. Here, the glasses information may represent information that the target user wears glasses of a non-sunglasses type.
And step four, in response to determining that the glasses information corresponding to the target user exists in the vehicle-mounted database, acquiring the glasses information. Wherein, the glasses information comprises glasses degree and scattered light. In practice, the executing body may acquire the glasses information from the in-vehicle database in response to determining that the glasses information corresponding to the target user exists in the in-vehicle database.
And fifthly, responding to the degree of the glasses being larger than or equal to a preset degree and the degree of astigmatism being larger than or equal to a preset astigmatism, and acquiring a brightness mapping table based on the degree of the glasses and a brightness increment value mapping table based on the astigmatism corresponding to the ambient brightness from the vehicle-mounted database. Here, the brightness map table based on the degree of eyeglasses corresponding to the above-mentioned ambient brightness may be a brightness change table which changes with the degree of eyeglasses in a certain ambient brightness section set in advance. Here, the degree-based brightness map may be composed of a degree section and brightness of a corresponding degree section. Here, the luminance increase value map based on astigmatism may refer to a luminance increase value map that varies with astigmatism in a certain ambient luminance section set in advance. Here, the luminance increase value map based on the astigmatism may be composed of an astigmatism section and a luminance increase value corresponding to the astigmatism section. For example, the brightness map based on the degree of eyeglasses may be:
spectacle power interval Brightness (cd/m) 2 )
<=100° 1
> 100 ° and < = 200 ° 1.5
> 200 ° and < = 300 ° 2
For example, the luminance increase value map based on the astigmatism may be:
Scattered light interval Brightness increment value (cd/m) 2 )
<=50° 0.5
> 50 ° and < = 100 ° 1
> 100 ° and < = 150 ° 1.5
And a sixth step of selecting the brightness corresponding to the degree of spectacles as the first adjustment brightness from the brightness map table based on the degree of spectacles. In practice, the brightness corresponding to the eyeglass power section including the eyeglass power may be selected from the eyeglass power-based brightness map as the first adjustment brightness.
Seventh, selecting a luminance increase value corresponding to the scattered light intensity from the luminance increase value map based on the scattered light intensity as a first luminance increase value. In practice, the luminance increase value corresponding to the astigmatism section including the above-described astigmatism may be selected as the first luminance increase value from the luminance increase value map based on the astigmatism.
Eighth, determining the sum of the first adjusted luminance and the first luminance increase value as a first luminance adjustment value, and adjusting the screen luminance of the vehicle-mounted screen and the dashboard to the first luminance adjustment value.
Optionally, in response to the eyeglass power being greater than or equal to the preset power and the astigmatism power being less than the preset astigmatism power, the screen brightness of the on-board screen and the dashboard is adjusted to the first adjusted brightness.
In some embodiments, the executing body may adjust the screen brightness of the on-vehicle screen and the dashboard to the first adjusted brightness in response to the glasses power being equal to or greater than the preset power and the astigmatism power being less than the preset astigmatism power.
Optionally, in response to the eyeglass power being less than the preset power and the astigmatism power being greater than or equal to the preset astigmatism power, a brightness adjustment table based on ambient brightness is obtained from the vehicle-mounted database.
In some embodiments, the executing body may obtain the brightness adjustment table based on the ambient brightness from the vehicle-mounted database in response to the eyeglass power being less than the preset power and the astigmatism power being equal to or greater than the preset astigmatism power. Here, the brightness adjustment table based on the ambient brightness may refer to a brightness change table that is set in advance as the ambient brightness changes. The brightness adjustment table based on the ambient brightness may be composed of the ambient brightness section and the brightness of the corresponding ambient brightness section. For example, the brightness adjustment table based on the ambient brightness may be:
ambient brightness interval (cd/m) 2 ) Brightness (cd/m) 2 )
<=1 1.5
> 1 and < = 2 2.5
> 2 and < = 3 3.5
Alternatively, the brightness corresponding to the above-described ambient brightness is selected as the second adjustment brightness from the above-described ambient brightness-based brightness adjustment table.
In some embodiments, the executing body may select the brightness corresponding to the ambient brightness from the brightness adjustment table based on the ambient brightness as the second adjustment brightness. In practice, the luminance corresponding to the ambient luminance section including the above ambient luminance may be selected as the second adjustment luminance from the luminance adjustment table based on the ambient luminance.
Optionally, determining the sum of the first brightness increasing value and the second adjusted brightness as a second brightness adjusting value, and adjusting the screen brightness of the vehicle-mounted screen and the dashboard to the second adjusted brightness.
In some embodiments, the execution body may determine a sum of the first luminance increase value and the second adjustment luminance as a second luminance adjustment value, and adjust screen luminance of the in-vehicle screen and the dashboard to the second adjustment luminance.
Optionally, in response to determining that the user displayed by the user image is not the target user, generating a glasses information collection voice instruction, and controlling a sound playing device of the driving vehicle to play the glasses information collection voice instruction.
In some embodiments, the executing body may generate a glasses information collection voice command in response to determining that the user displayed by the user image is not the target user, and control the sound playing device of the driving vehicle to play the glasses information collection voice command. Here, the glasses information collection voice instruction may refer to a voice instruction to collect glasses information of the driving user. Here, the sound playing device may refer to a device having a voice playing function. Such as a vehicle speaker.
Optionally, in response to receiving the target glasses information input by the user within the target time period, adjusting the screen brightness of the vehicle-mounted screen and the dashboard of the driving vehicle according to the target glasses information.
In some embodiments, the executing body may adjust the screen brightness of the on-board screen and the dashboard of the driving vehicle according to the target glasses information in response to receiving the target glasses information input by the user within the target time period. Here, the target time length may refer to a time length for waiting for the target glasses information input by the user after playing the glasses information collection voice command. Here, the target eyeglass information may refer to eyeglass information of the driving user. Here, the specific implementation manner of adjusting the screen brightness of the on-board screen and the dashboard of the driving vehicle according to the target glasses information may be referred to the description in step 103, and will not be described herein.
The content in some alternative implementations is taken as an invention point of the present disclosure, thereby solving the technical problem three "the glasses degree and the astigmatism degree of the user are not considered, when the display brightness of the display and the dashboard is low or high, the display and the dashboard cannot be seen clearly by the user, and the safety of driving the vehicle is reduced. The reason why the safety of driving the vehicle is lowered is that: the degree of glasses and the degree of astigmatism of the user are not considered, and when the display brightness of the display and the instrument panel is low or high, the display and the instrument panel cannot be seen clearly by the user, so that the safety of driving the vehicle is reduced. If the above factors are solved, the effect of improving the safety of driving the vehicle can be achieved. To achieve this effect, the present disclosure first recognizes, as a target wear degree, a wear degree of glasses displayed in the glasses area image in response to the glasses type being a non-sunglasses type. Thus, data support is primarily provided for adjusting screen brightness. Next, it is determined whether the user displayed in the user image is a target user. Thus, it can be determined whether the user who drives the vehicle at the time is a temporary user. And then, in response to determining that the user displayed by the user image is the target user, determining whether glasses information corresponding to the target user exists in the vehicle-mounted database. Then, in response to determining that the glasses information corresponding to the target user exists in the vehicle-mounted database, the glasses information is acquired. Wherein, the glasses information comprises glasses degree and scattered light. Thus, it is convenient to adjust the screen brightness according to the user's glasses data. And then, in response to the degree of the glasses being greater than or equal to a preset degree and the degree of astigmatism being greater than or equal to a preset astigmatism, acquiring a brightness map based on the degree of the glasses and a brightness increment map based on the astigmatism corresponding to the ambient brightness from the vehicle-mounted database. Thereby facilitating the determination of the screen brightness that needs to be adjusted. Then, the luminance corresponding to the eyeglass power is selected as the first adjustment luminance from the eyeglass power-based luminance map. Then, a luminance increase value corresponding to the speckle pattern is selected from the luminance increase value map based on the speckle pattern as a first luminance increase value. Therefore, data support is provided for adjusting the screen brightness, and finally, the sum of the first adjusted brightness and the first brightness increasing value is determined to be a first brightness adjusting value, and the screen brightness of the vehicle-mounted screen and the screen brightness of the instrument panel are adjusted to the first brightness adjusting value. Thus, the display brightness of the display and the instrument panel can be adaptively adjusted according to the degree of eyeglasses and the degree of astigmatism of the user. Therefore, a user can see the display and the instrument panel clearly, and the safety of driving the vehicle is improved.
The above embodiments of the present disclosure have the following advantageous effects: according to the vehicle control method, in the state that the user wears the glasses, the display brightness of the display and the display brightness of the instrument panel can be adjusted adaptively, so that the user can see the display and the instrument panel clearly, and the safety of the automatic driving vehicle is improved. Specifically, the reason why the safety of the automatically driven vehicle is lowered is that: the display brightness of the display and the instrument panel cannot be adjusted adaptively according to the glasses worn by the user, so that the display brightness of the display and the instrument panel is brighter or darker, and the safety of driving the vehicle is reduced. Based on this, the vehicle control method of some embodiments of the present disclosure first controls the in-vehicle camera of the driving vehicle to capture a user image of the driving user, and determines whether or not the user displayed by the user image wears glasses, and then extracts, as a glasses area image, an area image of the glasses displayed in the user image in response to determining that the user displayed by the user image wears glasses. Thus, data support is provided for adjusting screen brightness. Finally, according to the glasses area image, the screen brightness of the vehicle-mounted screen and the instrument panel of the driving vehicle is adjusted. Thus, the screen brightness of the in-vehicle screen and the dashboard can be adjusted based on the glasses image. Therefore, the user can see the display and the instrument panel clearly, and the safety of the automatic driving vehicle is improved.
With further reference to fig. 2, as an implementation of the method shown in the above figures, the present disclosure provides embodiments of a vehicle control apparatus, which system embodiments correspond to those shown in fig. 1, and which system is particularly applicable in various electronic devices.
As shown in fig. 2, the vehicle control apparatus 200 of some embodiments includes: a control unit 201, an extraction unit 202, and an adjustment unit 203. Wherein the control unit 201 is configured to control the vehicle-mounted camera of the driving vehicle to shoot a user image of a driving user and determine whether the user displayed by the user image wears glasses; an extracting unit 202 configured to extract, as a glasses area image, an area image of glasses displayed in the user image in response to determining that the user displayed by the user image wears glasses; and an adjusting unit 203 configured to adjust screen brightness of the on-board screen and dashboard of the driving vehicle based on the glasses area image.
It will be appreciated that the elements described in the system 200 correspond to the various steps in the method described with reference to fig. 1. Thus, the operations, features, and benefits described above with respect to the method are equally applicable to the system 200 and the units contained therein, and are not described in detail herein.
Referring now to fig. 3, a schematic diagram of an electronic device 300 suitable for use in implementing some embodiments of the present disclosure is shown. The electronic device shown in fig. 3 is merely an example and should not impose any limitations on the functionality and scope of use of embodiments of the present disclosure.
As shown in fig. 3, the electronic device 300 may include a processing means (e.g., a central processing unit, a graphics processor, etc.) 301 that may perform various suitable actions and processes in accordance with a program stored in a Read Only Memory (ROM) 302 or a program loaded from a storage means 308 into a Random Access Memory (RAM) 303. In the RAM 303, various programs and data required for the operation of the electronic apparatus 300 are also stored. The processing device 301, the ROM 302, and the RAM 303 are connected to each other via a bus 304. An input/output (I/O) interface 305 is also connected to bus 304.
In general, the following devices may be connected to the I/O interface 305: input devices 306 including, for example, a touch screen, a touch pad, a keyboard, a mouse, a camera (front/rear of the vehicle), a microphone, an accelerometer, a gyroscope, and the like; an output device 307 including, for example, a Liquid Crystal Display (LCD), a speaker, a vibrator, and the like; storage 308 including, for example, magnetic tape, hard disk, etc.; and communication means 309. The communication means 309 may allow the electronic device 300 to communicate with other devices wirelessly or by wire to exchange data. While fig. 3 shows an electronic device 300 having various means, it is to be understood that not all of the illustrated means are required to be implemented or provided. More or fewer devices may be implemented or provided instead. Each block shown in fig. 3 may represent one device or a plurality of devices as needed.
In particular, according to some embodiments of the present disclosure, the processes described above with reference to flowcharts may be implemented as computer software programs. For example, some embodiments of the present disclosure include a computer program product comprising a computer program embodied on a computer readable medium, the computer program comprising program code for performing the method shown in the flow chart. In such embodiments, the computer program may be downloaded and installed from a network via communications device 309, or from storage device 308, or from ROM 302. The above-described functions defined in the methods of some embodiments of the present disclosure are performed when the computer program is executed by the processing means 301.
It should be noted that, in some embodiments of the present disclosure, the computer readable medium may be a computer readable signal medium or a computer readable storage medium, or any combination of the two. The computer readable storage medium can be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or a combination of any of the foregoing. More specific examples of the computer-readable storage medium may include, but are not limited to: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In some embodiments of the present disclosure, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. In some embodiments of the present disclosure, however, the computer-readable signal medium may comprise a data signal propagated in baseband or as part of a carrier wave, with the computer-readable program code embodied therein. Such a propagated data signal may take any of a variety of forms, including, but not limited to, electro-magnetic, optical, or any suitable combination of the foregoing. A computer readable signal medium may also be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device. Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to: electrical wires, fiber optic cables, RF (radio frequency), and the like, or any suitable combination of the foregoing.
In some implementations, the clients, servers may communicate using any currently known or future developed network protocol, such as HTTP (HyperText Transfer Protocol ), and may be interconnected with any form or medium of digital data communication (e.g., a communication network). Examples of communication networks include a local area network ("LAN"), a wide area network ("WAN"), the internet (e.g., the internet), and peer-to-peer networks (e.g., ad hoc peer-to-peer networks), as well as any currently known or future developed networks.
The computer readable medium may be embodied in the apparatus; or may exist alone without being incorporated into the electronic device. The computer readable medium carries one or more programs which, when executed by the electronic device, cause the electronic device to: controlling the vehicle-mounted camera for driving the vehicle to shoot a user image of a driving user, and determining whether the user displayed by the user image wears glasses or not; extracting an area image of the glasses displayed in the user image as a glasses area image in response to determining that the user displayed in the user image wears the glasses; and adjusting the screen brightness of the vehicle-mounted screen and the instrument panel of the driving vehicle according to the glasses area image.
Computer program code for carrying out operations for some embodiments of the present disclosure may be written in one or more programming languages, including an object oriented programming language such as Java, smalltalk, C ++ and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the case of a remote computer, the remote computer may be connected to the user's computer through any kind of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or may be connected to an external computer (for example, through the Internet using an Internet service provider).
The flowcharts and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present disclosure. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
The units described in some embodiments of the present disclosure may be implemented by means of software, or may be implemented by means of hardware. The described units may also be provided in a processor, for example, described as: a processor includes a control unit, an extraction unit, and an adjustment unit. The names of these units do not constitute a limitation on the unit itself in some cases, and for example, the extracting unit may also be described as "a unit that extracts, as a glasses area image, an area image of glasses displayed in the above-described user image in response to determining that the user displayed the above-described user image wears glasses".
The functions described above herein may be performed, at least in part, by one or more hardware logic components. For example, without limitation, exemplary types of hardware logic components that may be used include: a Field Programmable Gate Array (FPGA), an Application Specific Integrated Circuit (ASIC), an Application Specific Standard Product (ASSP), a system on a chip (SOC), a Complex Programmable Logic Device (CPLD), and the like.
The foregoing description is only of the preferred embodiments of the present disclosure and description of the principles of the technology being employed. It will be appreciated by those skilled in the art that the scope of the invention in the embodiments of the present disclosure is not limited to the specific combination of the above technical features, but encompasses other technical features formed by any combination of the above technical features or their equivalents without departing from the spirit of the invention. Such as the above-described features, are mutually substituted with (but not limited to) the features having similar functions disclosed in the embodiments of the present disclosure.

Claims (9)

1. A vehicle control method applied to a driving vehicle, comprising:
controlling the vehicle-mounted camera of the driving vehicle to shoot a user image of a driving user, and determining whether the user displayed by the user image wears glasses or not;
extracting an area image of glasses displayed in the user image as a glasses area image in response to determining that the user displayed in the user image wears glasses;
according to the glasses area image, adjusting the screen brightness of the vehicle-mounted screen and the instrument panel of the driving vehicle;
the adjusting the screen brightness of the vehicle-mounted screen and the instrument panel of the driving vehicle according to the glasses area image comprises the following steps:
identifying a type of glasses displayed by the glasses area image, and identifying an ambient brightness within the driving vehicle;
in response to the glasses type being a sunglasses type, identifying the transmittance and the wear degree of the glasses displayed in the glasses area image;
acquiring a transmittance-based brightness mapping table corresponding to the ambient brightness from a vehicle-mounted database of the driving vehicle;
selecting a luminance corresponding to the transmittance from the transmittance-based luminance map as a target luminance;
And adjusting the screen brightness of the vehicle-mounted screen and the instrument panel to the target brightness in response to determining that the abrasion degree is smaller than a first preset abrasion degree.
2. The method of claim 1, wherein the method further comprises:
in response to determining that the wear degree is greater than or equal to the first preset wear degree, acquiring a brightness increase value mapping table based on the wear degree of the glasses from the vehicle-mounted database;
selecting a brightness increase value corresponding to the abrasion degree from the brightness increase value mapping table as a target brightness increase value;
determining the sum of the target brightness and the target brightness increase value as a brightness adjustment value;
and adjusting the screen brightness of the vehicle-mounted screen and the screen brightness of the instrument panel to the brightness adjustment value.
3. The method of claim 1, wherein said adjusting screen brightness of an on-board screen and dashboard of the driving vehicle from the eyeglass region image comprises:
identifying a wear degree of the glasses displayed in the glasses area image as a target wear degree in response to the glasses type being a non-sunglasses type;
determining whether the user displayed by the user image is a target user;
Determining whether glasses information corresponding to the target user exists in the vehicle-mounted database or not in response to determining that the user displayed by the user image is the target user;
acquiring glasses information corresponding to the target user in response to determining that the glasses information exists in the vehicle-mounted database, wherein the glasses information comprises glasses degree and scattered light;
responding to the degree of the glasses being larger than or equal to a preset degree and the astigmatism being larger than or equal to a preset astigmatism, acquiring a brightness mapping table based on the degree of the glasses and a brightness increment value mapping table based on the astigmatism corresponding to the environment brightness from the vehicle-mounted database;
selecting the brightness corresponding to the glasses degree from the brightness mapping table based on the glasses degree as a first adjustment brightness;
selecting a brightness increment value corresponding to the scattered light intensity from the brightness increment value mapping table based on the scattered light intensity as a first brightness increment value;
and determining the sum of the first adjusted brightness and the first brightness increase value as a first brightness adjustment value, and adjusting the screen brightness of the vehicle-mounted screen and the dashboard to the first brightness adjustment value.
4. A method according to claim 3, wherein the method further comprises:
and adjusting the screen brightness of the vehicle-mounted screen and the dashboard to the first adjusted brightness in response to the eyeglass power being greater than or equal to the preset power and the astigmatism power being less than the preset astigmatism power.
5. A method according to claim 3, wherein the method further comprises:
acquiring a brightness adjustment table based on ambient brightness from the vehicle-mounted database in response to the eyeglass power being less than the preset power and the astigmatism power being greater than or equal to the preset astigmatism power;
selecting the brightness corresponding to the ambient brightness from the brightness adjustment table based on the ambient brightness as a second adjustment brightness;
and determining the sum of the first brightness increasing value and the second adjusting brightness as a second brightness adjusting value, and adjusting the screen brightness of the vehicle-mounted screen and the screen brightness of the instrument panel to the second adjusting brightness.
6. A method according to claim 3, wherein the method further comprises:
generating a glasses information collection voice command in response to determining that the user displayed by the user image is not the target user, and controlling a sound playing device of the driving vehicle to play the glasses information collection voice command;
And in response to receiving target glasses information input by a user in a target time period, adjusting the screen brightness of the vehicle-mounted screen and the dashboard of the driving vehicle according to the target glasses information.
7. A vehicle control apparatus, applied to a driving vehicle, comprising:
a control unit configured to control an in-vehicle camera of the driving vehicle to capture a user image of a driving user, and to determine whether a user displayed by the user image wears glasses;
an extracting unit configured to extract, as a glasses area image, an area image of glasses displayed in the user image in response to determining that the user displayed by the user image wears glasses;
an adjusting unit configured to adjust screen brightness of an on-vehicle screen and an instrument panel of the driving vehicle according to the eyeglass region image; the adjustment unit is further configured to:
identifying a type of glasses displayed by the glasses area image, and identifying an ambient brightness within the driving vehicle;
in response to the glasses type being a sunglasses type, identifying the transmittance and the wear degree of the glasses displayed in the glasses area image;
acquiring a transmittance-based brightness mapping table corresponding to the ambient brightness from a vehicle-mounted database of the driving vehicle;
Selecting a luminance corresponding to the transmittance from the transmittance-based luminance map as a target luminance;
and adjusting the screen brightness of the vehicle-mounted screen and the instrument panel to the target brightness in response to determining that the abrasion degree is smaller than a first preset abrasion degree.
8. An electronic device, comprising:
one or more processors;
a storage device having one or more programs stored thereon;
when executed by the one or more processors, causes the one or more processors to implement the method of any of claims 1-6.
9. A computer readable medium having stored thereon a computer program, wherein the program when executed by a processor implements the method of any of claims 1-6.
CN202210237918.2A 2022-03-11 2022-03-11 Vehicle control method, apparatus, electronic device, and computer-readable medium Active CN114516270B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210237918.2A CN114516270B (en) 2022-03-11 2022-03-11 Vehicle control method, apparatus, electronic device, and computer-readable medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210237918.2A CN114516270B (en) 2022-03-11 2022-03-11 Vehicle control method, apparatus, electronic device, and computer-readable medium

Publications (2)

Publication Number Publication Date
CN114516270A CN114516270A (en) 2022-05-20
CN114516270B true CN114516270B (en) 2023-09-12

Family

ID=81599716

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210237918.2A Active CN114516270B (en) 2022-03-11 2022-03-11 Vehicle control method, apparatus, electronic device, and computer-readable medium

Country Status (1)

Country Link
CN (1) CN114516270B (en)

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2013127535A (en) * 2011-12-19 2013-06-27 Canon Inc Display device, method and program of controlling the same, and recording medium
CN107640160A (en) * 2016-07-14 2018-01-30 奥迪股份公司 Safe driving of vehicle accessory system and its control method
CN108196366A (en) * 2018-01-03 2018-06-22 京东方科技集团股份有限公司 A kind of method and apparatus of adjusting display brightness
US10633007B1 (en) * 2019-01-31 2020-04-28 StradVision, Inc. Autonomous driving assistance glasses that assist in autonomous driving by recognizing humans' status and driving environment through image analysis based on deep neural network
CN111179880A (en) * 2019-12-26 2020-05-19 恒大新能源汽车科技(广东)有限公司 Brightness adjusting method and device of display screen, electronic equipment and system
CN111273765A (en) * 2018-12-05 2020-06-12 丰田自动车株式会社 Display control device for vehicle, display control method for vehicle, and storage medium
CN212305727U (en) * 2020-04-15 2021-01-05 苏州瑞地测控技术有限公司 Light control system of automobile instrument panel
CN112829584A (en) * 2021-01-12 2021-05-25 浙江吉利控股集团有限公司 Brightness adjusting method and system for vehicle display device
WO2021237425A1 (en) * 2020-05-25 2021-12-02 深圳传音控股股份有限公司 Screen brightness adjustment method, terminal and storage medium

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2013127535A (en) * 2011-12-19 2013-06-27 Canon Inc Display device, method and program of controlling the same, and recording medium
CN107640160A (en) * 2016-07-14 2018-01-30 奥迪股份公司 Safe driving of vehicle accessory system and its control method
CN108196366A (en) * 2018-01-03 2018-06-22 京东方科技集团股份有限公司 A kind of method and apparatus of adjusting display brightness
CN111273765A (en) * 2018-12-05 2020-06-12 丰田自动车株式会社 Display control device for vehicle, display control method for vehicle, and storage medium
US10633007B1 (en) * 2019-01-31 2020-04-28 StradVision, Inc. Autonomous driving assistance glasses that assist in autonomous driving by recognizing humans' status and driving environment through image analysis based on deep neural network
CN111179880A (en) * 2019-12-26 2020-05-19 恒大新能源汽车科技(广东)有限公司 Brightness adjusting method and device of display screen, electronic equipment and system
CN212305727U (en) * 2020-04-15 2021-01-05 苏州瑞地测控技术有限公司 Light control system of automobile instrument panel
WO2021237425A1 (en) * 2020-05-25 2021-12-02 深圳传音控股股份有限公司 Screen brightness adjustment method, terminal and storage medium
CN112829584A (en) * 2021-01-12 2021-05-25 浙江吉利控股集团有限公司 Brightness adjusting method and system for vehicle display device

Also Published As

Publication number Publication date
CN114516270A (en) 2022-05-20

Similar Documents

Publication Publication Date Title
CN110148294B (en) Road condition state determining method and device
US9852496B2 (en) Systems and methods for rendering a display to compensate for a viewer&#39;s visual impairment
US11120707B2 (en) Cognitive snapshots for visually-impaired users
US9696798B2 (en) Eye gaze direction indicator
US20110305375A1 (en) Device function modification method and system
CN110044638B (en) Method and device for testing lane keeping function and storage medium
CN109144250B (en) Position adjusting method, device, equipment and storage medium
CN111432245B (en) Multimedia information playing control method, device, equipment and storage medium
CN114236834B (en) Screen brightness adjusting method and device of head-mounted display equipment and head-mounted display equipment
CN111027506A (en) Method and device for determining sight direction, electronic equipment and storage medium
CN112630799A (en) Method and apparatus for outputting information
CN107924229A (en) Image processing method and device in a kind of virtual reality device
CN111324202A (en) Interaction method, device, equipment and storage medium
CN107622241A (en) Display methods and device for mobile device
CN114516270B (en) Vehicle control method, apparatus, electronic device, and computer-readable medium
WO2023241214A1 (en) Display method and apparatus, and electronic rearview mirror system
CN112489006A (en) Image processing method, image processing device, storage medium and terminal
CN113495616A (en) Terminal display control method, terminal, and computer-readable storage medium
CN111402154A (en) Image beautifying method and device, electronic equipment and computer readable storage medium
CN115489402A (en) Vehicle cabin adjusting method and device, electronic equipment and readable storage medium
US10893388B2 (en) Map generation device, map generation system, map generation method, and non-transitory storage medium including instructions for map generation
CN113099101B (en) Camera shooting parameter adjusting method and device and electronic equipment
CN113362260A (en) Image optimization method and device, storage medium and electronic equipment
CN112950516A (en) Method and device for enhancing local contrast of image, storage medium and electronic equipment
CN113960788A (en) Image display method, image display device, AR glasses, and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
CP03 Change of name, title or address

Address after: 201, 202, 301, No. 56-4 Fenghuang South Road, Huadu District, Guangzhou City, Guangdong Province, 510806

Patentee after: Heduo Technology (Guangzhou) Co.,Ltd.

Address before: 100099 101-15, 3rd floor, building 9, yard 55, zique Road, Haidian District, Beijing

Patentee before: HOLOMATIC TECHNOLOGY (BEIJING) Co.,Ltd.

CP03 Change of name, title or address